At 7:13 PM -0400 7/2/00, A440A@AOL.COM wrote: > JIM writes: ><< In a truly "consistent" setting of aftertouch throughout the scale the >"dip" would look more like a saws tooth edge rather than a straight line. > >Yes, I agree, and it does look like that,(though very slightly) This is a very interesting effect, and entirely explanable as g'boy JimBeau points out: >>>This is because of inconsistencies in knuckle size, string heights, >>>hammer heights, >etc. the area we are dealing with in the "aftertouch arena" is in the >perception' >of eveness for the performer. >> This is why I made the analogy to ET and HT (and I'm thankful Bill didn't go there, he's got alot to offer in *this* subject......it was actually part joke......). Due to the lack of gun-barrel straight lines in the action, errors in the system (or rather deviations from straight lines) will also show up here. Where do you put this error? With a decided blow and a decided dip, it's determined that the aftertouch will be "in error". Most of us would agree that we don't want that error in the blow, with a slightly rumpled hammer line. And we also seem to feel that for the pianist, aftertouch is a far more immediate perception than "bulk" dip. That's why it seems to me such a natural to put the error in the dip. That is, to base the top action regulation on a decided blow, and then adjust the dip, however raggedly, to achieve uniform aftertouch. Ragged dip? Who cares.A 0.010" step in dip between adjacent naturals is far better than a level dip at that spot and a 0.010" shift in the starting location of aftertouch. This 1/100th " is 2.5% of the dip but 20% of the aftertouch. Nothing is happening during the aftertouch, the work of launching the hammer is complete and the parts are "heading for the back wall". You might say that aftertouch is wasted motion. That's similar to saying that actions should have no friction. I think it was David Stanwood who once said, if you can measure it, you can regulate. Here we are fine tuning aftertouch, and doing it directly without the side effects on the action regulation from changing the blow. > A .010" difference in aftertouch from one key to the next seems to be >more apparent than the same amount of difference in two keys actual dip. Hear hear! >For the highest order regulation, I first set the keydip to as close a >consistant, static, dimension as possible, then if the need for consistant >aftertouch wants me to add or subtract more than .010", I move the hammer up >or down to make up the difference. This splits the differences caused by >inconsistant geometry into two separate areas, both of which see only half of >the variation and essentially render it below the level of perception. Very interesting strategy. BTW, are you doing this for indiviudal hammers or section by section (IOW, is the hammer line a little bumpy, or is it that different sections of the action are set to different blows? If the former (bumpy hammerline), does this pose a problem for re-regulation later on after the hammer line (and everything else) settles under use, and you have to re-establish that hammer line? I've always found a straight line is easier to recall than a bumpy line Finally, the dip is the best, most efficient place to put this error. To remove a 0.010" deviation (error) from the aftertouch, we'd have to alter the blow by just under 1/16", that is, (0.010") x (action ratio of 6:1). IOW, the effectiveness of the blow as a place to vary in pursuit of a proper aftertouch is inversely proportional to action ratio, which can be anywhere from 5~8:1. Fascinating thread, this. Bill Ballard, RPT New Hampshire Chapter, PTG "No one builds the *perfect* piano, you can only remove the obstacles to that perfection during the building." ...........LaRoy Edwards, Yamaha International Corp
This PTG archive page provided courtesy of Moy Piano Service, LLC