Sorry to keep harping on this subject, but I have a couple more observations. One: I have managed to confirm aurally the pitch difference between single and three string unisons, in parts of 8ves 5 and 6, at least. I hadn't been able to hear this difference when comparing beat rates of 10ths and 17ths with or without mute inserted when I tried earlier. I had played the 10th or 17th with all three strings sounding, then with only a single string, relying on my memory to compare the beat rates. I guess my memory wasn't good enough. The way I have managed to hear the difference is to compare the beat rate to a bench mark: a parallel 10th or 17th a semitone higher or lower, with a beat rate equal or very close to the interval in question. When I compare the "benchmark" beat rate to the rates of the 10th or 17th played alternately with a single string, then with all three strings, I can hear a difference that experience tells me is in the .1 to .3 cent range, just perceptible. I can hear this difference in the area from about E5 to G6. Above G6, I simply can't judge well enough to be certain. So enrol me among the ranks of the aural believers, but making it clear that I can only hear this in 8ves 5 and 6, never in 8ve 4. In spite of being able to hear the difference in beat rates, I cannot detect the difference with my SAT. Choosing unisons that consist of three strings that all "read" clearly and reliably, I continue to come up with identical readings for any individual string and for all strings read together. I raise the cents value by .2 cents and both read flat (barely). I lower by the same amount, and both read sharp (again barely). So the question I have, which maybe Dean Reyburn can answer, is why the RCT is able to pick this up while the SAT is not. I don't think it is just a matter of relative accuracy (the SAT is very reliable within .2 cents, even accounting for difficulty in interpreting the lights). Maybe it has to do with the time element - how long a sample the RCT takes before measuring vs. the SAT. BTW, in 8ve 4 I am able with the SAT to measure a pitch difference of about ..3 cents between a note played with a fairly loud blow and a pianissimo blow. (I can't measure the same in 8ve 5, but I think that is due to the short sustain, and the SAT's difficulty in getting a "fix" on pitch at low decibel levels). Dean observed that a given "hammer blow strength" might generate greater amplitude in a single string with the other two muted compared to the amplitude of all three strings excited by an equal blow - that more of the force of the blow would be focused on one string, and that this might account for the pitch difference between single and three string unisons. I think that is the most satisfctory explanation I have heard, but I don't see why the SAT wouldn't pick it up. My second observation: I still caution against making more of this than is appropriate. I don't think we need worry about taking it into account wneh tuning bass or midrange. Why not? Because it doesn't seem to affect those areas at all. And for "average" tunings, with pitch changes of 5 cents or more (at least that describes my average tuning), time and effort is much better spent getting stability and good, solid unisons, than fussing with a possible .1 - .3 cent difference in 8ve widths. That said, when it comes to concert tuning, where the piano is within about .5 cents, I can certainly see taking this into account. Just my two, or maybe three, cents worth.
This PTG archive page provided courtesy of Moy Piano Service, LLC