[Stk] Bug in ADSR.setAllTimes decay rate?

Perry R Cook prc at cs.princeton.edu
Mon Jan 20 18:15:17 PST 2014


One last chime in from me:

Actually, if we think about what an ADSR is, or
was in the analog days, the existing behavior 
is the compatible one.  There's no way an analog
circuit would adjust capacitance/resistances for
decay and release when sustain level is adjusted.
Then decay and release would likely mean "time 
to go from full to zero," and changing sustain
level would make the actual times different, as
they are in our ADSR.

I now propose no change to anything.

PRC

----- Original Message -----
Perry, I think I've confused things for everybody. Here's a recap. 


- ADSR.setAllTimes effectively misinterprets the decay times. 


- Correcting ADSR.setAllTimes is simple but changes the behavior of all existing calls. 


- With the correction, the resulting decay times will be what the caller specifies but this is 
different (usually) from current effective decay times. 


- If that difference results in a perceptible audio difference, the calls to ADSR.setAllTimes could be modified, changing the decay argument to match the "incorrect" decay time that ADSR has been producing. 


- However, expecting that all existing calls to ADSR.setAllTimes from outside the SDK would be modified is impractical. 


I want to make it clear that as a total newbie to the SDK I'm not proposing whether a change should be made or not. It seems like a quandary from the standpoint of backward compatibility. 


Bob H 




_______________________________________________
Stk mailing list
Stk at ccrma.stanford.edu
http://ccrma-mail.stanford.edu/mailman/listinfo/stk



More information about the Stk mailing list