In most 3rd party libraries, the "Drive" level set by defualt at 0dB, but in VST Plugin Analayser this level does not match the frequency response corresponding in manual reference. However, at -30dB all matches.
As i understand, true hardware response must be always at minimum Drive level with all libraries and it just a fake harmonic distortion?
Am i correct of this? If so, why all devs. set it to 0dB?
An example with AlexB Modern Tube console channel attached with links:
Decrease Nebula input around -3 or -6 dB and try it again. You are providing Nebula 0.00 dBFSD signal, is in the limit of distortion, emulation preset should be calibrated to work at nominal levels, EBU r-68: -18 dBFSD, SMPTE 155: -20 dBFSD, etc.
Enrique Silveti. Acustica Audio customer and technical support.
But I thought that devs. sampled their libraries at +6 and -6.. true dynamic range and I send signal to Nebula at -18 dBFSD or VU 0 while Nebula IN knob set by default. And if i need to boost those true drive dynamic range, i set +6dB IN while decrease -6dB OUT.
So I still do not understand while fake Drive knob set by zero dB and make fake harmonic distortion by default in emulation preset.
Sorry for persistence, just want to understand)
Last edited by Shibata on Sat Aug 03, 2013 1:33 pm, edited 1 time in total.
it isn't generating fake harmonics. the harmonics are sampled from whatever hardware was sampled.
you could say you get an 'unnatural' result if you boost OR lower it from 0db. the drive control is really just a mixer level adjustment control for the sampled harmonics. so it just allows you to raise or lower the level of the distortion directly, relative to the sampled fundamental.
if you increase that control, you are artificially raising the level of the distortion relative to the fundamental. so that could be considered unnatural. if you lower it below 0db, again, you are getting an unnatural result. you leave it at 0db, and you have the natural ratio/relationship between the sampled distortion and fundamental.
if anything, it's probably a bit misleading for the control to be called 'drive'. you could consider it a 'fake' drive control, because drive is typically what you call it when you boost the input level going into something, which generates more distortion. this drive control doesn't do that. it just directly raises the level of the distortion. so in that sense, you could say it's a fake drive control. but the harmonics themselves are not fake. anyway, this is why i always rename the control to 'dist' in my presets, which i think is a bit more accurate of a description.
Cupwise Oh,thank you, Tim! Finally everything comes together) That was my fault, just because I figured that the sampled distortion and fundamental are the same thing in EP.. Sorry for misleading topic
It is a little more accurate to describe it as an 'independent' harmonic distortion control than a fake drive control. To add information that I hope may be helpful to the question:
Having the drive knob is an added feature that is not necessary to have, but if it is there, there is a reason to be at 0dB. This does not mean there is 0dB of drive. It means that at the 12 o'clock position, it is neither boosting anything extra or cutting anything from the pre-edited level.
The program itself can be edited to provide the amount of harmonics present, and any rule can be used to decide this level.
For instance- you may not get the equipments best, most obvious harmonic distortion characteristic by sampling at a low average level- you may need to drive the input a little to get the best response. But, when you edit the program, you want it to be useful under a wide range of circumstances. Some may start at a median or average and edit so that going above this level at the NebPro input can add too much distortion, or another may also take into account how loud a wav file can actually be in digital.
None of these program-creating decisions are based around what the level is on the 'drive' feature. It defaults to "0dB" unless edited to be different at start up.
I think the confusion may be that this should mean there is no distortion, and it does not mean that. It instead means that when at 12 o'clock, you are using the amount of distortion that was programmed into the preset. By turning it above this position, you are adding the 'fake' response mentioned by Cupwise. If you turn it down, you are removing some of the real harmonic response as well. Both of these moves can work fine within the context of using the program, but you get the most accurate harmonic distortion by always leaving this alone and controlling it rather by either turning the input gain knob up or down. This makes it so that the distortion acts along with the dynamic-volume- of the program and never independently.
Using the "Drive" knob does act independently of gain, so especially if boosting the drive, it can become more and more unrealistic.
Hopefully that adds helpful info. to the already-good-answers above!
the vst analyzer actually measures the frequency response using a low level signal, so it's showing you the softest sampled layer + ringing artifacts which are caused by the nebula playback engine. The image you posted is different from the "official" one, probably because a lower input signal was used for the freq response measurement posted in the manual.
If fake harmonic distortion was imparted, then the dynamic response of every piece of sampled gear would be exactly the same, as the dynamic response in most algorithmic plugins, which in this case, luckily isn't.