Login

0dbFS and r2r , crank the input?

Officially Licensed 3rd Party Developer Libraries
Free 3rd Party Programs

Re: 0dbFS and r2r , crank the input?

Postby TranscendingMusic » Tue Aug 23, 2011 10:55 pm

ngarjuna wrote:I hear what you mean but if you're talking about peaks as the constraining value (I presume you mean 0dBFS, right?) then you still don't need a peak meter: all you need is a simple clip light.

Yes 0dbFS, and sorry I didn't make it clear enough that I'm not stressing an absolute peak meter but since ITB usually you are working with a peak meter as there are really no virtual LED's by themselves of course.

And for that matter: with Nebula how many programs can you safely push up to 0dBFS? The real constraining factor on these programs is not a peak ceiling at all but rather overdriving the program too far (if you've gone over 0dBFS then you have bigger problems than just how hard you're hitting Nebula!); so while peak metering would certainly give you clues about that, no more so than average metering as far as I can reason.


Well, yes and no. In general as we discussed and very fundamentally so, analog gear too contains limitations where you can make it sound as ugly in its own way as digital. Right? So yes we should be mindful of that fact. But now in the In's and Out's here in our case it should still be looked at as a digital problem. Look at it as a digital container that possesses an analog interlude for character and flavor but comes back through the said container. Our problem is still for example driving programs too hard artifacts that involve digital errors because you are capping off not from an analog unit but from a digital code holding the analog flavor. So on these programs, so long as we are working with 1's and 0's your limitation is always the ceiling. The exception to this, sonically, is to create virtual soft clipping that can change the perception that you are driving towards a ceiling rather than a "yes" or "no" thing. The average meter again won't be telling you that you are crossing the line because you don't know what your peaks are doing "up there".

There's also the matter of calibration: many programs have a fairly small sweet spot (compared to purely digital effects, anyway) which is entirely derived from VU/average value. So if Alex advises you try to hit his processor at about -18dBFS average and you're watching a peak meter...how does that work?


But understand this, that it's all about the character not necessarily a clip point. We aren't saying that once you go past -18 it's an absolute clip regarding the gear and I think that's where users have to distinguish and remember what realm they are opertaing in. An "about" value is the key word here. And again, just as in working on an analog piece of gear, you are always dealing with about values, at least initially.

We've already agreed that peaks don't necessarily reflect useful information about their corresponding average values. You're always guessing/going by ear in terms of hitting your processors in their sweet spots. I'm in favor of using one's ears but that's going to be a lot harder for someone who has never used a Vintage Blue Console in real life (for one random example). Seems to me to maximize getting into those sweet spots you definitely want an averaging meter even if one were taking your advice and also paying close attention to their peaks.


This is tricky, because what is referred to an average level is different pre processing and post. When a mix or track has been massaged, the average level changes. Usually, you are looking at more controlled and reduced dynamic range. And if I'm going back through a processor I used in the analog domain, I'm still listening in howI am driving it/pushing it to find its sweet spot. Again, there's no problem if you want to see how your avergae level is sitting at this point using a meter, I'm not against it, but you are still dictated by your peaks. In other words you can't decide to place the signal at -18. sometimes that works, sometimes it won't, all case dependent.

The converse (monitoring VU by meter and peak by ear) is a lot easier: your VU meter gets you right into the ballpark and you know if your crest factor is abusive because most Nebula programs will start ringing like a dinner glass when you pump excessive transients into them. Whereas the opposite (monitoring VU by ear and peak by meter) you may or may not easily hear that sweet spot as you fluctuate between -4dB VU and +1dB VU (depending on one's experience, how busy the mix is, etc.).


But you see, monitoring by VU gets you into a ballpark. But what is the adjustment to avoid clipping and artifacts based on? It's based on peak values. That ballpark would have been in passing having monitored the peaks. Your final resting place would be based on these peaks.

But I guess we approach gain staging from a pretty radically different perspective; I am firmly in the camp of engineers who hold that you treat ITB the same as you treat OTB and you end up with the same good mixes ITB that you used to get OTB. Sure, you don't get all that clean headroom that digital could be offering you but you don't lose anything by not driving your digital gain either


I do treat it similiarly, but in knowing that my limitation behavior does differ and that working with a digital container imparts an aboslute ceiling, I have to be mindful of who the boss is ultimately. Gain staging no matter what is an approximation, always. Whatever hovers near the ceiling is what dicatates how I set levels. This is much more pertinent ITB. A VU meter in and of itself can help me understand a value in loudness or density but for gain staging I don't use it and in turn I don't have any artifacts propagated within my work because i am well within an approximation that gives me the color I need and keeps me safe from the digital container it finds itself in.
mixing | mastering
Win 10 x64 | Sonar Platinum x64 | 3930K(OC)
User avatar
TranscendingMusic
Expert
Expert
 
Posts: 1132
Joined: Sat Mar 27, 2010 7:01 am
Location: USA

Re: 0dbFS and r2r , crank the input?

Postby ngarjuna » Tue Aug 23, 2011 11:48 pm

Thanks for the thoughtful reply. It occurs to me that I should mention: I really don't do much mastering at all. So everything I say is from a mixing perspective. Obviously if I was mastering I would be quite damned worried about 0dBFS and precision metering at the top of the dynamic range. I do sometimes forget that there are other people in audio besides mixers.
Transcendingusic wrote:Well, yes and no. In general as we discussed and very fundamentally so, analog gear too contains limitations where you can make it sound as ugly in its own way as digital. Right? So yes we should be mindful of that fact. But now in the In's and Out's here in our case it should still be looked at as a digital problem. Look at it as a digital container that possesses an analog interlude for character and flavor but comes back through the said container. Our problem is still for example driving programs too hard artifacts that involve digital errors because you are capping off not from an analog unit but from a digital code holding the analog flavor. So on these programs, so long as we are working with 1's and 0's your limitation is always the ceiling. The exception to this, sonically, is to create virtual soft clipping that can change the perception that you are driving towards a ceiling rather than a "yes" or "no" thing. The average meter again won't be telling you that you are crossing the line because you don't know what your peaks are doing "up there".

That’s an interesting thought and you’re right. I just look at it a little differently: you’re right that the behavior in question (the problem behavior) is essentially a digital problem as you describe above; but the actual range, the sweet spot in which you can avoid said problematic behavior, is essentially derived from the analog audio path itself (as locked in by the developer; obviously Nebula programs don’t have to resemble the operating ranges of the original hardware but they often do since fidelity and integrity of the model are such champions in the Nebula community). So while the symptom itself and the precise underlying cause are indeed digital the limitation itself is actually introduced as a part of the analog itself preceding even the sampling. In a practical sense: it’s pretty unlikely, if you’re within the boundaries of what would sound decent on the hardware itself, that you’re going to exceed the Nebula program of that hardware within the same boundaries. That's not inherent but, the way things are going with Nebula emulations, it's a pretty safe bet right now.
But understand this, that it's all about the character not necessarily a clip point. We aren't saying that once you go past -18 it's an absolute clip regarding the gear and I think that's where users have to distinguish and remember what realm they are opertaing in. An "about" value is the key word here. And again, just as in working on an analog piece of gear, you are always dealing with about values, at least initially.

Yes, very much so. It’s implicit in that it’s an “average” value but it’s worth pointing out explicitly. If one were to complain (not that anyone has but still) that average values lack precision in terms of what they are informing you about that would be 100% on the mark. I would nonetheless posit that peak values have sort of the opposite problem: they offer total precision and little context. For me, I want context from my meters. I just don't have a lot of call for precise values when I'm mixing, it's usually TMI. The meters are sort of like the lights on the runway.
This is tricky, because what is referred to an average level is different pre processing and post. When a mix or track has been massaged, the average level changes. Usually, you are looking at more controlled and reduced dynamic range. And if I'm going back through a processor I used in the analog domain, I'm still listening in howI am driving it/pushing it to find its sweet spot. Again, there's no problem if you want to see how your avergae level is sitting at this point using a meter, I'm not against it, but you are still dictated by your peaks. In other words you can't decide to place the signal at -18. sometimes that works, sometimes it won't, all case dependent.

The ear is always right as far as I’m concerned. I definitely don’t rigidly set every track I have to -18.00000dBFS with some kind of complicated averaging system; like you said it’s just a ballpark, from there my ear makes all the decisions. But I think what you’re saying (and I agree it merits repeating) is that you don’t treat -18 average/VU/RMS as an absolute or clip line, it’s neither.
But you see, monitoring by VU gets you into a ballpark. But what is the adjustment to avoid clipping and artifacts based on? It's based on peak values. That ballpark would have been in passing having monitored the peaks. Your final resting place would be based on these peaks.

Technically I guess. But I seriously can’t remember the last time I had a signal locked in on an input at 0VU (I work at 18 across the board) and had to worry about my peaks at 0dBFS. Even in classical/jazz contexts I don’t recall having that problem much less in the considerably less dynamic genres like funk, rock, country (seems to be what dominates my work lately anyway).
To me that’s the systematic advantage of observing VU gain staging in a digital context: you really don’t have to sweat the output metering (other than the two buss maybe) in terms of transients and level violations. In Nebula we all kind of have to observe these levels (or many of the programs will sound pretty craptastic) but it’s something we kind of take for granted: not that many people are observing these kinds of gain rules in digital. The old “you should record as hot as you can to use all of the 16bit range that you can” combined with good old psychoacoustics (who doesn’t like it better a little louder??) has made Nebula style levels outdated, which is a shame because the systematic advantage is worth having.
I’m sure Bob knows this but I’ll say it anyway: none of what I just said is a rule. Rule of thumb maybe (according to some anyway) but it’s not strictly required. One way or another you are going to have to kind of understand what you’re doing level wise to make it all work out but, assuming you do, there are indeed many ways to do so. Bob’s quote (below) illustrates that perfectly: he does his thing without even using a VU meter. Certainly a very different approach than mine:
I do treat it similiarly, but in knowing that my limitation behavior does differ and that working with a digital container imparts an aboslute ceiling, I have to be mindful of who the boss is ultimately. Gain staging no matter what is an approximation, always. Whatever hovers near the ceiling is what dicatates how I set levels. This is much more pertinent ITB. A VU meter in and of itself can help me understand a value in loudness or density but for gain staging I don't use it and in turn I don't have any artifacts propagated within my work because i am well within an approximation that gives me the color I need and keeps me safe from the digital container it finds itself in.
User avatar
ngarjuna
Expert
Expert
 
Posts: 778
Joined: Tue Mar 30, 2010 5:04 pm
Location: Miami

Re: 0dbFS and r2r , crank the input?

Postby yr » Wed Aug 24, 2011 8:14 am

the actual range, the sweet spot in which you can avoid said problematic behavior, is essentially derived from the analog audio path itself (as locked in by the developer; obviously Nebula programs don’t have to resemble the operating ranges of the original hardware but they often do since fidelity and integrity of the model are such champions in the Nebula community).


I don't believe this statement is correct, there are enough presets out there with a smaller usable dynamic-range then the actual hardware (at least upwards).

I seriously can’t remember the last time I had a signal locked in on an input at 0VU (I work at 18 across the board) and had to worry about my peaks at 0dBFS.


I have the opposite experience, even after gentle gain-riding/compression. I normally end up using many presets with lower values if I want to stay clear of the digital ringing/artifacts. A good workaround, assuming you are indeed dealing with a wide dynamic-range, is to process the peaks separately- clone the tracks and use different/lower settings for the highest peaks only.
Reuven | post-production & sound-design | scenography |
website | nebula presets
User avatar
yr
Beta Tester
Beta Tester
 
Posts: 438
Joined: Mon Mar 29, 2010 9:04 am
Location: Amsterdam

Re: 0dbFS and r2r , crank the input?

Postby ngarjuna » Wed Aug 24, 2011 2:09 pm

yr wrote:I don't believe this statement is correct, there are enough presets out there with a smaller usable dynamic-range then the actual hardware (at least upwards).

For example?

I have the opposite experience, even after gentle gain-riding/compression. I normally end up using many presets with lower values if I want to stay clear of the digital ringing/artifacts. A good workaround, assuming you are indeed dealing with a wide dynamic-range, is to process the peaks separately- clone the tracks and use different/lower settings for the highest peaks only.

This claim is bizarre: after compression you are producing tracks which are locked in at 0VU = -18dBFS and you're flirting with 0dBFS? That's incredibly unlikely; steady state signals flat out don't work that way (even if this were true for the majority of percussion which is still a pretty big stretch).

The much more likely scenario is that your average is nowhere near -18dBFS and you're mixing hotter than that.

Even if it were happening that way mixing with crest factors that high is a big disadvantage in Nebula: a huge percentage of the most popular 3rd party libraries will not tolerate that kind of signal when calibrated to -18dBFS = 0VU. You try hitting an AlexB program with signals peaking around 0 and it's going to be like reentering the atmosphere. One could compile a set of Nebula tools that simply didn't include any such- calibrated libraries (excluding pretty much all AlexB and Rhythminmind libraries) but I don't see what the advantage of that would be versus working in the designated operating ranges.
User avatar
ngarjuna
Expert
Expert
 
Posts: 778
Joined: Tue Mar 30, 2010 5:04 pm
Location: Miami

Re: 0dbFS and r2r , crank the input?

Postby yr » Wed Aug 24, 2011 4:08 pm

Nothing "bizarre"about it- I said gentle compression and you clearly don't mix a lot of classical music. If you aim for 0vu for loud peaks, almost always you will hit a much higher value a couple of times during a symphonic piece.

There are plenty of presets showing digital artifacts long before 0 that I believe have a limited dynamic range. Just to name a few: some of the new tape presets and the MLC.
Reuven | post-production & sound-design | scenography |
website | nebula presets
User avatar
yr
Beta Tester
Beta Tester
 
Posts: 438
Joined: Mon Mar 29, 2010 9:04 am
Location: Amsterdam

Re: 0dbFS and r2r , crank the input?

Postby ngarjuna » Wed Aug 24, 2011 5:09 pm

yr wrote:Nothing "bizarre"about it- I said gentle compression and you clearly don't mix a lot of classical music. If you aim for 0vu for loud peaks, almost always you will hit a much higher value a couple of times during a symphonic piece.

Not a ton, no. Nor is it a genre I'm unfamiliar with. After more than 15 years of recording and mixing I stand by my notion that it's bizarre to have the majority of your tracks with a crest factor greater than 18dB. Keeping in mind what the crest factor is, the fact that the piece has a wide dynamic range is a totally different issue that is somewhat tangential to this discussion.

There are plenty of presets showing digital artifacts long before 0 that I believe have a limited dynamic range. Just to name a few: some of the new tape presets and the MLC.

Those hardware devices have limited dynamic ranges too. What makes you think that the tolerances of Eric's tapes, for example, are different than the actual ferromagnetic limitations and tolerances of those actual tape machines? If your sweet spot on a deck is around 0VU (-18dBFS) you're not going to get anywhere near 18dB of saturation headroom, it's going to burn up right at the same point Eric's program burns up (it's just a different distortion sound being analog instead of digital).

Same with the MLC. What's the actual headroom rating for the hardware console? I'd be willing to bet it's pretty much the same as Alex's programs.
User avatar
ngarjuna
Expert
Expert
 
Posts: 778
Joined: Tue Mar 30, 2010 5:04 pm
Location: Miami

Re: 0dbFS and r2r , crank the input?

Postby yr » Wed Aug 24, 2011 6:51 pm

I'm sorry but you simply have no idea what you're talking about. Try recording a symphonic piece using sdc mics & a Forssell preamp (or other fast hi-end) and then come back to report what you've learned.

I know quite well what some of the new tape presets are capable of and how they are tuned because I did some of the fine tuning myself. They are edited to reflect the actual THD @nominal level. While a modern tape formula provides you a headroom of 14dB (sometimes more) for over-shots I wouldn't recommend using this presets at those levels because they will ring...

As for the MLC, this is the line input @ -5dBFS, and no it doesn't reflect the headroom of the actual hardware (in case you were wondering):
MLC L1 -5dBFS.PNG
MLC L1 -5dBFS.PNG (15.29 KiB) Viewed 1388 times
Reuven | post-production & sound-design | scenography |
website | nebula presets
User avatar
yr
Beta Tester
Beta Tester
 
Posts: 438
Joined: Mon Mar 29, 2010 9:04 am
Location: Amsterdam

Re: 0dbFS and r2r , crank the input?

Postby ngarjuna » Wed Aug 24, 2011 7:07 pm

yr wrote:I'm sorry but you simply have no idea what you're talking about. Try recording a symphonic piece using sdc mics & a Forssell preamp (or other fast hi-end) and then come back to report what you've learned.

Wow, that's some bold talk there. I'm pretty lucky to subscribe to a thread which can give me such an advanced education! Because in 15 years I've never used an SDC or fast hi-end preamp, you're totally right. Nor have I ever used a tape deck. Totally.
User avatar
ngarjuna
Expert
Expert
 
Posts: 778
Joined: Tue Mar 30, 2010 5:04 pm
Location: Miami

Re: 0dbFS and r2r , crank the input?

Postby Finnish » Wed Aug 24, 2011 7:21 pm

Calm down boys... I think we have a good conversation going on about levels in general and especially levels&Nebula&different Nebula programs.

I've learned a lot, keep the good stuff floating!!!
Finnish
User Level IV
User Level IV
 
Posts: 44
Joined: Wed Mar 31, 2010 6:21 pm

Re: 0dbFS and r2r , crank the input?

Postby yr » Wed Aug 24, 2011 7:22 pm

It's the reality- you don't have this specific experience/knowledge concerning classical music (as reflected by your posts) but you insist on arguing about it. I'm usually very diplomatic around here, but I do find it slightly odd that you would refer to things I'm saying (which are based on practical and theoretical knowledge) as "bizarre". I think that shows little respect...

Going back to Nebula, I only used a couple of examples but the same holds for many presets. Nebula is great and does a wonderful job at mimicking the sound/character of analog equipment. It is not a 1:1 copy, and many times creating presets involves making decisions that are both aesthetic and technical. Understanding that, invites you to do some testing and exploration, just as you would do with a piece of hardware...
Reuven | post-production & sound-design | scenography |
website | nebula presets
User avatar
yr
Beta Tester
Beta Tester
 
Posts: 438
Joined: Mon Mar 29, 2010 9:04 am
Location: Amsterdam

PreviousNext

Return to 3rd party libraries

Who is online

Users browsing this forum: No registered users and 2 guests