I did the upgrade to server yet, now I´m trying to see the ram sharing thing, so I open a 44KHz project and load a nebula server instance with a TB+ program (it´s called 'Tapesat44 2dB') and copy/paste it some times trying to see if ram use is growing or not.
I use reaper and looking at Perfomance Meter window the results are so:
So I can see at least that the ram use is lower than before, it's not 80MB(which would be real ram sharing) but at least it´s not 80MB multiplied by 8 instances(640MB). Well, this is incredible and amazing and it´s xmas in april for me, . Anyway, is there something to do inside nebula for getting a better ram sharing or it´s something inside server and it´s untouchable?
But the problem is that the playing of a file through this chain is very bad, a lot of dropping. The CPU meter is about 40% which was not a serious issue for my usual works in reaper with nebula pro, I mean I didn´t find this dropping-hard-to-work thing before, at least I suffered it by using a bigger number of nebulas in the project. My PC is core duo 2GB ram, it´s not a beast, but before server I didn´t find this issue often.
So this is any common to you?
I mean I did hope that ram sharing will decrease ram use at all, and this is not true but at least the use is very much lower than before, that´s ok. But the dropping thrills me because if this is my new state it means that I have not got a real enhacement. Less ram use but more dropping...
oops, please light me a little because I´m maybe being thick and I cant see the forest by the trees...
You could try to put your dspbuffer on 2048 on mast page in nebula and in your daw. You could also try some pc tuning like, Under the Advanced Performance Options of System Properties, choose background services and choose best performance instead of nice looks....
yes you "save resources" but not ALL resources can't be saved. For example real-time resources like vector datas, because they are merged in realtime and every instance is in a different state. All things can be saved now are saved, we can't optimized it further....
if you had a new installation, check your mast page. the buffer is set pretty low by default, so bring that up until you don't hear any dropouts. the daw asiobuffer could be set the same or double, to reduce cpu load.
system 1: windows 8 32 bit - samplitude prox, tracktion6, reaper system 2: mac osx yosemite - reaper(32+64bit), tracktion6(32+64bit)
both systems on: macbook pro (late 2009), core 2 duo 3,06 ghz, 4 gb ram, graphic: nvidia geforce 9600M GT 512 MB
mathias wrote:if you had a new installation, check your mast page. the buffer is set pretty low by default, so bring that up until you don't hear any dropouts. the daw asiobuffer could be set the same or double, to reduce cpu load.
Yes. Nebula on first run will copy your ASIO buffer. This could be a problem if you have in a low latency workflow.
Enrique Silveti. Acustica Audio customer and technical support.
I am also a Reaper user, and you can save yourself this issue if you start to get familiar with batch processing. It's awesome, especially as you get more familiar with the many Nebula presets. I typically will batch process the early stages of Nebula (tubes, tapes, preamps, "mojo's"), so by the time I am actually mixing, my project effectively has hundreds of Nebula presets but hardly any of them running on the project itself. It took me a long time to adjust my brain to this work-flow, but now I am comfortable. The most interesting thing about it for me (and may not work for others) is that it urges me to MAKE DECISIONS based on my ears and then commit to them, like working in actual analog. And when I am at the mix stage, it is fun and fast because Reaper is not bogged down like it would be when I used to try and run as many Nebula's as possible.....I come from an all analog background, and this is by far the closest to my old work-flow as I have ever gotten with digital audio, except for the day or more of boring and highly technical geeky batch-processing. Best of all worlds! And the aspect of committing has actually helped my work, things flow better because I have eliminated some of the endless tweaking that used to haunt me. Cheers!
I indeed do operate in that way too, applying lots o nebulas and rendering, going along the project in a destructive way and taking decisions/dont look back. And its true that this way you dont need a lot of nebulas at the same time, yes. But it's just I thought I could be able to set a few inputs loaded with nebulas and do the recording through them in real time, emulating a desk in tracking stage, listening how every track will be. This way you could record and just render inmediatelly getting a set of "like analog" tracks avoiding to begin from dry tracks.