Quote:
Originally Posted by wehurlbert
Now let me get this straight - you're comparing the sound from the analog
outputs of the Virus with what you're getting from your soundcard by way
of the Virus USB and your DAW. And you expect them to be the same?
Like Marc said, at the minimum you'd have to use the Virus as your sound
card. And you'll need to be sure you're DAW is not touching the bits ...
...
If you've got really good A/D converters, you could record the analog outs
of the Virus back into your DAW, and clinically compare that with the USB
recording i.e. invert phase of one and listen/look. You could also do this
via the S/PDIF out of the Virus i.e. USB vs S/PDIF.
-Wayne
|
It's not so complicated to compare a couple of sound sources.
There are three minimal signal paths for listening to the Virus, not including the headphone jack:
- Virus USB -> D/A
- Virus SP/DIF -> D/A
- Virus Analog [ -> A/D -> D/A ]
All three of these can be recorded as digital audio (after an optional A/D step in the last case). You can then play them back through a single D/A. You can use a MIDI sequencer to be sure you are getting the same thing all three times. Since you don't care much about the timing in this case, you could probably record all three in one pass if you wanted to. In any case, it's not any harder than recording multiple tracks or takes in a song.
None of the major DAWs changes the bits of a track that is simply recorded as audio and played back at unity gain (and at the original tempo, in the case of Live). Live is the only DAW that silently applies time-stretching to audio (!) so it's the only one where it's really necessary to be so precise about when the warping algorithm is being applied.
In any event, the real question is not better or worse, but rather:
are there more bits of precision feeding the internal D/A converters than there are coming over the USB and/or SP/DIF connection? This simple question would be easy for Access to answer, but as far as I know they have chosen to remain silent on the point. Consequently, the only way to get at the answer is to probe with an experiment like the one we're discussing here.
Marc's proposal is nice because it uses a single D/A (in the Virus) for everything. On the other hand, it depends on the return audio path over the Virus's USB link. If that link truncates the audio to 16 bits, then it will give a misleading result -- it removes any extra bits of precision (beyond 16) that the SP/DIF or analog outputs of the Virus may have.
-Luddy