I think I don’t really understand the depth of digital sound…
So, stereo 16 bit sound = 1411 kbps. 705 on the left, 705 on the right, right?
Mono 16 bit sound = 705 kbps. 352 on the left, 352 on the right?
That’s because I hear it on both channels…
Or the signal is duplicated and I have 705 kbps on each hear?
Thanks anybody who knows.
Bit rate, as in bits per second. (combination of bit depth and sample rate)
e.g. 16bit @ 44.1Khz: - 16 x 44100=705600 (705 Kbps)
I agree with Jarno.
I think the confusion is that mono really is one “source channel”, but our “problem” is that we ‘all’ have systems that can do stereo, so we ‘always’ have two speakers. Two speakers = two channels. So to play back something that is a mono source it actually gets duplicated at some point and we get identical content in both speakers.
That’s why stereo often will have the same bitrate as mono, because the mono content has been in both channels.
BUT, you can also play back mono content in a stereo system without converting the file itself to dual-mono. So what I mean is that you can in many applications select a mono file, a file with just one channel, and it will by default play back equally in both speakers. If you audition a file for import in Cubase/Nuendo for example, and the file is mono, it will still play back in both speakers.
So, if there’s a “bandwidth” issue to deal with, like in gaming software, you might be using half the bandwidth in part of your system; for example half the bandwidth streaming it from storage (disk or whatever) and then it gets duplicated and played back in both speakers - as opposed to a dual-mono file that has the same bandwidth as a stereo file but in the end sounds just like the mono file (with pan-law/level being a caveat).