Home Entertainment

 

Signal to Noise - Dolby TrueHD & DTS-HD MA vs. Uncompressed PCM

July 14, 2008 By David Birch-Jones



Click the images below for bigger versions:
David Birch-Jones in the Dolby Lab
The Dolby listening room
The DTS listening room

How well do the new compression schemes from Dolby and DTS stack up against uncompressed audio? We buff up our golden ears to audition and compare the latest Blu-ray audio codecs, in the design labs that developed them.

Compared to DVD, the tremendous increase in storage capacity of the Blu-ray disc format, necessary to carry the increased high definition video data payload, also provides for expanded audio options, including the ability to carry so-called lossless audio formats such as Dolby TrueHD and DTS-HD Master Audio.

While many of the initial Blu-ray movie titles feature conventional Dolby and DTS digital soundtracks, a number of them also feature high resolution uncompressed pulse code modulation (PCM) soundtracks to appeal to owners of high end surround sound systems.

The upgrade of the High Definition Multimedia Interface to Version 1.3 includes cable upgrades along with transmitter and receiver modifications to allow substantially higher overall bitrate flows, and allows for lossless audio formats to be sent from source components to be decoded by newer A/V receivers and processors.

David Birch-Jones in the Dolby LabAs with their movie theater equivalents, both Dolby and DTS home audio formats use what is called lossy compression, in order to fit into the relatively narrow amount of data space allotted on DVDs (and in the case of Dolby, with HDTV broadcasts as well). The need to compress digital audio stems from the way conventional PCM audio works – the bitrate remains the same at all volume levels and frequencies, even when there is little or no signal actually being coded.

With Blu-ray’s five-fold increase in data storage capacity (compared to DVD), both Dolby and DTS have developed new audio encoder/decoders (codecs) that are 100% bit-for-bit identical to the original PCM master, but with substantial bitrate reduction efficiency as well, freeing up more space on the disc for added content, extended/alternate versions and the like.

To get the latest scoop on these new codecs, Editor-in-Chief Geoffrey Morrison and I made arrangements to visit both companies’ respective headquarters, where we would be able to hear definitive A/B comparisons that would be otherwise impossible to properly set up in our own facilities.

Our first stop was at Dolby Laboratories’ headquarters in San Francisco. After a short tour of their impressive facilities, our hosts ushered us into what one of their engineers called their “codec killer room.” The specially designed room adheres to the ITU-R BS.1161-1 critical listening evaluation specification and companion BS.1284-1 Annex document that together specify in great detail the precise conditions, procedures and protocols necessary to achieve repeatable and truly useful results in the on-going development of these codecs. A suitably high resolution 5.1 system resides in the room, with five Revel Ultima Studio full range loudspeakers, along with a Paradigm subwoofer and a stack of Bryston power amplifiers rounding out the gear.

The control panel allowed for selection between a number of sources, including the original PCM multi-channel audio track, as well as TrueHD, Dolby Digital Plus, high bitrate 640 kilobits per second (kbps)Dolby Digital, and lower 448 kbps DVD-format Dolby Digital choices that have all been through the full encode/decode process.

The Dolby listening roomThe process of codec evaluation includes pre-screening potential listeners for their aural acuity as well as their consistency after multiple trials. Panelists are asked to listen to a reference clip, and then compare it against another clip that may be the same identical reference source, or a different clip that has been codec’d. They are then asked to score their perception of the audio quality on a five point scale. The lowest 1.0 grade is rated very annoying. The 2.0 grade is annoying, while the middle 3.0 grade is rated at slightly annoying. The 4.0 grade is rated perceptible, but not annoying, while the highest 5.0 grade is rated as imperceptible – the goal of the codec designers is to make the codec itself disappear, from an audio standpoint.

The computer chooses which clips are presented to the listener on a randomized basis to ensure true subjectivity, and the post-session scoring data is then entered into a database and statistically validated against the actual presentation order of the test clips. From that, the engineers can glean a useful score as to the performance of the codec compared to the reference uncompressed source clip, and the process ensures that individual biases are eliminated along the way. It is both time-consuming, and given the repetitive nature of listening to dozens or hundreds of clips in a given listening session, mind-numbingly boring (at least to me, anyway). This is why even keen-eared reviewers simply can’t perform an honest evaluation of codec sound quality in their own home theaters – it can only be done under these most rigidly controlled conditions, with specialized equipment and software that is designed expressly for the task.

Due to the masking of sounds that inevitably occurs during complex and bombastic passages, the best evaluation results are obtained using relatively simple program clips, limited in duration to around 10 seconds or so and on constant replay. For our limited test, our hosts chose a brief audio clip from the movie American Beauty, the so-called “Spectacular” dream sequence where Kevin Spacey’s character ruminates on his life while looking upward at the inviting Mina Suvari, barely dressed in rose petals and surrounded by additional petals that fall towards him. The track features simple, center-channel anchored dialog, along with gentle percussive bell-like notes (xylophone, perhaps?) along with even more gentle triangle bell embellishments—just the ticket for an A/B codec comparison.

Neither Geoff nor I could hear any differences between the original PCM track and the TrueHD version, which should be the case, as they’re bit-for-bit identical. The lossless coding process is analogous to “zipping” computer files—it’s simply a function of more efficient packing that loses nothing along the way. With movies, TrueHD typically provides a two- or three-to-one bitrate reduction compared to the original PCM source.

Next, we compared the original to the Dolby Digital Plus version (that codec is found on numerous BD titles, and like TrueHD, is fully backward compatible with regular Dolby Digital decoders). Even on this extremely high-end system, we couldn’t hear any difference between the uncompressed and the compressed. Then, we compared the higher bitrate (640 kbps) that is found on the Dolby Digital tracks on Blu-rays to the original. "Golden Ears" Morrison was able to hear the difference, but I, and most others in the room with us, did not. Each of us had our turn in the prime listening chair, and couldn’t know the origin of the clips or their order of presentation.

The shocker came when we compared the lower 448 kbps Dolby Digital DVD bitrate to the original. There was an audible difference, but it was only ever-so-slightly noticeable (and this is with a high end audio system in an acoustically controlled environment that is so far beyond what typical home theater systems are capable of resolving). There was just the slightest decrease in presence with the DD version, not exactly a softening of the sound, but just a tad less ambience and a similarly small tightening of the front soundstage’s depth. Quite a remarkable result, I thought, and I was highly impressed with how much fidelity can be packed into such a relatively small amount of bitspace. If I was doing actual scoring, I would have awarded a 4.8 grade to the results I heard – the audible difference was that subtle.

PAGE 2: On to DTS . . .

Comments

That was a great article Geoffrey. Thanks for telling us what your ears told you and not exaggerating the differences like so many publications that have to in order to keep the advertisers happy...

OK, heres my 2 cents!...95% of so called "Home Theaters" are in a home. Bonus Room, Family Room whatever. A room full of people. Adults, Kids and Pets. It's never completely quite as in say a Recording Studio. What makes the whole "watching a movie at home" is a Big Screen with a nice sharp, detailed picture with great color and clean, crisp sound. We dont need "microscopic detail" in our pictures or "National Research Council(NRC)Anechoic Chamber perfect audio. And these days...we dont need to spend $25,000 to build it either. Then again...a Holodeck would be cool right? Ron Davidson

Good article. I suppose that the ultimate result of the experience is to trust your own hearing: if you are not able to discern any quality between $10 system and $1K system, it would be foolish for you as a consumer to buy the $1K system. Moral is to everyone according to their own hearing.

So cool!

Hi man great review and thx

Dolby TrueHD & DTS-HD are absolutely awesome!

Thanks for sharing this stuff!

Geoffrey, Geoffrey, Geoffrey, come on! Use some common sense. Well, OK, use some (rather uncommon) math. Listening to the new audio formats is akin to listening to CD music thru a string and a tin can. Most people's, and probably yours too, ears, are accustomed to hearing CD quality music. At 16 bits, the signal-to-noise ratio is 96 db. Most audio equipment, even the best consumer level (or maybe pro), barely goes above 100 db SNR. 24 bit audio is capable of producing a whopping 144 db SNR. Where are you going to find equipment even capable of reproducing such sonic differences? Granted, the amps and speakers you mentioned (you didnt mention the what players were used or the wires) are excellent, but even they don't match up. I have listened to DVD audio discs, using an Audigy sound card and logitech PC speakers, and even I could tell a huge difference between CD quality and 24 bit audio.....

You seem to be using common sense by only taking into account certain facts, while ignoring others?! Let's look at the important facts you ignored:

Commercial recordings (CD or 24bit) never have a dynamic range of more than about 60dB and therefore utilise 10bits or fewer. The remaining 6 or so bits on a CD are just noise. On a 24bit recording there are at least 14bits of just noise. The reason for this limit is the noise floor. The noise floor of an average sitting room is usually 50dB or so, therefore to hear a dynamic range 96dB higher than the room's noise floor would require an incredibly powerful system and immunity from pretty much instant deafness!!

The other fact you omitted is that for more than a decade it has been standard practice to use a procedure known as "noise-shaped dither". This extends the percieved dynamic range of CDs (16bit audio) in excess of 120dB, although of course this extended range is still not utilised.

If you heard differences between CD and 24bit there is either a problem with your ears, a problem with your system or mastering/production differences between the two versions. It is not humanly possible to tell the difference, on any system or in any environment!

>Commercial recordings (CD or 24bit) never have a dynamic range of more >than about 60dB and therefore utilize 10bits or fewer.

---Anonymous, good post - we have a debate! OK, I agree that the dynamic range of CD's is limited to below that of the medium itself. However, as Wikipedia states, "The "intensity" range of audible sounds is enormous. Our ear drums are sensitive only to variations in the sound pressure, but can detect pressure changes as small as 2×10–10 atm and as great or greater than 1 atm. " (http://en.wikipedia.org/wiki/Psychoacoustics)

Also, see http://en.wikipedia.org/wiki/Dynamic_range
The human senses of sight and hearing have a very high dynamic range. A human is capable of hearing (and usefully discerning) anything from a quiet murmur in a soundproofed room to the sound of the loudest rock concert. Such a difference can exceed 100 dB which represents a factor of 10,000,000,000 in power.

Thus, if you are correct that the dynamic range in a typical CD is purposely limited to 60db, then certainly Blurays can exceed 96db.

>On a 24bit recording there are at least 14bits of just noise. The reason >for this limit is the noise floor. The noise floor of an average sitting >room is usually 50dB or so
------ it may be that this is typical, but my argument was 'not' what is typical, it was regarding the assertion of this article that human beings cannot tell the difference btn CD quality and 24 bit quality. Thus, we are talking potential, not necessarily typical.

>The other fact you omitted is that for more than a decade it has been >standard practice to use a procedure known as "noise-shaped dither". >This extends the perceived dynamic range of CDs (16bit audio) in excess >of 120dB, although of course this extended range is still not utilized.
---- not sure what this has to do with my argument

The above arguments you make are regarding the volume level / dynamic range of the two media formats. This may indeed be the weakest aspect to the greater aural quality of the new media. However, consider:

2 speakers for CD sound, versus 5.1 (up to 11.2) speakers. Certainly, a greater number of speakers, inputting sound to a user will create more psychoacoustic information in that person's brain!

The greater sampling rate of up to 96 kHz, at up to 8 channels. The human ear/mind is analog. We can certainly process the greater amount of information contained in 24 bit discs.

Bruce

Signal to noise is only one aspect of sound reproduction. Also, read the article over again, I'm not saying you can't hear a difference. In fact, I'm saying the exact opposite.

Post new comment

  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
  • Images can be added to this post.
  • Glossary terms will be automatically marked with links to their descriptions. If there are certain phrases or sections of text that should be excluded from glossary marking and linking, use the special markup, [no-glossary] ... [/no-glossary]. Additionally, these HTML elements will not be scanned: a, abbr, acronym, code, pre.

More information about formatting options

Local Guides

 All Guides
   Alabama
   Alaska
   Arizona
   Arkansas
   California
   Colorado
   Connecticut
   DC
   Delaware
   Florida
   Georgia
   Hawaii
   Idaho
   Illinois
   Indiana
   Iowa
   Kansas
   Kentucky
   Louisiana
   Maine
   Maryland
   Massachusetts
   Michigan
   Minnesota
   Mississippi
   Missouri
   Montana
   Nebraska
   Nevada
   New Hampshire
   New Jersey
   New Mexico
   New York
   North Carolina
   North Dakota
   Ohio
   Oklahoma
   Oregon
   Pennsylvania
   Rhode Island
   South Carolina
   South Dakota
   Tennessee
   Texas
   Utah
   Vermont
   Virginia
   Washington
   West Virginia
   Wisconsin
   Wyoming