skip to main content

Huh? Why we're all watching TV with subtitles on

Eric McCormack as Will Truman, Debra Messing as Grace Adler, Sean Hayes as Jack McFarland in NBC TV show Will & Grace Photo: Getty Images
Eric McCormack as Will Truman, Debra Messing as Grace Adler, Sean Hayes as Jack McFarland in NBC TV show Will & Grace Photo: Getty Images

Analysis: No, it's not just you: TV and film dialogue are now harder to hear and the use of subtitles has gone through the roof

If you find yourself turning on subtitles when you’re watching TV, rest assured you’re not alone: use of subtitles has gone through the roof. The numbers differ, but so far surveys suggest the same conclusion: most of us are using subtitles and younger viewers in particular appear to be driving the trend.

Netflix says 40% of viewers have closed caption subtitling on all the time, while 80% of viewers use them at least once a month. A recent YouGov survey of 3,600 people found 61% of people aged 18 to 24 watch TV in their native language with subtitles on, falling to 31% for 25 to 49-year-olds, 13% for those aged 50 to 65 and going back up to 22% for people aged 65 and over.

Another study found 85% of Netflix customers in the UK used subtitles while streaming, 54% used them on Amazon Prime, while 37% used subtitles on Disney Plus. Looking at why people might choose to use subtitles, 66% of the people surveyed use subtitles to better understand the storyline, while 40% of people said using captions helps them improve their concentration, and this was more common in younger audiences. A Vox poll found a lot of people feel they simply can’t understand what’s being said.

We need your consent to load this rte-player contentWe use rte-player to manage extra content that can set cookies on your device and collect data about your activity. Please review their details and accept them to load the content.Manage Preferences

From RTÉ Radio 1's Today with Claire Byrne, RTÉ Gold's Rick O'Shea and Entertainment.ie's Deirdre Molumby on watching foreign language series and films dubbed or subbed

The number of people using subtitles is "magnitudes bigger" than what you would anticipate due to people with hearing impairments, "the question is why? says Dr Andrew Hines, Assistant Professor in the School of Computer Science at UCD and lead investigator on a collaborative project at the Insight SFI Research Centre for Data Analytics, looking at streaming content quality and the growing use of subtitles.

If you ask anyone anecdotally why this is occurring, you'll get the same six or seven answers, says Hines. But there are two key reasons behind we're all using subtitles: a shift to a naturalistic approach to acting that can make dialogue hard to understand (hello, mumblecore), and a change in the way that sound is recorded, designed, and then played-back in our homes and on our devices.

Let’s start at the very beginning, so to speak, using the classic 1952 musical comedy Singin’ in the Rain, a film within a film that nicely illustrates how sound used to be captured on a set and how important clear, crisp articulation was for intelligibility:

From Warner Bros, Lina Lamont vs the Mic in Singin' in the Rain (1952)

"I like to call it the Doctor Who effect," Hines says. "If you've ever seen Doctor Who back from the 60s, you'd never see the doctor talking while the Daleks are shooting at him. He stops and talks to the camera, it's almost like a stage production. Nowadays we expect things to be far more natural." We also see that characters might speak off-screen or with their backs turned, which means we can’t see their lips move. "People don't realise that even though we're not experts at it, everybody is actually pretty good at lip reading."

Some directors simply want you to have a hard time hearing what’s being said, the classic example being Christopher Nolan (Tenet, Dunkirk, Inception, Interstellar). "He’s actually making it so that it's unintelligible or mumbled and he doesn't expect you to understand what they're saying," says Hines. "There were movie theatres in the States who, when they were airing [Tenet], had signs on the back of the seats saying 'it’s the director, it's not us, don't complain about our sound system’."

From Thomas Flight, why you can't hear the dialogue in Tenet

We of course turn on our subtitles for many reasons. There are times we can't hear the dialogue properly, but we could also be multi-tasking, looking at our phones, struggling with our concentration and attention span, or watching content in something other than our native language (think shows like Elite, Extraordinary Attorney Woo, Squid Game, Money Heist, Call My Agent! and more). Some people need subtitles for the accents (Peaky Blinders, Derry Girls) and sometimes the closed caption subtitles can even be a source of humour or an added layer of enjoyment (Stranger Things). Part of the problem is we don’t yet have enough research to find out what the biggest reasons is, says Hines, and therefore how to address it if needs addressing.

Can’t you just turn up the volume on the dialogue?

Why people aren't hearing dialogue properly probably comes down to a number of different factors in how the content is created in the first place, says producer and mixer Kieran Lynch, who works as a dialogue editor, sound designer and re-recording mixer in film and television, and lectures in IADT. "Actors tend not to project as much as they used to, they tend to deliver lines quieter," says Lynch. "Most people would think, well, can't we just make that louder when the program is being mixed, or in post-production? Just turn up the voice?"

James Dean and Betsy Palmer filming "Danger" on CBS Studio One, August 25, 1953 Photo: CBS Photo Archive/Getty Images

But it’s not that simple and it comes down to how we’re recording that more quiet, naturalistic, sometimes mumbling, dialogue. "On set, most people use lavalier mics" — clip mics, also known as lav mics — "they've got very, very close proximity to the voice: the closer you are to the microphone, particularly the directional microphone, the more base content will be recorded." This is called the proximity effect.

Now, a regular microphone on a boom stand wouldn’t pick up those base frequencies in the same way, and that boost in base frequencies is what can make voices sound muddy. "If that's not dealt with in post-production then the clarity is missing from the voices," says Lynch.

On top of that, when somebody shouts or speaks loudly there are a lot of mid-range frequencies and that's what our ears like to prioritise, but when you speak very quietly or hushed, a lot of low-frequency sound is included and picked up by the lav mic, he explains. To add to that, the dialogue has to compete with all the extra sound effects and ambient sound that gets added in post-production, including ADR — automated dialogue replacement - for when the original dialogue just doesn’t sound right and needs to be recorded again.

Lucille Ball and husband Desi Arnaz appear on 'Toast of the Town', aka 'The Ed Sullivan Show', circa 1954. Photo: Getty Images

It seems obvious, but we’re watching films and TV shows on devices that understandably can’t reproduce the high quality sound from the studios. When the sound is mixed for a film, there may be thousands of audio tracks or audio elements for each scene. They’ll get mixed down into the various formats that are used for streaming, for cinema, and for broadcast, says Lynch. All the frequency content is technically maintained, so in theory, we are getting all the information that is in that cinema mixing room, but we just can’t hear it.

Something else to consider is that we like visual clues when it comes to our understanding. There’s a psychoacoustics phenomenon where our brain prioritises visual information over auditory information: this is called visual capture and is part of psychology, explains Lynch. The psychoacoustic effect is "very, very powerful in our brains and how we interpret sound. A lot of what we interpret [in sound] is through what we see, similar to, a lot of what you taste comes from what you smell."

We need your consent to load this rte-player contentWe use rte-player to manage extra content that can set cookies on your device and collect data about your activity. Please review their details and accept them to load the content.Manage Preferences

From RTÉ Radio 1's Brendan O'Connor, tech expert Colin Baker explains how to optimise the sound on your telly

He mentions an auditory illusion called the McGurk effect, where you think you're hearing something based on what you're seeing, whereas in fact you're hearing something completely different. "So, there is a phenomena I’d say, where if it's hard to hear the dialogue, if there's a visual stimulus presented we'll naturally want to keep that visual stimulus. So if there's text on screen, then our brain will prioritise looking at that text to get the information from the auditory system."

Is there anything we can do? Not a whole lot

Lynch says dialogue sound can appear better on phones and laptops than on TVs, because they’re so small, they filter out those low-range frequencies. Hines adds to this that it’s worth checking the settings on your TV to use the various sound options for the type of content you’re watching (films, music, sport etc.)

"The Christopher Nolans of this world will tell us that the problems they're creating are not problems, they’re artistic choices, so we shouldn't be trying to fix them," says Hines. But 'the fundamental answer is not the one we want, which is, there is no silver bullet for this."

Nevertheless, streamers are definitely aware of the sound problem and interested in improving it: Amazon Prime recently introduced a dialogue boost option on anything produced by their own studio. Which obviously would make the intelligibility better, but would probably horrify Christopher Nolan, adds Hines.

Follow RTÉ Brainstorm on WhatsApp and Instagram for more stories and updates


The views expressed here are those of the author and do not represent or reflect the views of RTÉ