How Does Closed Captioning Work?
The art of closed captioning has come a long way from its humble beginnings. What started as a visual aid for the hard of hearing has become an everyday part of the media landscape. From live television broadcasts to YouTube videos and the latest Netflix blockbuster which is being streamed around the world instantly. In addition, closed captioning and subtitles have become a vital part of today’s digital education landscape to show real benefits for learning.
Understanding Closed Captioning
One of the clearest indications of its popularity is the finding that four out of five people who use closed captioning when they’re watching video don’t actually need it. They just do it because they like it and it lets them make a deeper connection with the material they’re watching. Captioning helps to reinforce the visual messaging and in noisy places, like bars and gyms, it’s the only way for people to follow what’s happening on screen.
The terms ‘closed captioning (or CC)’ and ‘subtitles’ are frequently used interchangeably, but there are differences. Closed captions are a timed transcript of the dialogue between characters on the screen, while subtitles are translated into another language.
How People Get Closed Captions
Most businesses and people who need closed captions for their video content get them through a closed caption service like Rev.com. They send their online videos or video files to the closed captioning service, and receive a caption file like an SRT file back in around 12-24 hours, which they can then use to easily add the captions to a variety of online video platforms and software.
Captioning online content, news programs, television episodes, movies, and other types of video content significantly benefits deaf or hard of hearing individuals, and helps to make content more accessible to more people.
Closed Captions in The Early Days
When people first began watching films during the silent movie era, cards with written text on them were known as ‘intertitles’, and they were edited into the film between the scenes of a silent movie to deliver snippets of dialogue to the audience, while live musicians provided soundtracks when possible.
The transition towards recorded sound and away from intertitles happened gradually in the late 1920s. Musicals and gangster films were the two genres were sound was an absolute game-changer, as the New York Times wrote a few years back, “they flourished when sound introduced the sensational elements of chattering machine guns, screaming tires and, most importantly, the varied timbres of contemporary American speech, bursting with vivid idioms (Aw, go slip on the ice!) and filtered through every accent known to man.”
Closed Captions as We Know Them Today
Closed captioning only resurfaced in the early 1970s when ABC television studios teamed up with the National Bureau of Standards to try and answer the question ‘might it be possible to send captions encoded in the television signal?’. In February of 1972, they demonstrated their results at Gallaudet College in Washington DC using closed captioning that was embedded into the regular TV broadcast of a show called “Mod Squad”. From those humble beginnings, it caught the imagination of the public, making its way into the halls of congress where it became law that video programming distributors are required to close caption all of their output.
The National Captioning Institute tells the fascinating history of captioning from that point on right here.
The FCC’s 4 Guiding Principles for captioning television programs:
- Accurate: Captions must match the spoken words in the dialogue and convey background noises and other sounds to the fullest extent possible.
- Synchronous: Captions must coincide with their corresponding spoken words and sounds to the greatest extent possible and must be displayed on the screen at a speed that can be read by viewers.
- Complete: Captions must run from the beginning to the end of the program to the fullest extent possible.
- Properly placed: Captions should not block other important visual content on the screen, overlap one another or run off the edge of the video screen.
The educational benefits of closed captioning have been understood for a long time. The ability to hear audio in one language while reading it in another has a profound effect on the way that students learn. But even when working in one language, many students rely on closed captioning when they work with video, because it helps them to focus and it helps them to retain information as well. When they were surveyed, students admitted that it’s also very useful for learning difficult vocabulary, as well as overcoming poor audio quality and teachers who have accents that can be challenging to understand.
How Does Closed Captioning Work on Live Television?
Closed captioning live events is one of the most important and one of the most difficult tasks to achieve. During important television events, such as sporting events, political debates and live breaking news events, television channels strive to produce timely and accurate closed captioning a few seconds after the words have been spoken.
They employ teams of world-class stenographers to capture and put down the audio soundtrack. These people generally have certificates and degrees in broadcast stenography and are often the best at what they do. Part of the skill is to deal with competing dialogue and a wide variety of noises and sound effects during a live broadcast.
How Does Closed Captioning Work on Streaming Services?
Most of the closed captioning that we experience today comes from streaming services like YouTube and Netflix where the text doesn’t have to be captured in real-time. Once the final video edit is complete, professional captioners (like the ones who work for Rev.com) add the captions using software that allows them to go frame by frame and craft the text in order to make sure they have the right timing and feel of the piece.
Caption writers are able to work off a script or a recorded video, as well as the soundtrack to build up the captioning for the content. Read more here about how captions are delivered to streaming services.
At a minimum, today’s streaming services offer English as a standard but there is a huge need and demand for captions, or subtitles, in other languages, and the market is only going to get bigger for foreign language captioning as the rest of the world embraces digital and streaming services.
How Does Closed Captioning Work on Movies?
When movies make use of captioning, they tend to provide a more limited experience than TV or streaming. Movie subtitles don’t describe the action or sound effects occurring on-screen, they usually just translate the dialogue into another language for a wider audience.
For some reason, many moviegoers are still resistant to the idea of English subtitles in a foreign language film. As Bong Joon-Ho, the acclaimed South Korean director of Parasite, which was nominated for an Oscar in 2020 said during his Golden Globe awards acceptance speech for Best Foreign Language Film, ‘“Once you overcome the one-inch-tall barrier of subtitles, you will be introduced to so many more amazing films.”
The whole world of international cinema opens up to viewers who embrace subtitles in movies.
How Does Closed Captioning Work in Movie Theaters?
Movie theaters are a challenging space for closed captioning. In the past, if a movie wasn’t subtitled on-screen then there were no other options, but some cinemas have begun to offer options that dramatically improve the experience for deaf patrons.
One of the most useful inventions is a closed caption stand, which is inserted into the cup holder on the side of the seat, or the back of the seat in front of it. The patron can then adjust the screen to suit their height and so that they can switch easily between watching the screen and reading the captions. It’s not ideal, and the viewer has to work quite hard to watch both screens, but it’s better than nothing, although the light from the screen can be a distraction for other patrons watching the movie.
Another idea that is being developed is around close caption glasses for patrons. This system uses a tiny projector to cast the captions onto glasses that the hard-of-hearing users can wear in order to follow the action on the screen. It’s a much more personal experience that doesn’t interfere with other patrons’ enjoyment of the movie.
On the downside, the technology is expensive and glasses need to be attached by wires to a small control box and that can be uncomfortable. If you already wear glasses, then wearing another pair on top of that is far from ideal.
In the digital economy, it’s far more efficient and cost-effective for content providers to use specialist caption providers like Rev.com to provide closed captions rather than building their own teams to do it. Rev employs specialist, experienced teams of stenographers who boast accuracy levels of 99%, who can deliver on a job in a matter of hours, and who only charge $1.25 per minute of captioning.
It’s a quick and hassle-free way of serving millions of consumers and staying compliant with regulations at the same time.
A Force for Good in Society
The use of closed captioning on all forms of TV and video has had a profound effect for good in society. Not only has it brought those with hearing difficulties into the mainstream, but it has also changed the way that many people learn and enjoy content in situations where audio is unavailable. Think of students trying to learn quietly when their siblings are sleeping, or travelers in airports keeping up with the news and old age pensioners enjoying a movie on Netflix that they would not have been able to hear in the past.
Closed captioning has democratized society’s interactions with video, empowered millions of people who struggle with their hearing and changed the way we all consume video for the better.