![]() ![]() Whether it be accurately rendering regional and foreign accents, puns that are essential to a punch-line, or adding labels to depict sounds, our job is to provide the same information and experience that a hearing viewer enjoys. Personally, I’ve always felt that monolingual subtitling involves a certain level of translation. I think it indulges the linguist’s love for words – and our pedantry for grammar. Many of my colleagues have language degrees, which may seem strange considering we generally subtitle from English into English. We receive these clips in advance and have time to research vocabulary and ensure the punctuation is just so. This requires an on-the-spot judgment call – from a human being, and not a computer.įortunately, there’s no chance of any dictionary malfunctions and real-time edits when subtitling pre-recorded programmes. In this case, I’d need to carefully edit where necessary, so that the most essential points are being subtitled. Think of a debate show like Question Time, where there can be several people talking (or shouting) at the same time, often at high speed and in different regional accents. Once my own ‘voice model’ is loaded, the computer only understands me. Why can’t the software pick up the audio directly from the studio? Again, the technology is not advanced enough for that. You’re probably wondering why there’s a need for the middle man. And with a name like Gurbanguly Berdimuhamedow, the results probably won’t be pretty! We prepare thoroughly for all output but it’s impossible to predict exactly what vocabulary we’re going to need. If there’s breaking news about the president of Turkmenistan and I’ve not entered it into my dictionary, the computer’s going to go with its best guess. Similar problems arise with tricky names and obscure vocabulary. If I didn’t, you could end up reading – ‘The finish athlete eight at a tie restaurant in soul’ instead of ‘The Finnish athlete ate at a Thai restaurant in Seoul’. Wherever possible, I use the software to create vocal commands which can get around these. Unfortunately, the English language is full of them. ![]() It particularly hates homophones – words of different meaning that sound the same. While the technology we use is impressive, it’s not quite as sophisticated as human language (yet!). I promise you that we subtitlers do know the difference between having ‘patience’ and ‘patients’. This accounts for the slight delay in the subtitles reaching your telly, and also for some of the mistakes you may notice. We also add in punctuation, move the subtitles’ position and change colours when there’s a new speaker. But the vast majority of us now use speech recognition software to repeat – ‘respeak’ – what’s being said on the TV in a computer-friendly, robotic voice to produce on-screen text. Some subtitles are produced by stenographers, who transcribe speech by writing in shorthand on a stenograph machine. Contrary to popular belief, live subtitling does not involve typing really fast. This is where voice recognition software comes in. Such a huge increase in subtitled output required the development of new technology. Sports channels like Sky Sports and ESPN have more live content, which means more real-time subtitling. Fast forward a couple of decades, and Red Bee Media now provides subtitles for 100% of programmes on both Channel 4 and the BBC. As a result of the 1990 Broadcasting Act, subtitling output increased on the BBC and in response to the growing demand from deaf and hard-of hearing audiences to see subtitles on network news programmes, the BBC set up a specialised live subtitling unit. In 1986, Blue Peter became the first programme with live subtitles. This also marked the first use of CEEFAX subtitles for the deaf in the world. Subtitling departments were established in the BBC in the early ‘80s, but the first BBC programme to carry subtitles aired back in 1979 – a documentary about deaf children called Quietly In Switzerland. I worked as an assistant in the subtitling and signing departments before becoming a fully-fledged subtitler. So on my return to London, when I saw a vacancy for a subtitling assistant, I didn’t think twice. I really enjoyed learning more about and interacting with the deaf community. It was amazing to see the classes being delivered in ASL. After graduating, I spent a year in Montreal and was offered a temporary admin role in a school for deaf and disabled children. I’d studied French and Spanish at university, and wanted to work with language in some way. My encounter with the world of subtitles was a happy accident. ![]() I have to confess – a career in subtitling was not a burning ambition of mine.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |