My Current Pet Project: Cinema Subtitles (pt 2)
So, in the previous post I laid out all my ‘pain points’ and annoyances with the existing service for cinema captions.
It was around October that I thought about making an iPhone app to solve all these problems. After all, an iPhone can do all the things that the Capti-View can. Perhaps it can even improve on the Capti-View system?
I did some research and uncovered some interesting facts:
The online subtitle community is incredible
How incredible?
One week after cinema release, there were subtitles online for the new star wars movie.
In less than half an hour after airing the final game of thrones episode. There were subtitles in english, spanish, farsi and french!
Basically, like most healthy online communities, people do it for the love of it. They do it for the “karma”. Some of the best subtitle authors enjoy a very good reputation for their work.
So, sourcing subtitles was not going to be an issue. They are already out there, hosted on quite a few large platforms.
There are existing subtitle apps
If subtitle apps already exist, should I bother developing one? Well, I downloaded one and checked it out. I worked pretty well except for one key point. Synchronisation.
Have you ever watched a movie where the audio slipped out of time with the video? It is very annoying and can be downright confusing. Having correctly synchronised subtitles is a total “must have” feature for any application providing this service.
The app that I downloaded used a slider bar. This is a terrible solution to “scrubbing” the subtitles into sync.
Representing an entire movie with 4 inches of screen is very silly. Sliding across 3 pixels is equivalent to about 2 mins of movie time. So, in its current format, I think the existing app in the App Store is pretty useless.
So what’s my idea? I have Two!
I have two different approaches that I’m going to use.
Idea 1: A better UI
One approach is to develop a UI that solves the problem of “scrubbing” or syncing a sub. This should be interesting.
I think that selecting subtitles using a scrolling table view would be a much better solution. But I will have to experiment.
Idea 2: A clever algorithm
The other idea is a bit more outlandish. Could an algorithm sync the subtitle by listening to the movie?
I’ll expand on this idea: Perhaps the application could use the iPhone microphone and “listen” to a sample of audio. There are plenty of open source algorithms that do a pretty good job at this.
The actor might say: “Lets get out of here. Its going to blow!”
The application would hear: “Lets get our beer Its going to snow”
But remember the application would be able to assign a weighting of certainty to the words that it heard. So perhaps “beer” and “snow” get replaced with wild.
Now we just have to search our subtitle file for the pattern “Lets get our *@#* Its going to *@#*”
It might take a few lines before it gets a possible match, but when it does it can time shift the subtitles.
Perhaps this process only needs to happen once? Maybe the first person who uses the app in “Spiderman 7” can rate the sync quality and send a small signature back to a server. This signature could be a pattern of the first minute of the film. Future viewers of “Spiderman 7” would use this signature to sync their subtitles before any audio is even spoken?
Anyway, there are lots of idea here. Before I could explore them, I would need to be able to program yeah? So that’s kind of how I got started down this road.