It’s like a guide dog but instead of guiding blind people through the streets it guides them through the movies. How come?
It helps visually impaired and partially sighted people to watch movies. Audio description translates pictures into words. Maybe it doesn’t make the blind see the movie, but it makes them feel the movie. And if something feels the same, it is the same.
Most cinema theaters in Poland are not ready to screen audio described movies. What is worse, cinemas have different equipment, different projectors, different soundtrack formats, different everything.
Since it is a mobile app, it works regardless of the equipment in theaters. Movie Guide Dog outsmarts the formats, projectors and stuff. It lists movies that have audio descriptions available. Then you download the description and go to the movies. The dog has very good ears, so it synchronizes the audio description with the movie soundtrack. And ta-dam! You watch the movie.
Movie Guide Dog is cheaper than any cinema equipment, but it requires some training. The app must be blind friendly and synchronizing audio descriptions with movie soundtracks will take some technology. We will run and train the dog in Poland because we live here, as do 145 000 visually impaired and 700 000 partially sighted people. But when it’s ready, it’ll be a platform available to anyone. For free.
- In-house built Audio Fingerprinting technology, which enables movie recognition like Shazam
- Large database of movies and their fingerprints
- Searching & learning algorithms
- Build a prototype of custom hardware based on ARM Cortex-M7 which enables our solution to work without even smartphone or internet
Few words from Marcin (UX) & Tomasz (Developer)
Among many challenges in the project, one kept on bothering me: How to design an interface that
will be beautiful, usable and invisible at the same time? In the application, every screen is a
single button and there are no other elements on it. We also used large icons, voice reading of
messages and high colour contrast.
The project required us to devote large amounts to research and development works: We had to
teach Movie Guide Dog how to listen to and learn movies, how to identify individual fragments,
as well as how to synchronize a soundtrack to what is happening on the screen. We used fast
Fourier transform (FFT) and GPU distributed processing. A separate challenge was to develop
dedicated hardware that allowed audio description to work without the Internet or even a phone.
To do this we used our own dedicated Linux version and Embedded Python.
The United Ideas team gave the Movie Guide Dog project their absolute best. They worked to a very tight deadline under tremendous pressure. Even so, at every stage of the project, I was kept posted with fact-based, structured updates. The team provided the updates in layman's terms to make sure I understood everything (I'm not a programmer). The project also required large scale RnD works - and they 100% delivered on this, too!