EVE débuts in Berlin

We had the chance to test EVE for the first time, under real-life conditions, at the Microsoft Explained digital conference in Berlin. And it started off with a surprise.
EVE débuts in Berlin

We had the chance to test EVE for the first time, under real-life conditions, at the Microsoft Explained digital conference in Berlin. And it started off with a surprise. Microsoft found the product so interesting that our colleague, Maxi Krause, was asked to introduce it up on the stage. After all, the theme of the event was “artificial intelligence”. Moderator Daniel Finger asked him a few questions:

What is the idea behind your project?

The main idea behind EVE is to help hearing-impaired people participate more actively in events. This has been very costly in the past. In order to provide them with subtitles they could read in real time, it was necessary to hire a stenographer to transcribe the spoken word into text.

According to a report produced by the World Health Organization (WHO), 466 million people are hearing impaired (2018 statistics). Many of them rely on subtitles to be able to understand anything. The sad truth is that subtitles are not available often enough on the web, on TV and at events.

That’s why we developed EVE – to make this service available at as many events as possible. EVE uses AI and Microsoft Cognitive Services to produce subtitles, which we can then see live on a monitor or via a link on a personal device. You could also access this view on your mobile phone, where it is fully responsive in your browser. Everything in real time and much less expensive.

 

How typical was the AI performance during the event?

I followed the subtitles during the keynote. In my opinion, EVE’s performance was at approximately 90% of where we want to be. But, we are still developing EVE. What we are looking at today is only a beta release. It is definitely not perfect and still needs improvements.

Another thing you need to know to make an objective assessment is that it is possible to have a human revise the text before it is released. The proofreader checks the recorded sentences and corrects minor errors, such as technical terms. Typos and punctuation errors are corrected by our service in real time. By the way, the proofreader can be sitting anywhere in the world. The only thing he needs is an Internet connection and a browser.

 

EVE was used for 8 hours straight and for 30 minutes Maxi explained to the audience what our service is all about. (Source: Filmgsindl GmbH)

How quickly will AI improve?

The service learns automatically with each correction the proofreader makes. For example, if the word “ice tea” is changed to “Iot” (Internet of things) five times, EVE makes note of this. EVE then uses the correct word the next time. By default, these learned corrections stay with the customer’s account. However, the user can allow EVE to share its learned corrections with others. There are many very important data protection questions in the AI age.

 

What are the biggest challenges EVE faces?

The biggest challenges are when more than one person is speaking at the same time. But even we would have difficulty understanding what is being said. Accents are also not easy, for example, an American who holds a speech in German. Or someone from Munich speaking a Bavarian dialect. However, there are things we can do to overcome these challenges. In the future, we will be able to train EVE using language models. Customer can ask their presenters to record 10 sentences on their mobile phones and then send them to EVE. This will improve the recognition rate of the words immensely. Technical terms can be learned easily by uploading a dictionary.

 

Will jobs be lost because of EVE?

No, just the opposite! And this was important to us from the very beginning. We offer EVE licenses to stenographers at a special price. They can use it to transcribe events without actually being at the event. They don’t need to travel as much. This lowers costs. They are able to appeal to clients who would have otherwise not paid for their service. So you see, jobs are not lost. They become more digital and remote.

 

What will the future look like for EVE?

EVE is a project very close to Tom Papadhimas’ and my heart, and we believe that we can achieve a lot with it. We have many plans! Our plans include, for example, many new languages and live translation. And we want to use speaker recognition to output text in different colours so that each colour represents a different speaker. Over the long term we also want to support HoloLens. Imagine it – you are at a football stadium. You put on your HoloLens and you see real-time subtitles of what the stadium announcer is saying.

 

When will it be available for purchase?

EVE should go live in February 2019. Until then, we are gathering feedback from selected test customers and are developing the service using this information. Contact us if you want to test EVE in your environment or provide us with ideas!

EVE live at work, for the first time!

Here is what the event location looked like just before the guests arrived. Nevertheless, we were still pretty nervous! (Source: Filmgsindl GmbH)