The future of accessibility, now.
NHK Enterprises, Inc. (NEP) has long been committed to better serving the deaf community in Japan, becoming an accessibility pioneer in the media industry. This has led to heavy research into Japanese Sign Language (JSL) and investment into technology that can make important news available to JSL speakers. Our solution: Kiki, a virtual human interpreter that generates full JSL dynamic interpretations from text input.
Tech and creativity for good.
You might wonder: why not just use subtitles in broadcasts? JSL is its own distinct language, and native speakers may not speak or read Japanese for various reasons. This made speaking to the deaf community in their own language an important priority. While Kiki’s primary use case is to interpret emergency broadcasts for natural disasters on NHK’s website and TV channels, the platform is available for licensing to other major organizations, with initial engagements focused on generating sign language interpretation for museum exhibit guides, public announcements and more.
Turning text into a magical performance.
Kiki is a cutting-edge, highly realistic virtual human for sign language interpretation. Built from 16,000 individual mocaps, the stunning 3D character looks, moves and feels human. The back-end application, powered by Unity, turns sentences into a welcoming and dynamic sign language interpretation, outputting a video file rendering of the sentence in near real time. Altogether, the platform achieves significant cost and time savings compared to a traditional live professional broadcast, ensuring JSL speakers have better access to the news and information that’s important to them.