Simulating Distorted Visuals of a Drunk Driver with a Real-Time Streaming to Virtual Reality Glasses
The application runs on Android devices transmitting distorted visuals to glasses of virtual reality. Is used at special trainings to improve skills of safe driving.
- Technologies used:Java, Android SDK, Google VR SDK, OpenGL ES, NativeStackBlur, GPUImage for Android
- Industry:virtual reality
The customer is a global automobile consortium, which also has a driving school of its own. At the school, teenagers can gain and improve their skills of safe driving at no cost. The training are conducted by professional tutors all over the world. The program under those trainings embraces four key areas that are critical factors in more 60% of car accidents: hazard recognition, vehicle handling, speed management, and space management.
When the customer turned to HQSoftware, one of the activities under the safe driving program was distorted driving. It helped the students to understand what physical difficulties a drunk driver experiences and where it may bring him. For that purpose, the trainees put on darkened-lens glasses and tried to cover a distance with obstacles on the way.
The customer wanted to bring the experience closer to reality and develop an application that would simulate visual disruptions when intoxicated. The application should run on Android devices transmitting distorted visuals to glasses of virtual reality (Google Cardboard, Samsung Gear VR).
In the course of the project, development team at HQSoftware faced the following challenges:
- the application should apply distortion filters, while the image is streamed by the virtual reality glasses in real time.
- when the project kicked off, Samsung Gear VR was a brand new technology and our developers had to enable smooth app integration into its ecosystem.
The delivered application simulates five major distortion effects: latency vision, swirled, blurred, vignette, and double vision. In addition, a user can choose from a combination of these effects, as well as tune any of them.
With the OpenGL ES API, engineers at HQSoftware enabled the application to overlay visual distortion filters in real time. This API was also customized to deliver most of the filters themselves. The blur filter was developed using the NativeStackBlur library. To create a vignette effect, our specialists utilized the GPUImage library for Android.
Furthermore, development team at HQSoftware wrote an algorithm that puts the streamed images into a queue and transmits it back with the a delay at pre-set time intervals.
By exploring under-the-hood mechanisms of Samsung Gear VR, engineers at HQSoftware successfully implemented the application into Gear VR’s ecosystem.
Cooperating with HQSoftware, the customer delivered an application that simulates visual distortions of an intoxicated driver and overlays them on the image streamed to the virtual reality glasses in real time. At the moment, the pilot version features five major filters (latency vision, swirl, blur, vignette, and double vision) and their combinations that can be further tuned.
Now, the application is available for Android devices and is compatible with Google Cardboard and Samsung Gear VR glasses.