Dezeen Magazine

Audi uses gaming technology to test vehicle designs in the virtual world

German manufacturer Audi is adapting gesture-control technology used in the gaming industry to make its vehicle design processes more efficient.

Engineers will use the technology to assemble prospective vehicle designs in a virtual 3D environment.

Audi's team currently uses a manual controller to manipulate components in the company's virtual testing space – known as the Cave Automatic Virtual Environment (CAVE).

"We want to make picking up and moving the components more intuitive in the future," said Audi's development engineer Katharina Kunz. "Technologies from the gaming world are ideal for us because they are relatively inexpensive and are being developed rapidly."

Using a wearable device called Myo in conjunction with a virtual-reality headset, engineers and designers working inside CAVE will have the ability to interact with components more freely. According to Audi, the system will also allow designers to test the usability of their concepts.

Myo is an armband that measures muscle activity in the wearer's forearm. The device interprets movement before sending the data to a computer, which also tracks the wearer's position using Microsoft's Kinect infrared camera system.

The Myo armband monitors the wearer's muscle activity and then converts the movement into data

"All this information, processed together, allows the engineer to instantly pick up and move items in the CAVE virtual world with great precision," said Audi in a statement.

The wearer activates Myo by touching their thumb and middle finger together. This ensures that the wearable device doesn't interpret every movement as a control gesture.

Audi is currently piloting the gesture-control technology, and aims to roll it out fully in the coming months.

A number of car manufacturers are investigating the possibilities of gesture controls for drivers. In January, Volkswagen debuted a concept car with a sunroof that opens with the wave of a hand. Land Rover is also developing a sight activated dashboard.

Earlier this year, Google unveiled a new type of interaction sensor that uses radar to translate subtle hand movements into gesture controls for electronic devices with the aim of changing how they are designed.

Musician Imogen Heap has also been experimenting with gesture controls with her Mi.Mu gloves, which allow her to perform live without instruments or control panels and could lead to more accessible ways of making music.