Dezeen Magazine

Royal Shakespeare Company to stage live play in virtual Midsummer Night's Dream forest

The Royal Shakespeare Company is set to put on a virtual play called Dream that will be performed by actors in motion capture suits and animated in real-time.

The digital world, which is revealed above in an exclusive video shared with Dezeen, was designed by London creative studio Marshmallow Laser Feast using the same game engine as popular online game Fortnite

Performer in motion capture suit rehearsing for virtual Dream play by the Royal Shakespeare Company
Dream will be performed by actors in motion capture suits

Dream follows Puck, the mischievous sprite from William Shakespeare's A Midsummer Night's Dream, as he embarks on a journey through a magical woodland.

It will be performed live for ten nights between 12 and 20 March in a seven by seven-metre cube in Portsmouth's Guidhall.

The play will be live-streamed to audiences who will see not the performers, or the stage, but their movements and voices expressed by virtual avatars in the virtual world.

Still of Peaseblossom character from Dream
Peaseblossom is one of the sprites featured in the play

This will be made possible through the same technology that has long been used to create characters for video games and feature films, such as Gollum in Lord of the Rings.

But while this process has previously involved lengthy pre- or postproduction to create an unchangeable outcome, Dream is animated as it is being acted out, which leaves unprecedented room for spontaneity.

Trailer for Dream

"There's space for performers to improvise and for us to allow for happy accidents to occur and to find beautiful moments through the rehearsal process," Marshmallow Laser Feast (MLF) founder Robin McNicholas told Dezeen.

"This just hasn't been done in the past because motion capture is generally quite a laborious and painting-by-numbers approach. We're using the same techniques but in a live setting. As a result, it's really high-risk but really exciting. It feels like an adventure."

Animation of virtual forest created using Unreal Engine
The virtual forest was created using Unreal Engine

With the help of a web player that was specially designed for this project, viewers will be able to manipulate Dream's virtual world via their laptop, phone or tablet.

MLF has designed both the digital sprite characters as well as their surrounding forest world in advance using Unreal Engine. This 3D production platform, created by American video game developer Epic Games, was previously used to animate Fortnite as well as several Star Wars films.

Rose bush animation from Dream play by the Royal Shakespeare Company
Rose bushes are among the local fauna found in the forest

To populate the virtual woodland, MLF worked with the play's writer Pippa Hill to incorporate the flora mentioned in A Midsummer Night's Dream that would be found in English forests at the time the play was written.

"Instead of presenting the traditional Victorian illustrated fairy world that Midsummer Night's Dream is often portrayed with, we thought: let's focus on the wonders of the natural world and the magic of it," McNicholas explained.

"We take people from the canopy of the trees down to underneath the forest floor, which provided us with lots of different spaces to be creative within and to frame characters."

Animation of virtual forest in Dream play by the Royal Shakespeare Company
MLF designed the world to highlight the beauty of nature

Although the world was pre-designed, the way the characters move and interact with their environment will be determined by the actors, who will wear both facial rigs and mocap suits.

Vicon motion capture cameras will record the actors' actions and expressions while large LED screens surrounding the performers help them to imagine where their avatar is in the virtual world.

Staging blocks, which are used to give the actors elevation, will be the only props involved in the play and will be animated as anything from rocks to roots and branches.

Performer in motion capture suit rehearsing for virtual Dream play
Staging blocks help give the virtual character elevation

At key points, the actors will also able to use their movements and gestures to manipulate the play's score, which was recorded by London's Philharmonia Orchestra.

"We have all the different instruments recorded separately, so one performer's arm could be a cello and another's could be a harp," McNicholas explained. "But it is tailored in such a way that any note that is played corresponds to the overall score."

Behind the scenes of real-time animation for Dream play by the Royal Shakespeare company using Unreal Engine
The play is being animated in real-time

Viewers will be able to participate in the play as fireflies, which can be controlled via a mouse, trackpad or touchscreen. These fireflies will appear both on the viewers' screens as well as on the LED screens surrounding the performers.

"The audience at home are part character, part lighting department, part costume," said McNicholas. "And neither they nor the performers, the cast or the crew know the outcome of the performance. Each one is unique – it will look and sound different."

Bushes in virtual play
It takes viewers through different levels of the woodland from the canopy to the forest floor

The production is the outcome of a multi-year research and development collaboration, which was funded by the UK government's Industrial Strategy Challenge Fund and aimed at finding new ways to engage audiences through technological innovation.

The Royal Shakespeare Company previously staged a mixed reality production in 2016, using live performance capture via Unreal Engine to render Ariel, the spirit from The Tempest, as a 3D hologram on stage.

The technology is set to become more readily available for smaller creators, as Unreal Engine announced last month that it would be releasing an app called the MetaHuman Creator, which allow anyone to create a realistic digital character in a matter of minutes.