Life of Us

The studio
Bringing social VR to life

Life of Us creators Chris Milk and Aaron Koblin are LA-based VR innovators known for “The Wilderness Downtown,” an interactive Arcade Fire video that won a Grand Prix at the 2011 Cannes advertising awards.

Over the years, the two have put together a string of ground-breaking projects that blend cool technology and consumer entertainment. Today, the pair collaborates with organizations like Apple, The New York Times, Vice Media, and the United Nations.

Their brainchild is WITHIN, a production and distribution company of more than 35 whose mission is to define VR as a new medium for experiential storytelling and use it to seed empathy. “Virtual reality is the ultimate empathy machine,” Milk says. “It allows you to connect on a real human level, soul to soul, regardless of where you are in the world.”

Life of Us is WITHIN’s first prototype for what a shared VR experience looks and feels like.

The project

Produced in association with Annapurna Pictures, Life of Us features original music by Pharrell Williams. The action-packed piece premiered at Sundance’s New Frontier Festival and Tribeca in 2017 and lasts just under 10 minutes.

Through the end of 2017, you can experience Life of Us in:

Los Angeles (IMAX VR Center and AFI Fest LA)
New York (IMAX VR Center and Future of Storytelling)
Montreal (Phi Centre)
Paris (mk2)
London (Raindance Festival)
Beijing (SoReal)
Shanghai (IMAX)

Check the venues and festivals for exact dates.

A billion years in 10 thrilling minutes

あぶくを吹きだし、火を噴き、氷河時代を生き抜き、火山の上空を飛行しながら、異なる生物の視点と数十億年の進化を体験しましょう。Life of Us はゲームでもなければ映画でもありません。地球上の生命が語るすばらしいストーリーを聞きながら、友達とあるいは見知らぬ人といっしょに自分の部屋で VR アドベンチャーを楽しむことができます。

1 度に最高 6 人のプレイヤーがそれぞれの選んだコースの中で自由に動き、たとえば、人間ではない声で互いに"話をする"といったようなインタラクトができます。Engadget のあるレビュアーは、「それはひとりで映画を見るのと、友達といっしょに見るのとの違いのように感じられます。ヘッドセットをつけて世界を遮断していても、誰かとつながることができるのです」

The reveal
Telling the story with Timeline

Just as some prehistoric fish eventually grew legs and left the oceans behind, Life of Us helped evolve WITHIN’s VR production in leaps and bounds. One vital tool that enabled them to compose their sophisticated story was Unity’s Timeline, which they employed to drag and drop content into a rough cut.

“That’s the most important feature we used,” says Koblin. “The experience is highly scripted and linear, and Timeline made it easy to trigger animations, music and events so we could just slide things around and change them in real-time.”  


WITHIN used Timeline to trigger and change animations, events and music in real-time.

WITHIN used Timeline to trigger and change animations, events and music in real-time.

In Edit mode they were able to trim, move and finesse hundreds of different clips and view their results instantly, making it “easy to iterate on a project where timing is crucial because it has a scrubber . . . it saved us countless hours,” says Scott Cairns, lead engineer.

Additionally, Timeline’s granular controls allowed them to adjust clips to the frame in order to keep players synced across the network. And because every transition was planned on a single timeline, they relied heavily on its Blend capability to set up inter-clip blending weights to manipulate transition strengths.    

Speaking in prehistoric tongues

To deepen the shared experience, WITHIN wanted players to be able to utter sounds and hear each other but not in any known language. To achieve this, the team innovated a unique sound design so that when players speak – as tadpoles, dinosaurs, apes, etc. – their human voices are translated into “warble” voices.

“We worked out real-time modulation to change your voice as you embody each new creature,” says VR engineer Jake Jeffrey. To achieve the effect, they employed frequency modulation to alter the pitch, and granular synthesis to redesign the modulated sound: “By chopping up your voice into thousands of smaller pieces, we could turn it into the voice of anything from a grunting gorilla to a roaring pterodactyl.”

Reducing the voice latency was another hurdle that the team worked through. “In the end, we combined a slew of existing technologies in order to achieve a system for voice manipulation. For upcoming projects, we’re especially eager to use the Unity Native Audio Plugin SDK with its own Granulator.”

このメイキングビデオを見て、WITHIN が実現した高い技術について学びましょう。
Character-rigging brings predators alive

In addition to the unique sound design, the mechanics of the player movements – flying like a pterodactyl, running from a T. rex, swimming like an amoeba, and so on – also presented major challenges. “Those were very difficult to get right and to make feel comfortable,” says Koblin. “We’re proud of our character rigs – which combine keyframing, inverse kinematics, and ragdolls – supporting up to six people in the space.”

They used keyframing to animate movements that were independent of player input, and inverse kinematics to determine where necks, shoulders and elbows would be (vis-a-vis data coming in from the headsets and controllers). To determine how monkeys, for example, would react when grabbed off fellow players’ backs and thrown in the air, they tapped ragdoll physics.

Advanced character rigs give players the sensation of real flight and movement.

Advanced character rigs give players the sensation of real flight and movement.

Koblin recalls the magical moment when the life in Life of Us all came together: “You could see the humanity pouring out of our creation despite the fact that it wasn’t photorealistic at all. We realized that through our innovations we were suddenly in this weird world together!”

Explore other amazing VR experiences like Life of Us made with Unity.


Unity のオープンでフレキシブルなアーキテクチャでゲームを作ってみたいと思いませんか? Unity の最新バージョンをご覧ください。