Highlights from Unite Austin 2017

- by John Luxford

By John Luxford, CTO & Cofounder - Flipside


I had my first visit to Austin, Texas for Unity’s Unite conference which ran from October 3-5, and wanted to share some highlights from the amazing week I had there.

Sign up for early access to Flipside!

Neill Blomkamp interview

Neill Blomkamp, maker of District 9 and Chappie, has been collaborating with Unity to produce the next two short films that are part of Unity’s Adam series. The series is meant to demonstrate Unity’s ability to render near lifelike animated content in real-time, and it is just beautiful.


The first of the two new short films is called Adam: The Mirror, which was shown during the Unite keynote just before Neill Blomkamp was invited on stage to talk about the differences moving from pre-rendered CG to real-time, and how his team felt like it was almost cheating because you no longer have to wait for each frame to render before watching the results. It was super cool to hear him speak, having been a big fan ever since I first saw District 9.

Rick and Morty talk

It’s always a pleasure to hear Devin and Alex from Owlchemy Labs talk about VR, not just because they’re super entertaining, but because of the careful consideration they bring to each aspect of design and development.

This talk deconstructed their Rick and Morty: Virtual Rick-ality game, showing how they solved issues like designing for multiple playspace sizes, 360° and 180° setups, making teleportation work in the context of the game, and designing interactions for a single trigger button control scheme.


They also showed a spreadsheet of the possible permutations that the Combinator machine lets players create, and it reminded me a lot of their talk about Dyscourse having around 180 separate branching narratives all weaved together. Sometimes to solve the hard problem the right way takes an awful lot of hard work.

Airbnb roomies

I arrived in Austin just in time to head over to the Unite Pre-Mixer put on by the local Unity User Group. As I was walking around the room going between the drink station, the various VR and AR demo stations, and chatting with the occasional person, I hear a “Hey you, you look like you’re just wandering around. Come talk to us!” from someone who then introduced himself as Byron. We all had a great chat, and it was a nice welcome into the Unite community.

Fast-forward a day and I head back to my Airbnb around 1am after cruising 6th Street with some friends I ran into / met that day, and sitting on the couch is a guy who introduces himself as Mike who just came back from Unite as well. Awesome! So we get to talking, and about 15 minutes later Byron walks into the house, looks at me, and says “I didn’t know we were roommates!”

We all laugh about it, find out we’re all Canadian too, and stay up until around 4am laughing and telling stories. Couldn’t have been a better living situation to find yourself in for your first Unite :)

AR roundtable

I got invited to participate in a roundtable with a handful of other developers to share our thoughts on how Unity can better support AR development. It was an honour and super fun too! We talked about so many aspects of both AR and VR, challenges with input limitations, tracking limitations, trying to create useful apps instead of short-term gimmicks, and lots more.

This highlighted for me how receptive Unity is to learning from its community of developers and artists. Coming from the open source world, you can really see which projects listen to their user base, and which ones assume the user must have done something wrong. The bug couldn’t possibly be our fault.

It’s very encouraging to see Unity fall squarely on the right side of that cultural divide, and is something I felt echoed in each conversation I had with Unity’s developers over the course of the week.


Other notables for a post that’s already too long

  • Got to try Microsoft’s Mixed Reality headset and controllers (which were quite impressive!), while our dev kit arrived back at the office that same day.
  • Got to try the Meta 2 AR headset. Much better field of view than the Hololens, but interactions are equally limited at this stage. Someone please integrate Leap Motion into your AR headset!
  • Our first package was approved in the Unity Asset Store during the conference - VRKeys, a free drum-style virtual keyboard for Oculus Touch and HTC Vive.
  • So much BBQ meat and tacos - Austin is a great city for food!
  • Thanks to Unity for the awesome party with fancy cupcakes, donuts, live band, and great conversations. I may have had more fun though at the Jackalope afterwards with Byron and the Owlchemy Labs crew :P
  • Unity’s new Entity and Job System for Data-Oriented Design and writing code that can more easily utilize all CPU cores looks AWESOME! Watching Joachim Ante (CTO/Co-founder of Unity) give an hour-long talk going over the new system in detail was really cool.

Now I can’t wait for Unite 2018. I look forward to learning lots of new things again, and seeing many familiar faces. Thanks to everyone who made my first Unite such an awesome one!

 

Working with live actors in VR - Part 3 of 3

- by John Luxford

Part 3: Technical lessons learned

By John Luxford, CTO & Co-founder - Flipside

This is part 3 of our blog post series about acting in VR and working with actors in virtual environments. Here are the two previous posts:

Now that we’ve explored some general lessons learned as well as lessons by actors for actors wanting to act in VR, here are some of the more technical discoveries we've made that can have a big impact on the quality of your final output.


Jordan Cerminara recording an episode of Earth From Up Here using Flipside Studio.

Work hard to minimize latency

Actors need to respond quickly to verbal and physical cues from the other actors present, as well as to changes in the environment. This is not a problem in the real world because there is no latency between the actors who are present in the same physical space, but actors over multiplayer are always seeing each others' actions from the past. This is the latency between them.

In a virtual space, latency is impossible to avoid. Even the time it takes for an action taken by the actor to be shown in their own VR headset can be upwards of 20 milliseconds. Remote actors will see each other's actions with latencies of 100 milliseconds or greater, even over short distance peer-to-peer connections.

Depending on the distance and connection quality, that can be as much as half a second or more, in which case reaction times are simply too slow. Past the 100 millisecond mark, actor-to-actor response times can degrade quickly, making the reaction to a joke fall flat, or creating awkward pauses similar to those you see on a slow Skype connection. For this reason, a virtual studio needs to be designed to keep latency to an absolute minimum.

Fortunately, a virtual studio doesn't have a lot of the same requirements video games have that make peer-to-peer connections disadvantageous. For example, the number of peers is going to be relatively low, and you don't need to protect against cheating, or waiting for the slowest player to catch up before achieving consensus on the next action in the game. So for VR over shorter distances, peer-to-peer is a better option than to use a server in the middle (although a server can often decrease latency over greater distances because of the faster connections between data centers).

Buffering needs to be minimized as much as possible too. Minimal buffers also mean the system can't smooth over network hiccups as easily, so a stable and fast network connection is needed at both ends.

Soundproof between multiple actors in the same location

A great way to keep latency to a minimum is to make sure the actors are physically located close together, preferably connected via Ethernet to the same network.

If you're recording a show with multiple actors in the same physical space, soundproofing between them becomes critical because the microphones in each VR headset can pick up the other actors's voices, causing issues with lip syncing where the lips move when an actor isn't speaking, or even hearing one actor faintly coming out of the other actor's mouth.


Even hearing feet on the ground, or the clicks from the older Oculus Touch engineering samples, can be picked up and become audible, or cause the character's lips to twitch. Wearing socks and using the consumer edition of the Oculus Touch controllers can make a big difference.

In-ear earphones are also key for ensuring voices don't bleed through from the earphones into the microphone of the wrong actor as well.

Audio quality is as important as visual quality

On the simplest level, this means adjusting the VR headset microphone levels in the system settings so that the voices at their loudest aren't clipping (e.g., causing audio distortion). It also means getting the audio mix right between the actors, the music, and other sound effects.


Clipping in a digital audio signal.

For traditional 2D output, a spatialized audio mix is not ideal either, since that means the mix will be relative to the position and direction of the local actor's head in the scene. For this reason, a stereo mix is important if you're recording for 2D viewers, but with Flipside we built a way of replaying the spatialized version in VR while outputting to stereo while recording.

Another challenge is that VoIP quality voice recording is substandard for recorded shows, by about half. Because higher frequency sound waves move faster than lower ones, a 16kHz sample rate is too slow to capture the higher frequencies of an actor's voice, losing detail and leaving them sounding muffled.

This ceiling where voice stops being captured properly is around 7.2kHz, but to capture the full frequency range of a voice you want to capture everything up to 12kHz, or even higher. But this is a trade-off between quality and the size of audio data being sent between actors. If the data is too large, it can slow things down, adding to the latency problem.

Rift vs Vive, pros and cons for acting purposes

There are pros and cons to both platforms, and while both are amazing platforms in their own right, which one is anyone's favourite usually comes down to personal preference.

That said, the Oculus Rift with Touch Controllers has certain advantages and the HTC Vive has other advantages too, for the purposes of acting.

On the Oculus Rift, the microphone generally sounds better, and the Oculus Touch controllers offer more expressiveness in the hands, as well as additional buttons which allow for control of things like a teleprompter or slideshow in the scene. We've found the Oculus Touch's joystick easier for precision control of facial expressions than the thumb pad on the HTC Vive controllers, and the middle finger button easier to grab with than the Vive's grip buttons.

On the other hand, the HTC Vive's larger tracking volume is much more ideal for actors looking to move around, although a 3-sensor setup can easily achieve a sufficient tracking volume for Oculus Rift users. The Vive also wins on cord length from the PC to the headset, and the Vive trackers are awesome for doing full body motion capture!


Closing thoughts

After working with professional actors in Flipside Studio for the past few months, it really opened our eyes to the subtle balance needed to provide an environment they feel not just comfortable acting in, but inspired to be in too.

We're glad we could share what we've learned with you, and as we continue seeking out new actors for our Flipside Studio early access program, we hope these lessons will inform creators and help you create better content, faster.

 

The process behind Earth From Up Here

- by John Luxford

Earth From Up Here is a news-style comedy show produced in Flipside that features an alien newscaster named Zēblō Gonzor, who sits at a UFO-shaped desk, with a screen behind him for showing photos and video clips. On his desk are a number of props - a coffee cup, a paper and pencil - and some buttons that trigger the show's intro, outro, and alien laugh tracks. The concept is similar to Saturday Night Live's Weekend Update and Last Week Tonight with John Oliver.

Sign up for early access!

Earth From Up Here was designed for a two-person team made up of a producer/director and a writer/performer. Here is how we make an episode of the show.

Preparing for the show

The producer/director searches for funny news stories and videos online and adds them to a preparation document.


The writer/performer takes this document and writes a script based on the materials provided.


Based on the script, the writer and producer also compile the needed photos and video clips. We use Photoshop and Adobe Premiere for this step.


We then upload the images and video files to the web. We use Backblaze B2 to store these files, but you could also use Amazon S3 or any website or file hosting service that provides public links to the files.

Producing the show

Now we're ready to launch Flipside and enter the studio. Once inside, we select the Earth From Up Here set.

To add our slideshow links, we right-click the Slideshow button under the Activities tab and edit the slideshow settings. We paste the links in one-per-line in the order they need to appear for the show.

Next, we do the same thing for the Teleprompter and paste the script into its settings, so the talent can read from the script while they're performing (Figure 5).

Now that the show is ready to be recorded, we also launch OBS Studio to record the video output. Flipside controls OBS Studio transparently using the obs-websocket plugin. This allows Flipside to focus on recording the VR version of the show, while OBS Studio simultaneously handles recording the video as well as the ability to stream a video feed live to Twitch, YouTube, Facebook Live, and other streaming services.

To ensure OBS Studio is ready, we click the Connect button and wait for the OBS Status to say Connected.

When we click Record, the show starts and the actor performs while the director cuts camera angles. Either the actor or the director can control the teleprompter and slideshow, depending on what works best for the team members.

When a take is good, the director or the actor can click the star icon on that take to keep track of the good ones for later.

So that's the process of producing an episode of Earth From Up Here in Flipside. As you can see, Flipside streamlines much of the time it takes to produce an animated show, and enables the talent to focus on what they do best.

 

Working with live actors in VR - Part 2 of 3

- by John Luxford

Part 2: Lessons for actors

By Lauren Cochrane, Improviser & Writer - Bucko

Welcome to part 2 of our blog post series on working with live actors in VR (click here to read part 1). This part is about working as an actor in VR, and is a guest post courtesy of our friends at Bucko (aka Genefur and 2B from Super Secret Science Island).


Now listen here!

Improv is and always will be about listening. Connecting to your scene partner requires communication both verbally and physically. In VR improv, the same is true, but you will need to listen in a whole new way.

You won’t have the same physical connection as you would in a live theatre setting, so it’s imperative that you take your time and speak and move with intention. It’s actually quite a gift for an improvisor to have the opportunity to play in a format that slows you down and makes you come at a scene from a completely new perspective.

Yes, you will absolutely react to and communicate with the character’s movements (both your own, and your scene partner) but the movements themselves are different. For example: If I am in regular live scene, and I put my hand to my ear to listen to something in the distance, that’s a pretty simple gesture that I can do to communicate an offer and I know my scene partner will follow along. In VR, my character’s body and limitations are different than my own.

Same scenario as above: Genefur’s ear is about half a foot away from my head because the character’s head is wider than mine. So I, as the actor, have to think about how to move my body and puppet Genefur to put her hand to her ear, to communicate the same offer of listening to something in the distance.

This adds a few extra seconds to my offer. Both scene partners have to be aware of this added time. If you are just talking and flailing around you will miss things and the audience will won’t be able to immerse themselves in the world. So, it takes a little bit to get used to the timing and new reaction process. Practice and play.


Space, man!

It’s important to be aware of your virtual surroundings and space, so you don’t bump into things you can’t feel. Using the monitors inside the VR scene is really important. You will want to get familiar with the environments and props to know where they end and you begin.

Yes, it can be funny to use the world and hide inside it (like when Genefur hid inside the fridge to scare 2B) but it can take away the reality if you walk through the table or each other when you’re not supposed to be able to - just like in live improv - if I set up the scene and indicated that “this is where the grand piano is, it’s big and beautiful and right there” and then I walk through it, it dispels the reality I worked so hard to create. Same goes with VR. It will take the viewer out of the magic and then the “oops” becomes the focus, rather than the scene.

Trust me, I know what I’m doing…

Trust is still conveyed between virtual avatars, even though you’re acting against the other person's character and not their real self. If you are playing with someone who you are used to playing with, it will make the VR scene process MUCH easier. It’s a little funny to get used to, but knowing that the person you are playing with is right there with you in the scene, while also at the same time, being in a room down the hall-it’s kind of new sensation.

Your brain will honestly believe you can reach out and physically touch your partner. Aaron and I usually high five to connect and ground ourselves together in live scenes. And without talking about it, we naturally did that between takes in VR scenes.

When you come out of VR and realize this person your brain swore was beside you isn’t, it’s a very cool/weird/eerie/amazing feeling. It’s a feeling that is probably easiest to process, if you are super comfortable with your scene partner. It’s a new level of trust. As an improvisor that is quite the experience!


Ahem… Is this thing on?

Lack of an audience can take a little bit of adjustment. Live improv is pretty clear with it’s audience communication. You, as the improvisor will know if the audience is digging what you are putting out by their laughter, clapping - even just the vibe of the room. You can feel and see if the audience is enjoying themselves and connecting.

In VR, you won’t have that, which can take a little bit of getting used to. You have to trust the same instincts and timing like you would in a live theatre, but be okay without the instant payoff of a room laughing and giggling with you. You’re a seasoned improviser. You got this. Trust it’s working, and you can go back and try again if it’s not.

Cheat to the camera. Cheat to the audience: Same diff’

In live improv, you will constantly be told “Cheat to the audience!” They need to be able to see what you are doing, what you are holding (mimed or real) and what you are trying to do with it. It’s the absolute same in VR improv.

The cool thing about props in the VR scenes is that they are used to add even more to the world than you could ever have in live scenes. (Ahem: Super Science Pencil-YES!) So if you use or make something, be sure to check the monitors to see that it’s purpose is being communicated.

That if you drew a sword and are now using it, make sure the camera’s and your scene partner can see it clearly. Otherwise, you will have a break in communication. If that happens, scroll up and re-read this again. It’s all there my friends. :)

 

Working with live actors in VR - Part 1 of 3

- by John Luxford

Part 1: General lessons learned

By John Luxford, CTO & Co-founder - Flipside

With Flipside’s first two show productions in full swing, we've now been through a number of production days with live actors working inside the platform. We learned a ton as a result, and wanted to share what we learned with you.

This first post explores some of the more general lessons we learned that have helped streamline our productions and helped us empower our actors to do the best job they can.

Get your processes down

Processes need to be honed, but they also need to be documented. These are living documents that will evolve rapidly, but you won’t be able to iterate on them as fast if you don’t have them written down to begin with.

We currently have documentation covering:

  • How to setup the physical studio and equipment
  • The recording process
  • Best practices (where a lot of this series comes from)
  • Known issues and workarounds
  • Show-specific notes for each production

These act as checklists to make sure we don’t miss a step that may have cost us time, or worse, a usable end result.

Because Flipside is still in beta, our "known issues and workarounds” document becomes critical. The purpose of this is to provide quick actionables we can use when an issue arises, without having to worry about finding a creative solution on the spot. Not having a workaround ready can quickly eat into your production time.

Keep the actors in VR between takes

At first, there were little things that we would have to reset between takes, and early on some of these even required restarting the app, which doesn’t help the actors stay in the right frame of mind. Context switching hurts creativity, and our goal isn’t just to be a virtual studio, but to use that opportunity to eliminate as much of the friction that goes into show production as possible.

So we iterated on ways of reducing the time between takes as much as possible. We now have a process that is impressively automated, with one person manning the camera switcher and director tools, while the actors are free to concentrate their attention on what they do best.


Build tools to help actors rehearse on their own

Actors need to learn and get into their characters, how they move, how they talk. They also need to get comfortable acting with a VR headset on. One request we got was a simple mirror scene, so the actors could practice their parts while seeing themselves from the front, side profile, and back all at the same time. Actors can now hop in there and see exactly how their movements translate to their virtual counterparts.

Give the actors clear virtual directions

The actors need to know when something is about to change, or where they should be standing and facing to be in frame. For example, we added virtual marks for each camera position, which update prior to the next camera change so the actors can know to move into place or turn if needed for the next shot.

This can also be as simple as counting down to “Action!” in VR when the director clicks record instead of starting recording immediately after pressing the button. These little things add up to make a more intuitive experience for everyone.

Fake what you can to make characters feel alive, but don’t go for realism yet

This means we put a lot of work into making eye contact feel right, and blinking feel natural, because we don’t have eye tracking available in consumer VR headsets just yet.

We also devised our own system for more natural neck and spine movement, as well as arms that emulate more traditional animation techniques that emphasize for style over accuracy. Since today’s full body inverse kinematics options still don't feel quite natural, and the closer you get to the character feeling alive, the more you risk falling into the uncanny valley.

The more you play into the strengths of the medium, the more the quality of the content can shine. Counter-intuitively, the better things get, the more noticeable the remaining issues become.


Give the actors maximum expressiveness

We quickly realized that even with lip syncing and natural eye movements, the avatar faces felt dead. To solve this, we created an expression system that the actors can control with the joystick on their motion controllers that allows them to express four distinct emotions, but also blend between them (smoothly transitioning between happy to upset, while blending naturally with the lip syncing).

With a little practice, these expressions can become reflexive actions for the actors, giving them a new level of expressive control as they embody their characters.

Avoid all causes of motion sickness

There are lots of unsolved problems in VR, probably most notably locomotion without causing motion sickness. But there are other subtler causes of motion sickness too, which can include anything that creates even slight disorientation.


Image source: techradar.com

One of the strangest examples we encountered in Flipside was in our preview monitors (which are just flat virtual screens for the actors to see the 2D output). We found that there was a perceived parallax in the preview monitors which caused a tiny amount of motion sickness over time. Nothing crazy, but present nonetheless. The solution we came up with was to flip the video on the preview screens horizontally. This had the effect of making any on-screen text appear backwards, but eliminated the perceived parallax which slowly caused discomfort for the actors.

The reason this is so critical is that actors are likely spending prolonged periods of time in the virtual sets, doing several takes before they get it just right, or doing batches of episodes in a single shoot. Anything that causes discomfort can potentially cut your shoot short in an unpleasant way.

These are some of the more general lessons we took away from working hands-on with live actors in a virtual world. They’ve helped us hone our vision, and Flipside is already way better because of it.

Stay tuned for the next post in this 3-part series on live acting in virtual reality. We have lots more to share! And if you're a content creator, make sure to sign up for early access to Flipside Studio!

 

How Magic's Secrets Apply to Virtual Narratives

- by John Luxford

By John Luxford, CTO & Co-founder - Flipside

Storytelling in virtual reality is about directing attention and engaging via interaction with the user. In film, there is the frame to direct attention, but in VR the user can look away when you didn't intend for them to and miss a key plot point or instruction.

Here, I'd like to explore how Teller's secrets that magicians use to manipulate audiences can be applied to build stronger narratives in VR.

Note: This post uses examples from common VR experiences, so it may contain spoilers if you haven't played them. I'll do my best to keep the examples simpler and relatively spoiler-free.

1. Exploit pattern recognition

Repetition builds expectation, which will draw the user's attention back to the repeating action. For example in Accounting, the repeated use of putting on a virtual headset to move to the next part of the narrative, or the miniature lawyers that keep popping out of the briefcase in the courtroom scene.

Accounting VR - Virtual Reality Device

If we look to create common metaphors and repetition around choices the user is given, it helps build familiarity and understanding of their surroundings so they can get lost in the details of the experience itself as they progress through it.

2. Make the secret a lot more trouble than the trick seems worth

By making something more elaborate, we can string them along multiple steps, or wow them with the perceived complexity of the task, even if it involves just two or three elements. In the extreme example, a Rube Goldberg machine comes to mind, whose complexity can easily wow audiences both in and out of VR. Many simpler examples exist in VR due to the perceived physical nature of the medium.

Rube Goldberg Machine

In terms of immersion, this can be as simple as confirming a single interaction via multiple senses (visually via depth cues and believable movement, accurate spatialized audio, and haptic feedback to touch), which tricks the mind into believing the interaction was with a real object. This can lend believability to any interaction, from pressing a button to mimicking the feel of shooting a bow and arrow.

3. It’s hard to think critically if you’re laughing

Probably the simplest but best example of this is a technique that originated in Job Simulator, which is that as soon as you pick up an object your virtual hand disappears and is replaced by the object itself.

Job Simulator - Hot Sauce

Owlchemy Labs had discovered that when you're busy pouring virtual hot sauce on things and laughing at the hot sauce coming out of the virtual bottle, you don't even notice your hand isn't wrapped around the bottle anymore. This has the benefit of avoiding having to make the virtual hand look right when wrapped around a wide variety of objects with varying shapes, which when done wrong can highlight the limitations of the experience and break the sense of immersion as a result.

As an interesting aside, we opted not to use this technique in Flipside after much testing, because the perception of the viewer is more important than the perception of the actor in a show, and viewers found it odd that the actors' hands kept disappearing.

4. Keep the trickery outside the frame

The frame in both magic on stage and story in VR is where the user's attention has been drawn. Fortunately in VR, the trickery is often in the form of making something appear or disappear, which can happen in a variety of ways.

You can make something appear right in front of the user to create an element of surprise, or worse you can make a monster appear beside them when they're not looking to create a jump scare.

To make a character disappear when they're no longer needed, they might walk out of a door or behind an obstruction and can be turned off when safely out of view. Meanwhile the user's attention is directed towards the next part of the experience.

5. To fool the mind, combine at least two tricks

This also relates to the example of combining multiple senses I used earlier, since much of VR's ability to immerse comes from the reinforcement of information across multiple sensory inputs. The eye can't be wrong if the ear heard it too.

But in terms of narrative and progression, combining two tasks or actions can be a great way to reinforce the direction of a user's attention, helping ensure they're seeing the key points along the way.

For example in Rick and Morty: Virtual Rick-ality, the need to use the microverse machine to complete the task of recharging the batteries adds a second layer of interaction to the original goal, which immerses the user in a deeper way.

Rick and Morty: Virtual Rick-ality - Microverse Machine

Bringing physicality into the experience by requiring the user to carry out a task itself is a way of combining the story element with participation. When the user is busy doing, we can be busy setting up the next steps for them.

6. Nothing fools you better than the lie you tell yourself

To use something from Flipside in an example, we have a VR episode of Super Secret Science Island in our not-yet-released Flipside viewer app where 2B throws a magic pencil out to the audience, breaking the fourth wall. If the viewer catches the magic pencil, the magic pencil grows to match the size of the viewer's hand. Most users we've tested don't even notice it happen, which means they've suspended their disbelief.

In fact, the magical action of throwing something out of the pre-recorded show and into your hand, akin to a character throwing something out of the television and into your living room, doesn't even seem that far fetched as it happens. The pencil appears in your hand as expected, and most users simply begin to draw with it.

Whenever the user can inspect an object for themselves, they'll believe what their senses tell them.

7. If you are given a choice, you believe you have acted freely

The simplest example here isn't exclusive to VR games, but any sandbox narrative style game offers the user a limited set of choices, and as you make these choices you begin to feel that the game's outcomes are tailored to your individual set of choices. But in reality there are only a handful of final outcomes in the game's sandboxed world.

Some VR examples include Push for Emor, the previously mentioned Rick and Morty: Virtual Rick-ality or the highly-anticipated Fallout 4 VR.

Conclusion

As you can see, there are many parallels between VR narratives and traditional forms of live entertainment like theatre, improv, and magic. Magicians especially have had millenia to hone the craft of manipulating perception, which makes for a rich body of work to learn from as we craft our narratives in this new medium. Once again, what is old becomes new again, which might be the oldest trick in the book.

 

Virtual Reality Will Disrupt the Stage

- by Lesley Klassen

By Lesley Klassen, CEO & Cofounder - Flipside

I often hear how VR will enable us to fly around virtual worlds, explore vast virtual spaces, and bring a new reality into the lives of its users. This is somewhat true already, however, VR really can’t achieve all these promises without some locomotion trickery.

Sign up for early access to Flipside!

Locomotion is the technical term used when explaining how users move around in virtual reality. When you use desktop VR like the HTC Vive you can walk around the tracking volume that was mapped out when you setup the Vive. The blue dotted lines below represent the tracked space you can move around in.


Image source: Upload VR

For every step you take in the real world, you take one step in the virtual world. In order to fly, or explore vast worlds, you need to use locomotion techniques that make you feel like you are traveling beyond the tracking volume. For some people, like me, that can feel uncomfortable. But we haven’t explored the limits of the tracked space yet. How much can we achieve in VR without resorting to locomotion tricks?

The tracking volume as a stage

Rather than flying around and exploring vast new virtual worlds, what happens if you think of the tracking volume as a virtual stage? If you see VR from this perspective the opportunities for creators are endless. It is possible to turn every tracking volume in the world into a performance space. This is a complete game changer.

When new technologies are introduced there are winners and losers. Blogging changed the landscape of the newspaper industry forever and virtual reality will disrupt the stage.


The stage plays a huge role in the entertainment industry. Concerts, theatrical plays, musicals, live TV shows, sitcoms, and of course dance all use a stage, on which the show occurs. It’s big business with large viewership.

Who is doing this right now?

The Foo Show

Will Smith, not that Will Smith, co-founded tested.com and started Foo Show. He is creating talk shows in VR and launched the initial episode of the Foo Show in early 2016 and he making more.

Gunter’s Universe in VR Chat

If you’ve read Ready Player One you will recognize the name Gunter. The team at Gunter’s Universe have been producing a VR talk show since 2014. They have interviewed over 50 leaders in the VR industry.

Fireside chats in High Fidelity

Philip Rosendale, co-founder of High Fidelity, and former CEO of Second Life has brought his open world vision to VR. High Fidelity is an open source platform that allows developers to craft new worlds and places, including VR talk shows and virtual performances.

Reggie Watts and Justin Roiland in AltSpace

AltSpace is a social VR platform. They hold the record for the most number concurrent viewers watching a live show in virtual reality. They were also one of the first to bring celebrity entertainers into VR.

But who’s watching?

Virtual reality is still in its infancy. The only people who use it regularly are innovators and early adopters.


There are two challenges holding back mass adoption. The hardware needs to come down in price and VR needs more content. Seeing the tracking volume as a stage and empowering creators to make more content will help solve the content problem. We will have to wait on the hardware manufacturers to make the hardware more affordable, but fear not, it’s happening. Our future will be filled with amazing, captivating, and engaging VR performances on virtual stages.

 

It's Demo Day at Boost VC

- by John Luxford

Today is Demo Day for Tribe 9 at Boost VC, and we are super excited to be a part of it and proud of the work everyone has done over the past few months. The growth and polish in everyone's products, pitches, and businesses is incredible.

The past week alone has been crazy for us. We announced Flipside Studio in early access as well as the first two shows created with it, Earth From Up Here and Super Secret Science Island. We demoed like mad with Exit Reality at VRLA, and had some great coverage in Tom's Hardware, VR Status, and The VR Base.

Anyway, we better get back to rehearsing and getting our setup looking spiffy for the event. Looking forward to showing off the future of animation and interactive content creation in VR!

 

Introducing Flipside Studio, the fastest way to create animated shows

- by John Luxford

We’ve been hinting at big things for a while now at The Campfire Union, and we’re super excited to finally unveil the first part of those things. Flipside has been our semi-stealth mode project since early last year, and we’re now ready to start opening it up to a wider audience.

Today, we are making Flipside Studio available to a select group of content creators. We need your feedback to shape it into the best product it can be, and it’s time to see what you guys come up with. If you are a content creator and are interested in participating, please sign up here:

What is Flipside Studio?

Flipside Studio transforms your standard PC-based VR setup into a complete virtual TV studio, where you can create live-acted animated shows in real time. Your headset and controllers are the motion capture system as you embody characters on your own virtual sets.

Shows are recorded for both 2D screens as well as VR and AR playback. 2D shows can be viewed on any of today’s screens, and can even be streamed live to Facebook, Twitch, or YouTube.


Our first show: Super Secret Science Island

Today we’re also proud to share the first show made in Flipside Studio, created in collaboration with Winnipeg-based improv comedy duo Bucko Comedy. Super Secret Science Island is an improv comedy show about two failed science experiments, 2B and Genefur, who are stranded on a deserted island. Follow their antics as they look for a way off the island and to find their missing creator, Dr. Whoosh.

Our second show: Earth From Up Here

As if one show wasn’t enough, we’re also launching with a weekly alien news update about what’s happening down here on Earth. The alien newscaster, named Zeeblow Gonzoar, is written and performed by San Francisco-based comedian Jordan Cerminara.

We will be sharing lots more info about the making of Super Secret Science Island and Earth From Up Here, including advice and tips on acting in VR courtesy of Lauren and Aaron of Bucko Comedy, in the weeks to come.

Creators wanted!

If you are a content creator, streamer, writer, actor, producer, or director and would like early access to Flipside Studio for a show of your own, enter your email below to be added to our early access list:

Stay tuned for more Flipside productions in the weeks to come! We’ve only just begun to discover what’s possible in terms of finding new ways of interacting with audiences, and of bringing the impossible to life.

Sincerely,
The Campfire Union

 

« Newer posts