The Animator's Dilemma: Beating the YouTube Algorithm

- by Lesley Klassen

At the end of each year, YouTube creates their Rewind video that celebrates the videos, people, music, and memes that hit big. For the first time, YouTube featured animators in their Rewind video

In fact, animation is really growing on YouTube and you can see this at conferences like VidCon, that are seeing more programming devoted to animation. 

You may be saying to yourself, “What’s the big deal? Animations are becoming more popular on YouTube, so what.” It all comes down to the YouTube algorithm and the added challenges that animators face over live-action content creators.

If you have an hour or more to spare watch this enlightening video on the YouTube algorithm by the Fine Brothers and MatPat from Game Theorist. There are some great insights to take from this video. Primarily, creators need to post regularly and have videos with longer watch times, preferably over 10 minutes long.

Creating 10 minutes of live-action video is challenging enough, but it’s not nearly as tough as it is to create 10 minutes of animated content on a regular basis. It’s actually damn near impossible. Animation takes a long time. 

YouTube made it even harder for animators with the new changes to their monetization policy. Now you need 1000 subscribers and at least 4000 hours of viewership over the last 12 months in order to qualify for their YouTube Partner Program. This means that animators need to create 10 minutes of content or longer on a frequent basis, preferably 2 - 3 times a week, and build a subscriber base of 1000 before even turning on monetization. For many animators, why bother?


Channel Frederator represents over 3000 animators on YouTube and provide services that help YouTubers market their animations.  They felt the need to respond to the new YouTube monetization policy. "Channel Frederator Network has decided to allow members of the network, who have been kicked out of YouTube's Partnership Program, to stay and use our tools & resources to help them reach new audiences," states Director of Networks Kenneth Ash. 

Animators really do have a dilemma.  Especially those up and coming channels who haven't reached monetization. That's the thing about animators, they are so passionate about animating that they will persist for the love of the art. But persistence won’t change the YouTube algorithm, however technology can help. 

Flipside Studio is a real-time animation studio. We simulate a film set or TV studio inside virtual reality to allow an animator to use filmmaking techniques along with real-time motion capture to make animated content. Because it’s real-time, it takes about the same amount of time to make a Flipside animation as it takes to record a live-action video. We make it easier to make longer animations more frequently so animators can start competing against live-action creators for subscribers and watch time.  

There are multiple strategies animators are using to currently keep pace with the demands of the YouTube algorithm, and we know Flipside can help. 

  • Interstitial content
    Many animators release monthly, which is not optimal. Interstitial content can be created with Flipside and released in between the larger monthly releases to keep a channel freshly updated.
  • Add to an already existing animation format
    Augmenting an already produced show with new elements made in Flipside will speed up the process. For example, Liam the Leprechaun is using Flipside to introduce his gaming channels. 
  • Create new animations
    Use Flipside to make a new show that satisfies the YouTube algorithm by taking advantage of the real-time nature of Flipside. FunPack is an example of this.
  • Live stream an animated show
    Real-time animation is great for live streaming.  Each stream can be captured and re-edited too.  Take advantage of connecting with a live audience while making content well suited for YouTube. 

We believe that real-time animation technology will play a big role in the future of animation. This is especially true on competitive platforms like YouTube that have algorithms that play a role in surfacing content. Animators have it tough but new technologies like Flipside are emerging to level the playing field. Sign up for our private Alpha to give Flipside a try.  

 

Flipside Alpha Update #5

- by John Luxford

This weeks' update has a number of features we're really excited to share, so I'll jump right in!

More new characters! Meet Jess, Phillip, and Meg

Jess, Phillip, and Meg round out our set of default human characters, and they couldn't be happier to join the others in Flipside to help you make your shows.

Texture fixes on imported props

We were having issues with textures loading properly on the custom props being imported, which hampered the usefulness of our custom prop importer. We've made some big improvements that should make this much more useful.

Major improvements to our character importing process

The Flipside Creator Tools got a massive overhaul with this update that we've been working on for the past few weeks. In addition to fixing some issues with shader importing, there are three big areas of improvement:

1. The workflow has been made much simpler, with a new panel placed front and center that walks you through each step in the process.

2. You no longer have to match Flipside's exact list of blend shapes for facial expressions and lip syncing, because the Creator Tools now let you map our list to yours, including skipping missing blend shapes and mapping one of ours to multiple blend shapes on your characters. This means less time tweaking your characters in Blender before bringing them into the Creator Tools.

3. You can now hit Play in the Unity editor with your character's scene loaded and then you can hop into VR to see your character standing in front of you to check for scale and other issues.

This is just the start, and we have lots more planned to make character creation faster and more fun.

Roadmap to 1.0

I mentioned we were working on a public roadmap in the last update post, and that's ready now too! Head over to our Roadmap page to see what we're currently working on, what's already completed, and what is coming up on the road to Flipside Studio 1.0. There's also a Roadmap link in the Creator Community so it's always just a click away from the discussions.

Misc changes & fixes

  • Scaling the set in Set Builder mode now goes down to 25% size and up to 1:1 scale, making it easier to place big objects quickly.
  • The puppet was able to grab itself, causing a puppet inception bug that was neat but ultimately still a bug. That's been fixed too.
  • Improved colliders and interaction points on props. Still lots of work to be done on improving prop interactions, but they should interact with each other a bit better now (e.g., setting an object on the desk).
  • We've enabled Oculus Dash support for our Rift users.

Try it out, then head over to the Creator Community and let us know what you think!

 

Flipside Alpha Update #4

- by John Luxford

Starting with last week's update, we decided to try for weekly updates as we approach Flipside Studio 1.0.

With that in mind, each update will be smaller and mainly focused on these areas:

  • Bug fixes
  • Stability
  • Performance
  • Usability
  • Finishing incomplete features

We're also working on a public roadmap that we can share with the community to better understand where we're going, not just for 1.0 but for future versions as well.

Meet Ruth and Steve, our newest avatars!

Ruth and Steve (internal names only, since Flipside characters don't have visible names) are the newest members of our built-in character set. They're an elderly couple who just don't understand what all the fuss is about with this new technology.

Highlighting objects

One big usability change with this update are objects that you hover over will highlight if they're grabbable. We notice a lot of newcomers to VR don't reach out far enough to grab things, so this is meant to provide a visual cue to show that you're within reach. This should also help to know which object you're grabbing when several objects are close together.

Bug fixes

  • Hand interactions could sometimes get flipped on the Vive controllers (but not anymore!)
  • Fixed more bugs in the character importing process
  • Minor visual fixes

As always, the updates should happen automatically via Oculus Home or Steam. Looking forward to seeing your feedback and creations in the Flipside Creator Community!

 

Flipside Alpha Update #3

- by John Luxford

Hot on the heels of update #2, we have a fresh batch of bug fixes for our Flipside creators. You should see the latest version automatically updated on Steam or Oculus Home.

Slideshow video issues fixed

The slideshow had some issues playing video files, which should now be fixed. For users that still have issues, see our Installing Flipside help page for a link to download additional video codecs from Microsoft.

Eye movement fixes

In the process of getting our Flipside Creator Tools ready for everyone to import their own characters, we ran into some snags that left our eyes less expressive than they used to be. Eyes should once again be naturally looking at points of interest, which we call eye targets. This includes other characters as well as things like the coffee mug, which the eyes will focus on briefly as the cup is picked up.

Magic pencil default colour is now black

It used to be white. Now it's black. Watching how people are using Flipside, this was a small change that should have a big positive impact :)

Unity upgrade and misc fixes

There were several other behind-the-scenes fixes, including updating the project and Creator Tools to Unity version 2017.3.1p1. If you're using the Creator Tools to import your own characters, you'll need to ensure that you're using that version of Unity or you may run into issues like your character's textures showing up wrong. You can find the correct version of Unity on their Patch Releases page.

As always, please post any fixes, feature suggestions, or things you create over on the Creator Community.

 

Flipside Alpha Update #2

- by John Luxford

We're excited to share our second big update to the Flipside Alpha! Existing users should see the latest version automatically updated on Steam or Oculus Home.

This update features too many little improvements to name them all, but here are the highlights:

In-app tutorials

We spent a lot of time designing an in-app tutorial system to help new users get started and to make sure existing users have a clear overview of Flipside's controls and interface.

Audio synchronization fix

A performance optimization we made in the last update caused audio playback to be off by a small but noticeable amount. We're happy to say this is now completely fixed.

Blue screen

When you have lots of green on a set or in your characters, it can be hard to key them out correctly using a green screen. We added a blue screen option to the skies so you now have a choice of green or blue for keying out your backgrounds.

Slideshow supports Dropbox links

Dropbox links are now automatically converted to the direct photo or video link, so you can paste Dropbox links into the slideshow and everything will just work.

New and improved hands

We completely redesigned your hands so it's easier to press the right button and interact with objects.

Many bug fixes and improvements

Some bug fix highlights:

  • Thinner lines on the magic pencil make it easier to draw finer details
  • Numerous fixes to the custom character importing process
  • Improvements to our HTC Vive Tracker support for full body tracking
  • Improvements to the teleporter, puppet tool, and calibration process
  • Improvements to the UI, camera system, and director tools
 

The Making of the Interface for Flipside Medium Post

- by Lesley Klassen

The Making of the Interface for Flipside

Today we posted an article on Medium about the making of the interface for Flipside.  The process was a team effort.  We explored many idea and now as we ready Flipside for public release on early access for both the HTC Vive and the Oculus Rift we wanted to share the process and give people a sneak peak at our palette interface.

 

Flipside Alpha Update #1

- by John Luxford

Our first big update to the Flipside Alpha is out! Alpha users will see the option to update on the Flipside Steam page and in the Oculus Home app (or if you have auto-updates enabled, you should automatically have the latest version).

Since this is a big update with lots of improvements, let's start with the highlights first:

Highlights

Simplified and consolidated user interface

Based on your feedback, we made several changes to the user interface:

No more wrist menu

This felt too video game-like and also had accessibility issues for single-handed use.

As a result, the following user interface elements have been added:

The teleporter has been moved to your right hand joystick/thumbpad

This falls in line with many other VR apps and games, and should make it easier for users to pick up.

The camera and puppet tools have been moved to your waist

We created a utility belt (more like floating holsters) by your waist where you can always grab these common tools.

The main menu has been moved to the bottom of the palettes

The palettes have also been unified to act like a single menu system for the whole app.


Menus are now a more app-like colour scheme

This helps distinguish the Flipside user interface from show elements like props and sets.

Wrist watch stop button while recording

To stop you can either press the usual A/X on Oculus Rift or Application Menu on HTC Vive, but you'll also see a wrist watch appear on your left wrist while recording is underway that has a stop button on it. We decided to keep that bit of skeumorphism :)

Character improvements

Import your own custom characters!

We've released the first public release of the Flipside Creator Tools, which enable you to import your own custom characters from 3D models, complete with full-body movement, hand animations, facial expressions, lip syncing, and natural eye movement. Now you can be anything you can imagine in Flipside!

Height calibration

We've added a calibration button found on the underside of the Characters palette. This measures and stores your height, shoulder height, and arm span and uses these to provide more accurate motion capture.

Foot planting improvements

The first alpha release had a bug where your character's feet would lift too easily off the floor. The calibration as well as some other character changes have made big improvements to how the feet feel and connect with the floor.

Improved neck movement

Looking up, down and all around doesn't cause your character's body to move nearly as much, which helps convey body language that much better.

HTC Vive Tracker support

Flipside now supports both 2 and 3 Vive Tracker configurations, tracking either both feet or both feet and your waist. This opens up whole new possibilities for physical acting in Flipside.

Camera improvements

Handheld camera streaming fix

The handheld camera had a bug where it wasn't projecting to the 2D output, which made it impossible to capture handheld shots. Now whenever you grab the handheld camera from your side, it will become the active camera.

Non-selfie mode fix

The handheld camera defaults to facing outward. A bug had it defaulting to selfie mode, but this makes the camera way more useful for quickly capturing the right shots for your shows.

Steadicam smoothing effect

We've improved the handheld camera's steadicam smoothing effect, helping you capture more stable footage of your shows.

Other fixes

  • Adjusted the weight of all built-in props to feel more natural.
  • The puppet's inactive arm feels more natural.

We can't wait to hear your feedback on these changes and see what you guys make in Flipside!

 

That's a wrap! Where Flipside has been in 2017

- by John Luxford

This year started with the decision to focus exclusively on Flipside as a company. That was a hard decision because up until that point we were a bootstrapped company relying on service-based income to survive. It was even harder because it meant deciding to cancel our then-imminent plans to release Lost Cities on the Oculus Rift. But far and away it was the right decision.

Sign up for early access!

Our first show

We were just starting the art production on our first Flipside show, Super Secret Science Island, in collaboration with the super creative comedy duo Bucko. Super Secret Science Island is an improv comedy set on a deserted island which really stretched our multiplayer and avatar capabilities. It also taught us reams and reams about what actors need to perform well in a virtual environment (see our 3-part acting in VR series).


Maximum acceleration

We had also just been accepted into Tribe 9 of Boost VC, a startup accelerator focused on frontier tech companies like us working in areas like VR, AR, AI, and Blockchain. Boost VC believing in our vision was all the proof we needed that we made the right move. Throughout the program, we made some amazing relationships and connections, learned a metric ton, and moved our product forward by leaps and bounds.

While at Boost VC, we were connected with San Francisco comedian Jordan Cerminara, who became the writer and actor in our second Flipside show, Earth From Up Here. This show made extensive use of our camera system, our teleprompter, and our slideshow for delivering SNL Weekend Update-style news. In the show, Jordan plays Zeblo Gonzor, an alien newscaster who makes jokes about how crazy Earthly news must seem from the outside.

Learning and growth

Having produced a complete season of two shows, we went back to the drawing board and determined what it would take to provide the same experience for creators who could work on their own, without our help troubleshooting issues on the fly. We talked to lots of creators, and really honed our vision for what Flipside 1.0 ought to be.

We also demoed Flipside at a ton of events, from All Access to Vidcon, grew into our very own office space from our previous co-working spaces, and also grew to a team of seven. Compared to the year before, having a whole team working on a single shared vision has been amazing, to say the least.

Early access, at long last

And now, just in time for the holidays, we're sending our first Flipside Early Access release out into the wild to our first group of beta testers, warts and all.

They say if you're not at least a little embarrassed about showing your app to the world you've waited too long. I wouldn't say that we're embarrassed because we're all immensely proud of the work and creativity that's gone into this release, but there's a list a mile long of things we can't wait to fix or add in.

2017 has been the craziest year yet for us, and ending it by getting Flipside into the hands of its first users feels like exactly how it should end. I expect 2018 to be even crazier, with more beta updates, a budding new creator community to grow, a wider public release and more content in the works.

We'd like to end with a huge THANK YOU to everyone who has been along for this ride, and who have supported us in any way this past year. Flipside is the most ambitious and creative thing I think any of us have attempted to make, and it wouldn't be where it is today without you.

And another huge THANK YOU to our creator community who have waited patiently for us to get Flipside into your hands. The desire to help people share their stories has been at the heart of our company from before the beginning, and we can't wait to see what all of you come up with!

Sincerely,
The Flipside Team

PS. Have a safe and warm holiday, and an inspired new year!

 

We brought the xkcd web comic into VR!

- by John Luxford


By Rachael Hosein (CCO / Co-founder) & John Luxford (CTO / Co-founder)

The Winnipeg Winter Game Jam was this past weekend, which conveniently overlapped with the itch.io xkcd Game Jam, so we chose to make something for both.

The result is xkcd vr, a virtual reality experience that lets you be all the characters from the xkcd web comic in VR using the same motion capture tech found in Flipside.

Concept

The first step was to see what being a stick figure avatar felt like in VR, which it turns out is ridiculous fun! From there we wanted to let people make their own comic cells with speech bubbles and props and save them as images that matched the xkcd style.

We did have to deviate from the style in some places, like adding outlines to our speech bubbles because without outlines they were harder to grab and place. But overall, we're pretty proud of how well we were able to match the look and feel.

Features

Here are the features we managed to finish over the course of the jam:

  • Be one of 8 xkcd characters (VR as mocap)
  • Choose from 12 different props
  • Make speech bubbles with our VRKeys virtual keyboard
  • Compose wide, regular or narrow comic cells
  • Switch between light and dark themes
  • Take screenshots to make your own comics
  • Switch between comic cell and mirroring the in-VR perspective on the computer screen


This was a super fun project that we'd love to incorporate pieces of in Flipside proper. Imagine making your own animated shows as the xkcd characters in a web comic world? How cool is that?

You can download xkcd vr for Oculus Rift and HTC Vive here. We hope you enjoy it as much as we enjoyed making it, and feel free to reach out - we'd love to hear what you think!

 

The XR Generation

- by John Luxford

“The changes are dynamic and take place in real time. The show reconfigures itself dynamically depending upon what happens moment to moment… It’s a smart play.”
– Neal Stephenson, The Diamond Age


By John Luxford, CTO & Cofounder - Flipside

We started Flipside out of a shared passion for using technology to empower people to be creative and to tell their stories. We love building creation tools, and our mission is to enable a new generation of creativity using VR and AR, or as people have been dubbing the collective Virtual/Augmented/Mixed Reality combo, just XR.

When we started Flipside, we reflected a lot about where we saw these technologies going. We wondered: what will kids growing up in a time when AR headsets are commonplace expect from their apps?

If we can understand this, we can avoid building things that seem novel today but won’t have staying power. We call this target user The XR Generation.

Some key insights emerged from these reflections:

Real-time 3D will succeed 2D content

Just like our kids don't share our passion for the movies and shows we loved growing up, the next generations will prefer content made for their generation.

We're just starting to see the potential of 3D content today, but this content consumption pattern led us to the realization that as display resolutions improve and miniaturization lets hardware reach a consumer-friendly level of style, 3D content will overtake 2D content at some point in the future. At that point, kids will all want to wear XR glasses, because everyone else will be doing it and they won't want to miss out on sharing that experience.

2D content will still be a part of that experience, but it will simply become a virtual flat surface in a larger 3D context. And the reason users will prefer real-time rendering over pre-rendered content like stereoscopic 360 video is because real-time content can be interacted with. A 2D screen or even a 360 video just can't compete on that front. And as interactions become increasingly physical at the same time, the level of engagement will be profoundly different than today's touchscreen world.

Video games are today's 3D content rendered to a 2D screen, and the immersion is lacking. The content never feels like it's part of your world, and you never feel truly transported to another world, because there's a flat piece of glass always in the way. High resolution VR and AR will enable true immersion and physical interaction with the games and entertainment people consume in the future, and today's video games will seem antiquated by comparison, just like our favourite movies growing up seem to kids today.

Social is not a feature, it's basic infrastructure

The XR Generation will expect that they’ll be able to invite their friends to join them in any experience worth their time, and that it will be a rich and expressive shared experience. 

Avatars are the tattoos of the future, except ever changing and adapting with you. Tweet this

Interactivity will be an expectation, too

The video game industry is already bigger than the movie industry, and games like Night in the Woods keep showing us how interactive storytelling can tell deeply personal and human stories while giving the player agency to explore the game's world and go where they choose, and to feel a stronger affinity with the characters they become while playing.

Like Neal Stephenson’s vision of “ractives” in The Diamond Age, we see the line between performer and viewer blurring until they’re almost indistinguishable. Not every piece of content will necessarily be fully interactive, but new forms of interactivity will emerge that haven't even been imagined yet.

VR and AR are inherently creative mediums

Users already expect some degree of agency in their virtual worlds, and they will feel a need to participate in the creation of the impossible things that they are going to experience. Humans are born curious and unafraid to express their creativity, and that creativity is the key to the process of discovery and learning about the world.

When our entertainment blurs the line between real and virtual, it will empower our creativity. Tweet this

If you can think it, why can't it become virtually real? And with advancements in haptic feedback, it may feel just as good as the real thing, too.

The metaverse is a decoy

The idea of a single metaverse that everyone visits sounds great in sci-fi novels, but doesn't quite add up in practice.

There will be countless virtual worlds, not just one. But we can only interact with so many people at one time, and the intimacy of interaction is what brings it much of its inherent value.

We anticipate that there will be hugely popular metaverse-like apps, but no one app will be able to satisfy everyone's creative and entertainment needs.

Individual games will run outside of that metaverse, even if you launch them from inside of it and end up back there later, like Steam or Oculus Home today. With AR, there won't be a need for a metaverse at all, just for pieces of a metaverse like a unified avatar system.

For these pieces that make up a metaverse, standards will likely emerge just like we've seen on the web: your avatar will go with you from world to world or place to place; there will be an operating system which acts as a way of organizing and launching your XR apps just like we have now, and glues the various standards together to make a whole, but it won't be where users spend most of their time.

Where does all of this lead us?

The key to realizing this long-term vision is to build the features necessary to bridge the gap between what is possible today, and what will become possible as the technology matures. This means providing real value to creators and viewers now, and building the future out in careful steps, which leads us to the following axioms.

Axiom 1: Render in real-time

360 video is going to get much better, but it still has inherent limitations like the viewer being stuck in a fixed point in the scene, and will never be fully interactive in the way a real-time rendered scene can be. Lightfield technology may get us closer to interactivity, but it has its own limitations too and is still years away from hitting the market.

Today's real-time rendering is also limited in its ability to render truly lifelike scenes, but this is improving rapidly and won't be a problem in the future we're talking about. Nothing but real-time rendering can provide users with an immersive as well as interactive experience, allowing them to affect its outcomes.

Interactivity is the superpower of real-time rendering that other formats just can't compete with. Tweet this

We're focusing on real-time now, because it aligns perfectly with the way we see the XR content of the future being consumed.

Axiom 2: Craft from the user out, not the world in

We are not building the metaverse. We are intentionally building a specific show production and viewing app for actors and viewers, centered on their needs.

This means that actors need to have the tools to help them act, and viewers need to have the tools to engage with the content. It's our job to take care of the rest, which should largely remain unseen to both sides.

Axiom 3: Make content, including live content, fast to produce

Live shows and single-take recordings make production faster today, and allow for real-time engagement.

The easier we can make producing content, the more content can be produced, which provides content creators with a tangible benefit today, and helps us get to a stage where the audience jump inside of, and become part of, the show itself.

Axiom 4: Be social from the start

We started Flipside as a multiplayer experience from day one. This helps us in achieving axiom 3, because multiple users can act together in real-time, without the need to jump between characters over a series of takes.

It also helps us push ourselves when it comes to the expressiveness we want in our avatars. It's one thing to act a part and watch it back, but it's another to see someone in front of you, and assess in real-time whether their expressions are being sufficiently conveyed.

We must breathe life into our avatars by capturing the essence of the person behind it.​ Tweet this

Axiom 5: Produce a wide spectrum of content

Since VR and AR are new mediums, we know that the best way to accelerate the pace of discovery is to bring as wide of a variety of content creators into them as possible. Because Flipside is a social show production tool, users can craft simple shareables or more elaborate live and recorded productions like comedies, dramas, and game shows.

Axiom 6: Interact with the audience

An engaged audience comes back.

Our social interactivity framework allows each show to have a game element or custom activity that creates unique levels of engagement on a per-show basis.

While these are simple interactions in the beginning, they will become increasingly varied and rich over time as more creators use Flipside to create new worlds, stories, and experiences. This axiom comes more from live performance and theatre than from television or movies, which we explore in further detail in Les' post Virtual Reality Will Disrupt the Stage.

Axiom 7: Share your learning

From the beginning, we understood the value of onboarding teams and scaling production.

Together, we are building the future of entertainment, and we’ve only just scratched the surface of what’s possible. We envision Flipside as nothing less than the technology that powers interactive entertainment of the future, something that empowers millions of creators to reach billions of viewers, not that those distinctions even make sense to the XR Generation to come.

If you want to join us on this creative journey, make sure to sign up for early access to Flipside.

 

« Newer posts

Older posts »