Storytelling in the round – experiments in 360 video

Recently, Chris was presenting a workshop on what 360 video can bring to digital storytelling at a conference in Greece. Here he explains his reasons for exploring the technology and some of teh conclusions he came to including how to address some of the technical and artistic challenges.

ARgassi beach in Greece at sunriseIt’s not often we get to do overseas work in this job, but last weekend I was at the international digital storytelling conference in Greece, running a workshop on digital storytelling using 360 video.

I’m going to hold my hands up at the start of this and admit that I’ve not been a fan of VR and AR technologies. There are some really strong use cases in education and entertainment but “mixed realities” are one of those things that are often touted as a “transformative technology”, able somehow to radically change the way we “do digital” if we’d only give it a chance. My inner sceptic feels uncomfortable with this.

But sometimes you get called out on your assumptions, which is what my colleague John Sumpter did in Leicester a few months ago and I realised I was trying to argue a position without even having properly engaged with the technology.

Hence why I was in Greece presenting the workshop having been through the process from start to finish. The aim was not to convert myself to the technology, nor to try to debunk it, rather to keep an open mind.

Why 360 video and storytelling?

I’ve been a digital storytelling practitioner for over 10 years now so it seemed the easiest way to explore a new technology, by combining something I knew well with something I didn’t.

I wanted to see whether 360 video was a good fit for the sort of personalised storytelling we do at Jisc as part of our workshops and consultancy. Usually we’d just be using still images and voiceover to create a simple video. This different medium would involve a different production process and maybe create a different sort of audience experience.

I’ve spent the last month with the help of John, Zac, Suhad Aljundi and Matt Ramirez creating some proofs of concept.

There’s probably scope for some “How To” posts in all this but for now I’ll stick with the big picture stuff.

The first one, Tides, was made using 360 images captured using Google Streetview on an iPhone.

Notes on viewing: If you watch this on a desktop browser you can scan round the scene by clicking and dragging the video. If you’re watching using a mobile device I’d recommend viewing them in the YouTube app which will let you scan round by pointing your phone in different directions. If you have a VR headset like Google Cardboard, that’ll probably give you a more intense and isolated experience with fewer distractions. Use headphones if you can.

This was edited in Adobe Premiere, complicated and expensive software that handles “monoscopic” VR like this reasonably smoothly. It also allowed me to try out some nifty tricks with overlays, graphics and sound design.

The second one, Bridges, was filmed using a Samsung Gear 360 video camera and edited on the software that comes bundled with the camera. It’s a lot simpler but more limited in what you can do. For example, you can’t record a narration straight onto the movie. I had to use Audacity to create the finished MP3 then time the transitions of the different scenes accordingly. Storyboarding was doubly important for this one.

I’m less happy with the second one, I think. One of my errors was filming at too low a resolution. I put them out on Twitter to get some feedback before the conference which I’ll come back to but if you have any thoughts on how effective they are please leave your thoughts in the comments section.

These are the main things I learned from doing it…

The technical stuff

Expect hiccups.

The whole experience felt like trying to do video editing 10 years ago. The technology isn’t particulalrly refined or standardised and not all types of software can handle all types of 360 media the same so be prepared for some frustration while you get up to speed.

Learn some new terminology.

In order for editing software and websites like Facebook and YouTube to recognise 360 media it needs to be tagged with the correct metadata. The main thing to know is how to change the “projection” to “equirectangular” or to flag up that the video is “spherical”. Also, your screen size ratio is 2:1, rather than 16:9.

Altering metadata

There are some simple tools for changing the metadata. I recommend eXifer for editing still images and the Spatial Media Metadata Injector (choose a better name, folks!) for video. Neither need a lot of technical skill to use and they mostly worked fine. Software like Adobe Premiere lets you alter the images after the fact but it does involve going deep into the settings.

The Samsung Gear 360 camera
The Samsung Gear 360 camera

Bump up the resolution.

When you’re using a 360 camera, you’ll want it on the highest resolution possible. 4k video is 3840 x 2160 pixels which sounds a lot and it is when you’re looking at traditional screen. With 360 video, however, the “screen” is actually a massive sphere with you at the centre. Even the highest quality video will start to look a bit blocky. But raising the resolution will lead to another issue…

Get a powerful editing computer.

Your file sizes will be pretty huge given the resolution of the images involved. A finished 3 minute video could be anywhere south of 2GB in size so when you consider all that raw footage you’ve collected, a single project could be 5-10 times that! Also it’s heavy on the processor and RAM. My laptop sounded like a hairdryer for most of the project to stop it overheating even though its spec is quite high.

Software is a bit of a compromise at the moment. You can either go for something that’s powerful and flexible but expensive or free but rudimentary. There isn’t really a middle ground yet and there won’t be until we see the likes of iMovie and WeVideo inhabiting this space. That’s the reason I think this sort of media will remain stubbornly outside the mainstream for the time being.

I followed this Adobe tutorial to get the hang of things.

The artistic stuff

This bit was actually more interesting and challenging.

It’s all about taking your time.

Rapid cutting between 360 scenes is horribly disorientating. You don’t have time to explore your environment before the scene changes. Normally, I’d feel nervous about holding a still image on screen for longer than 5 or 6 seconds but with 360 video, it worked better when the scenes lasted 30 seconds or more. This inevitably affects the writing too, requiring a more reflective, meditative mood.

How do you “frame” a shot when there is no “frame”?

You want to create images and video that are engaging and interesting whichever direction someone is looking but it’s also important to direct the viewer’s attention to particular things at certain times to help tell your story. I didn’t wholly succeed here but there’s a whole separate blog post in it! The main thing is to think in terms of location, rather than image. Honestly, it’s a geographer’s dream and a cinematographer’s nightmare!

Sound is important.

The mic in the Samsung camera isn’t that great, especially in windy conditions (See the Bridges story) but I found completely removing the “live” or diegetic sound made the experience too disconcerting. If a car goes past, your brain expects to hear it. For the Tides story, obviously there was no “live” sound” so I just used my iPhone’s audio recorder to record some in situ. I did cheat a little here in the edit. See if you can spot any inconsistencies in the Tides story soundtrack…

Filming in 360 is a pain in the butt.

Firstly finding somewhere discrete and secure to place your camera in public places to get a decent shot takes creativity and certain amount of chutzpah and then you have to find somewhere to hide from its all-seeing eye. Fun game: watch the Bridges story again and see if you can work out where I’m hiding in each scene! It’s like a pixellated “Where’s Wally?”! I ended up having to explain myself to passers-by quite a bit. Using Streetview to capture still images was easier in this regard as you can always be behind the camea as it builds its shot but it’s difficult to keep your shadow or tripod out of shot (see below).

A flattened out 360 image of a beach in the evening sun
“Stitched” 360 image from the Tides story

Impressions

My overall impression doing this was it was worth exploring and a lot of fun once I’d worked out what I was doing with the technology. I would say that I’m about 65% happy with the results. I think my main problem was taking the form and pace of storytelling I’m used to and trying to shoe-horn the 360 media into the process. I’ll need to do a lot more work to get the storytelling style and medium better aligned.

The viewing experience is really important. Of the people I asked via Twitter it seemed that those who had viewed it using the click-&-drag verison in the desktop browser had the worst experience. It’s too active and repetitive taking attention away from the story. The mobile app was much better either for whole screen viewing (iPad worked best) or through a VR headset.

The whole point of digital storytelling is that it should be open to anyone to create their own without being as much of a video nerd as I am. My eye was always on whether this was a process that gave good results at the end but without requiring a lot of complex jiggery-pokery. The Bridges story is probably the closest I came to finding that simple workflow. It’s rough at the edges production wise but was possible to do with no more pain and stress than using iMovie which bodes well. Facilitating the whole process from story writing, selecting and filming locations and editing would still need some hand-holding, though.

Lastly, this isn’t a one-size fits all solution. There will be some storytelling projcets that this will work well with and others where it’s an unnecessary complication. I found that capturing the media and writing the stories helped me get a much better understanding of the places I was in and how I related to them which might give an indication of what storytelling purposes will work well with 360.

Going back to the beginning

Reflecting on my mission I set out to complete, I don’t think I’m finished with this yet. There’s still lots of things I want to try like mixing video with still images in 360, using surround sound and so on.

VR and 360 video still isn’t rocking my world entirely but it seems much more achievable and useful now in this context at least.

Feedback from others

https://twitter.com/i/moments/1045349367211855872

By Chris Thomson

I'm a Subject Specialist at Jisc focusing on online learning and digital student experience.

One reply on “Storytelling in the round – experiments in 360 video”

But sometimes you get called out on your assumptions, which is what my colleague John Sumpter did in Leicester a few months ago and I realised I was trying to argue a position without even having properly engaged with the technology.

Hence why I was in Greece presenting the workshop having been through the process from start to finish. The aim was not to convert myself to the technology, nor to try to debunk it, rather to keep an open mind.

Alight Motion APK

Leave a Reply

Your email address will not be published. Required fields are marked *