: Guy Gadney

Just attended a really interesting talk by Guy Gadney CEO and founder of hosted by Falmouth University.

I’m dropping this post here to remind me of everything I heard and thought about so I dont forget anything 😀  

Bringing Natural Language Processing to non-linear storytelling in immersive installations, Broadcast TV, Interactive Graphic Novels, games and potentially VR Animation. 

This app has been in development for a while and has been tried and tested by BBC and Sky to name just a few.

Why not test your wits and interact with a serial killer, uncover an unscrupulous villian at a murder mystery dinner table or even arrange to bump off a sinister uncle called Kevin 😀 (purely by speaking with them.. naturally) 

It’s pretty cool.

BulletProof 2


Check out more about this project here..

Experiences like bandersnatch led the audience to 4 different alternative endings whereas Charisma aims to place the viewer inside the story, dealing directly with the characters, using the power of language… which results in a much more immersive, naturalised experience.

To create truly naturalised immersive interaction with characters and in turn authorship of the storyline, is a really tall order but its something that Guy Gadney, at is aiming for.

To look at how I might be able to use this app in my current project (and my next one) I’ve made a note of some key take home points that will help me to think about when writing content specifically for AI interaction (with in mind).

  • Language: The scope and specificity of the words the user might use. This reminded me of the hobbit’s (from the 80’s) where one tried to get into the mind of the coder by discovering various action words 😀
  • Nudge: Can the scope be nudged so using Barnum Statements, emotional responses and subtle movement etc.. (A little like cold reading lol)  
  • Communication: Text or voice control 
  • Narrative Structure: Branching, Open, linear… a mix up? 

I am particularly intersted in the use of volumetric capture and how this could be used to communicate emotion thropugh micromovements. Usually I convey alot of information about a character through their design, props clothing etc but have recently been playing with Mixamo where it is possible to export a character I have modelled and add various different responses like dancing, shouting etc. Given the blanket use of facemasks in this pandemic it would be interesting to see how we interact with each other without being able to see the lower part of the face (something nurses have mentioned in the news).. how do you convey emotion through other means than the face.

I am going to explore this further. 

I also need to think about how I might be able to integrate into my unity projects.

To begin with i’m going to write a short story 8-9 minutes long which is set in a room (maybe the mystic interaction at the end of my lunarium project.

I will be coming back to this blog post later to add a few more links from research…