Pushing the boundaries with VR post-production in Unity w/ Randa Dibaje
Hey friends and welcome back to the Alex Makes VR podcast. In today's episode, I am joined by postproduction powerhouse Randa Dibaje. She is an xR developer for PwC UK and in this episode, she takes us through the whole post production process for creating a VR training experience in unity. Now, this specific project that we talked about in this episode is one that I've talked about in several podcasts now, but specifically, we talked at length about it in last week's episode, where I sat down and talk with Louise Lu and Jeremy Dalton from PwC. UK, about the 'In My Shoes' project, for those of you who haven't listened to that yet, and I highly recommend you go listen to that and the 'In My Shoes' project is a diversity and inclusion training project that is being used globally to transform the way employees are being trained in diversity and inclusion. It was a really exciting project to be a part of not just because of the impactful content, but specifically from a technical point of view, this was the most ambitious project I have ever been involved with.Instead of going the kind of safe route and and making this a nice, simple 360 video experience, we decided to go for push the limits basically of all of our skill sets and the technology itself. We decided to make this a volumetric video, interactive branch narrative VR experience for Oculus quest.
This is a full transcription of the podcast episode. Instead, you can listen here:
Now for those of you who don't know, the technicalities of something like that this project on paper should have been impossible, close to impossible anyway, but lo and behold In this episode, Randa talks us through how they made it possible. She talks us through all of the challenges that they overcame all of the intricate details that were considered in order to make this project a success. We also talk by the end about just Unity in general and how you yourself listener can get started with the basics of working in a programme like Unity, which is so powerful and one of the industry standard for the VR industry. So we talked about loads, I can't wait for you to listen to this one. If you've got any questions or future kind of episode suggestions, I would love to hear from you, at @alexmakesvr on Instagram and Twitter.
Alex: Randa, you are here to demystify all of the complex post production process that went into the recent volumetric interactive branch narrative piece from PwC - 'In My Shoes'. Randa, welcome to the podcast. Please introduce yourself.
Randa: What an intro Alex. Well, thank you so much for having me on this podcast. I'm Rhonda Debaji. I am an XR/VR/AR developer at PwC and I work closely with Jeremy and Louise and we worked very closely on this project as well together.
Alex: I got excited. And for those of you, I guess, because I've just dive straight in there, because I'm excited to talk to you about this project that we worked on for several months. But for those people that haven't listened to that podcast with Jeremy and Louise, and for those who haven't heard me talk about this project before... In My Shoes project, why don't you go, why don't you give us a bit of a rundown of what that project is and what it entailed?
Randa: Okay, so this project is a about diversity and inclusion, being in the shoes of someone going to micro-aggressions at work. And the main element of this project was making sure that everything looks really realistic. So the characters had to show certain emotions that the user could read properly. The environments had to feel real so like, you know, making sure that if it's morning, it really feels like morning, if it's evening It feels like it's evening. And it's little things like also making sure like rooms actually look like this people living in it and people using the actual kitchen and the bedroom as well as you know, it was so hard making it seem like you know, like people are living here. So that was mainly like from from the technical side. That was the project that we were the focus of the project is making it look realistic and that's where volumetric capture comes into the play which is what you you spoke with Jeremy and was about previously. And we were we had a number of options when we're trying to pick what kind of characters we wanted to use in this project. And we had, we went from stylised two body scans to volumetric and we spent a good month trying to figure out what is what, what are the pros and cons of each character. And, you know, with stylised, it was difficult because, you know, we're kind of working in a corporate environment, and we need people to really feel like these are real humans. But then, you know, there's always that with 3D modelled characters or with body scans, there was always this fear that, you know, the characters might look creepy. A lot of the scenes you had a character and sometimes they blink at you, and then it's like, they're not there maybe blinking too fast, or the blinking too slow. And you're like - okay, this looks creepy. I don't know how to connect this. Yes, so that's how we ended up with deciding at the end with volumetric capture and even though we knew like, there was one big limitation to volumetric capture. One of the biggest challenges that we'd have to face and that was the full size of the videos.
Alex: Give people a bit of an indicator on how big we're talking with volumetric capture file size.
Randa: So if we wanted to go for like the best quality, like we wanted this to look like, you know, like 8k or large files and super high quality, high good colours, it would have been like, maybe for that entire experience, I would say like maybe 40/50 gigabits gigabyte size. But we went for, I think it was 2k 2k resolution. In the end, I think it was that we ended up with 18 gigabytes. So you can imagine that does not fit in one APK. Because you're limited to four gigabytes, I think. So we had to kind of create a system where you can load the videos from outside APK. Now that the challenging part is because we use enterprise headsets, and we push things to the cloud to APKs to the cloud to that to the headsets but we can't do that with the video. So we have to find load all the videos, man like manually plugging into the computer and then pushing these videos onto the headset. Now the problem is the APK moves off the headset all these videos are gone. So like there's we have to like tiptoe around pushing these onto headsets and making sure we don't remove the wrong fans of the headsets. Because if the headsets are with someone on the other side of the world, they're going to open up the application and there's going to be voice overs, but there's going to be no characters that also now, yeah, that's something we're still dealing with as well.
Alex: Wow, that is fascinating. Because part of the the process of this project for me was kind of understanding all of these limitations right up front to be able to kind of like write and direct it properly. But then luckily for me, as soon as we finished the shoot, it was kind of like hands off over to you guys. So I was stressed out of my mind up to the point where it was kind of completed production. I can't even I yeah, I didn't even know that there was like all of this going on in the background. I mean, I always knew that, that it was going to be a technically complicated process, getting it to work, especially on quest. But I had no idea that you were having to kind of come up with all these different solutions. And even now you're having to kind of to almost work around them, which is fascinating and shows. It just shows though, like what a kind of innovative project it actually is. And I feel like other people or other organisations might have been put off by that. But that's one of the beautiful things about about you guys, and in your team and PwC is you're like, Oh, that's impossible. We'll see about that.
Which we're gonna do, exactly. So that's really fascinating. And so let's dive into talk me through I mean, you've kind of given a really nice overview of, of what the project is, and what some of the challenges were along the way. But let's rewind and take me through from you know, from even before we actually captured the actors in the in the volumetric studio from before that what what was going on, talk me through the actual process, step by step who was involved, what were the elements that had to come together from a post production standpoint.
Randa: Okay. So obviously, you were involved in the beginning steps of of this project, and we all knew that there were the one of the limitations with this volumetric capture as well was the fact that we only had a two metre radius to work with. So say that we wanted some characters to walk into The room we had to make sure like the environment was designed to, for the user not to pick up the fact that they've actually just appeared out of nowhere, rather than actually walk from a door. So we in before we did the capture, we worked with a concept artist, and we passed the script to him. And we said, okay, so we need characters to walk from here to here, we need this character to stand here, but we don't want to see their feet or little things like this. And eventually, we came up with some environments, prototypes of the environments, where we put the radius of the two metres around, and we could we could see from what points at what point the characters were going to walk. Now the good thing with working with CGI as well is that, you know, even if afterwards, we made someone made a mistake in the volumetric capture periods where we're capturing the characters and, and you know, it didn't really match what environment we're making, we could still change the environment. So that was always kind of like a, like, a good thing to know, like, we can still change where the wall is gonna slow and it's gonna end. So when we were doing I was, I was there on the day of the shoot, and we had an Oculus headset, and we had everyone kind of like envision the environment, what it would look like, and the space that they could walk in and then we showed that to the to the characters to the actors as well and they were able to kind of imagine like, what while they were going to come through or also like when they were sitting on a stool and they had to pretend to work on a laptop and even that was just like you know, that was so challenging to see like oh my god, like I hope that their hands are in the right position. Like otherwise we have to like really like make this table shorter or taller or this is a lot of things like this that was very challenging and you just hope it's going to go well and you just hope like this is you know, this is this is the the right path we're going because we only find that once we get the footage and put it into the environments.
Alex: Well and that's that's kind of this because even just then you were saying that remember Do you remember we kind of had in like meeting things for example, we'd have to tell the actors you can't lean on the chair because your elbow if it moves even like a centimetre is going to go through the chair and we like that we maybe can't do much about in pose so like so so make sure you keep your arms in and there were so many different things like that like you say that although yeah you could use have loads of controlling posts like move a wall here or move a door there to make it so that they're not walking through that in in pose still there were like these things where we weren't sure until until it was over and you could bring it into the environment we weren't sure that it was potentially going to work which was a fascinating way to work but yeah, sorry carry on.
Randa: Yeah, well yeah, we're actually still like there's still some things now I see and others don't see but there's there's hands going through tables and stuff and then after obviously capturing and everything we had to start working on environments but without the actual footage, so we had 3D modellers, 3D artists, we had an environmental artists work with us on creating the environment, the materials and you know a lot of the a lot of the rooms as well, like the design wise was very bespoke. So it was kind of like...I remember walking around like the office space and just thinking like what is it that makes an office seem like an office and you know, like, what is the woods like how can you actually it's kind of like interior design in a way because I'm thinking okay, like what is it that goes with whiteboard that makes the curtain looks better or like all these things are going to factors and also ensuring for example that you know, if you have a kitchen and a bedroom in the same space, they need to have the same word colours things to have the same curtains if you're if you're trying to show two different offices you don't want them to kind of look the same you have to make sure that they look very different so that people are understand that you're in another office at this point faster. Yes, it was communicating these ideas with working with the 3d artists to create these these these environments. And then obviously when we got the volumetric videos, you know, some things we realised that didn't work or we have to change the environments a little bit.
Alex: Talk to me about like, what what specific things like might you have to change about an environment and why?
Randa: It was things like was, there was a kitchen scene where we had the character had to walk out of the kitchen, and she started walking into the wall, and you could kind of see that she was one Looking eventually, like, again to the kitchen, this little things like that we panicked a little bit at first, but we realised we could just mask her out. So we didn't have to make much change. And in that sense, but also, for example, in the kitchen as well, where she's, the characters is doing something over the stove, we had some of the stove a little bit to a different location, like inside a little bit, because if she turns around, she's no longer looking at you. So it was like the small details like that. And there was also some walls that we had to move to make sure that you don't see that the character just like randomly appear from nowhere. Yeah, and it's actually funny because a lot of users as well tend to stand up and look around. So like, if you if you stand up, you can also look to your left, look to your right, you can see these mistakes.
Alex: That's really funny and well, that was something that I wanted to chat to you about. And we will come back because I realised that we were kind of going chronologically there, like through the different elements. But on that note, I'm remember one thing that I couldn't wrap my head around, and still to be quite frank can't wrap my head around is with 360. Like I know exactly where the user's head is going to be. So I know that I can tell my actors to look at a specific lens on the camera. And I know that that is going to be direct eye contact with with my audience, I know that. But when it came to volumetric, the thing that I couldn't fathom is that we were in this obviously this this volume, the studio that we were filming in with all these cameras everywhere. And I was like, well, surely we just like pick one camera, and then that is the audience. But no, it wasn't like that. It was like the camera. The eye line constantly moved. And then obviously, yeah, taking that a step further your audience because it says six degrees of freedom piece where technically they could if they wanted to walk around the whole scene? Yeah. How do you kind of...how do you guys in post production, think about that? In terms of...we've got these actors who are supposed to be talking to you as the main character, but how do you deal with that? If someone does start to kind of like stand up and move about and walk around? How did you kind of think about that? I guess.
Randa: So I mean, we obviously do not advise anyone to stand up and use their, with their headset and just start walking around the room. And generally for safety reasons. Yeah, it was literally just making like a wall like a bit longer to make sure that even if they stand up and look over don't see things, it was also like things under the table. Like there was because obviously like they're sitting like what are you seeing is that they're sitting on a table and then using their laptops, but you don't see is that they're actually like inside their chair, rather than sitting on a chair. So it was like hiding things like that. And that was that was a major challenge. But in terms of eye line as well, it was working with volumetric video, as long as that you just have to play that the scene as well. And then you get the actual video to play. And then you have to move your head rounds, there was there was some scenes that we had three characters, and then you have two of them that were filmed together, and one of them in the western separately. So So the eye lines were off a little bit and you just have to kind of like find the centre point like find, why does it feel like all of them could potentially be potentially be looking because either way, and like sometimes, you know, when people adjust their garden differently, they might end up in like a slightly different spot than others. So you kind of hoping that it is along those lines. But but most of most of the scenes, the island was very successful, like, the director and the producer at the studio did a really good job at making sure like the island was was was well enough to make it look like they're really looking at the user at the right position as well. And the good thing as well as working in unity is that you can you can always just change where the character like where the user is to make sure that you know, aligns with everything like you. I guess you don't have that limitation with your like, you're not forced to stick to like a wrong Island, you can always fix that in post production. So there's ways around it. There's ways around everything.
Alex: We'll definitely come on to talking about unity and then your kind of, I guess, journey into post production because I'm fascinated by that as well. But let's keep on this treadmill of talking about you know, each individual step. So we've talked about the environments, you know that they were done, pre shoot and then obviously being able to adjust post shoot, and things like that. So what happens next so you've got the environments are vaguely in a place where they're ready, then you've been kind of given the volumetric and kind of capture from the studio, what kind of happens next? What's the process of like bringing them all together?
Randa: And so the next step was putting them into the environment. We also were, we were going back and forth with the studio about, because there was like several ones that we could pick from and what, what, what second goes in and what second it comes out. So I don't think so we're going back and forth with the studio was about the colouring, because obviously, like they're working on a flat screen, what you see in a flat screen is very different where you're going to see in the quest. So it was, you know, the colouring was it was it looking similar to the environment was another looking similar to the environment was looking realistic. So it was going back and forth with the studio. And at the same time, we were working with an environment, environmental artist who was creating all the shaders for the environment. So because there was a lot of things that were shiny and things that we wanted to look realistic.
Alex: Explain what a shader is, for those who don't know, including myself.
Randa: It's like a, it's like a programme material, if that's the right way to describe it, you kind of programme the ways that you want the material to act. So you try. So if you want to make something like shiny, or it kind of fuses a lot of the power and the performance of the headset, so you can kind of code your way out of making a more optimised and able to work better in the headset. So we had an environmental, environmental artists working on that for us. And he did a really good job. But really making things look like they're real, like it was little things like that, that the fridge as well, like it just feels like you just kind of want to grab, go up and open it and then see what's in there. But and then on top of that, so we have obviously the environment is very separate to the volumetric video and bringing them together. And then making sure also the lighting works for both of them was a bit of a challenge. Because, you know, some things could, it could make the character look a bit weird, but unexplained damage, like reaching out to kind of like find the middle ground. But between that so that was that was basically the next step. So we we did a lot of testing and going into the headset and making sure like it's running smoothly, and then things aren't really crashing or slowing down or lagging or anything like that. So yeah, that was so cool.
Alex: Because when we when we actually filmed it in the studio is kind of consistent lighting is the same regardless. So is is that kind of is the lighting of the scene all done in post and do like you say, do you have to kind of are you lighting? The characters and the environment separately? Or when you get into Unity altogether? Are you just kind of casting the same lighting on both?
Randa: Um, yeah, so you cast the same lighting on both it was because even though the character so they, when they have lighting in the studio itself, they're lighting it to get the most out of the details of the character of the of the actor, or actress. But when you bring it into Unity, you might For example, this is just a random example nothing to do with the project. But say you want that character to be in a dark room, but they're quite lit up. So you kind of need to make sure that they're on the same level of lighting, I guess. So that's that's a good way to explain I think.
Alex: Yeah, that makes total sense. So So you've gotten the environments you've gotten the characters in, you've made it look beautiful, the details of that it, you know, the environments are as realistic as they can be to make sure that the kind of the photo realism of the characters and the environment aren't jarring. So it's all nice and smooth. And then you got a chuck on top interactivity, because, after all, is a branch narrative. Talk to me about that, because I mean, even a linear volumetric video would have been quite the task, especially to run on request, like say, Now add on top of that, this layer of decision making. So where where does that come into the process? Is that the very last thing or is that a consideration? Like beforehand, talk to me about about that.
Randa: It was a key consideration throughout I think, because, I mean, knowing that there was like two branches of like three different scenes as well, I just added more like, more storage, like we needed to work with more material. But then I guess also, the challenge for all of us, like you included was getting it to look like you know, it's a it's a smooth continuation like so you have obviously, for example, scene one and then you have two decision points, we'll see you one as C, one B, and they're all spilled separately, and you need to make sure like they continue and they look really smooth enough to for the character to be used to really really feel like they've, you know, like it hasn't really discontinued the scene. Now, the good thing is that it was something it was a difficult thing to deal with in production and post production it was you know, just We had to make sure that, you know, when we're going from from season one to Season One is, you know, it doesn't seem like it's, it's continued. So all we did really is just put a fade in and fade out. And it was it was really smooth, you really don't feel like you know that they were filmed at separate times we feel like it was a continuation of each. Sometimes interaction, I think it was more during production, that was a challenge rather than the post production parts.
Alex: That's very true, because because there was the way that we had to give the listeners a bit of an insight. The way we had to do that is by having, choreographing Corey, yeah, choreographing the action of the actor to kind of end in a kind of neutral pose, so that every time they got to a split, they could then resume that kind of exact neutral pose to then continue the scene. So that there was there was this kind of natural place that that actor could go back to so that they kind of look as as close to as if they you know, we hadn't cut the scene in half, which for me, I always think that those kind of things that and getting the actors to perform against a recorded take of the other actors, those two were the biggest things for me that I thought, if there's any technical challenges that, you know, I'm responsible for making sure that that was those two were it and it sounds like we managed to pull it off, which is really exciting. But again, those are things that, like I said, we went into this with all of these massive kind of challenges to overcome any way to make this project what it was, but then add in those decision makings. I mean, we really did. We really did go for it.
Randa: Yeah, no, I think I think they did a good job. Like I remember, like, towards the end of the finishing of the project, I remember, I hadn't gone through the entire thing yet. I hadn't watched it from one end to the other. And I watched it finally. And I wanted to cry for two reasons. One of them being like it was, it was really emotional. So like the entire, I didn't realise how powerful that experience was, like, you really feel like you're in the shoes of someone else. And then secondly, I was just so proud that we managed to pull this off, like I was like, I can't believe it. Like, I can't believe that that vision that we had, not just like for the characters, but also for the environment. This actually came to my like, like, I really thought like you because I was seeing all these other applications. And I was like, oh, like the environments might not look that great in the manual look realistic. Like, I was really worried that, you know, like, this was gonna fail or something. But really, I think we really pulled it off. And I think, you know, like, you know, how we work as a team and whatever is impossible, we're gonna go for it. That's exactly what we did. So, ya know, it was it was worth the the tears and the sweat that one.
Alex: Definitely, and those, those are the moments where like, you say, like you halfway through it, you think, why did we think this was a good idea? This will be so much easier like, but actually like you say the impact of it. And when you see it pulled together, and then especially when you start to kind of see people that had nothing to do with the project. Those people that kind of experience it for the first time and they they're not looking at the hand that might be slightly going through a laptop, or they wouldn't know that those three characters are not actually in the same room when they are recording that, you know, they don't know any of that stuff. Makes it? Yeah, that makes it all the more magical when you kind of like, say when you pull it off, but I want to talk a little bit about so we've kind of talked through the three key elements as I see it anyway, the for the the post production, is there anything else apart from sound? Because I do want to get on to sound in a minute. But is there anything else from a visual point of view that you had to kind of consider along the way?
Randa: No, I think I think we've covered and I believe we've covered everything.
Alex: Let's talk about sound because sound is often even just in this conversation, for example, it's the last thing that you kind of discuss. But weirdly, the sound makes or break a project, right? Like we can get away with so much visually. But if the sound is off, if your sound, especially for a project like this, where you are in the shoes of the main character, selling that to the audience, like sound is so important. So talk me through the kind of the way that you worked with Luke, the, the sound engineer who did the post production.
Randa: So it was you know, I think also Louise worked very close with him on the sound stuff and Luke was working with us to put these sounds into the environment and everything but you know, it was a it was a challenge because you have the inner voice. And then you also have the sounds from the environment. So how do you make sure that the user understands that it's the inner voice that's talking and it's not someone else, like around you that you can see. So it was things like, like, when you're when the inner voice is on you have to make sure that everything on The songs and environment are cut out from the scene, you have to make sure because also, we had the sound separately to the videos. So he was working on the sounds differently. So we have the videos in the environment. And we had to basically find a way to match exactly where the sounds were coming from making them spatial. And also aligning them with the, with the, with the speed of the map and everything. Because normally people just go in and they they put it with the video, and you just have the video there and like it just works seamlessly that way. But we chose to have it separately because we wanted to add an effect. So like, in a voice also sounds very different than how you would have someone else speak. It was also things like, for example, when you're in a lift, like what are the things you hear in the left, you know, it's like, you have to hear the door open, you have to hear the bullets going up and down. So it was it was very crucial. And I think also music came into play a lot. Sometimes when you have like an emotional or strong, strong scene, adding a bit of sound makes it even more emotional and empowering. Yeah, and I remember a small things like covering up mistakes. And you know about this, but the scripts that we had an issue with, and we said something that we're not supposed to say. So coming up with an idea of how we can like, make sure that this doesn't appear was also like, a challenging part. But I think Yeah, so it was I think the sound was was amazing like that a really good job there. And like everything I had envisioned for the environment. Like it was just perfect.
Alex: That's amazing. And because I remember, I mean, firstly, I totally agree with you about the whole music just elevates, especially an experience like this, that is supposed to have so much emotional impact, like music, just kind of even just a little bit of, of music throughout just really kind of elevates it. But I remember when we were going through the post production phase, and like, like I've mentioned to the listeners, I wasn't very close to that, because I had already gone hurt. Well, you guys at this point, but I remember there being a conversation about things like making sure that the sound and the video were loading at the same time. So there wasn't any sync issues. So again, was this a byproduct of choosing to to create this piece for quests specifically, was this? Was that a byproduct or a consideration? Because you know, audio and video were kind of being treated separately? How did you guys deal with that?
Randa: It was, I think it was because I didn't I wasn't the one that worked closely with, with the sound and the videos per se. But there was a time to understand, for example, because we had so many different clips. So like you have the the the internal voice, internal voice, and then you have the user environment, and then you have another internal voice and then stitching them together to make sure like, you know, it's the same length as the whole environment. And then also making sure that the environment changes scenes, after all the arches that was very bespoke, I think, process of working that I would say so it was kind of understanding, okay, like this happens here. So we should probably put this here instead and say, Well, yeah, it was it was quite bespoke. I think it was no right or wrong way to do it. I think we just kind of went through what we thought would be right to do.
Alex: Yeah, amazing. And was that using unity as well, the whole the whole piece post production wise came together in unity. Is that right? Everything? Yeah, except I think Luke probably use some other sounds, tools. But yeah, he were he helped us put them into Unity as well. So and speaking of unity, is there a particular reason why? Because I know and for those listening, just in case, they are even less educated on the subject as I am and I'm pretty uneducated when it comes to game engines. But the two big ones are Unity and Unreal. Is there a particular reason why Unity was better for this project than Unreal other than just the people that were on board? That was their skill set?
Randa: It was literally because all of us just use unity. And also I mean, we only have we only use unity I think we don't really use unreal.
Alex: And why is that explained because again, I'm totally oblivious to any of this stuff. So it'd be great to understand what is the difference between those two game engines?
Randa: So from my understand, so they're used, they both use two different languages. So if you're stuck on one, it's unlikely that you'll go to the other one although some people have expertise in both but I think the main reason is that we see a lot of the VR stuff more apparent and unity rather than an unreal doc. Don't get me wrong. Unreal, does have you can create VR applications on there, but we're finding is a lot of the main claim And consumers and everything, majority of them use unity for VR, the tools are more apparent there. Also, when we find like, a lot of the times, I find that if companies are providing SDK or something, it's almost always unity rather than unreal. Unreal, unreal, is not sometimes it's more used for things that have like visually pleasing. look to it. So a lot of people are not creating movies and stuff using unreal. So if you're kind of looking in that direction, unreal is good. I think if you're looking towards the, the, the VR stuff will be unity.
Alex: Right? And so yeah, for someone who is quite interested in all of this, how do how do people start to like to, to get educated on unity? Like, how would you suggest how you know how people can get started with learning about this tool?
Randa: Okay, so the good thing about unity, it's free, so you can go and download the application, you can start using it whenever you want. The other good news is that Unity has a lot of free tutorials, on on, on, on getting started. If you want to go into 2d gaming, if you want to go into 3d gaming, if you want to do VR, AR, it's all there unity provides all free tutorials for that. At the same time, you will find so many YouTubers and YouTube tutorials that will teach you everything that you need to know it was how I started, I literally just created like a very simple game, just to understand, like the physics and what goes behind that and just, you know that just the input, like how do I make sure like when the user the user clicks left, they go actually left, it's little things like that. And, and from there, you realise, oh, okay, there's so much, you should come up with some sort of like idea that you have and try to create a project out of that. So I do really recommend just getting into it, finding a tutorial that you want to do. Simple starts, Don't go, don't go crazy initially. And when you get the basics, you realise how everything is so much easier, like it's so easy to do eventually.
Alex: That's amazing. I love that, I love that. And I love the advice of just like getting stuck in and just just making something as you go along, I feel like I preach the same kind of thing when it comes to 360 video and learning how to direct and learning about frame composition and all that kind of stuff is the same thing. Just get out there, get your hands dirty, especially because something like unity. You know, if you've got a laptop, chances are, it's probably going to be at least good enough just to run you know, a little kind of something, something it might not be able to run, you know, super high end, VR kind of making on unity, if that's even a phrase. But the barrier to entry is as if you've got a laptop, then you can probably start to learn unity. Whereas obviously, on the filmmaking side of things, you obviously have to buy the kit or rent it or whatever. So the barrier is like super low. But I love this. I think that especially with a project like this, what we've kind of proven is that if things seem like they're impossible, if things seem like they are too out of reach, if you've if you know if you feel like you've got this idea, but you just don't feel like there is a way to actualize it. With a tool like unity, you can make all of those things possible, right?
Randa: Mm hmm. Yeah. 100%, like anything, nothing is impossible. Because you're just recreating a week. So if you want to recreate an environment, you're just literally recreating it into Unity. It almost feels like it's limitless. It becomes limiting when it comes to the hardware, I think. But with Unity as a software, it feels like it's limitless. You can you can do whatever you want with it, and you can create whatever you want with it.
Alex: that's interesting, and is a perfect kind of segue into kind of talking about the distribution because like you say that the this, this piece would not have been a problem. If we were rolling out on to a high powered PC, high end you know, VR headset with unlimited power, like we could have done whatever we wanted with this essentially because as long as we got the processing power, and the graphics cards were golden, the big bottleneck was rolling out for quest. So you mentioned it a little bit at the beginning, but now we've given the audience the context of the whole project. Let's come back to distributing this and talk through the process of you know, you've built it in unit and now what?
Randa: Okay, so I have I mentioned earlier that the problems of the file size of the videos and everything so in order to get the videos onto the headsets is we have to plug into a computer and move these videos until after All inside the quests, and then you can, you can basically up and run it. Now the problem we have is that a lot of our headsets are all around the world. And we use an enterprise version of the headset. So in order for you to get applications on that you have to do three clouds, you just send it to these headsets through a cloud, you can send the application, the actual application itself, but you cannot send the videos themselves. So what we've done is we've, we've initially just I loaded them, so we plug them all into computers, and we just drag and drop them into the headset. But now when you have bigger number amounts of headsets, and you have like one or two people to only do this, so we're talking about 100 200 headsets, or eventually there might be more, we want to find a way to be able to remotely download these videos. So we created an APK, which you can just directly download them from the cloud. But for some reason, even that's proving difficulty, because sometimes it crashes, sometimes it doesn't. So we obviously don't want to have a headset that's in like New Zealand. And they've just tried to run it and it just crashing right before a workshop that they're about to run. So we have to basically make sure that there's troubleshoot guidelines, we have to warn people that, you know, please go in beforehand and make sure that the app isn't crashing, or if it is crashing, which that to us, and then we'll fix it for you. So there's a lot of ongoing maintenance, with with this with this with this project. Now, if it was not enterprise, and it was a consumer headset version, or, you know, we weren't really, we were kind of using it inside the office, and we weren't really sending it across the world, it would have been much easier. Because, you know, before we give it to anyone, we'd like to double check and make sure that everything is there. And then they can go to the workshop and come back because it's International, it's quite challenging. But you know, we were up three times, and you know, how we work in teams.
Alex: You've gotten over the the feeling of being kind of comfortable and bored. Now it's all about, you know, what's the next? What's the next impossible task that you're going to have to? But with with that, will it become easier? Like, are you still looking into ways to, or is it literally a case of you've tapped out now? Like is this is as small as the files are gonna get? You're just going to have to deal with the fact that this is until we get 5G, for example, which will allow you to download 50 gigabytes in however many seconds? Until then, is this it? Or are you still trying to find ways of optimising this?
Randa: We tried, we tried. We came down to this point. And we're like, Okay, this is like the only way that we can make sure that if someone's remote remote somewhere, and they literally have the APKs disappeared off their headset or someone did something and they asked them to remove that. We know for sure that if they run this application, it will still try to download the videos. And they won't go into the environment and then find out that it's actually no characters. And also we know from our side, and we're sure that no one's going to go into that environment with that the characters being there. So that was like one of the the most important thing that we wanted, but doesn't get any better. Like if it No, it does not get any better. If we had really fast my friend, like even I have really, really slow WiFi like, three months ago. And I would have been impossible to download these videos. So we always I guess before sending anything, we also make sure that we've we've downloaded them from our end. But funny enough, we found that the office doesn't even have fast enough Wi Fi to do this.
Alex: If PwC, don't have faster Wi Fi what hope does anyone have? But I think that this is an this is fascinating, right? Because this really does remind me of the first days of definitely my experience of getting into VR. When I first started getting into 360. And trying to understand how on earth I was going to crunch, you know, a 20 gigabyte mp4 to the point where it would run smoothly. And it would be easy to send, like you say over Wi Fi remotely to people to see my 360 projects. And it's so funny that no matter how far we've come, it's still a problem, obviously, because as we're getting better with our as we're getting more advanced forms of capture, as we're getting more advanced with the utility of VR, we're still bumping up against that kind of friction point between the software and the hardware. Which I find fascinating because I think that is one thing that until that solved Well, not until it's off because we're already making massive strides in this industry like phenomenal strides and the fact that we keep coming up against these things and innovating around them. Isn't Why the industry is successful. But I feel like the second that that becomes frictionless. We're like a rocket ship, like where there will be nothing that stops us because like you said, when you even experienced this project, having worked so closely to it, so you would have thought that that would have kind of done from the emotional experience, but you still had this very visceral, emotional reaction to watching this piece. And imagine if we had the kind of hardware and the kind of, you know, limitations of removed of giving it to 1000s of people, and they could all have that experience. That would be amazing.
Randa: Yeah, yeah. But hopefully, we can make it work as a team like...we'll make sure that 1000 people will go through what I mentioned.
Alex: The plan is to get it seen by all 22,000 PwC UK employees, right, and let alone the global kind of impact of this project. Right?
Randa: Yeah, yeah, that was when, you know, the, you know, the pressure is on so yeah, make this happen. And, and, you know, as well, like, it was, it's so much easier, if we just went for the typical 360 film. And, you know, because of COVID, we couldn't and we knew, like, we were gonna have to find a common ground and make it seem like, this was almost a 360 film, but it's not, you know, like, that's how realistic we wanted it to be. So even that pressure, like having to make sure that it fits into this corporate like environment. And this and this model for the training that they're doing. Like it was really important, because eventually we had like a conversation about making it stylised. We're like, get this, a lot of people will not accept this, or they won't really understand it, or they might not feel the connection with the characters. So yeah, no, it is. It was a crazy challenge that we went through. But yeah, we're not even done yet.
Alex: Which is crazy, right? Because I remember, Louise and I had our first call about this project last May or June, I think it was. So it's wild. But even nearly a year on this is when the fun just begins because this is when people start to actually put it into action. And it's started to use in the in the way that it's supposed to be so yeah. So just I just want to quickly rewind to the fact that because you said the word stylised a couple of times, I just I know what that means. But I just want for any listener, who might not know what we're talking about when we're talking about the difference between stylised versus photorealistic. Do you just want to kind of like clarify and maybe give an example of what something yes stylists stylised might not be.
Randa: So a stylised character is more like you know, what you see in cartoons, for example, where they have like a bit, you know, big eyes, or they have a specific style, all the characters have like a specific kind of style to their face. So it's kind of like when you're watching one like cartoon film, like how they all have a specific style of how their faces are, what they're just and that's kind of what stylised is.
Alex: Amazing. Thank you for clarifying that. Because I think it's interesting because like you say, there is something so powerful about VR, but when you then come face to face with another human being in a potentially real environment, that's when To me it kind of like levels up. And that's like you say why we chose to go down the route, even though it bout gave us all early, early kind of strokes. For making it volumetric. That's essentially why we kind of chose it is because that kind of that connection that you get from being face to face with a human is slightly different to being face to face with a kind of a characterisation of a human.
Randa: Yeah, there was also the possibility of using 3d scans, I remember but the face kind of looked creepy and because it's supposed to be like a real human like standing in front of you. But once they start talking, it was it was really hard to make it seem like it was real. So volumetric was a good choice.
Alex: Talk to me about that talk, because I don't think we went into too much detail about that in the podcast I do with Jeremy and Louise, what would that have looked like? So if we had done just a 3d scan, of a character, would that have been animated in Unity? How would have that differed?
Randa: So that would also we would have had so if we get a 3d scan, where did you just get like the kind of the texture around the character. So what we would have had to do is we would have, we would have had to bring in a 3d animator actually would have at this with the many characters we had, we would have probably had to like have multiple different animators at that point. But then they need to create a skeleton for this character. It's really hard sometimes to create a skeleton out of like a 3d scan. And the next steps would have been to go into a Motion Capture Studio. And also get a rec for the face. Because you have facial movements, you have smiles, blinks, all these kind of needs to be captured. But after you leave the Motion Capture Studio, you have the motion capture data and then needs to be cleaned up. Now, maybe 10 years ago, that would have been a very impossible like a very difficult task. Nowadays, it's become much easier, but you still get some glitches, you still get some, I mean it but the timeframe that we had, it seemed like it was almost impossible to go down that route as well. But you still have that, you know, you have that power those just to change things from how they were when you filmed those, you can have the animators for example, change where the hand is or where it goes. And, but these things take time. And it's a lot of you need a bigger team to work on that you need a lot more people to know more animators. And then obviously bring that into Unity in the environment, you need to make sure it all works together as soon as the and again, it was, you know, the facial efficient emotions was very, very important. And with this kind of character model, they would have been quite a challenge for the amount of characters that we have.
Alex: Yeah, definitely. And like you say, it's interesting, because one of the downsides of volumetric is similar to three sixty film, like, if it's not in the camera, it don't exist, but you can't change it after the fact. Whereas with motion capture, maybe you'd have a bit more control. But it comes along with so many more complications and having to tax the post production process, which I don't think Yeah, wouldn't have been timeline or budget friendly. Really interesting. Well, I feel like we have talked about, I feel so lucky to have gotten to work with you on this project Ronda, seriously, it's been an absolute pleasure. And let the people know, like, what, what even brought you into xR, because obviously I've known you since you started with PwC. But tell the people a little bit about how you even got into this kind of crazy stuff?
Randa: So how I got into it as doing my master's degree, I was we were doing I was doing a lot of 3d stuff. And I had like no experience in 3d. And I absolutely loved it. And I was like, Oh, this is amazing. You can create whatever you want, you can recreate everything, you can do whatever you want, basically. And we had a unity module. And initially, I was like, Oh, it's coding, I don't want to do coding, again, because I did my undergrad and coding. And then we had to create a game and like, I started to really enjoy it. For some reason. I was like, Oh, this is really nice. Like I created this runner game where you run and it's about health and picking up the water and making sure like your heart rate is at the same level. And then we went into you know, during the year, we were taught about the, you know, one of the things you can do with Unity, and it's like one of them was VR and AR and I really, I like I love the idea of doing VR and AR but my biggest worry was that I'm not really a typical gamer, I'm not gonna I don't want to work in a in a gaming industry because I don't really fit and I wanted to do more of the things like the kind of work we're doing now like this, this to me was like, where I wanted to be. And to benefit, you know, society, the benefit different industries, and especially things like the medical field. And the construction field, like the amount of training you can give people like off side is insane. Like even things where people are in dangerous situations, and they have to kind of, you know, the you don't want to put them in there to train them, but you can do it via VR. And those were the things that really, really made me want to get into xR and I was when I was looking for jobs and you know, still applying to these gaming companies. My new tip down I was like, I'm not like, you know, I'm probably not the perfect candidate for them because I don't game a lot. Like I'm not always I do games sometimes. But like, I'm not a typical, you know, it's not what I do every day. Yeah. So then the I found this role at PwC and I got it so I'm here now and doing things that I want to be doing so especially this project was like you know, the one thing that I came in for was for things like this.
Alex: So, making an impact with your work and what have you because because when we first met and you were just starting obviously you were kind of across the board having to like dip your toes in every different aspect from like stitching 360 to like you say like these kind of like rolling out apps on unity to AR to to kind of 2d capture placed in in kind of AR applications. So what's like been the best thing that you've like enjoyed learning about since being a part of it?
Randa: I really enjoy every part of it. There was nothing but I hated there's nothing that I didn't enjoy, like especially And now it's so nice knowing that I can have multiple different types of projects. I'm not just working on CGI, I'm not just working on 360 film. I know that one project might be AR the other project might be like, especially like, I'd never thought I would be working with volumetric capture. And and especially when I first joined the team, I realised that a lot of the work that we were doing with 360 film, and for the opportunity to come up that we're doing CGI, and we're doing all these different things, and trying out things was very exciting. So I think it's more and I find it more exciting rather than like, wanting I guess, yeah, I enjoyed this part of the job a lot.
Alex: That's amazing, the kind of variety of stuff and and to put you on the spot. What's your kind of favourite piece as as a consumer or in a work context? What's been your favourite piece of VR you've ever experienced yourself?
Randa: recently I've really been enjoying the same read never thought I would say this, but it's fun. Yeah. It's just that you know what the social parts as well. So recently, me and my friend like we've been experimenting a lot going into like rec room and stuff. And we just been having like a blast. Like, you've never thought about this. But going into these rooms and listening to people's conversations. This is the party's we went in and bash the whole talent show. Like it was really funny.
Alex: I remember when the pandemic first started and I went into alt space for the first time and had like a snowball rave. And I was like - what is this? Like? This is why or this is a whole side of the industry I'd never seen before.
Randa: Yeah, no, no. So I think and I think everything that I experienced in VR is different. I don't think I have really my favourite piece, but it's just I think periods like now I'm really enjoying Beat Sabre. Maybe in a month's time, I'll find something else.
Alex: Beat Sabre is obviously such a front runner in the VR industry, especially in terms of getting people into it. I think that's for such a good reason. And it's because no matter who I talk to, no matter how long they've been in the industry, beat Sabre is usually top three experiences. Same with myself, as a consumer, the only thing that has gotten me into a VR headset, just as a just a general person is Beat Sabre. It's so well done. And I feel like, I'll be interested to see what the next beat Sabre is, whatever that might be that application that just gets everyone into it. Because even like my kind of brothers who obviously do work with me on VR projects, but they're not massively into it, and they all gamers so I'm surprised that you know, it hasn't kind of caught their attention for him. Even they ended up buying headsets purely for Beat Sabre.
Well, friend, I hope you enjoyed listening to this conversation as much as I enjoyed having it. Randa, thank you so much again for joining me and giving us your time and sharing your expertise on all of this and friend listening - I want to hear from you. I want to hear what you thought of this episode. Are you inspired to go and and pick up a copy of unity and start educating yourself? Are there any other questions or subjects on this on post production that you'd be curious about me covering in future episodes reach out to me let me know @alexmakesvr are on Instagram and Twitter. And as I mentioned, I've included all of the links to run the social medias.
In the description, please go and show her some love and support for giving us her time today. For now though, friend, enjoy your day wherever you are in the world and I look forward to speaking to you in the next one.
Listen to the Alex Makes VR podcast here
Subscribe to the newsletter here
Follow Alex on Instagram here