Si vous représentez un marque, un studio de divertissement ou une agence de création qui s'intéresse à la réalité augmentée ou si vous avez en tête de créer un projet en RA, vous devriez consulter ce webinaire : Take Your AR Projet from Imagination to Reality (Faites passer votre projet en RA de l'imagination à la réalité).
Découvrez une discussion passionnante avec les membres de Unity Dan Miller et Daniel Usedom ainsi que Will Humphrey de Sugar Creative, le studio ayant réalisé l'application Dr. Seuss’s ABC–An Amazing AR Alphabet! et d'autres expériences immersives, pour apprendre :
- Des défis rencontrés lors de la création d'une expérience véritablement authentique et en monde réel
- De ce que doivent prendre en compte les marques, les studios et les agences avant de développer en réalité augmentée
- Comment résoudre les problèmes d'emplacements imprévisibles, d'assistance sur les appareils, de précision et d'échelle
[DAN] Hello, everyone and welcome. We're really excited. Go ahead and get started herein a couple of minutes. This is a webinar, so everyone will be on mute. But there is a chat channel if you want to talk to us, and then there's a question and answer panel if you want to ask us questions live. We are here live, this is not pre-recorded. So we'll get started here momentarily and welcome.
[MUSIC] Welcome, welcome. Good morning, good afternoon. Good night, depending on where you're joining us from. I'm UK-based, I'm just at the end of my day, and Dan, you're just waking up and filling yourself with coffee. Yup, pretty excited. Good start to the morning. See a lot more people shuffling in. Thanks a lot for coming. Just as a reminder, this is a Zoom webinar. You're not going to be able to talk verbally, but you can talk to us in the chat if you want to let us know about something. If you do have any questions throughout the entire presentation, you can ask them in the Q&A panel. I will be monitoring that throughout the entire presentation, and then we should have plenty of time at the end to get to any of those questions that you have in mind. So ask away, we're here to live, and we're excited to share lots of fun information, experience, and all the good stuff. We'll be getting started in one minute. I'm seeing that"Unity Rules' ' in China. Yeah, just going to say it. Just coming out, definitely. Pumping this up, I love it. Alright, so let's go ahead and get started. This webinar is: Take Your CAR ProjectFrom Imagination to Reality. We're all really excited to be here. My name is Dan Miller, I'm a SeniorDeveloper advocate here at Unity focused on XR. I build a lot of different sample content, do presentations, used to travel around. And yeah, so. And next, we have Daniel. Hey, everybody. Thanks For joining us today. I'm Daniel. I work on the BusinessDevelopment team, a Business Development for a Media and Entertainment team here at Unity. My team really helps studios, agencies, brands, leverage real-time 3D technology for projects like animation, interactive XR, location-based digital experiences. And from ideation to delivery, we have the products and services to help teams be successful. And now, I want to introduce the star of the show. Will Humphrey joining us from Sugar.
[WILL] Thanks, Dan, Daniel, hey, everybody. Yeah, as they said, I'm Will Humphrey. I head up the creative arm in Sugar Creative. We're a small studio with some interesting projects under our belt. There, I'm mostly responsible for innovation, for innovating new things, new ways of telling stories, new ways of creating experiences.
[DAN] Alright. So for those who don't know, Sugar Creative is a creation and innovation studio based in the UK. They're one of the industry-leading pioneers in the emerging sector of technology. We've actually worked pretty closely with them as we're developing Unity MARS. They've done a lot of work and experience, as you can see now. And this was pivotal, looking at these specific use cases, working closely with them throughout development.
[DANIEL] It's really great to have you here, Will. We'd love to just begun by walking us through some of the projects that you guys have been working on as of late.
[WILL] Yeah, that sounds great. We've done a lot in quite some time. As you would imagine, as a studio, it's been quite varied. Some of the stuff has been more on the straight up technical end, and some of it's been a lot more immersive. So the things that you see on-screen just to get the ball rolling, on the left-hand side, you can see an educational-focused portal, so an MR portal, there's a doorway, walk through it, in order to learn more about specific areas of history, so we're embodying sort of explorative learning. On the far-right, you've got some of the more experimental stuff we're doing with building-scale occlusion. In the center, we've got the Dr. Seuss' ABC AR app that's currently live. We'll chat more about that later. It will be the star of our show. Alongside that, we've done a whole bunch of stuff for VR learning. We've done some interesting bits and pieces for medical as well. We've deployed an audio and machine learning-based tool to assist the diagnosis of head injuries, which was probably one of the most unusual applications we've seen.
[DAN] All these were made in Unity?
[WILL] Yep, 100%, made in Unity.
[DAN] Awesome. We know that there's obviously some real importance with augmented reality. It's 2020, we're in a unique time, as this technology continues to evolve. And I want to point out that Sugar Creative was part of this consortium in the UK where they were awarded £4. 1 million by the UK government specifically to create and demonstrate the potential for technology-driven immersive story-telling on an international stage. I had to read that off there, to make sure I got that correctly. Can you tell us a little bit about that, Will? I mean, it seems like, A) an awesome opportunity, and B) just a lot of fun to continue to explore that.
[WILL] Yeah, it's been absolutely amazing to be a part of that. The objective was to explore the thing we already know. Everybody in the industry recognizes immersive technologies will be an integral part of the future of entertainment. That's almost a given at this point. The question is,"How? What will the audience be like? "That's where the UK government stepped in and went,"We really want to have stuff in the world that's demonstrating from a creative angle, to inspire people, and from a commercial angle to say,"This works."People engage with it, people love it, people want this."So that's what we went out. . . we're in the process of making now, we're about four or five months away from a make-live date. We had the pleasure before this of working with Aardman Animations, so their Wallace and Gromit Characters, you can see the trailer up on the screen for a minute for the Big Fix Up, which is the main project. Those guys are obviously creative leaders and giants in the industry and have been absolutely fascinating and inspiring to work with them throughout all of this. Interesting, but we're very much looking at creativity and the technology as the thing which will define entertainment in the future.
[DANIEL] I think you guys have obviously worked with some big name clients at Sugar. One of the questions I'm always getting is what sort of business impact does augmented reality have. How can I prove that this is a good project to go after? Can you provide a little bit of insight into that area to us?
[WILL] I can certainly give it a shot. The logos which you see on screen are a collection of partners that we're currently working with. You can see how incredibly broad they are. You've got entertainment industry, you've got governmental stuff, you've got product-based brands, you've got communication-based brands. The most important thing that we've discovered that irrespective of your industry, there has been an enormous range of applications. All of these sectors have said,"We see a potential. What could that potential look like when it comes to. . . when it comes to industries that you don't normally associate with it."They have said,"You know what? This has opened our eyes to the potential. It's a potential not just for creating an unusual thing, but a potential for creating something interesting and meaningful."And as if by magic, a lovely little quote from Susan Brandt, the President of Dr. SeussEnterprises. She worked incredibly close with the Dr. Seuss' ABC AR. When talking to her rights at the start of that project, the thing that meant that she was 100% on board, and loved the potential, was that she saw it as creating meaningful story and learning experiences for children. She went,"Not just what is it, but what can it do? "So there is very much a focus there on. . . not is the technology there, but what is the purpose of the technology in terms of creative imagination.
[DANIEL] I think it's pretty much at the point where there's no doubt that businesses across all industries should be at least evaluating and beginning to embrace AR. It's been around for a while, but it's still relatively new on the scene in a lot of areas. Can you give a little bit deeper into ways that AR is being leveraged across each of the industries, and the projects you have worked on?
[WILL] Yeah, no probs. Starting with probably what people would consider the most direct entertainment, you have with AR the possibility of bringing characters to life. That will always be immensely valuable. We associate the things that we see, the entertainment we have with a degree of imagination. This is another platform. You had radio, you were limited to sound. You had comics, you were limited to paper. You had a TV, cool, moving image big screen, you can get more immersive with it. You got theater, you can get up on stage. With AR, you've got yet another boundary that you can be pushing, another space to work in. You can then put in play components. Your entertainment doesn't have to be dictatorial, doesn't have to be,"I am telling you a story."It can be,"I've made a story, and we're experiencing it together."Actually, that's a huge shift. In terms of some of the more unusual business applications? Construction. We know that twinning is a huge component of the construction industry going forward. BIM data is how we interpret all building work. Being able to take all that and move it into. . . and move it into a visual space can be incredibly powerful. If you're product-based, we've had some amazing products here, and some really interesting ones around beers, especially, showing problems, showing purpose. Like if you are a small craft beer, and you've lovingly sourced all of your materials, you know exactly where all of your grain comes from, you want to be able to tell people about it. And you can't just label, so how do you enhance that, how do you get that provenance out there? Education, we've touched it on with Seuss, and with the MR portal. But again, you want to take content, you want to show it to someone in a new way, it's that new frontier. Cool example of one that inspired me was Innerspace, the film from what, 1970s? In which you got to see inside the body from a different angle. Now, with MR and AR, you can do this in different ways. You can take imagined content, tiny content you wouldn't normally see, and put it on the side of a table into a room. That's magical, but it's also very compelling from an educational standpoint. Lastly, medical. You've got a diagnosis. We know that data is an integral part of this. But visualizing relations in the patient, you've got the ability to both demystify it for them, and also to create meaningful context, which helps you use it from a diagnostic angle. Incredibly broad, but no matter what the industry, we found that combining data, imagination has created meaningful outcomes almost unilaterally.
[DAN] You're talking here about the evolution of storytelling, visualizing data, it's kind of bringing some of that information more into a contextual state, as well as just providing like a new medium that is fun and exciting, then people are interested in engaging. Obviously, we're in a unique state in the world right now. How do you see this being reflected as this new landscape this emerging technology. . . as people are sheltering in place and things like that?
[WILL] Well, certainly, first from the digital angle. You've got something which already existed. We already knew that how we interpret the world is your understanding of it. You take that understanding, you take that context, and you're able to put it into the world. And so you're able to not be more separate with your information, you're able to be more unified. We do this as people, and you're able to do it within the thing we're building. In terms of currently. Obviously, we're in a pandemic situation, you have a vast number of people who have had to turn to technology for communications. Personally, I see it as more of a willingness to consider those technologies going forward. You have people who for the first time used Zoom. It's something which has been an industry standard, but a lot more people are engaging with it and going,"Okay, cool. What else can we do? ''You have an awareness, and you have a willingness to engage with it. The point that I'm making, mostly, is that you get an upshot of acceleration, of both need and a desire to bridge that digital gap. To make it more tangible? We've all probably watched some sort of theatery kind of entertainment, virtually, during the last five, six months. That doesn't mean that when we go back to normal we're not going to just go,"Oh, I'll never go to the theater again,"because it's amazing. It's amazing being there. But say I am wanting to see something that's on stage in New York, it will be impossible for me to get there on short notice. Why shouldn't it come to me? Why can't I be transported therefor the performance? You don't have a restriction, you have an expanding of the potential marketplace because people are more willing to engage with that potential. Does that hit some of what you're hoping to grasp with that?
[DAN] Yeah, I think so. We're looking at. . . obviously, things will change, and I think, as you mentioned, it's not just going to get back to normal, and I think people are getting more used to and more expecting, that they want the content on demand, they want it when they want, they want it. . . We've always wanted it, we just couldn't necessarily have it. We just want stuff that's been successful because human beings want things. Even if you've heard something described to you, you've never been able to see, you will have imagined it, and you will have wanted to be there. That's why we go home and tell stories is that it's human nature, it's why we were painting,"and then I attacked the mammoth" onto the wall with some chalk way back when, because we imagined it, and if we can bridge that gap and make that imagination the thing, you're giving people what they want. And from an industry view, give people what they want, that will be incredible.
[DANIEL] I think one thing customers have actually been coming to us with is this idea ofCOVID-proof experiences. And I think it's like re-imagining what it will look like with longer queues. Obviously, people trying to be involved from their homes and stuff like that. It's almost safe to say that this is acting as almost an accelerator for people in forcing them to think in new and creative ways about these projects.
[WILL] We've seen, as well as acceleration, a move towards more democratization. We're getting a lot of entertainment brands and entertainment companies, a lot of performance companies. . . realizing that they were being really exclusive. They were going,"Hold on a minute. We've got fans in rural areas, or disadvantaged areas who have never been able to see any of the things that we've made. Why on earth are we not making moves to help that? Where's our social responsibility? What can we do? ''Not just Covid-proofing it, but also socially engaging much more, which can only be a positive thing.
[DAN] In my role, I have a unique and fortunate role that I get to see a lot of different AR experiences. I also try to keep a pulse on the industry, and the market, downloading various apps, testing them out. And one thing I find is that they're not always what we call natural AR experiences. They don't always feel like they were designed or built around my small San Francisco apartment, or the office or things like that. Can you tell us a bit about what does it mean from an authoring standpoint to build these more natural AR experiences? How does it feel like it's integrated into your environment when you're actually using the application?
[WILL] The reason why you haven't had many of those is because they didn't exist. We were always stuck with,"Cool, I want AR. What can I put for somebody? ''There wasn't the ability to create a natural experience. It becomes one-dimensional. If you want nature, you need intelligence. You need to be aware of who you are, where you are, what you're doing. Your cat is a natural thing, because when you put it on the table, it doesn't fall through it. It doesn't run into a wall, mostly. It does this because it knows the environment. If you're building experience in AR, you want it to feel natural, which is what you will want because you will want to engage with it. You will need it to be aware of its surroundings. If you want interactive, natural-feeling AT, the central conceit is:make it intelligent.
[DAN] And this is, obviously, we're doing a bit of leading herein Unity MARS, and what's possible with that, and we'll dive even more into it, but can you talk a bit about, obviously, there's different devices out there, each of them have some different technologies. There's also just different individual users, and this idea of accessibility and how you get the users integrated and engaging with the content.
[WILL] Obviously, first, devices. Devices are going to be super duper broad. Unity has always had AR Foundation as a way of unifying between Apple and Android, and that's great. In terms of can you use it. Hardware benchmarking means that we are more able to have more things on our devices up more of the time. Technology will continue to push forward. We've seen consistently, it becomes more democratized. We can almost say,"Yes, device capability, it can be handled."It's one of those questions which is a given, as long as you approach it right. The really exciting stuff in natural is that individual users, not just the,"I live in a flat in San Francisco, I live in a hut inBasingstoke, I live in. . .", wherever you're living, that variation, you can deal with that because you're interpreting that world space. The really cool stuff for the individual. Why am I different to you? How am I getting to interact with it? What abilities do I have for disabilities do I have? We are in an era when neuro-divergence is a central thing. . . or at least an awareness of it is a central thing for most educated societies. It's something that we build in asa team within the industry. But also, when it comes to our consumers. How do they differ? You can go quite neurological, you can go quite complex, and quite psychologically, you can look at what does a person do when they interact in terms of how they learn from the thing; you can create feedback loops and machine learning to support them. Can you identify risk-based behavior? If you're creating something where you have loot boxes, where you have. . . anything which has any sort of gambling component, can you identify people who are at risk and who need support in terms of their behavior. We actually work with a lab who are neurological specialists in terms of identifying those toolkit for highlighting at-risk people. Or if you want to be more direct with a very specific known disability, if you have dyslexia, and you have an experience based upon reading, or you have text, why should you not support that? When you can when it comes to digital, because your camera comes to ART because you can handle it well. That was a very specific things that we wanted to include into Dr. Seuss' ABC because originally it's a book. It's designed to help you learn to read. And if you're there, teaching your child to read, and you are dyslexic, why should we not have a toolkit inside there to help you best be able to do that. All of the text inside it switches to a dyslexic-friendly font. It will not solve the problem, but it makes it damn sight easier, and that's worth doing as far as we're concerned. I was slightly rambling, but it's one of those thingsI get really passionate about because we are moving away from entertainment being somethingI'm giving to you and telling you, to being something that we can share together that embraces the fact that we are inherently different.
[DAN] Yeah, I think it's something, as you mentioned, to be conscious of, and realize there's different levels, different approaches, but you can make those. And realizing, too, that every user experience with this AR content, especially, as it gets more intelligent, is unique. If you can think about those and take those into account, I think that's cool.
[DANIEL] I'm constantly talking to people every day. A lot of times, people who are not as technical, that have idea for augmented reality, any sort of interactive application. What advice do you have for them to get started if they don't have a technical background, they want to bring their story to life?
[WILL] It's an interesting one because I'm lucky enough to work with technologists, and whenI don't get something, I can poke them and say,"What am I missing? "But the more you work with people who are able to handle it, the more that question goes out the window, the more it's replaced by something more central, which is: what do you want to imagine? What can you have that you want to turn into reality? If you start with an idea, so long as that idea is consistent to your brand, and so long as it will be meaningful for your consumers, it will be worth doing. There is a point that comes before the technology there. . . it never fails to amuse when a Fiffer-feffer-feff starts wagging his head. You have the question of how does it fit? With Dr. Seuss, things you're seeing on screen are showing the original instructions. Arguably, Dr. Seuss is the most important illustrator we've had in the last hundred years. His work is defining for so many children, and so many adults as a result. And if you want to take that content and put it into AR, what you mustn't lose is the nature of that illustration, the nature of the animation that was on the page. Now, you are going digital, you are going 3D, that doesn't preclude it being true to its original nature and vision. To give a specific example, what you can see here is how we've created custom models, custom shaders, custom animations, to replicate the drawn feel. It's not the same. But it is in keeping. It has the same kind of soul. And if you, as a brand, can imagine a thing, and are committed to keeping that, and work with creative technologists, and work with Unity, you can figure out the best way to transform it into these new environments. Once you have that, you have to start a project.
[DAN] I will say, for me, one of the most impressive things was this is not just a 3D model. There's obviously some unique shaders going on, it's got that illustrative feel with some of the different textures and things like that. I thought that was something when I first saw it, it was the most impressive, and obviously, visually stands out right away.
[WILL] It would obviously differ for every brand, for every IP, for every thing. So long as you embody the thing that makes whatever you do, you, then it's the thing that's going to shine through, and it's the thing, again, it goes back to that question, natural. It's not just natural in terms of does it fit the real world. It's natural in terms of is that imagined thing true to our imagination? It doesn't matter what it is, in this case, it was Seuss. But if we had. . . Terry Pratchett's tentacled monstrosities from dungeon dimension. If we had that coming out of a wall into our kitchen, and it felt kind of low poly, it wouldn't be right because it has to feel sticky and oozy. Because that's true to how you imagine it. That naturalness become an imagined thing.
[DANIEL] I think from our side, my team in particular, loves working with customers on crazy ideas, innovative ideas, projects like these, and helping to bring them to life. Can you walk us through the project life-style, life-cycle of this application and how that looks?
[WILL] You're right, with yourFreudian slip, it's a life-style. You've got to approach it as a mindset thing. You've got up on the screen ferociously inaccurate, but kind of right depiction of how we tend to do things at Sugar. It's all based around having an audience, understanding them, then understanding experience, then creating designs from it that enable you to develop and then delivering it. To break that down, we start by working with a brand, and going,"Who are your audience? What do they want and how do they currently engaged with whatever your IP is? "In the case of Dr. Seuss, we know their audience is phenomenally broad. One of the things I became aware of in this project is how important Oh, the Places You'll Go! is to university graduates in the US. It is an integral part of graduation. You have parents who were taught using Seuss by their parents, who are now teaching it to their children. That sense of continuation, about audience engagement. Understanding that then lets you build and understand what sort of experience you can create, what sort of expectations they have, what sort of things motivate them. Also what perceptions they have, but also possibly more important, perceptions, preconceptions. If we're saying we will build an AR or digital output of any sort, does the audience demographic for the people that love your brand have any preconceptions about what that should or shouldn't be, has there been anything created before that can be utilized or built upon. Once you have all of that, and part of that is your idea, your imagined thing, I want to turn this thing into something that is the size of a building, I want to bring it to life in ways like never before, you can start to work out that design. What is your content, what is its messaging, in this case, how does it work? And then what is the user experience. Now that design, we've got it as a kind of experience going through design, actually, it's a fluid, iterative, cyclical thing. Your user experience is the thing that is: "What do they feel? How do they go through a process? What is the nature of it? "And it's that that forms the basis of what you then develop. That development is a standard industry, cyclical, iterative thing whereby if you create an imagined space, you build it, you author it, you test it, you look at feedback, and then you iterate again based upon that feedback. That feedback there is very much in the develop box, and we do internal feedback. But to every brand, especially entertainment brands, feedback is and should be part of every key stakeholder, and every loyal brand member. That, especially with Seuss, included children, included loyal fans, included the senior management team because everyone has to understand and enjoy it. You then have delivery, you have the packaging of it. Importantly, the communication of it. When it comes to AR, and indeed, any new technology, any immersive technology, if it's new, and you don't communicate what it is, you fall out of preconceptions because people won't understand what it is that they're doing, and they won't have correct expectations. Here you see that it does feed back into itself. The last thing, and possibly the most important thing for feeding back into the process is listening. Whatever you create, listen to the outcomes. You will not get it right 100%, you will inspire people. Some of the most incredible things in terms of projects we're doing next, is by that listening. One of the things that came out of the MR porter we built was we listened to feedback from teachers, and they went,"Yeah, this is great. But what if I can't walk around? What if I need to put something in front of all our class? "We're currently working on a new project whereby large bacteria are able to be placed in this instance, it was specifically to help children who obviously had a psychological effect from COVID-19, understand, and thus demystify, what a virus is and what it does. Listening to that feedback forms the whole process and connects that loop at the end.
[DAN] Obviously, many complexities in the whole process in general, but I appreciate you going in and giving us a lot of those tidbits around each stage.
[WILL] It's one of those things where I could talk for days. It's an integral part of how you build creatively. The development was the bit that I glossed over the most in part because I'm talking to massive Unity experts. I don't want to overstep my bounds. You guys have created the thing that we use. We use it to create interesting things ourselves. You're probably the best place to be able to go,"Okay, we intended you as developers to be able to use itin the following ways."Could you break down MARS? A little bit. Almost to see if it fits with what our expectations are.
[DAN] Yeah, and I think there are many parts that go into being a successful creative studio, and at Unity, as the platform holders, we're deeply focused on that develop stage with our platform and our product. But obviously, also happy to help out in the early creative stages, exploration of the technology. One of the big idea with Unity MARS, and as we work with you, Will, and Sugar Creative, was how do we take this development cycle, how do we break it down and provide specific tools and features that help out with that. When we're looking at Unity MARS, it's an editor tool within the Unity Editor. It adds additional functionality. You're looking at things like specifically authoring within the simulated environment. Getting this what you see is what you get. You're able to test it without deploying it to device. You're able to test out this AR logic, which for me, I think was one of the most exciting things, is I can start to say,"Okay, first I place this thing here. And then, I find another plane."What happens? How do I continue to place content, how do I author it, then from there, just getting feedback and being able to iterate more quickly. I think that's always been one of the key drivers of MARS, is how do we take this whole cycleon development which we know can be long, can be tedious. How do we speed it up, how do we provide different things like the simulator, like the device view, like the different rules and logic for how content is spawned. I begin thinking about this procedural content. You can continue to author for different environments. I think to me, that's what gives these applications and using Unity MARS a lot more of this natural, intelligent feeling, which I think is exciting as we see more apps continue to build on top of them, and come out like the ABC Dr. Seuss experience.
[WILL] With those points, I think one of the things that we found, I can kind of cheat here, I know I'm going to dive in some of the more intelligent bits later on. But one of the things that I won't pick up on enough is the fact that that speeding up has a massive benefit. Just to make that tangible for, putting my creative director hat on for a second, if you area director of a studio currently listening to this, the tangible benefit was that 3/5 of our iteration time was completely cut from the project. Now, that did not impact the creative time, the development time. That was just do I have to put it onto a phone and walk around the room. No. Amazing. That is a huge benefit. And that, if anything, rather than saying it didn't hurt the creative time, it massively reinforced it. You had all of the creative team able to look and experience something as it was being modeled. It was paper prototyping on speed, it was incredible. It was that ability to explore creativity an experience that had a massive knock-on effect. I think, massive shout out to the Unity MARS team who both supported us to inspire us, and also to create a thing we have found genuinely and tangibly useful.
[DAN] It gives you the opportunity to almost hit the ground running. It allows for a bit more creative freedom by doing some of the heavy lifting for you. There's not these specific things you have to test, or think about, it's we're providing that platform or tool to help out and assist you, and potentially, give you more time for creative, or more time for some of those explorations.
[WILL] Putting my developer hat on, the ability to have live debugging in visual scene,
[EXHALE] so nice!You're not scanning through error messages, you literally just got a pop-up that goes,"Yeah, this thing's broken." Amazing. It has a cool tangible benefit. If you're watching from a studio, give it a try, you'll be surprised.
"[DAN] Let's hop in and let's see some real use cases. Let's see how this is utilized basically, within the experience itself. We know that you worked here on the Dr. Seuss application, and it's thinking about and talking about how does this actually work, what does this look like on the device or on the app itself? Let's go ahead and watch
this video quickly, and we'll talk through it."
[WILL] What you are seeing on your screen right now, on the left-hand side is in Unity, with Unity MARS in Simulation view. This is where the experience was authored. And on the right-hand side, you have some footage from the new Playground mode that will be going live as soon as it goes through the App Store review processes. It's going through that at the minute. It knows your world. It's what we've been talking about in terms of natural AR. It knows the world, it knows what a floor is, it knows where the table is, it knows how big these things are. Then it generates content in response to it. On the floors, it will generate grass. On the tables, it will generate suitable objects. We have authored that, we've authored it with a logic, we've set walls to display pictures, we've set characters then to know to walk on the floors. And interestingly, to know if it is a bridge between a floor and something higher, so a table or a wall, I know I can't walk on a wall, but I can walk on a table. If "I can jump that" in parameters for "Can I jump that far? ", and in this case, the Icabod, you tell him to jump up onto the table, he'll run up to it, go,"Yeah, I can do that, leap up."
[DAN] I just want to be clear, on the left, we're actually seeing, it almost looks like the real experience, but you're in a gray world, so that's the Simulator view
[DAN] or the Device view.
[WILL] It's a horrific gray bedroom
[LAUGHTER] and it's one of many gray bedrooms. You have a collection of gray soulless environments that range from tiny bedrooms to massive warehouses, in part, because you know you're going to have to adjust your experience. If you didn't have that, you couldn't make a dynamic, reactive experience without having all these spaces. We've got a couple of different rooms in the studio, but I'm going to struggle to replicate a child's bedroom. Certainly, on demand in the development cycle. That is really handy in and above itself.
[DAN] It looks like it's this hide and seek game that we have going on here. Using the environmental bit with that.
[WILL] Obviously, some of this was inspired by the cool things we can now do. Hide and seek is great, hide and seek is one of the most natural things for children, or animals play. You can see lemurs doing it in the zoo, it's great. Your content knows where it is, what the real world is, and in this case, the Ichabod goes,"I'd like to play hide and seek."And if you go,"Yeah, I'd like to play with you,"you see it conveniently come up timed. You can't peek, and he will run off based on the real world, hiding behind objects, and then encouraging you to go find him. And to find him, it's not just looking around, you physically have to move around the world space, your real world because occlusion is a thing. Occlusion just means for those of you who are less tech language, that when you have a thing in front of a thing, the thing behind it is hidden. Normally, you only do this with digital content. MARS let us do it with real world content. You can see on the right-hand side, the tree that's behind the table is behind the table. You might go,"Of course it is,"but that's what we mean by natural, that of course it is, but until now that hasn't been viable.
[DAN] Yeah, go ahead. I was just saying for me, it's likeI think it's so impressive that we're able to develop this entire application. What we're seeing obviously it's a different experience than what we're seeing on the left hand on the right, but it's actually the same experience. It will be a bit unique every time. But I think that's one of the most exciting parts of it. You're talking about these things like occlusion, those are not out-of-the-box features. But in Unity MARS, those are out-of-the-box features. But when you're talking about just mobile augmented reality, this is where we're giving tool sets, we're giving features, we're giving these additional functionalities, so then all of your AR applications are more unique, more intelligent, more alive, whichI think is exciting.
[WILL] It does require a mindset shift. Most creators are very used to creating a thing that they know how it will be experienced. You're not doing that anymore if you're creating like this, you're saying,"Okay, what is the world I'm building? How are you going to experience it? "We'll have a lot of people talking about creatives in terms of Dungeons And Dragons' DMs. You're not prescribing what every person does. You're being aware of what they might do, guiding intelligently, and allowing for that variation. The result of this is absolutely magical because you will have those moments whereby you can say,"Mine Did this. What did yours do? "You don't have that with a film or a book because they are consistent. Here you can embrace that individual variation creatively, and it is the toolkit that is helping do that. I can talk for probably hours on some of the details around this. I suppose one thing that we haven't mentioned is the painting; the paintingI find really fun. If you imagine you're a kid, what would you want to be able to do? You want to be able to paint the walls, you want to be able etch-a-sketch, you want to run around and make footprints. We've identified that, we talked to children about it, and we went,"Okay, we can make this happen."So we make clouds spawn in relation to flat surfaces, so intelligently understanding terrain, and then make it rain paint. The paint will land on those surfaces. And then the characters, so they're real-world surfaces at this point. And then the characters are able to move around, and when they do, they make footprints through it. Now, again, it sounds quite simple. And it is quite simple because it's designed for very young children. But it's one that's quite magical. And we can get businessy, or very tacky about it, and forget that something which is magical is ageless, and it is the fulfillment of imagination. Alice in Wonderland resonates with everybody because the idea of stepping into something imagined, and it being real for you is something that will resonate. That's the toolkit, and those are the tools that enable us to do this effectively, and that's the thing that we've had enormous fun exploring when it comes to Dr. Seuss.
[DAN] Very cool. It's awesome to see this sneak peek of what's coming, and how. . .
[WILL] should be a date, hopefully, bear in mind the review process.
[DAN] To me, it's like, obviously, you're using all the different parts of Unity MARS, and building these, did you say,"Unique, but simple, but magical,"but the simplicity is on the experience side. It's not necessarily on the complexity side on the backend, but Unity MARS provides the tools that there is a lot of this smart logic happening a bit behind the scenes, which I think is beneficial, and allows for these more interesting experiences.
[WILL] The idea that something has to feel complex on the surface was debunked by design years and years ago. The best designs are the design you don't notice. You don't notice something performing naturally, performing well. And by putting more intelligence into the backend by doing this, you're able to create both uniqueness, but also that naturalness so it seems right. We've only scratched the surface of it. I'd love to sit here and go,"Yeah, we've built a huge story."But we haven't yet. This is just getting going. Ten years from now, we'll look back and go,"Wow, that's how it started,"as you walk through our imagined world space with the effortlessly personal, personable and unique story because this is where it starts, this is what the future will be.
[DANIEL] Unity MARS, it's relatively new on the scene, only out for a couple of months now to the general public. Will and Dan, you guys have had some time to dive into the tool. What sort of advice do you have for people that are considering using it or thinking about beginning to dive in?
[WILL] In terms of practical tips, the ability to rapidly iterate with MARSis the thing that lets you test quickly, deploy even faster, and then accelerate through that loop. You're Not limited by what you can imagine to build. This is where WYSIWYG is vital. So that, from a creative studio angle is probably the most important technical and practical point. If you can get it to people's hands, and you can get them testing and engaging with it, one thing we always do is put someone, put a portal in front of somebody, let them walk through into a new world, watch them jiggle like schoolgirls, and go,"This is amazing."Once you've got that, you're there. The rest of it is then imagination and making sure you've got people backing you up who have the technical know-how to make it happen.
[DAN] And what does WYSIWYG stand for, for people who might now know?
[WILL] That's bad on my part."What You See Is What You Get."You're able to deploy stuff, and then see it inside, obviously, with AR, that's a unique problem because you have to see inside the real world. That's where the UnityMARS toolkit comes in.
[DAN] I think from my perspective, my advice would be check it out. There's a 45-day free trial. You could go and basically try out and start to understand what Unity MARS provides. There's lot of different kind of layers and features. We also ship with a handful of templates. There's a game template, there's a walk-through or instruction template, to start to utilize and get you up to speed even quicker. I think, for me, it's exciting, and able to hop in there, start testing out, bring in my own content, kind of understand. At the end of the day, Unity MARSis an additional framework, but we're providing a lot of the tools and a lot of the kind of. . . let's say like templates, and methods and rules to enable a lot of these experiences. I always encourage people to just dive in, try it out. There is some templates, there's some videos, there's some documentation, there's a lot of content out there to get up and running, and then, from there, I think that's where it becomes exciting. You start to understand some of the possibilities, or start to see some of the things that can be created in the future. Then you just get that head explode moment of I can do anything, and there's so many fun things. . .
[WILL] That's what happened with us. The folks at Unity reached out to us and said,"Let us show you what the possibilities are."And we took the demo Scenes That were supplied and put it in the hands of everybody from the developers to the creatives to the illustrators. And they were hammering on my door, going,"Yeah, we need this, we need it in our lives, and we need it now."And so we then had that buy-internally in the belief that it will have to be something meaningful. Those out of the box demos, they were the things that we started with, and the option we showed clients. They are polished, they're good enough to go,"Let me explain visually this thing that's difficult to explain verbally."Invaluable base assets.
[DAN] For sure. So. . . I was going to say so, we've been watching the Q&A, we've been checking out the chat. Lots of excitement, especially when we showed the actual app there, and how it's taking advantage of things. Of course, thanks, everyone, for joining. We're not going anywhere. We see lots of the Q&A questions. We'll start knocking off some of those. And feel free to ask more questions. If some of our answers spawn more questions, we're happy to hang out and try to answer all of them. I want to start with the first one, and Daniel, I think you have some insight into this; Will, obviously, you do too. What are we seeing in terms of specific numbers that the benefit of AugmentedReality have brought? How has it specifically increased retention? This question's kind of,"We know it's catchy."Are there specific numbers? I know Unity, we have some kind of white papers, and different things like that. Can you guys look at or try to narrow income of the key benefits that we're seeing here from FAR? We've worked recently with a university that's involved in the Audience Of the Future Programme, and they published a white paper in conjunction with uson user engagement and using numbers, if anyone's interested in that. I can find it, should be all over my social media as well. But we'll make sure that's shared. That's got some nice numbers. Obviously, some of the projects that we work on, we can't go into numbers because our partners wouldn't like us to. But there are things that we can share on a more one-to-one basis in terms of justifications where you go into a lot of detail. I would imagine, Dan, you've probably got better broad industry numbers to share.
[DANIEL] At the top of my head, I know that Pokemon GO has generated $3. 5 billion for one of our great partners.
[WILL] That's alright!
[DANIEL] Which is pretty decent. I think that's the question that we always get, especially in more traditional industries, on the manufacturing side, in architecture, engineering, and construction, people are like,"What are the actual business benefits? "Besides engaging or being that fad sort of tool. And I think it's something that we're obviously finding out more and more everyday. There's people that are pushing the boundaries of these sorts of applications. Using them to drive engagement, but then, at stadiums, location-based sort of areas, too, that is pushing people to become more involved and spend more money at the different whatever, that sort of application is tied to. It's really up to you, and like I always say, there's so many different avenues you can take as far as driving real engagement, having people actually spend more money, whatever that is with these applications. Possibilities are endless.
[WILL] One tangible example that springs to mind that might be quite useful here is an AR holographic training tool we built for a client. That used branching narratives and footage which was turned into holograms of real people to simulate one-on-one training sessions. This was normally done in a centralized location. This was able to be deployed globally and simultaneously. Instead of having to move the people around, you were able to move the experience to the people. This saved significant cost in terms of transport, significant cost in terms of man hours, and significant benefit in terms of environment and travel, and particularly relevant currently in terms of being able to do personalized training even when you're remote. Although I can't give you exact numbers, obviously, you can extrapolate if you don't have to take five people onto locations all over the world, you very quickly wrap up a very sizable and tangible benefit.
[DAN] I just want to briefly mention that Unity itself is doing a bit more research into this. We're going to be publishing something along the lines of engagement and ROI. That's in the coming months. Stay tuned on that front as well. Next question we have, Will, I think this is more directed to you. It's looking at the earlier the pitch stage of a project. How do you manage the client's imagination? With AR and VR technologies, there's this big change. People maybe see things that are completely pre-rendered, or not even fully there yet. How do you manage that, how do you approach creating that vertical slice or prototype, and then present that because a 2D video or a still image might not always convey what's possible with the technology.
[WILL] It is by its nature a thing that you get most from when you do it. If I was to tell you about a doorway you could step through, you might go,"Oh, kind of nice."If I draw a picture of it, you go,"Oh, that looks fun," but if you do it, you are blown away. How do you do that? We touched on some of it without-of-the-box resources from MARS, or we're lucky at Sugar because we had some time to do some R&D, so we just go,"Here's one we made earlier,"pop on
[Blue Peterborough]. You have to put it into people's hands. In terms of making a slice and convincing a client, so long as you're honest, we found most clients, once they have that imagination buy-in, believe in a product enough to create a vertical slice. And a vertical slice, we're not talking inordinate, amounts of money as well. That iteration speed things that we talked about earlier, that makes it much more viable in terms of industry and studio costs.
[DAN] Makes a lot of sense. The next question is,"How do you approach AR and VR for clients who are less technical? "Maybe they don't have a full understanding or experience, or haven't tried out the technology fully. Put technology in their hands. Put them in a headset, the number of people that I catch myself doing either pitches or initial discussions, go,"Sorry, have you tried any of this before? "And they go,"Funny You should ask. No."Like,"Okay, right. This whole thing stops right now,"and I start unboxing a VR headset, and they go,"What, now? "Like yeah, otherwise you have no frame of reference. If I was to describe this, there's some amazing early illustrations from Renaissance era France, where you have animals that were described by explorers to artists. And the things that come out of it don't look like the animal rather unsurprisingly, it's Chinese whispers. Until you let people experience something that isn't the same, but at least gives them a frame of reference, it will be really difficult. We always use the rule of put them in the thing, doesn't matter if it's AR, VR, MR, just get them doing it.
[DANIEL] I think it's important to note that Unity, we can help you at any point of a project. I mentioned earlier, even in the ideation, proof of concept stage, we have resources internally that can assist you, and to develop something that can show people who aren't very adept in AR and VR, what the possibilities are. We're here to help, and that's what I always like to tell people, and so, feel free to reach out.
[DAN] Yeah, for sure. The next question, I think, Will, was directed at you, obviously, the star of the show here. When we're looking at AR experiences, and I certainly have some experience myself with this, the engagement, or the use, or the time in AR can be quite short. You're not looking at some of these engagements that you get on social media, or other applications. Do you think this is a challenge to make. . . or do you think it's worth trying to make AR experiences longer? And how do you keep engagement? How do you keep people excited and engaging with the AR content longer? I guess is the question there.
[WILL] It makes sense, I mean. . . when you're reading a good book, you don't notice how long it is; watching a good film, you don't notice how long it is. If you're experiencing good AR content, you won't notice how long it is. And the length will be a natural product of the creative and imagination process. It's a thing that will be enabled by having more intelligent content. If you have "I place the thing,"you can poke it, you're inevitably going to end up with a short experience, because the story you tell with that is rather short. It's up there with"Spot lost his ball, where's Spot's ball? "There it is. You're not going to have a long experience. As we have more intelligent ARs, we have more playful ARs, we have more deep AR. We're able to build larger, more complex stories that aren't defined by being long, but that can be. I think we aren't planning to set out to make longer things. But we know full well that we will make longer experiences. That's the point we're at with it.
[DAN] Yeah, that makes sense. The next question is a bit technical, around AR and shadows. I will take this one myself. The question is basically, something, and this is a clear part of the ABC Dr. Seuss app, is when you place content on these physical surfaces, you want a transparent plane or quad underneath them to cast shadows. Everything in the real world with light is casting shadows, and that's helpful for grounding, and understanding where the depth is of certain objects. The question was around, they say, LWRP, which is the LightweightRender Pipeline. That's evolved and is now the Universal Render Pipeline. The question is,"How do I get shadows, or is there any content available for shadows using the UniversalRender Pipeline? "I will say yes. I've recently been working on a lot of different shaders specifically, for augmented reality. Right now, there is a foundation-demos repository that is completely open source and available on the Unity-TechnologiesGitHub page. There is a shadow shader in there right now, but it's not fully documented and complete. You could go and check that out. I will be working on that later today and tomorrow to officially release that. There will be two different types of shadow shaders available using the UniversalRender Pipeline. That's just built using AR Foundation, but because it works in AR Foundation, because it works in Unity, it can also work in Unity MARS. The two different types of shaders are a hard shadow, and then another one that uses stochastic learning to create a soft-looking shadow based on the distance from the object to the plane. I hope that answers that. Check out the foundation-demos repository for that. Then, the next question is. We've got a nice"Thanks for the webinar."Regarding the Dr. Seuss example, imagine there's a lot of additional challenges considering an AR experience may be targeted to this younger audience. There's obviously things like attention span and different things like that. Can you talk a bit, Will, about what are some of the approaches, or how do you think about building these experiences that might be for a lower age range, let's say?
[WILL] That is whereI glossed over massively in terms of understanding the target age groups. We work with educational, developmental psychologists, as well as my creative background. My background is also in genetics and developmental psychology, so I have a heavy hand in that as well. Making sure you understand the risks, the needs, the behavior of the target group, especially, when it comes to children, super duper important. I can't give a 40-second answer to that one because it opens up a massive floodgate of something that I have a propensity to talk way too much about. But yes, you do have to have specific considerations, but no more specific than any other target group, and your best friend there are specific psychologists.
[DAN] Awesome. This is one, I think we should all give a bit of an opinion piece on here. There's no right answer. We will not say any secret information, but where do we think that the future of AR would be? Do we think it's on the software side or on the hardware side? Or maybe a bit in the middle?
[WILL] You won't have without the hardware. As hardware improves, the data that you're harnessing will improve. There are new components to understanding the environment based on scanning, the LiDAR scan, technology like that. Creates a richer point cloud. That means you can have a more detailed understanding. In terms of the software, I think that will be where. . . most of the innovation lies now. We've got the ability to drag data into pieces of software. I think the other bit alongside that, is it software, is it hardware, or is it creative technology? I think there will be a massive role to play as more people start experimenting with it. We will see some interesting questions. I attended a few panels recently and talked about the potential for misuse as well, I think it will be important the industry takes that seriously and doesn't make that an unrestricted environment to have creativity that is meaningful. It means we need to be aware of what it's impact might be. I think the simple answer is, all of it. From our point of view, from a creative studio angle.
[DANIEL] Just to chime in on that, as things become more accessible, as headsets get smaller, obviously, as more people have access to more powerful mobile devices and stuff like that, as well as the addition of things like 5G, those will just expand the uses of AR and be able to have companies leverage those sorts of things, all the better. It is a combination of both, I think, and they will work in tandem together. That's what Unity does so well, with the different hardware providers and that sort of thing. We are just scratching the surface, and it's still, even though has been around forever, or XR, whatever you want to call it, we are just scratching the surface as far as accessibility goes, and everyone is just beginning to drive across all the different industries we touched on today.
[DAN] Yeah, for sure. I think it's really exciting for me as a developer working at Unity. A lot of our partners, I don't want to say they're battling, but they're releasing and continuing to iterate on their platforms, on their content. I feel like I'm the winner in all of it. There's lots of cool new innovation, there's new hardware, there's new software. For me, it's just fun and exciting to continue to see how it improves, take advantage of it, use the new features, things like that. There's a couple more technical questions that I will knock out here. One was around,"Is there a way to detect light sources in a room, in the environment, and then utilize that for shadow casting? "The ARCore platform, which is enabled through AR Foundation, and available in Unity MARS, has the ability to rotate directional light. You can make this a real-time light in Unity based on the environmental lighting. As they call this, it's part of the light estimation. It's basically looking at, understanding the different light sources in the environment, then allowing you to rotate a light which will then cast shadows, emulate what it's like in the real world. There's also things like brightness and stuff like that. It is possible, it's a feature available on the ARCore platform. It does add that next level of realism. I'm also seeing some people who go pretty far into looking at things like GPS coordinates, time of day. If you're doing an outdoor experience, the sun position is a known mathematical formula at a certain time of day, so you can utilize that the position and rotate your light at the same angle as the sun would be. The next one was around talking about augmented reality on mobile, talking about things like what we call image markers. It's referencing specifically, for QR codes. I will say that is a feature available on ARCore and ARKit, surfaced through ART Foundation, and then in Unity MARS, it actually takes it a step further and creates this easy authoring environment. What you can do is create this image marker proxy, and then any content you drag underneath it, once the image marker is recognized, then it will just show the content on it. Now there are certain limitations were the platforms themselves want larger images, and you also want a certain level of contrast. You don't necessarily want a big blank image or things like that. But that is available, Unity MARS, the image marker authoring experience is one of the most streamlined I've seen. Highly recommend checking that out when exploring some of that. Then the last question we have now is "What does the process look like for getting your Unity project to devices like HoloLens? And is MARS required? "MARS is not required. We're doing some work on our side to support more of the head-mounted platforms in Unity MARS. Right now, MARS is built or utilizes AR Foundation. That's the underlying layer that Unity uses when surfacing the different AR platforms in Unity. For something specific to HoloLens, you're looking at the UWP platform. That's the Microsoft platform, and it's just a build target within Unity. There's lots of documentation from Microsoft. A large majority of HoloLens applications are made with Unity, so I recommend checking that out. And you can utilize MARS, or we're doing some work to MARS will be enabled on the HoloLens as well.
[WILL] If you are not somebody technical who asked that question, the studio level answer to that is it's really easy. If you have a Unity project, that all sounded quite daunting. The build target just means you press Go to deploy it to that. As long as you build the right type of experience, it just works. It's one of the things that we have clients talk about in terms of,"Oh it could be on VR, as well, could it? "And it's like,"Yeah, wow,"that shows that there is this lack of understanding that physics engine, like Unity, can just deploy to these different outputs as long as you've engineered it right. Sorry, Dan, I didn't want to steal your thunder there, awesome, beautiful, and very technical, and I had this sense that maybe it wasn't a technical person's answer. Awesome, I think it's always helpful for that. I think that's all the questions we have today. I want to say thanks a lot for everyone coming. There's lots of different resources, as well as videos, FAQs. The developers are active on the forums, there's blogs. Definitely check out the Unity MARS product page for some additional resources there.
[DANIEL] Yeah, I think, as I Mentioned, from the project side, if anyone in the audience is thinking about an AR Project, feel free to reach out to me. I posted my name and email in the chat. Add me on LinkedIn, whatever you want to do, whatever is the easiest way to reach out to me. Any customer or potential project that I talk to is valuable. Feel free to reach out, and I look forward to hearing about some of the cool projects you guys are creating in Unity and MARS.
[DAN] Any final thoughts, Will?
[WILL] Obviously, reach out if there's any questions. It's always great to hear from people. I think for a final thought, without my creative director's hat on, just as somebody who makes creative things, just envisages the possibilities, just have an imagination here. We've got the technologies to make it happen now. Come with some ideas, and let that be your guide to what you're making. And you can make some very fun, very interesting things.
[DAN] Awesome. Thanks, again, for everyone for coming. This will be recorded and available later. Definitely reach out, lots of resources out there, and we're excited to continue to see what other people created with Unity MARS.
[WILL] Great, thanks, everyone.
[DANIEL] Bye, everybody. Thank you.