Blend into virtual reality with Unreal Engine - Part I | Tiago Sagban | Skillshare

Blend into virtual reality with Unreal Engine - Part I

Tiago Sagban, Virtual Reality Architect

Play Speed
  • 0.5x
  • 1x (Normal)
  • 1.25x
  • 1.5x
  • 2x
12 Lessons (1h 7m)
    • 1. Introduction

      1:10
    • 2. Definition

      3:47
    • 3. History

      3:00
    • 4. Modern Devices Autopsy

      17:03
    • 5. Hardware Ecosystem

      8:13
    • 6. Software Ecosystem

      3:24
    • 7. Challenges

      3:45
    • 8. Download and install Unreal Engine

      4:17
    • 9. Configure Unreal Engine for Oculus Rift

      1:55
    • 10. Configure Unreal Engine for Gear VR

      4:33
    • 11. Find your way around Unreal Engine

      3:08
    • 12. Tweak virtual reality rendering

      12:54

About This Class

Build a strong foundation in the world of virtual reality, and get to know Unreal Engine as a tool to design virtual worlds. No background required !

Transcripts

1. Introduction: welcome aboard Blend into which, already, with unreal Engine, this is Part one, which covers a full virtual reality overview, together with a section on how to start developing for virtual reality using 100 engine first Z. Virtual reality overview is a theory versus practice part theory, because off course, we'll see the basic so virtuality its definition story, acronyms and so on and practice because you will perform a modern virtual reality devices autopsy together with a review of the hardware and software ecosystem. The good thing is, there is no required background to understand this content. So if you're just curious about virtual reality or if you are a senor developer, while it doesn't matter, you will be able to follow at the same pace. And the same thing goes on for the second part off this course, which is the first day as a developer section in which we will cover everything from the nodding under engine toe, tweaking its virtual reality rendering, so don't hesitate to jump in even if you don't know what 100 engine is. And even if you don't have a virtual reality device with you as everything is covered from scratch, 2. Definition: hello guys would come aboard blend into virtual reality, Part one in his first video, We're going to define virtual reality. First thing first. It's a computer generated environment, pretty much like any video games that you're able to explore and interact ways. But most importantly, virtual reality is to simulate physical presence. So whether we want to be a pilot in the cockpit or just walk down the street VR is to give you this compelling feeling off presence. Off course. Such a thing is impossible without dedicated hardware that connects with your senses. And when I say connect, I mean taking over what's fed to your senses. First in line is site and white. It might sound city. It's done by putting a screen right in front off one's face off course. It's not that simple, as we will see in another video, but it gave birth to lots off funny devices. For instance, this cardboard viewer, which uses one smartphone, two lenses and a couple frame toe feed treaty images next is hearing, and nothing else is on. Good old headphones will play treaty audio toe ones. Here's but to generate pertinent Trudy data to both eyes and ears, the sense off posture has to be taken into account. The first component off posture is balance, which is a precise rotation and accelerations and so located in years, and it pretty much works like her spirit level. A spirit level used one or more cylinders field with freed but with an air bubble trapped in it. So when the cylinder is stilted, the bubble will move in relation to the tears angle. But it's not only about tilting, the bubble will also move, depending on accelerations. As a result, head movements can be measured by the brain. Next component off posture is proprietor exception. In short, it means that the human brain knows body parts position based on joints and all sorts feedback. Funny thing is, you can try it yourself now. Just close your eyes and shake both your left and right arms in various directions. And when you're ready, just try to clap your hands with your eyes still closed. It should work like a charm next, and final component off posture is touch while up Joe's. What's important to understand is that touch cannot be cheated. Halting an object in a real life will mean holding an object in virtual reality. In the end, why we will see in another video out one door those three posture components that can already tell you that it means using computer vision or devices like glove controllers. So now we know sight hearing and posture, and we know they have to play together to create a nice virtual experience. But there are lots off other senses, and while less important, they have to be remembered. For instance, Thermo exception would have to kick in if you're to go near a campfire in virtual reality. So auxiliary senses former an important group and together recite, hearing and posture, says four groups off senses will be at the core off the relation between a current mind and the current virtual world. In a word, controlling zor senses mean creating of your tour reality experience, and that's where interaction devices have a big role to play. In the next video, we'll see what's the story off these interaction devices, But most importantly, we'll see why it took so long for virtual reality to kick in tow or lives. Well, see you later 3. History: Okay, so in this video, we're going to take a look at the history off. Virtual reality. First at a philosophical level, virtual reality is very old. For instance, for 100 years before Christ Plateau already depicted illusion that were interpreted as reality. In the allegory of the cave, Plateau describes people watching shadows predicted on the war from things passing in front of a fire behind them. But as they can only see the shadows, the shadows become reality. In the end, it shows that the human mind can be tricked easily well, only if the illusion is good enough. Next stop is renaissance. Approximately 500 years ago, painters tried to create treaty by using two D CIR faces and perspective painting techniques. For instance, What you're seeing here is painted on the war oven, Italian or in essence, villa. Another leaper head brings us to your thirties. During those years, literature pisses off work described virtual reality experiences as we might imagine them today. But with all human senses involved. 20 years later, the burning desire to use all senses kicked in again. But this time it was cinematographers trying to push the boundaries of the motion pictures industry, for instance, Censor Rama that you can see here on your screen involves lots off senses, not only sight and hearing, however. It never went mainstream. But 10 years later, virtual reality was back as computer visions projects for the military's. That's back then that the first head mounted displays were creating as an attempt to enhance helicopters cockpits. So the next 20 years, where dedicated to improving artery and self trail for virtual reality. That's why in the eighties space agencies, for instance, had 20 devices capable off multi sensory simulations 10 years later, and it's already time for virtual reality toe enter or living room. As video game devices head mounted, this place can be found for a few hundreds off daughters. However, such devices only involved site, which, as you may remember, from the definition video, is not enough to have a compelling virtual reality experience. That's why we had to wait almost 20 years to see veto reality, rebirth in or living room. We had to wait this long, mainly because hardware and computing power wasn't strong enough at the time. But it is now thanks to the smartphone ecosystem on. We see in details how it works in the next video, where we will perform no autopsy off modern virtual reality devices 4. Modern Devices Autopsy: in his video We're going to do. Is the autopsy off modern virtual reality devices? First, what I mean by modern virtual reality devices is a consumer product some things that you can buy 400 bucks. So the deal is to accommodate the now famous four groups off Senses versus a nexus able price nisi daily use and, of course, a virtual reality experience that Devers first in line are called board devices. And those devices are very cheap and easy to build. They just require a smartphone, two lenses and a cardboard frame. As you can guess, using a smartphone is a really good combination off both screen and computer. So this playing a treaty world where the help off the two lenses will be easy? Well, not that much. The first problem is that a human I behaved like a camera, which means it can focus on it one distance at the time. For instance, on the left image is what you would see if you were to look at the building in the background, you would see the buildings clearly, but the rocks on the foreground will be blurred on the other side. If you were to look at the rocks. You would see them clearly, but the buildings would become blurred in the background. This is what you can see on the right image, So if you want to say something clearly in between 10 centimeters and six meters, you have to focus on it as a side note. Trying to see something below 10 centimeters won't work. It will be brewed. However, the good thing is, trying to focus on something that is more than six meters away is like focusing to something that is six meters away. So, for instance, if you try to look at an object that is 10 meters away and at the same time you try to look at another object that is 20 meters away, you will see both object in focus. Well, it's not important to keep these details in mind, but what's important is to understand or focus work on. The good thing is that you can try it right now. Just close one eye and place your fingers as shown on the picture. Them looking at one finger, then the other sent back to the previous one, will make you observe focused differences. So now If we go back to a cardboard device, we easily understand why there are two lenses without lenses. Placing a smartphone a few centimeters away from the eyes will be completely useless. It will be blurred, but using lenses it will be like the smartphone is about one meter away, so that's good. But that's a fixed distance. What it means is that if I generate a picture on a smartphone, it will always feel like it's one meter away. So it's it's not that convenient, but it's a good middle ground because clearly it's not possible to treat lenses. Depending on the distance of virtual object is to appear out. However, the good thing is that focus as a small impact on how the brain fears treaty insect. To do this, the main mechanism or brain uses is to compare the images produced by the left and the right. Let's say we take her cube a simple cube, and imagine we look at it right now there will be a different perspective. Different cue points in the image between the left and the right eyes that the brain will be able to pick up and reconstitute treaty images. Thanks to that. So even if the focus is fixed at one meters, because off the lenses displaying different point of view for the left and the right eyes will be sufficient to create a treaty image. And the good news is that it works perfectly because if an object is supposed to be near us in, ah, virtual reality, it will appear big on screen, and there will be lots off difference in between the two objects in between. The two views off the off each ice, but even object is far away. It will be much smaller. Is images inherited will be much smaller and with less is very know, little variation in between the two images, so that that's basically what I'm trying to show here on the video. But you can do the test yourself. Just look at an object around your room. Ah, far away. Compare it movie to run and you will see what I mean by it is different point of view on the best way to do this. It is to switch between the left and the right eye on by closing 11 of them, Of course not. At the same time, you cross one, Are you close yours, the one you switching between the two of them and we'll see that the point of you switch a little and that what's create a treaty vision for for your brain. As a final note, the's effect works really great, but we are not born equal when it comes to the distance in between or two eyes. So we have to use what is called inter papillary. Distance off each individual to generate a curate distance between the two images to play to your left and right. I So this is just a software perimeter that has to be configured, and there are lots of way to to do this. But you can see on this picture that there is a direct relationship in between what's displayed on the screen and a new individual i pd so Indian using a smartphone. And then this will work well off course. The focus won't feel as real as in real life, but it won't really kills a treaty effect, and it will be very comparing anyway. While off course, it's not that simple. Otherwise, it would have been perfect. Lenses bring to secondary problems to the table, which are shape and color distortions. What you can see here is a screenshot off what's on a smartphone screen when installed in a cardboard device. What we can see is that the images f kind of, ah, publish shape, and also that the three primary colors red, green and blue get shifted when coming close to the border. And if you got both frames ah, on the world, you'll see your see what I mean by quarter shifting. So to explain the color shifting first, we won't go into physics. But it's just that different. Carter's with take different path when entering plans, so shifting colors on the screen before light can enter. Lance's will compensate for this. Then what's the body shape for is to compensate for the contracting effect off the Lindsey's. For instance, if you want to display agree to a user, it would display kind of a magnified greed on screen. Then the lenses will apply a contraction distortion, and the user will see a perfect greed in the end. So it's the same kind off trick as we saw four color shifting. So that's it for site in virtual reality. What we just so is enough to create a competiting treaty view of the world. But off course we want to be able to navigate through this world, so the view that is generated will have to correspond to the posture in the real world Off course. It's not a necessity to track the entire body, but in order to generate accurate treaty views and in order not to make your sick, at least head movements have to be tracked. There are six ways a body can move in a treaty environment. Trios, um, are called translations. In short, they are treated directions. You may wonder why tree, but the answer is quite simple. Let's imagine you want to tell to a friend where a balloon is located in a room. First, you'll have to give a to D position on the floor on. It's exactly like giving a position off a cell in a table, meaning which road and which column it's located at. So that's two co ordinates. But you would also give hate because of balance can be high up to the sitting, or it can be close to the floor, or maybe in between. So in the end, to give a complete location, you would have to give three coordinates. Now, remember, I told you there were six ways about it. Can move in treaty. Well, we have tree translations, but what's left is three rotations now. It's a bit difficult to explain why there are tree rotations, but you can do a simple test to convince you. Try looking around by just moving your eyes. You will see that you can target any object by moving first your eyes up and down, then left or right so on. The two movements are necessary toe aim at any object. So, in other words, on the two movements are necessary for your eyes to reach any orientation. However, should you be a machine, you could still imagine your eyes rotating on itself a bit like if you were to rotate your head upside down. So Indiana to describe the four treaty orientation off something tree rotations have to be defined. So Indiana, if we can monitor the tree transition and reputation we just talked about in the movement, will be fully known first, to measure a transition. Will will use inertia as an example for inertia. Let's imagine a ball, which is freely rolling in a bus. Well, the ball will always go to the opposite direction to where the bus is going or accident rating to be more precise. And it's exactly the same us sitting in a car and feeling accelerations and breaking. And that's exactly all Asan Sore, called accelerometer, works annexing barometer. Just use a mass, which is stuck in between springs. And what's great is that if you use three of them, then you can almost measure the tree translations, I say, almost because off course, it senses acceleration. So imagine you're going in a car at a constant speed. You won't feel any acceleration. So to fully know three translations we also need to use a clock. I know it is can be a bit hard to grasp, but what's important to remember is that treaty accelerometers are on board most smartphones today, so or carbo device is perfectly able to know treaties, translations, so that's good for the translations. But what about the tree rotation we talked about earlier? Well, good fingers. We will use what's called a gyroscope. Well, it's a bit more complicated to explain how our gyroscope work, but In a nutshell. It works like a next barometer on a disk, and by doing so, its able to sense so much rotation degrees are. Don't for a second to understand this in details. Imagine you are standing on the rotating disc and off course. If you're close to the center, your speed relative to their environment is really small, as shown on the left Michael. But if you're standing near to the border off the disc, your speed will be much more important, as you can see with the right icon. But the fingers you can't really tell which paid you're going out. If you're standing, steer on a disk in order to know how much did you have a rotating. You have to move in between the external and internal part off the disk, and then you will feel some kind of accelerations. That's basically how it gyroscope works. It attached Ah Nextera matter and make it switch in between the internal and the external part of the disc at the very I speed in order to senses these accelerations. So this is what we would get if we were to rotate a disc rock wise, but it works also. I mean, it gets no signal if there is no rotation, as you can see here on off course, if the disk goes counter clockwise, then there is a force in another direction. So it's very easy in the end to be able to sense a rotation off the disco with this device . So again, putting train gyroscopes in ah treaty, Ari will give us the tree rotations. We need to know we are using accelerometers to make gyroscopes so off course, in order to know to tree quotations fully, we have to use a clock. So in the end, adding the tree gyroscopes, we just so and the track several meters we saw previously will define any treaty movement pending we other clock and this device escort inertial measurement unit or so known as I am you and most smartphones you can buy today have on I am. So in the end, a smartphone can sense the posture off the head, which is very great for virtual reality. I guess now you understand why our smartphone use for the cardboard device because it really possess what we need in order to have sight hearing and posture sense for virtual reality. Off course, I won't be no auxiliary senses, meaning you won't be able to make people feel temperature, pain or anything. But off course. It's not something that is mandatory, just a design for the developers when they would develop again not to put you close to a fire or anything like that. For instance, so NZ an for a cardboard device. It's able to generate a treaty world visuals and sound onboard smartphone thanks to its processors. And so its able to try and movements to have acquiring experience. Thanks to the smartphone, I am you. And of course, it's I compatible. Meanings are lenses to correct focus, as we saw earlier, but also corrections without doing on a smartphone, which are color and shape corrections. Okay, so now you have kind of a summary off what a cardboard virtual reality device is able to do how it performs. But what's interesting is to compare toe on ideal of virtual reality experience, meaning it would be an experience that you wouldn't tell if it's real or not. If the actuality is were or not, so I won't go into details, but just bear in mind tripping first is that cardboard VR specifications are really the minimum required to have a compelling experience. Second thing is, that idea is very far from current cardboard VR device, but it doesn't keep us from creating good experience. As we'll see later. Just compare video game industry like 10 years ago From now, it didn't keep people from playing. It's not because is not accurate. Is not real life accurate that it won't be amusing then? The third thing is that off courses Gap will always play a big role for people, toe and dance software and hardware for virtual reality. That's why the next age for virtual reality makers is to increase computing power, have better tracking and also dedicated hardware to the task. That's why, for instance, our familes devices like the Oculus rift. In short, the rift works like a cardboard device, but everything has been optimized for virtual reality. For instance, there are integrated headphones. Also, the screen and optics have been optimized for the experience. Then, of course, there are better graphics because it's linked to a gaming PC. So it's really much more competing powers on the smartphone and also there is better tracking which is done in infrared. I want to give you a little bit more details on this point because the thing is smartphone can't really track on absolute treaty position. Because even if we so we use accelerometers and the crock there stare a bit off drift, which add up with time. And Indiana leads to a very bad accuracy. So the idea be I'd infrared tracking is to add infrared led on the elm. It why infrared is off course because the human eye cannot see them. But the idea is to have known pattern on the elements. And then, when the next one or camera toe track the Army, thanks to reconnecting the known pattern. So Indiana it's much more stables and using an I am you because there is no drift. If the elm it doesn't move, the image doesn't move. So it's really, really a curated sub millimetre accuracy, which is necessary for called virtual reality. So here is a specs sheet for your career. Stripped ways. Also the cardboard device values, but you can see we're still far from ideal, but it's been improved quite a bit compared to a smartphone used for cardboard device so in the in, the cardboard device is like off the low end side off virtual reality. Why are the Oculus rift is the iron experience, But of course there are also players, and that's what we're going to see in the next two videos we software and other where ecosystem she was there. 5. Hardware Ecosystem: in this video, we're going to talk about the virtual reality hardware ecosystem. First, let's see head mounted this place or HMD as an acronym. What you're seeing here is the Samsung Gear VR, which is kind of a sordid cardboard. Well, it's much more than just a new plastic frame. It's also better optics or even ergonomics, which are optimized for some soon flagship smartphones. Off course, something didn't Onley Polish easy hardware, but also the software. So on the customer side of things that will be dedicated ups. UPS market It's a terra, while on the different percent of things that will be help and developer version release too well about. And finally, the good news is that it's currently available in retail. Next hmd his PlayStation VR, which is really pretty much like the Oculus rift, but for PlayStation, so it's pretty much more on the gaming console side of things on PC. But all of this is not confirmed yet because it's not yet been released. We're supposed to see it in Q 1 2000 and 16 at the same time. Sure, the HTC vibe that said be released. What you see here is the current prototype. But the thing is, it really is a Nicholas tree of competitors. There are only two main differences. 1st 1 is close collaboration with vulva, which is a famous software company, and the second point is that the tracking technology is different kind off rivers to compare to your chorus rift. So instead of having a camera picking up lights coming from the HMD, it's the other way around. And it's one or more basis that emits light, which is picked up by the chimney. Long story short. Why did instead off the Oculus rift? Tracking is mainly because off cables management Why are we the rift? You have to connect both the camera and the HMD to your computer with this solution union of two Canadia chimneys. The base is only a light emitting station, so it can simply be connected to a world socket. Next. Heading on that display to know about is sea Foam. This one is again really close to Theo Chris Rift, but with eye tracking. Why that? Well, if you remember what we talked about in the third video about focus and how it's fixed for the Oculus rift, well, the I D here is to have tiny cameras inside your chimney in orderto watch where the eyes are looking at. And thanks to this information, the HMD will generate a for baited image. In short, it will bear parts off the images where the eyes not looking in order to simulate focus. What's great about this is that, of course, it's more comfortable to watch. But it also means that less computing power is required because what's blood doesn't it to be as detailed as arrest? So that's it about H. MD's. Now let's talk about controllers. So far, the only talked about had tracking. But what about full body tracking? Well, there are products already available, and most of them work as full body wireless. I am you, as you can see on the left image, with different spot for different. I am use off course. You're seeing cables on this image, so you must become crazy talking about wireless. I am use. But Artie, I am use adding toe a base is that you can see on the right image. It's ah, read books on the chest, and that's the red box is that is connected wirelessly toe the computer that is underlings . The virtual reality world Next controller is a famously motion. It contains 22 D camera, which exit truly camera and also an infrared led ari. All of this in a single bar, Zan. Its software is able to recognize hands, movements, pattern fingers, etcetera, etcetera. So it's really able to track people hands as you can see on the right image. What's great is that it can be mounted on any HMD, so it's really able to try people and even when they rotate in space or turned their head around, next product is Oculus touch and why it's linked to the Oculus ecosystem, meaning you will need a new Oculus tracking camera. What's interesting is that it pretty much looks like again pad that has been cut in half, and what's great is that it's tracked like the rift itself, thanks to the to the camera. So User got both seasons tracked and at the same time can use a game pad as usual. Here you can see a product that does the same thing, but it is not linked to the Oculus ecosystem. There is a small difference in the tracking technology because here. Magnetic press. I am. Your systems are used. We will not see in details house magnetic tracking works. But bear in mind that it works pretty much like a compass. So that's it for controllers. Now we're going to see motion platforms, which are kind of a leap ahead regarding posture controllers. There are two types off motion platforms in virtual reality. The first type gives you freedom. The weather you want to run, John. See it, etcetera. Individual world. Well, you can. Because of motion Platform is interactive. The second type off motion platform is to make you fear movement. It's most of the time used for cockpits, whether it's in an aircraft or in a car. And, of course, the idea is to make a fears of movements off the vehicle to make you feel like you're behind the wheel. So in conclusion, order out where we just So whether it's a head mounted display, a controller, our motion platform are to explore interact with the virtual world. But while a virtual world can be generated only with virtual assets, which we will see in the software ecosystem video, sometimes it's also interesting to generate with real assets, meaning filming what's in the real world and projecting it in the virtual world, so off course dedicated cameras would have to be used first. This is the most simple virtual reality camera it uses to normal to the cameras with fisheye lenses and in order to capture what a human would see. The distance between those two cameras is equivalent to the distance between to humanize, so it gives a really good, truly effect. Once as a result is seen in a head mounted display, however, it won't be possible to look around. That's why 360 degrees cameras were invented. It really works as a precious camera, but off course a stitching software will have to be used. So in the end, recording with this type of camera will alos the viewer to look around, however, the users to stay at the centre off this fear for a good three d effect. So, for instance, if you were won't be aboard to lean forward, the solution to this is to use light field cameras. We won't see in details how it works because it's very expensive hardware, and it's pretty rare off so far, just know that it's similar to a tree 160 degrees camera, as you can see on the right image. But instead of being composed off no more cameras, each camera will record a few different shots, as you can see on the left image. With the proper software, it will not only hello to know the color off pixel like with a normal camera, but also the direction off the light coming from the pixel and the wings. Direction is key because if you put those cameras in a sphere are a knowing the direction off the light at its surface means knowing the direction off the light at any point inside a sphere in other world of you were wearing, a head mounted display will be able to move inside. This fear and fear complete full treaty. So is this really ears a top notch virtual reality camera? Because of you, we won't be able to tell the difference between virtual, and we're already thanks toa This kind of device well, this concludes this virtual reality art where ecosystem review us off the end of 2000 and 15 C around 6. Software Ecosystem : this video is to review virtual reality software ecosystem. We left the previous video review Rian cameras for virtual reality, and on this occasion we talked about tools called stitchers. Well, there are a lot of them resorts to make, a tree 160 degrees video or panorama. So software are very important because without them, it's not possible to film for which will worry, while the much simper video players our sore necessity. Most treaty videos are recorded as rule us possible, and this is quite normal because it's impossible to tell in advance on which hmd the video is going to be played. For instance, here you can see protections and correction for the cardboard device, as we saw in the modern device ecosystem video. Of course, it's not only about visuals, meaning the clear will be able to play tree todo and also take posture into account. Something exist for video games, and those are called converters. So those are drivers, which will modify how the game is rendered. But there were also a low user to use motion platforms or controls, as we saw in the previews Underwear Ecosystem video. So yes, converters are a great way to experience content that has not been designed for virtual reality. But of course, it's preferable to design a virtual reality experience from scratch. And that's why engines are very important. While to be more precise, most of the time we will talk about gaming engines. Why so simply because gaming engine have been built for real time rendering and real time is very important in virtual reality, because it's impossible to predict what of you were is to do on top of that gaming Indian produce treaty Worlds liable to run on most graphic card that you can find in any computer but even gaming consoles. So creating a virtual reality experience is pretty much like creating a video game, meaning it's reveal like running pregnant on top of a game engine to make things VR oriented and as a virtual reality developer, In consequence, it's, ah gonna be your main tool. Then, off course, it's possible to go a bit deeper, and that's what has DK's are four. In short, there will be an sdk per product, whether it's ah soft well hardware product, and what it provides to a developer is a way to interact with the functionality off a product or even modifying this functionality. So it really is a tour set. And, for instance, developer could create a new automatic coral blind mode for you troll reality at sets like CEO Chris Rift to conclude this software ecosystem review that's talk about up stores. Those are now very famous in our common on smartphones. So of course, there are dedicated patrol reality up stores. But what's interesting is that they have to be accessible inside virtual reality, but also that I have to target and be accessible on very different underwear, whether it's a smartphone in a cardboard, all of complete gaming PC with an Oculus rift. So it's a bit of a challenge for developers to create up stores. But it's more of a challenge for developers to target many platforms, as we'll see in the next video, which is about challenges that await virtuality developer 7. Challenges: Okay, so this is a last video off this section, and it's about challenges that await virtual reality. There are tree off them. 1st 1 is technology versus price off course. Virtual already asked for a very specific ecosystem, as we saw earlier. But what about its price? It's quite expensive so far. If it if it's just for cardboard device, you have to possess a very good smartphone in order to have good results. And for what, 360 videos. 20 game, something you can already do on a normal smartphone off gaming PC? Well, ofcourse, the true reality is much more compelling. But is it for us $500 minimum? On top of that, we saw that specifications are being behind what's necessary, and it's even more true for site, which is the most important virtual reality sense or ignored. It doesn't sound good for virtual reality, right? Well, not really, because TV is used to be worth thousands off daughters gaming PC's the same thing, so it means two things. It's on your matter of time before virtuality is cheap and efficient, and it also means that there is no need to wait. Why not start having fun right now and build virtual reality. So yes, consumer virtual reality hardware and software are here. Truths are not perfect, but they are here, so they are to be something to do with it, which brings up the next challenge, which is content. Good news is, everything is to rebuild and rethink. But the sad news is the ecosystem is too blurry from now. What I mean is, try to imagine you have to develop a bowling game, for example, which had mounted this pay. Are you to target which posture hardware is your user going tohave? Is your user wearing headphones? Does he have a controller? It sit there, right center. There are lots of questions, and you don't even know where to start. It's not like a PC game, where you know the user as a mouse and a keyboard and the screen. That's why, for instance, of the guys that Ocular has decided to ship the rift with explosive controller. So as a developer, you know you can always rely on the use or having this controller. So that's why it's quite pioneering times because it's quite difficult to create a compelling gameplay if it's not possible to know what the user has with him. Pioneering times also mean 1/3 challenge, which is social. My social. I think about the big picture, meaning locally. Is it good for the economy? Grow bodies? It's good for the environment. Questions like that, and not only is a bubble it will create around you when doing virtual reality. To be pretty honest, I think that the status is currently unknown because nobody can really predict what's going to happen. If you ask me out, say that what we're going to see will be mostly good, because first, virtual reality will help to create empathy. Why, simply because of feeling off presence and interaction will be enough to trigger emotional response. Also, maturity doesn't have to be something. Don't alone in the living room can be done in a Tim park or in like a cinema or around board games. So it can really be a social improvement. So, yes, I would say no need to fear the social side of things. For now, I know it's very lots of people, but it's not something to really consider in an every day basis for developer well this video concludes the virtual reality overview. I truly hope you enjoyed it and you will enjoy the developers head of things, even more so, see later. 8. Download and install Unreal Engine: Okay, so how to download an in store under an angel? First, you have to go to a new engine dot com through your favorite search brother or directly type in the address. He doesn't matter. And then on a top right corner, click on Get Unreal. The website will ask you to create an account in order to be able to download unreal engine on. This account will also be necessary later on to use unreal engine, so it's quite mandatory to create one. So just steal the formal signing. If you already have an account, then just don't know it accordingly. Where there you were under Windows or running a Mac. But as a side note, just know that using Windows is much better to develop. This is because most H m these are not compatible with Mac, so you would only be able to create a virtual world but not experience it at the same time . Okay, so just don't know the version you wish to use on this would get you an installer. So here I have it as a Windows version on my desktop and ah, just double clicking on it will run it and It's quite simple. It's just on extractor, which will in store on one engine at a given location. Well, when I say unreal engine, I mean the epi cruncher, which contains an engine. Okay, Once installed, it will give you a short curt somewhere just about take on it and it will ask you for the credit and showed you created just before Priore to you don't know the real engine, so just enter your credential. Wait for loading and you will access the epic ensure interface, which contains on one engine. Then just go to the UN Ranjan Tub and should see a yellow button asking you to in store in ruin. Engine Iran. My guess, it says launch because it's already installed. You also have sub menus which are community learned, marketplace and library. So the 1st 1 community is pretty important in order to access communities like forums, wiki, etcetera on farm. Some articles, learn or so is pretty important because it contains all example to toy or assets. Ah, that are free and available for you to learn and revenge in all the aspects. Really. Then you have marketplace where you can buy some assets, whether it's textures, sound maps, etcetera, so it can be useful to you at some point. Then we have library where or unreal engines are installed. So here, you see already have to. But you just have to kick your your yellow button to in store it or just take on adverse shins and in store for 0.10 version. So don't install 4.8 country, as I'm showing here, but really in stores of last version available for better virtual reality support. Just taking store and it will start toe, get under the engine on the Internet and install it. Okay, then once in its told, you just have to click launch in order to put it. Then it will put the under engine editor, which is where all the work is done. It will ask you to select a project you wish to open or create a new projects, or, in your case, off course, it will select a new project. Then it's between blue prison surplus. Press between. We'll see this later. Just goto blueprint and create a first person templates. Then you have tree icons are under it, so it doesn't really matter much now, but just make sure to say Stop maximum quality and start your content. You can put no contact. It doesn't really matter. Then you see where your project is stored and what's the name of your project? Then just click. Create project and it will load. So what are here is Ah, main interface. And as you can see, the world is already loaded. So it's a first person Tom played, so I will be able to play as a first person. You were sure twist again and accept awry and go around on this level. I can also already edit stuff. For example, I will move the box on the left, and as you can see, it's working pretty well. So now we just have to set it for virtual reality, which is subject off. The next video, see, was there 9. Configure Unreal Engine for Oculus Rift: this video is to configure under engine for your courage. Okay, so we are back again in the epic launcher where Unreal Engine is hosted. And as you can see here, we are able to access the first person project we created in the previous video, so it's time to double check on it. It will open the project under the under the editor. As we saw previously. The thing is now unreal. Engine is not set for virtual reality, For this will have twin store, some drivers and configure 100 engine. Problem is, those won't work under Mac. So that's why it's really interesting to have a Windows PC if you are to develop for virtual reality. But let's say you ever window specie, since video is about to your career swift. So let's see how it's configured. So first for the Oculus rift, you have to go toe occurance dot com, then click developers and PC runtime to the raspberry. Here it's 0.8. Then you have to agree to the terms, and you can finally don't know it's the runtime. This will don't know the installer that I have placed on my desktop here, so you just have to the world, shake on it and for instruction to go through ah air. I cannot really demonstrate how it's done because it will install drivers that will make the recording junk. When it's over, a Z installer will ask for you to report, so just follow instructions and then when it's back, go to the epic launcher and launches the project we created in the previous video. And if you have a Oculus rift that is connected to your PC currently, then you will see that under the place section there is a VR preview item. So should kick on it right now. It will create VR view on the screen and it will also this plays a game in the yes, it. So we'll be ready to to really interact with again. There is nothing more to configure a to first. Of course, we will tweak it a bit later. Toe enjoys the VR view. There is nothing more to do 10. Configure Unreal Engine for Gear VR: this video is to configure under engine for your courage. Okay, so we are back again in the epic launcher where Unreal Engine is hosted. And as you can see here, we are able to access the first person project we created in the previous video, so it's time to double check on it. It will open the project under the under the editor. As we saw previously. The thing is now unreal Engine is not set for virtual reality, For this will have twin store, some drivers and configure 100 engine. Problem is, those won't work under Mac. So that's why it's really interesting to have a Windows PC if you are to develop for virtual reality. His video is about Samsung Gear VR, so let's see how it's configured. First, we have to go to Oculus dot com, then go to developers and tools. We want the US A generator, which will produce a signature file for your mobile. For this, we need to create a new, curious account, so if you have one already, you just have to signing. But if not, you have to create a new account. Once you're logged in, you just have to follow instructions on how to generate Nozick file. So here I have generated and don't know that one, which is on my desktop now. Then we need toe. Place it in the Unreal Engine folder. So for this, just go to see programs and a game, which should be the place where the engine is installed. Then click on the unreal engine versions that is installed. So here it's four point down. Then Goto engine build under read and finding each other here created folder named Assets and put the sick fire. We just don't know that inside this folder that's good for the signature fire. Now we need to win store android dependencies to do this. Just go back four times until you reach the engines up border, then click Extras on Droid works. Win 64 runs and starter Only. To do is follow through the installer. No need to tweak anything, but just pay attention to the folder it's going to be installing. When's the installation is finished, It will ask for a reboot, so please do it. Then you can go back to the epic launcher and launch 100 engine. This time we go to a new project blueprint, then first person again and then on the tree icons below. 1st 1 will be set to mobile tablets. 2nd 1 will be set to scare about 20 or 30 and the next one to know starter content. The reason is that the smartphone is less for fourth on the computer, so you have to tell in some way toward the engine that it's running on a mobile and his low graphics quality. That's why we were sitting it like this. Okay, we're almost done. We just need to configure the project. To do that, just goto edit project settings, then go to the underrate section, then Target and sdk. For now, it's 19 about it can change with time, so beware of the then configures the money fist. Forgive your then go to the young, worried as decay section and make sure all of the past are set. It's not just came for where underweight works was in store, and thats hold now. Two packs E project for Android. Just go to file package project under read and easy to you just have to target a folder and it will run. In the meantime, you can switch your smartphone to developer mode, so you may have to look on the Internet for more info, as I cannot guess, the smartphone you're using. But you're saying that it's very simple to set. Just look for how to activate the developer mode and also look out to activate USB debugging mold. Then just connect your smartphone to your PC and accept the air s a fingerprint case that smartphone will Papa. This is just to recognize the PC as legit. Then once is the only project is ready to deploy. 100. Just go to the folder and runs in starting script. As you can see, it will look for your smartphone. Here I have non connected, but if you connect one, it will install automatically, then to run it on your smartphone. Just click on the icon that was created on your home screen, and that's it. You should be able to enjoy your project on your give your 11. Find your way around Unreal Engine: Okay, So how to find your way around? Unreal engine. Good news is you can access almost anything you need from the epic launcher off course. First, you're going to like the community and learn tough. So in those will find lot off help to toy yours and everything you need. But of course, it requires time. And this is not the subject off this to toil, but you'll have at some point to go through all of this content. Well, not not all of it. You understand what I mean? But if you want to be independent at some point, you have to look for some answers by yourself and run by yourself. Thankfully, for the complete blend into virtual reality series, there is no need to learn this stuff but maybe later are in between courses. You will want to go through learning contempt. Okay, let's go back in one off the projects we created in the previous videos. First other center off course. You notice syriza vieux port and basically is his few port allows you to place objects a bit like composing a movie scene if you already left and right buttons of your mouths independently. or at the same time you will be able to navigate in the scene. Elastic will allow you to select an object toe, move it around scale, lead or rotated. Also on around these view ports are menu, which will allow you to tweak your view. Port results, then on the right, are all the elements you will load in this report, and if you click on any element right below, you will be able to read it. It's basics perimeters here you can also notices green and blue bottles. Those are quite important because I will allow us to create more complex objects. For instance, at components will allow you to create text on one off those white cubes or even attached to sound to it to make it behave like a sandbox. The other button blueprint will allow you to define a behavior. For instance, we could make a white cube rotate when it's looked at, or simply make it explode when the user is nearby. What's quite that? You don't need any programming background if you want to create a blueprint. Only to do is to connect basic pre existing brooks to create functions and functionalities , then next panel is below and that's your content, brother. Here, hold custom objects will be stored, and you will be able to use them later in the project. Next is his tool box on the left, from where you can import basic object or lights or visual effects, and you can even modify them directly from here. Finally, here is top toolbar, and basically it's where you can play with your project so you can learn she ate. You can review it and you can share it on the marketplace. You can also see that there is a cinematic icon, and this is when you want to stage your project rather than playing with it. So that's it, for it is unreal engine look up video, see when the next one for how to set it for virtual reality. 12. Tweak virtual reality rendering: in this last video, we are going to tweak unreal engine for it or reality. We're going to start from scratch to make sure everybody's on the same page. So just launch Unreal Engine Editor and create the first person dump late in blueprints. Just make sure that you includes Dr Content on. Then you can create the project. So first thing we're gonna do is to delete the weapon and arms from the first person character. To do this, you just need to select it, click on edit, blueprint and then just go in the view port tub and suppress what we don't want to use. And then we just have to click compared to rebuild the blueprint. And as you can see, it doesn't work. And simply because what we just deleted is referenced in the blueprint. So we have to suppress these references before we can continue. So we just have to go in the event graph. And as you can see, there is a reference error in the spawned projectile system. So no problem. We can delete this point projectile entirely because you want to use it any more that we don't choose the gun. Ah, and we also suppress jump while we're at it. So here to select, I just use the left mouse button and tow access is not actions when you use the right mouse button. Okay, lets try toe compared again, and it still doesn't work this time. We'll just click on the target to see where the war is, because you can see we can did it it with no problem, because it's using the construction script to wealthy body with the weapon and arms. But as they are not here anymore, we can ditty twist No problem. So now we can combine again. And as you can see, it's finally working. And then we can hit the down our next to play and select view in a new view port so we'll be able to test. And as you can see, there is no more weapon and normal arms and liable to move around with the arrow keys on your keyboard or using your mouth. If you have ah, head mounted display connected to your computer like Oculus rift or HTC vive, I've You can also try the VR preview mode. As you can see, it's working quite well. The only small problem is that when you use the mouth, it will rotate his view and it's can off making people sick. So it will just go back in the blueprint event and removes the mouth section. Okay, that's ah, to be better. Now we just get rid of these annoying Red Cross in the middle of the screen. So to do that, we live the blueprint editor and go back in the content. Brother, make sure you go in first person BP for blueprints, going the blueprint section and then going first person. It's Judy. Then we just right click on the exact node and brake lines to make sure it doesn't generate the cross there, which is the section on direct. As you can see if I really want to see project Ah, don't it across there anymore. Um, by the way, you don't have toe it. Compile and then play If you want to run. Ah, the example You wish you could play. It will compile automatically before. Okay, so it's working quite good. If I am to wear on the chimney right now, it will. It will perform correctly, meaning that it will scale. Ah, when I move around it with scales level correctly, depending on my head movements. Ah, the only thing that doesn't really does the trick is at the distance to the ground is not realistic. Meaning. Now the system doesn't know if I am seated or if I am standing up. So we have to tell to the system in a way which position we are if we want the floor Toby at the realistic distance. Also, you may maybe on your computer you can see there is some lag or gender if you move around because the graphic card is not able to follow the quality which is required toe run the project. So for that you have two solutions. 1st 1 is toe chance, the game settings. So here it's really, really basic meaning. You can switch to low for Moroto. I sorry for values perimeters. And of course, you can also modify your scene to make sure it doesn't consume a lot. So change, texture, resolution and stuff like that, you hear? There's not much texture, so I was not much we can do, but we can change shadows, etcetera, um, second thing or so which is specific to VR is to change. The reservations came so to do that, we will go in or level editor Blueprint on. We were affect to keystrokes to make sure we increase or decrease this resolution scale so the R key will be to decrease in the T key will be to increase. So here I'm simply using the right click to create a note and the left click to connect notes together. Other scary is an integer perimeter. I create a variable in order to be able to store it in between increasing decreases. Also, the command itself is called hmd sp. Just one thing to what shout is out There is a space after recipe, so we just have to open integers Ah, right after it's in order to execute. So here, as you can see after on the increased chain. So each Time t express it will look for the resolution skater valuable it will add tend to it, then converted to a string and appended to h m the speak command and it will execute those his command. Then for the decrease section, I just have to duplicate and of course, change iki toe are and make sure ice obstruct 10 instead of adding town, then that's launch VR window to see how it looks and, as you can see, if I press are to decrease our T to increase the perimeter, I wish that the quality lowers or get higher on the good finger that it allows us to simulate values, device or configuration even if we don't have them. For example, let's say you have a very poor full PC and you want to emulate a not ah, that strong PC. Then you can increases the screen percentage perimeter toe to emulate toe. I mean to generate more pictures, mawr higher resolution pictures and it will ask for more drawing time. So, yes, By increasing these perimeter, we will see if your PC game runs on the less powerful PC. Ah, however, if you are to decrease it, it's to gain more FPs. So you sacrifice resolution quality. Ah, in order to evermore FPs so much more free the experience. So that's it for quality. Now let's have a look on how to have a better scale for floor. I'm in a better a distance to flora impression. So in order to do this, you have to possess a tracking device. I mean, the next time you're tracking your eyes like a camera or liked our system for staff to survive. For example, um, the reason is that it's impossible for the system to guess where you're seated or standing up. So the starting point off the virtual reality experience is to define beforehand when I mean to define. I mean that you have to stick. Ah, the initial condition off your virtual experience toe a static point in reality, and there is nothing better than your tracking camera because it's static. It won't move during the entire experience, and you can define where it is compared to your head mounted display. So, for example, let's say you have your camera sitting your kerchief. If camera sitting on your desk, you want to tell the system the distance between your camera and the floor. Of course, there is no need to tell precisely where it is. You just need ah, hate. So before we go any further, we need to see all ground better. So for this, just go to the floor by clicking on it and edit these material Property man, for example, who can put Oak. I know it's not that appealing to see, but at least we have good treaty effect for more ground. So now you can run it again to see how it works in VR. Um, off course. It's very likely that it's gonna leg a bit, So don't hesitate to use the warranty Kiwi defined before to switch the resolution scale and are so don't hesitate toe modifies the settings. Okay, so let's get on with this trucking thing. Basically, the first person capsule is a representation off a human being in the level, so gravity act on it and it can collide with other objects. But this capsule isn't alone. It also Hoster camera. So basically, if you run the project in a classic mode, so not in NVR you will have the point of view of the camera. However, in virtual reality mode, this camera represent where the HMD would be if it was toe start in front off the tracking camera physical tracking camera. So if I am toe wear my chimney right now and look at my tracking camera, then start the game, I will have the point of view off the camera as if I was running the game in a classic. Or more so then it's simple. If I want the distant from the HMD to the ground to be accurate, I just have to tweak C eight off the camera. So here it will be a CFC camera. Was I on the shelf on Dhere into second position? It would be almost as if it was on the ground. So of course you can You can set it Ah, step by step. Or you can take real life measurements to tweak it off. Course, you may not want this behaviour. Maybe you just formed the A to be constant. In this case, we just have toe edit the blueprint off the character and then assigning key for the user to reset the position. When he's ready toe take part in the virtual reality action. And in this case, ah, reaction will start. Where is he coming? Arise. So here is Ah, demo. If I want to reset, So I just launch it. Look around, prepare my my position and when I'm ready to start, I just hit the key and start to experience. So I really invite you to try out these two different style of their true morality. Because the 1st 1 with the two ground being realistic is, ah, very compelling. I mean, if you have a good distance between ground, you really think like the virtual ground is like your rail ground. However, it's a little bit more complicated to tweak for user because yes, to measure the distance between its camera and ground twice the second metal these ah preferred right now. And it also is more efficient for copies and stuff like that because you don't have a ground reference. But ah, it's time. You would see that the absolute from ground reference well will be will be necessary for digital reality are at a room scale. That's why it's important to think about you too right now. Okay, so one last thing as a bonus will be for sound on another to have a specialization, we need first to activate a plug in. Then we have to report the project on when it's done. Just dragon dropsy sound, which is attached to with this project. Uhm, the son has to be more No, that's why I'm giving it to you because it can be hard to find, even if you have lots ofem Petri because most of our sound are cereal. But here you just have to import this model file and configure as, ah, shown on the screen. And of course, ah, best is to wear headphones and two runs the project in VR classical mode. It doesn't matter, and we hear that the sound is specialized, so that's the end off part one off blend into a virtual rally In Part two. We will make a board game from scratch on in Part Tree. It will be about advanced techniques like Photo, Graeme ITRI Toe capture. Real life scenes for VR on were also ever look at realistic rendering inside on one engine , I truly hope, sees course. Wasser pulled to you. Notice it. Eight to give a feedback on and a big thank you for sticking around. Thanks for watching