Transcripts
1. Intro to Virtual Production: Hello and welcome to this course on virtual filmmaking inside Unreal Engine. Virtual production as essentially the point where physical filmmaking meets a virtual environment. For years, virtual production has been used throughout various high budget Hollywood films to shoot virtual CG scenes with physical filmmaking equipment. Today, making a virtual production on a low budget is more achievable than ever. The real-time game engines and VR positional tracking. In this course, I'll be guiding you through creating a virtual camera rig and sending it out to work within Unreal Engine. Finding the right assets to make your film and introduction to the unreal sequencer and the process of lighting and rendering your shot. This course is focused at a beginner level. However, it does expect you to know the absolute basics of Unreal Engine.
2. Finding Your Assets: To make your films, you'll need assets. There are many different places you can go to find these assets, but I'll show you the main places I like to find them. For me, the main place I looked for assets is the unreal marketplace. There's a huge range of free and paid assets from various other creators in the community. From objects to characters to full environments. You can even find pre-made effects and animations. Every month, the unreal marketplace and lists a handful of products that are 100% all. Meaning if you can that month, you're then free to use them in your projects forever. Once you've acquired the assets on the marketplace, adding them to your scene is as simple as going to your Unreal Engine vault and clicking Add to Project. In 2019, Epic Games, the creators of Unreal Engine partnered up with Quicksort mega scans to provide free Photoscan meshes for environments with the QuickSort bridge app. All you need to get started as an Epic Games account. A great place to go for finding animations as myxoma, it's a free service for anyone with an Adobe ID and has a huge, another area of motion capture animation. Otherwise, you're going to always create your own assets and animations in software like Maya or blender is you know, how to use them.
3. Building Your Virtual Camera: So we're virtual cameras, what we'll be using to get a real-world position and translated into the virtual environment. For this, we need a method of tracking our physical rig. There are various options available to tracking the rig. But for this demonstration, we'll be using a VR system. For my current REG, I'm using an Oculus Rift S controller. The rift S tracks the environment within. Tracking method that uses the built-in cameras, accelerometers and gyroscopes. This is great for if you have a small space and are unable to set up base stations to track movements. However, I found that insight out methods of tracking can be unreliable if your movements are too small. When doing smaller, more stable shots. I noticed the headset tends to cut out for a few seconds and stop tracking the scene. Which can be rather irritating if you're trying to get simpler shots with minimal movement and camera shake. The alternative option is to use a base station based VR system, such as the Oculus Rift, the HTC Vive were the valve end day. Personally, I'd go with the Vive pro or the valve index if you want higher quality tracking with the newer tech. Technically speaking, for a base station system, all you need is the base stations and a controller or tracking. But to get started, the headset itself is unnecessary for this system. For the actual camera tracking, your two options are between a VR controller or a standalone tracking puck. The benefit of having the controller is it gives you options for mapping buttons to various controls such as focus, focal length, or movement. However, it's more complicated to mount with a tracking puck. They come with a built-in read for easy mounting. For the rig itself, you can either build it up piece by piece with accessories from a company like small rig. Or you can buy a basic handheld or shoulder rig straight off Amazon. Building it up piece by piece will allow you to be more modular with it. But regardless, there's a few main things you'll want to make sure it has to make your life easier. If 15 millimeter rail system to attach everything to a set of handles to grip your rig. An external monitor so you can view the environment from all directions without having to look back at your computer monitor. And a way of attaching the tracker to the rig. If you're using a tracking buck, you can simply attach it with Ge Mao wherever if you're using a controller, you'll need a way to hold it in place. For my rift S rig, I used a small rig bulkhead clamped mount. Alternatively, you can also try 3D printing amount to fit the exact dimensions of your rig. There are a few other bits and pieces that can help make your life easier, such as a wireless HDMI system or a shoulder pad for comfort. But you can decide if those are things you need for your projects.
4. Setting Up Your Camera in Unreal: Before setting up the camera and unreal, you'll want to go into your plugin settings and make sure Oculus VR is disabled and steam VR is enabled in the virtual reality tab. Then under the play options, click Advanced settings and enable HMD control. Now in the Content Browser, Create a new blueprint and select Actor. Name your blueprint and double-click to open it. At a cube components. So your blueprint. Then at a cynic camera and attach it to your cube. This will allow us to adjust the offset later if we need to. Right-click on your blueprint and search for git tracked device position and orientation. Then drag out the orientation node and search for make transform. Then connect your out Position to the location. Right-click on your blueprint again and search set relative transform for your tube. Connect the event tick to your set relative transform. Then right-click again and add get relative transform. Now at alert, transform and feed the relative transform and make transform return value into the alert. Right-click and add a get world Delta seconds node. Then draw the return value out and search for a float times a float V, the output into the Loeb alpha, then set your multiplication to ten. This controls the smoothing intensity for the camera. The lower the number, the smoother it will be. Now plugged the return value into the new transform. Finally, search for a good player controller, ends and enable input node. Link your oven play to the enable input node. Then search for set view target with blood and turn off context sensitivity. Link your enable input to the view target and you connect the player controller returned to the target. Then drag off of the new view target and search for self-reference. The final step is to identify which device and steam VR you want as your camera. Generally speaking, the HMD is device 0, followed by base stations and controllers in the order of connection, compile and save and drag your virtual camera into the sea. Hit the play button and your virtual camera should now be tracked. So your physical controller.
5. Intro to Cameras: Unreal is built with two different types of cameras. There, default basic camera and the cynic camera. The default camera is limited to only being able to modify the field of view and aspect ratio. Whereas the cynic camera has a wide range of options to choose from that better resembles the settings of a real-world camera. Drag us into camera onto your level. Then under the perspective tab, choose to Pilate camera. The first section to look at are the film back options. Here you have a drop-down list of the most common cameras, sensor sizes. These foam bags settings only really become important when trying to match cameras for v effects were LED wall shoots where you need your virtual scene to match perfectly with your real-world camera. However, then isn't something we will be covering in this course. For now, I'd recommend leaving your film back to the default 16 by nine digital film preset. However, if you'd like a more widescreen film like look, you can adjust this answer, width and height to achieve that. For 2.3921 Cinema scope aspect ratio, you can set your sensor width to 31.95 millimeters. Next, we have our lens settings. You can leave most of this as default. The main point of interest here is the diaphragm blade count. Here you can set the number of blades on your aperture, changing the shape of the out-of-focus. Boca. Just said our focus, we can choose to either focus manually or automatically. Instead it manually drag the manual focus just insider back and forth. You can turn on the debug focus plane to get a better sense of where your focal point is. For autofocus change your focus method to tracking. Then under the tracking Settings, select the actor you wish to track. To change the zoom of your lens, play around with the current focal link slider were set the value manually. If you're after a specific focal link. You can also set your aperture here. This essentially changes how much of a shot is in focus. With a lower aperture, you'll have a shallower depth of field, leaving everything but your focal point out of focus. You also have the option to change various camera effects, such as Highlight diffusion, blue, lens flares, lens vignetting, the chromatic abberation. You also have access to other post-process and rendering features and the camera settings. But for now I recommend sticking to the basic shown here.
6. Intro to Sequencer: To create a sequence, right-click in your content browser and under animation, select level sequence, double-click it to open it. To add an object to the sequencer, click on the object and your viewport, then go to track, then select the object you're adding at the top of the list. Now that it's added, we can create a transform keyframe by pressing the circle, move the play head down the timeline a few frames and move the object in your viewport, then set a new keyframe. At that point, each key frame records the data for its category at the point you create it. To automatically create keyframes. When properties are changed, select the diamond with the key and the Toolbar. Now if you scroll a few more frames ahead and move your object, it will automatically add a keyframe. Next I'll show how to add animations to character in the level sequence. So Azure character to the sequence. Now under animation, you can select from all the current animations you have imported for your character. I'm using the free unreal mannequin and animation starter pack for this example. You can loop your animation by dragging out its tail. Now we can add a camera to our sequencer. Dropped the cynic camera into your scene, and the perspective dropdown, you can choose to pilot your camera. Then start tracking your camera and the scene. You can also choose to Pilate and Pilate your camera directly in the sequencer. As with the object transform earlier. You can also keyframe and control elements every camera, such as the focus and focally. With this, you can create animated camera moves, zoom pools, and focus portals. As default, the keyframe interpolation is set to cubic, which provides gradual smoothing between key frames. To make these more direct movements, you can change this to linear by right-clicking on your keyframe.
7. Intro to Lighting: And Unreal. There are five main sources of lighting. Directional light, point, light, spotlight, rectangular light, and skylight. The skylight acts as an ambient light, increasing the overall brightness of the scene. It captures distant parts of the environment and applies it as light. Alternatively, you can import a cube map. If you want to take a more manual approach. The directional light essentially emulates the sun. It projects an intense directional light onto the scene, casting harsh shadows. Combining the directional light and the skylight together will give you the basis you need for creating realistic outdoor lighting. When lighting your scene, it's a good idea to turn off the auto exposure to get a better and more consistent since of the brightness intensity. You can do this within a post-process volume. Under the exposure tabs makes you your minimum and maximum brightness or set to the same value. Then you can control the overall exposure via the exposure compensation settings. The next like we have available, the point light, this light the scene in all directions from a single point. But as the most intensive light for performance. The next line is the spotlight. This casts a light from a single point in the shape of a cone. The spotlight is a very directional light and can be good for casting focus and controlled light onto characters in your scene or drawing the viewer's attention to an object or set piece. You can also adjust the radius of the cone to control how much you're shaping the light. The final Light is the rectangle light. It emits light from a plane in a rectangular shape. You can change the width and height of the rectangle. And with Ray chasing enabled, you can get some very soft shadows as a result of the larger area size. This is great for simulating large diffused light sources. You also have the option of shaping the light or the angle and length of the barn doors. Every light also gives you the ability to adjust as color. There are three different mobility settings for the lights that can be used to optimize your scene. Static lights, least resource heavy, but require you to bake your lighting for every change made. Stationary lines allow you to make changes in intensity and color without baking. However, shadows and bounce lighting needs to be baked. Movable lights are the most resource intensive, but allow for full control without the need for baking. It's recommended that you keep these lights to a minimum if you don't have a particularly powerful PC. Combining these all together gives you full flexibility. So like you're seeing how you want it.
8. Filming with Take Recorder: Before we begin, we first need to make sure that tape recorder plugin is enabled. Go into your plugins and scroll down to virtual production, then enable the tape recorder and research. You should now see the tape recorder panel on the right of your screen. Next, we'll need to set up an input and go to your project settings and navigate down to input. Create a New Action Mapping and name it. Then assign the button you want to use to start the recordings. Now open up your blueprint and make sure the return value on your good player controller is connected to your Enable input player controller. Then right-click and search for the name of your new action event. Drag out the press node and add a flip-flop. This is our start and stop switch. Drag out the a switch and search for open tape recorder panel. Then drag that out, search for start recording under the tape recorder panel subcategory. Then link the return value to the target. Now drag out the B switch and add a stop recording node. Hit Compile and Xavier blueprint. To add our virtual camera to the tape recorder. Quick, the camera in your SR, then in the tape recorder panel, click on Source and Azure virtual camera. Click on the virtual camera in the panel and open up the tape recorder settings. Scroll down and disable, remove redundant tracks. Now scroll backup and de-select the virtual camera hierarchy. You can now go through and enable the specific properties you would like to track or change. Make sure the virtual camera spawn and transformation properties are selected. Then add your additional cameras settings you'd like to change later. Now play your scene and press your button to start the tape recorder. It will now record the camera motion until you press the button once again. Once you're done, look for the cinematics folder in your content browser and open up the take you one. Click on the padlock icon to unlock the sequencer properties. And you can now watch back you're recording. To make changes to your camera settings. Double-click on the sub-sequence in the timeline. Then you can start making changes to your shot. You can also keyframe changes like we did earlier. You can also use the tape recorder and a pre-made sequence. Just right-click on your sequence and hit Open and tape recorder. Then select all the properties you want to record again and start your seeing. You can now adjust the focus to follow your character or set the field of view to suit the shot. If your sequence is longer than your animations, navigate back to the main scene and drag the marker to the end of your animation. The scene will now play it within these boundaries.
9. Rendering Your Scene: Before we can render our scene, we first need to make sure our virtual camera is disconnected. During the render capture. Open up your blueprint and add a get Gamow node. Drag that out and add a two-string node. Then drag that out and type equals, equals and select equals string. Now in the string input box type movie pipeline Game Mode underscore 0. Then add a branch and connect to the event tick. Then feed the false input into your relative transform. Compile, save, and close the blueprint. Now in the game mode is said to movie pipeline or virtual camera will be disabled. Now open up the sequence you want to render. Then hit the clapper board button in the toolbar. You can choose to export sequence or video sequence. Image sequence gives you the option to go back and re-render specific faulting frames. Whoever For this case we'll set to an AVI video source. You can then set your video resolution and compression quality. The higher These are, the longer it will take to render, and the larger your file sizes will be. Finally, under the General tab, click the advanced dropped down. And under Game Mode override, select the movie pipeline game mode to block our VR input. Now hit capture movie and wait for your shot to render. Now you can load that file interior editor of choice and start piecing together. You're seeing.
10. Your Project: Now that you know the fundamentals of creating a virtual camera, finding and importing assets, navigating the sequencer, lighting, and rendering. Your project is to create a short scene using the skills you've learned. The unreal marketplace has plenty of animations and assets to get you started. If you don't have a virtual camera just yet, try playing around with the base cynic cameras inside Unreal. I hope the information in this Beginners Course has been useful to you and I look forward to seeing what you predict. Okay.