WEBVTT 1 00:00:23.847 --> 00:00:27.280 Hello this is Juyoung Lim of Westworld 2 00:00:27.280 --> 00:00:30.730 Today we will learn about VFX pipeline and 3 00:00:30.730 --> 00:00:33.880 how VFX pipeline is changing 4 00:00:33.880 --> 00:00:37.240 using real time game engines 5 00:00:37.240 --> 00:00:39.681 You will be able to enhance your understanding 6 00:00:39.681 --> 00:00:43.227 of usual VFX pipeline and 7 00:00:43.227 --> 00:00:46.240 you will learn how new content is made 8 00:00:46.240 --> 00:00:48.799 using real time engine 9 00:00:48.799 --> 00:00:52.399 The diagram you see is a usual VFX pipeline 10 00:00:52.399 --> 00:00:54.173 Pre-production, production, 11 00:00:54.173 --> 00:00:57.160 and post-production are the stages 12 00:00:57.160 --> 00:01:01.010 and each stages have the preparation step for the filming 13 00:01:01.010 --> 00:01:05.800 and the real work step after the filming 14 00:01:05.800 --> 00:01:08.300 VFX engages in all steps from pre-production, 15 00:01:08.300 --> 00:01:11.800 where preparation for the filming is done, 16 00:01:11.800 --> 00:01:14.360 to the filming, and the post-production based on 17 00:01:14.360 --> 00:01:17.831 the filmed data 18 00:01:17.831 --> 00:01:19.774 and we provide solution to those 19 00:01:20.279 --> 00:01:24.021 Understanding VFX pipeline 20 00:01:24.279 --> 00:01:26.280 Now, I will talk about what are done in 21 00:01:26.280 --> 00:01:28.780 each stages of VFX pipeline 22 00:01:28.780 --> 00:01:31.480 and how the procedures are broken into smaller steps 23 00:01:31.480 --> 00:01:33.839 in detail 24 00:01:33.839 --> 00:01:37.189 First, the most important first step in VFX pipeline is 25 00:01:37.189 --> 00:01:39.639 as you can see, photogrammetry 26 00:01:39.639 --> 00:01:41.879 and scanning 27 00:01:41.879 --> 00:01:45.833 This step refers to where you scan 28 00:01:45.833 --> 00:01:48.883 the needed object or environment before or during the filming 29 00:01:48.883 --> 00:01:51.800 and saving them as data 30 00:01:51.800 --> 00:01:55.050 The filmed images will be saved in 2D 31 00:01:55.050 --> 00:01:58.250 and later on, some kind of object or environment 32 00:01:58.250 --> 00:02:01.000 will be overlayed on that 2D image 33 00:02:01.000 --> 00:02:03.500 In this step 34 00:02:03.500 --> 00:02:07.450 the props or environment during the filming 35 00:02:07.450 --> 00:02:11.080 or the information about the location become the most important data 36 00:02:11.080 --> 00:02:12.530 Without these data 37 00:02:12.530 --> 00:02:15.630 you will have to simply guess 38 00:02:15.630 --> 00:02:18.630 how it was filmed, what lighting was used, 39 00:02:18.630 --> 00:02:20.720 and what objects were used 40 00:02:20.720 --> 00:02:24.320 just looking at the images when creating the final plate 41 00:02:24.320 --> 00:02:26.420 This way, the final plate will 42 00:02:26.420 --> 00:02:29.350 turn out awkward 43 00:02:29.350 --> 00:02:31.950 or the balance of the scene 44 00:02:31.950 --> 00:02:33.850 will be broken 45 00:02:33.850 --> 00:02:36.046 So before getting into filming, 46 00:02:36.046 --> 00:02:39.403 what will be filmed, what will be scanned, 47 00:02:39.403 --> 00:02:41.953 and how they'll be turned into data are determined 48 00:02:41.953 --> 00:02:46.039 and once you get into the actual filming, the database is formed 49 00:02:46.039 --> 00:02:48.639 by 3D scanning each data, 50 00:02:48.639 --> 00:02:51.148 using lidar to scan a space, 51 00:02:51.148 --> 00:02:52.748 or using drones to capture them 52 00:02:52.748 --> 00:02:56.679 encompassing every data, prop, and environment 53 00:02:56.679 --> 00:02:59.179 used in the filming process 54 00:02:59.179 --> 00:03:02.179 This process is scanning or photogrammetry, 55 00:03:02.179 --> 00:03:05.839 the data preparation process 56 00:03:05.839 --> 00:03:07.339 Once the filming is done like this 57 00:03:07.339 --> 00:03:09.539 and the post-production begins 58 00:03:09.539 --> 00:03:13.039 the first this is preparing the asset 59 00:03:13.039 --> 00:03:14.889 Asset doesn't refer to something complicated 60 00:03:14.889 --> 00:03:17.089 It means the parts used in projects, 61 00:03:17.089 --> 00:03:20.000 each ones of the used parts 62 00:03:20.000 --> 00:03:21.323 It could be the character 63 00:03:21.323 --> 00:03:23.873 or could be a specific object 64 00:03:23.873 --> 00:03:26.623 What comes first to prepare this asset is 65 00:03:26.623 --> 00:03:28.080 modelling 66 00:03:28.080 --> 00:03:32.180 Modelling process is used when making an actual character 67 00:03:32.180 --> 00:03:35.730 or environments or props 68 00:03:35.730 --> 00:03:38.659 There are many ways like using certain software to carve 69 00:03:38.659 --> 00:03:42.309 or connect rectangles like hard surface 70 00:03:42.309 --> 00:03:43.602 or use photogrammetry, 71 00:03:43.602 --> 00:03:47.039 based on photographs 72 00:03:47.039 --> 00:03:50.739 Fundamentally, the modelling process is the basic step 73 00:03:50.739 --> 00:03:53.720 to make 3D asset 74 00:03:53.720 --> 00:03:55.520 Once the modelling is done 75 00:03:55.520 --> 00:03:58.880 what follows as the second step is rigging 76 00:03:58.880 --> 00:04:00.680 You might have heard of it 77 00:04:00.680 --> 00:04:03.030 and there might be people who have already tried doing it 78 00:04:03.030 --> 00:04:06.594 Simply a bowl or a glass 79 00:04:06.594 --> 00:04:08.747 might not need this rigging 80 00:04:08.747 --> 00:04:10.833 Those, when holding up, 81 00:04:10.833 --> 00:04:12.933 like this remote controller, 82 00:04:12.933 --> 00:04:14.433 how it moves is easy to tell 83 00:04:14.433 --> 00:04:17.279 It's just lifting it up and down 84 00:04:17.279 --> 00:04:18.979 so you only need to produce how the 85 00:04:18.979 --> 00:04:22.229 world axis moves 86 00:04:22.229 --> 00:04:24.829 but humans or animals 87 00:04:24.829 --> 00:04:28.279 and cars, hellicopters and such 88 00:04:28.279 --> 00:04:30.679 have parts moving organically 89 00:04:30.679 --> 00:04:33.959 and there must be a way to control this movement 90 00:04:33.959 --> 00:04:37.159 All these are done in the 91 00:04:37.159 --> 00:04:39.359 rigging task 92 00:04:39.359 --> 00:04:40.909 Rigging basically is 93 00:04:40.909 --> 00:04:44.659 making the skeleton for a 3D data 94 00:04:44.659 --> 00:04:47.359 and preparing to make movements 95 00:04:47.359 --> 00:04:48.920 based on this skeleton 96 00:04:48.920 --> 00:04:50.770 Not only just making the skeleton 97 00:04:50.770 --> 00:04:54.520 but once the skeleton is made and the 3D data is able to move 98 00:04:54.520 --> 00:04:57.020 the 3D data can rip 99 00:04:57.020 --> 00:04:58.559 or stretch 100 00:04:58.559 --> 00:05:00.659 The amount it stretches 101 00:05:00.659 --> 00:05:04.072 or making muscles to create the feel of muscles 102 00:05:04.072 --> 00:05:07.972 or when the rigging skeleton moves, the hair, 103 00:05:07.972 --> 00:05:10.206 or clothes moving along with it 104 00:05:10.206 --> 00:05:12.480 These are all in the rigging part 105 00:05:12.480 --> 00:05:15.680 Once this movable asset is ready 106 00:05:15.680 --> 00:05:19.600 the animation process takes place 107 00:05:19.600 --> 00:05:21.850 I think a lot of you would know 108 00:05:21.850 --> 00:05:23.600 about this animation part 109 00:05:23.600 --> 00:05:25.959 This is the part where the movement is created 110 00:05:25.959 --> 00:05:29.259 As you can see you can create the movements of dragons 111 00:05:29.259 --> 00:05:32.109 or like in Avatar, using motion capture 112 00:05:32.109 --> 00:05:34.559 you can make character movements as well 113 00:05:34.559 --> 00:05:37.159 Animation, using the key frame method 114 00:05:37.159 --> 00:05:40.377 by creating each frame and each rigging control 115 00:05:40.377 --> 00:05:42.000 to create movement 116 00:05:42.000 --> 00:05:44.650 or motion capture used in virtual production 117 00:05:44.650 --> 00:05:47.483 or data captured from facial capture 118 00:05:47.483 --> 00:05:50.079 can be used as a base of animation as well 119 00:05:50.079 --> 00:05:52.258 Animation is the part where the most labor 120 00:05:52.258 --> 00:05:54.119 and time is required 121 00:05:54.119 --> 00:05:57.119 It can't be assured to be perfect just because there is data 122 00:05:57.119 --> 00:05:58.815 but the intention of the director 123 00:05:58.815 --> 00:06:01.020 or the way the content is made 124 00:06:01.020 --> 00:06:03.420 can exaggerate or reduce it 125 00:06:03.420 --> 00:06:07.160 Through these process, animation is done 126 00:06:07.160 --> 00:06:09.772 Like this, once the character 127 00:06:09.772 --> 00:06:11.574 movement animation is ready 128 00:06:11.574 --> 00:06:15.345 now we can say that we get into 129 00:06:15.345 --> 00:06:16.959 the VFX specialized area 130 00:06:16.959 --> 00:06:19.509 Match moving or lay-out 131 00:06:19.509 --> 00:06:23.009 becoming possible is the basis and the reason 132 00:06:23.009 --> 00:06:26.600 of movie VFX and drama VFX's development 133 00:06:26.600 --> 00:06:30.950 Match moving is a technology creating 3D data 134 00:06:30.950 --> 00:06:34.559 from 2D images filmed before 135 00:06:34.559 --> 00:06:36.459 Especially, among the 3D datas, 136 00:06:36.459 --> 00:06:39.409 it is the technology creating the camera 137 00:06:39.409 --> 00:06:41.759 In movies or drama series 138 00:06:41.759 --> 00:06:44.159 2D images are captured through cameras 139 00:06:44.159 --> 00:06:47.659 and this 2D images have objects or specific effects added 140 00:06:47.659 --> 00:06:50.880 through the VFX process 141 00:06:50.880 --> 00:06:54.299 To do this, where the camera was filming 142 00:06:54.299 --> 00:06:56.299 and the space the camera was filming 143 00:06:56.299 --> 00:06:59.149 must be reproduced as a 3D space 144 00:06:59.149 --> 00:07:02.349 to put the created objects, environments, and effects 145 00:07:02.349 --> 00:07:05.000 exactly where you want them to be 146 00:07:05.000 --> 00:07:07.550 This whole process is match move 147 00:07:07.550 --> 00:07:11.350 Match move is the process of selecting a certain area from the image plate 148 00:07:11.350 --> 00:07:15.760 and making the 3D camera out of it, from there 149 00:07:15.760 --> 00:07:19.610 Through this process, 150 00:07:19.610 --> 00:07:21.660 even when there are lots of movement in the camera 151 00:07:21.660 --> 00:07:23.710 3D space can be created 152 00:07:23.710 --> 00:07:25.560 and once this 3D space is created 153 00:07:25.560 --> 00:07:28.060 the objects that fit into that space 154 00:07:28.060 --> 00:07:32.040 can be finally added with the data obtained 155 00:07:32.040 --> 00:07:36.240 So match move artists put more focus on gaining the 156 00:07:36.240 --> 00:07:39.090 accurate data rather than on artistic aspects 157 00:07:39.090 --> 00:07:41.200 when working 158 00:07:41.200 --> 00:07:44.100 And once a character is made through modelling 159 00:07:44.100 --> 00:07:46.600 and it is able to move after the rigging process 160 00:07:46.600 --> 00:07:49.000 to give this character realisticity 161 00:07:49.000 --> 00:07:53.111 the process of texturing and look development takes place 162 00:07:53.111 --> 00:07:56.461 As you can see, real characters 163 00:07:56.461 --> 00:07:59.011 in their 3D mesh for with just grey 164 00:07:59.011 --> 00:08:00.679 cannot be realistic 165 00:08:00.679 --> 00:08:03.298 It needs new texture that reacts to light 166 00:08:03.298 --> 00:08:05.439 and making these textures 167 00:08:05.439 --> 00:08:08.339 and drawing these textures is texturing 168 00:08:08.339 --> 00:08:12.039 and with this texture, 169 00:08:12.039 --> 00:08:14.241 making the texture react to lighting 170 00:08:14.241 --> 00:08:16.369 is look development 171 00:08:16.369 --> 00:08:19.419 This look development uses photo reality 172 00:08:19.419 --> 00:08:22.359 to created to most realistic feel 173 00:08:22.359 --> 00:08:24.959 Of course, according to the occasion, 174 00:08:24.959 --> 00:08:28.559 it might create a cartoon style or a little faded 175 00:08:28.559 --> 00:08:30.458 look in the texture 176 00:08:30.458 --> 00:08:34.348 but usually, the work is done 177 00:08:34.348 --> 00:08:36.698 focusing on how it would look 178 00:08:36.698 --> 00:08:38.359 if it existed in real life 179 00:08:38.359 --> 00:08:40.559 The most important part of this process is 180 00:08:40.559 --> 00:08:42.900 choosing the right rendering engine 181 00:08:42.900 --> 00:08:45.200 because depending on what rendering engine is used 182 00:08:45.200 --> 00:08:46.550 the look can 183 00:08:46.550 --> 00:08:48.250 turn out 184 00:08:48.250 --> 00:08:50.200 completely different 185 00:08:50.200 --> 00:08:52.100 Next is lighting 186 00:08:52.100 --> 00:08:54.394 You might confuse lighting with 187 00:08:54.394 --> 00:08:56.894 look development or texturing 188 00:08:56.894 --> 00:08:58.880 but there are different 189 00:08:58.880 --> 00:09:01.730 Previously mentioned look development or texturing 190 00:09:01.730 --> 00:09:03.830 are process for a single asset 191 00:09:03.830 --> 00:09:06.380 while lighting process is 192 00:09:06.380 --> 00:09:09.200 done on the given environment 193 00:09:09.200 --> 00:09:11.250 If scene A is filmed 194 00:09:11.250 --> 00:09:15.650 and you are to add an object into that scene 195 00:09:15.650 --> 00:09:18.050 the object must be rendered 196 00:09:18.050 --> 00:09:21.400 in the same lighting as scene A's environment 197 00:09:21.400 --> 00:09:24.850 What this means is if you filmed a dark scene 198 00:09:24.850 --> 00:09:26.950 the object being added in 199 00:09:26.950 --> 00:09:31.700 must also be rendered with a dark lighting 200 00:09:31.700 --> 00:09:34.760 to make it look like it's in the same environment 201 00:09:34.760 --> 00:09:36.516 So lighting artist 202 00:09:36.516 --> 00:09:38.716 uses the site's data the most 203 00:09:38.716 --> 00:09:42.666 The filmed data, the weather and environment of that day, 204 00:09:42.666 --> 00:09:47.866 the 360 image of that day, and other references 205 00:09:47.866 --> 00:09:52.192 are used to create identical lighting 206 00:09:52.192 --> 00:09:55.359 with the filmed environment 207 00:09:55.359 --> 00:10:00.741 But rederers cannot perfectly simulate 208 00:10:00.741 --> 00:10:02.640 the lighting in real life 209 00:10:02.640 --> 00:10:04.427 So various tricks 210 00:10:04.427 --> 00:10:06.277 or various methods 211 00:10:06.277 --> 00:10:10.027 are used to make the most similar lighting 212 00:10:10.027 --> 00:10:12.320 in the lighting process 213 00:10:12.320 --> 00:10:15.470 In movies and drama series', unlike full 3D animations, 214 00:10:15.470 --> 00:10:18.520 add objects, environments, or effects to 215 00:10:18.520 --> 00:10:22.799 filmed real life images 216 00:10:22.799 --> 00:10:25.899 So there needs to be a process where 217 00:10:25.899 --> 00:10:28.899 unnecessary parts and the most important parts 218 00:10:28.899 --> 00:10:31.440 are separated from each of the filmed images 219 00:10:31.440 --> 00:10:33.590 This is called rotoscoping 220 00:10:33.590 --> 00:10:35.740 Usually we say we get a roto 221 00:10:35.740 --> 00:10:38.517 and what this means is 222 00:10:38.520 --> 00:10:42.448 as you can see there are many objects filmed 223 00:10:42.448 --> 00:10:44.298 and for each objects 224 00:10:44.298 --> 00:10:46.679 some of them might need to be deleted 225 00:10:46.679 --> 00:10:50.229 and some might need to be moved 226 00:10:50.229 --> 00:10:52.221 or a 3D object related to another one 227 00:10:52.221 --> 00:10:53.440 might need to be added in 228 00:10:53.440 --> 00:10:56.590 So to separate each objects 229 00:10:56.590 --> 00:11:01.040 you mask each object on every frame 230 00:11:01.040 --> 00:11:02.679 which is rotoscoping 231 00:11:02.679 --> 00:11:05.379 This rotoscoping is in a way the basis of 232 00:11:05.379 --> 00:11:07.429 2D compositing 233 00:11:07.429 --> 00:11:09.429 so depending on how the rotoscoping is done 234 00:11:09.429 --> 00:11:11.429 when the compositing is done 235 00:11:11.429 --> 00:11:13.440 the quality greatly differs 236 00:11:13.440 --> 00:11:17.919 So rotoscoping also needs a lot of time and effort 237 00:11:17.919 --> 00:11:21.319 There is also element shots 238 00:11:21.319 --> 00:11:25.119 Usually the characters or objects that we make are 239 00:11:25.119 --> 00:11:29.134 standardized 3D data 240 00:11:29.144 --> 00:11:30.994 Certain object moving 241 00:11:30.994 --> 00:11:33.894 or arms and legs of characters or animals moving 242 00:11:33.894 --> 00:11:38.005 or a creature moving are rigging based 243 00:11:38.005 --> 00:11:41.494 and bone or muscle movement 244 00:11:41.494 --> 00:11:43.320 simulation 245 00:11:43.320 --> 00:11:45.820 But element shots 246 00:11:45.820 --> 00:11:48.720 are not like this and unstructured 247 00:11:48.720 --> 00:11:52.570 like fire, water, wind, according to the scene or shot 248 00:11:52.570 --> 00:11:55.420 It refers to generating objects that 249 00:11:55.420 --> 00:11:57.640 continuously change 250 00:11:57.640 --> 00:12:00.240 So it can be made by filming 251 00:12:00.240 --> 00:12:02.890 or it can be made with 252 00:12:02.890 --> 00:12:06.000 simulation tools that we call FX 253 00:12:06.000 --> 00:12:09.800 Smoke, fire, water, wind, and such are 254 00:12:09.800 --> 00:12:12.850 made in certain areas like this 255 00:12:12.850 --> 00:12:15.960 and added into scenes 256 00:12:15.960 --> 00:12:18.760 Lastly, I will discuss about 257 00:12:18.760 --> 00:12:20.990 compositing 258 00:12:20.990 --> 00:12:22.793 You could say that we've been doing all this 259 00:12:22.793 --> 00:12:24.719 to prepare for compositing 260 00:12:24.719 --> 00:12:28.769 3D character, 3D environment, FX, 261 00:12:28.769 --> 00:12:33.169 and lighting, look development, all these steps 262 00:12:33.169 --> 00:12:35.619 are in fact a compositing of 2D plates 263 00:12:35.619 --> 00:12:38.369 to create one last rendering, 264 00:12:38.369 --> 00:12:41.239 the each elements prepared for that 265 00:12:41.239 --> 00:12:43.339 And with these elements 266 00:12:43.339 --> 00:12:45.289 the final compositing is done 267 00:12:45.289 --> 00:12:47.760 to create the final plate 268 00:12:47.760 --> 00:12:51.110 So compositing artists think about how each 269 00:12:51.110 --> 00:12:54.660 elements will be composited and 270 00:12:54.660 --> 00:12:56.520 how they should be composited and connected 271 00:12:56.520 --> 00:12:58.553 to create a natural image 272 00:12:58.553 --> 00:13:01.753 and how each elements should be blended to look 273 00:13:01.753 --> 00:13:06.159 integrated in one scene and not separated from each other 274 00:13:06.159 --> 00:13:08.309 So depending on the compositing quality 275 00:13:08.309 --> 00:13:11.109 even if the prepared single asset 276 00:13:11.109 --> 00:13:13.479 or single environment has high quality 277 00:13:13.479 --> 00:13:15.429 the final quality is determined by 278 00:13:15.429 --> 00:13:18.279 the composition 279 00:13:18.279 --> 00:13:22.280 so this compositing process is a very important step 280 00:13:22.280 --> 00:13:25.130 Lastly I will talk about rendering 281 00:13:25.130 --> 00:13:28.580 This rendering process does not belong to a certain stage 282 00:13:28.580 --> 00:13:32.280 It's rather a term used in all steps in VFX 283 00:13:32.280 --> 00:13:36.480 from asset or environment, to 2D compositing 284 00:13:36.480 --> 00:13:38.320 rather than a task 285 00:13:38.320 --> 00:13:40.920 Rendering refers to the whole process of 286 00:13:40.920 --> 00:13:45.320 simulating light and creating a final image 287 00:13:45.320 --> 00:13:47.770 It could simply render a grey state 288 00:13:47.770 --> 00:13:49.670 to create a grey video 289 00:13:49.670 --> 00:13:51.870 or render something that only has textures 290 00:13:51.870 --> 00:13:54.119 and create how the light reacts 291 00:13:54.119 --> 00:13:57.569 This whole process is called redering 292 00:13:57.569 --> 00:14:00.369 and it requires time and resource 293 00:14:00.369 --> 00:14:02.819 The resource needed for rendering could be CPU 294 00:14:02.819 --> 00:14:04.159 or GPU 295 00:14:04.159 --> 00:14:08.759 And according to the quality you want for each frame 296 00:14:08.759 --> 00:14:10.109 it could take a lot of time 297 00:14:10.109 --> 00:14:11.799 but it could also take only a short while 298 00:14:11.799 --> 00:14:14.149 By calculating all these 299 00:14:14.149 --> 00:14:15.949 the final quality is determined 300 00:14:15.949 --> 00:14:17.999 so it would help you a lot 301 00:14:17.999 --> 00:14:21.499 to know how this rendering process is done 302 00:14:21.499 --> 00:14:23.320 and what steps it takes 303 00:14:23.320 --> 00:14:25.039 It is impossible to not mention USD 304 00:14:25.039 --> 00:14:26.939 when talking about VFX pipeline these days 305 00:14:26.939 --> 00:14:29.880 so let me briefly touch on it 306 00:14:29.880 --> 00:14:34.430 USD is a open file format 307 00:14:34.430 --> 00:14:38.030 or a file framework created by Pixar 308 00:14:38.030 --> 00:14:41.799 Though it's not completely accurate to call it just a file format 309 00:14:41.799 --> 00:14:45.549 because USD can extend many functions 310 00:14:45.549 --> 00:14:48.499 and share those extended functions 311 00:14:48.499 --> 00:14:51.699 Not only saving files but you can save, 312 00:14:51.699 --> 00:14:54.099 share, and edit all data 313 00:14:54.099 --> 00:14:56.679 of a scene 314 00:14:56.679 --> 00:15:00.329 Before, the reason why sharing and working together 315 00:15:00.329 --> 00:15:04.579 on 3D data and 3D asset was 316 00:15:04.579 --> 00:15:08.529 because there was no standardized format and each file format 317 00:15:08.529 --> 00:15:12.034 had different functions available 318 00:15:12.034 --> 00:15:15.484 but after this USD emerged 319 00:15:15.484 --> 00:15:20.384 it is treated like the standard of the industry so that production is based on USD 320 00:15:20.384 --> 00:15:24.080 and files are exchanged to fit into USD 321 00:15:24.080 --> 00:15:26.006 meaning that different tools, different rederers, 322 00:15:26.006 --> 00:15:28.430 and different software did not matter 323 00:15:28.430 --> 00:15:33.159 and contents of the same quality were possible to create 324 00:15:33.159 --> 00:15:34.759 Another reason why USD is important is 325 00:15:34.759 --> 00:15:38.059 also on virtual production that we'll talk about later on 326 00:15:38.059 --> 00:15:41.709 virtual production also uses 3D data 327 00:15:41.709 --> 00:15:44.880 and to make use of 3D data 328 00:15:44.880 --> 00:15:47.730 using USD format data 329 00:15:47.730 --> 00:15:50.980 makes it possible for it to be used in other virtual production 330 00:15:50.980 --> 00:15:52.520 without any other conversion process 331 00:15:52.520 --> 00:15:55.616 So along with VFX pipeline 332 00:15:55.616 --> 00:15:57.366 if you study USD 333 00:15:57.366 --> 00:15:59.679 it will help you greatly 334 00:15:59.679 --> 00:16:02.946 I would also like to tell you about 335 00:16:02.946 --> 00:16:04.346 management in VFX pipeline 336 00:16:04.346 --> 00:16:06.896 The process I told you previously 337 00:16:06.896 --> 00:16:10.386 can be implemented on certain shots as a whole 338 00:16:10.386 --> 00:16:13.236 or only certain parts of it as well 339 00:16:13.236 --> 00:16:16.627 There are some shots where without any other process 340 00:16:16.627 --> 00:16:19.080 only compositing is needed 341 00:16:19.080 --> 00:16:21.480 But in one movie, there are 2,000 cuts 342 00:16:21.480 --> 00:16:24.030 and in one drama series, there are 8 to 10 thousand cuts 343 00:16:24.030 --> 00:16:26.440 To work with these 344 00:16:26.440 --> 00:16:28.490 there will be 345 00:16:28.490 --> 00:16:33.190 80 thousand, or more than 12 thousand tasks 346 00:16:33.190 --> 00:16:37.679 and it is bound to require a lot of labor in the process of 347 00:16:37.679 --> 00:16:41.479 collecting, delivering, and confirming the data from each tasks 348 00:16:41.479 --> 00:16:44.239 So management is the most important element 349 00:16:44.239 --> 00:16:47.989 If VFX artists or those working in the VFX industry 350 00:16:47.989 --> 00:16:50.189 do not have proper understanding of this management 351 00:16:50.189 --> 00:16:52.719 they cannot have the full image of how they should 352 00:16:52.719 --> 00:16:55.469 deliver and confirm the data they worked on 353 00:16:55.469 --> 00:16:58.069 and how the final output will be 354 00:16:58.069 --> 00:16:59.960 turn out to be 355 00:16:59.960 --> 00:17:03.212 So knowing what management tool you are using is 356 00:17:03.212 --> 00:17:04.547 and what it's composed of 357 00:17:04.547 --> 00:17:07.678 and how it's related to the tools and data 358 00:17:07.678 --> 00:17:10.228 you are working with 359 00:17:10.228 --> 00:17:13.782 will help you greatly 360 00:17:13.783 --> 00:17:17.526 Conditions for linking VFX pipeline and unreal engine 361 00:17:18.079 --> 00:17:21.429 Until now, I talked about usual VFX pipeline 362 00:17:21.429 --> 00:17:23.329 and from now on 363 00:17:23.329 --> 00:17:26.880 I will talk about how real time engine, game engine 364 00:17:26.880 --> 00:17:30.680 integrated into VFX pipeline 365 00:17:30.680 --> 00:17:35.359 and what change it brought 366 00:17:35.359 --> 00:17:39.459 The first example I will explain would be 367 00:17:39.459 --> 00:17:42.809 using game engine 368 00:17:42.809 --> 00:17:44.880 to create a VFX crowd 369 00:17:44.880 --> 00:17:47.930 First, if you use a game engine 370 00:17:47.930 --> 00:17:50.880 you can check the modelling, rigging and 371 00:17:50.880 --> 00:17:52.959 animation process in real time 372 00:17:52.959 --> 00:17:56.459 Before, doing this type of cooperative work 373 00:17:56.459 --> 00:18:00.159 would require each piece of data to be modeled, 374 00:18:00.159 --> 00:18:03.880 textured, and animated 375 00:18:03.880 --> 00:18:06.830 using differing tools or by having to convert the data 376 00:18:06.830 --> 00:18:09.830 every time, but with game engine 377 00:18:09.830 --> 00:18:13.719 various data can be brought to one project 378 00:18:13.719 --> 00:18:16.469 where the animation can be checked in real time or 379 00:18:16.469 --> 00:18:19.280 edited when making characters 380 00:18:19.280 --> 00:18:21.080 Before, this cooperative work 381 00:18:21.080 --> 00:18:24.530 required a lot of time to check each object's animation 382 00:18:24.530 --> 00:18:27.330 since it would require 383 00:18:27.330 --> 00:18:29.000 high resolution rendering 384 00:18:29.000 --> 00:18:31.800 but using game engine, a unified system 385 00:18:31.800 --> 00:18:34.880 let's you check animation quickly and easily 386 00:18:34.880 --> 00:18:36.730 And another strength that game engine 387 00:18:36.730 --> 00:18:38.580 has is AI 388 00:18:38.580 --> 00:18:43.239 Here, AI does not refer to LLM like generative AI 389 00:18:43.239 --> 00:18:47.189 or GPT that are famous these days, but AI in games 390 00:18:47.189 --> 00:18:50.239 AI in games means designating 391 00:18:50.239 --> 00:18:52.239 how NPCs move 392 00:18:52.239 --> 00:18:54.937 For example, where the character should go 393 00:18:54.937 --> 00:18:57.665 when I throw a first at them 394 00:18:57.665 --> 00:19:00.061 or what action to show when seeing what character 395 00:19:00.061 --> 00:19:02.880 These general logic are all called AI 396 00:19:02.880 --> 00:19:07.480 This AI can be made and edited in real time 397 00:19:07.480 --> 00:19:10.239 through visual scripting or source code 398 00:19:10.239 --> 00:19:12.289 For video tools that were used before 399 00:19:12.289 --> 00:19:16.139 these AI would be directly related to animation 400 00:19:16.139 --> 00:19:20.746 which means that generating animation and effects between subjects 401 00:19:20.746 --> 00:19:22.077 and the interaction 402 00:19:22.077 --> 00:19:23.746 needed a lot of steps 403 00:19:23.746 --> 00:19:25.446 but with game engine 404 00:19:25.446 --> 00:19:28.079 relationships and actions between subjects are 405 00:19:28.079 --> 00:19:30.479 as you are seeing, made quicker in production 406 00:19:30.479 --> 00:19:34.479 and also in checking with visual scripting 407 00:19:34.479 --> 00:19:36.479 Each characters made like this 408 00:19:36.479 --> 00:19:39.429 must be placed to fit the scene 409 00:19:39.429 --> 00:19:41.729 Here we can see another advantage that 410 00:19:41.729 --> 00:19:43.319 game engine has 411 00:19:43.319 --> 00:19:47.265 The landscape data you are seeing is a 3D dummy data 412 00:19:47.265 --> 00:19:49.365 made from an actual filmed plate 413 00:19:49.365 --> 00:19:52.980 Dummy data refers to a data 414 00:19:52.980 --> 00:19:56.180 that is turned into guide form 3D of a certain area 415 00:19:56.180 --> 00:19:59.669 when making 3D camera in the match moving process 416 00:19:59.669 --> 00:20:03.119 On this dummy data, how the 417 00:20:03.119 --> 00:20:07.380 AI crowd is spawned is determined 418 00:20:07.380 --> 00:20:11.457 and then from that area, 419 00:20:11.457 --> 00:20:14.957 whether the density must be higher or lower 420 00:20:14.957 --> 00:20:16.559 is able to be determined 421 00:20:16.559 --> 00:20:19.459 And among the areas 422 00:20:19.459 --> 00:20:24.359 where crowd shouldn't be on can 423 00:20:24.359 --> 00:20:26.840 also be determined with dummy data 424 00:20:26.840 --> 00:20:30.917 and that way, where reactivity like 425 00:20:30.917 --> 00:20:35.217 objects or buildings are located don't generate data 426 00:20:35.217 --> 00:20:37.416 and areas only without reactivity will 427 00:20:37.416 --> 00:20:38.900 generate crowds 428 00:20:38.900 --> 00:20:42.026 With all this process, AI characters, and 429 00:20:42.026 --> 00:20:45.380 using match move data made in original shots 430 00:20:45.380 --> 00:20:48.030 continuous and short time simulation 431 00:20:48.030 --> 00:20:50.116 can be checked real time 432 00:21:06.740 --> 00:21:10.690 Like this, AI characters and located crowd 433 00:21:10.690 --> 00:21:13.919 can be live composited in the game engine 434 00:21:13.919 --> 00:21:19.774 Before, to check the composition of these data 435 00:21:19.774 --> 00:21:22.324 you would have had to render 436 00:21:22.324 --> 00:21:24.480 a generated 3D crowd data to a 2D image 437 00:21:24.480 --> 00:21:28.580 and then bring this rendered image and the original plate 438 00:21:28.580 --> 00:21:31.380 to the compositing tool 439 00:21:31.380 --> 00:21:32.880 but with game engine 440 00:21:32.880 --> 00:21:35.559 the characters are located in real time 441 00:21:35.559 --> 00:21:39.409 and how these characters are shown in 442 00:21:39.409 --> 00:21:42.360 2D image plate can be composited live 443 00:21:42.360 --> 00:21:45.410 Of course it cannot reach the optimal compositing quality 444 00:21:45.410 --> 00:21:49.560 but checking the basic locating and depth 445 00:21:49.560 --> 00:21:52.110 allows easier editing 446 00:21:52.110 --> 00:21:54.040 and confirming 447 00:21:54.040 --> 00:21:56.690 Lastly, this was one of the challenges 448 00:21:56.690 --> 00:22:01.690 The images made like this in the game engine 449 00:22:01.690 --> 00:22:05.919 are then extracted of data for the actual compositing 450 00:22:05.919 --> 00:22:09.569 But the limit that game engine still has is 451 00:22:09.569 --> 00:22:13.219 that it cannot perfectly separate the many 452 00:22:13.219 --> 00:22:15.469 passes and layers of images used in the 453 00:22:15.469 --> 00:22:17.679 original compositing pipeline 454 00:22:17.679 --> 00:22:22.579 So various passes provided by game engines 455 00:22:22.579 --> 00:22:25.429 would be extracted and then these passes 456 00:22:25.429 --> 00:22:29.329 would be used to find the desired pass 457 00:22:29.329 --> 00:22:33.756 using many color controlling software 458 00:22:33.756 --> 00:22:36.513 As you can see 459 00:22:36.513 --> 00:22:40.013 what you are seeing is 460 00:22:40.013 --> 00:22:42.679 the passes about 3D crowd data 461 00:22:42.679 --> 00:22:45.629 And these passes are 462 00:22:45.629 --> 00:22:50.179 used to make the final passes 463 00:22:50.179 --> 00:22:53.520 with adapted titles and compositing 464 00:22:53.520 --> 00:22:56.920 But this as well, as game engine technology developed, 465 00:22:56.920 --> 00:22:58.720 with updates, is 466 00:22:58.720 --> 00:23:02.020 expected to create functions that can 467 00:23:02.020 --> 00:23:04.320 customize and apply on all pass areas of 468 00:23:04.320 --> 00:23:06.360 general VFX pipelines 469 00:23:06.360 --> 00:23:09.110 From the passes that are made like this, the 470 00:23:09.110 --> 00:23:14.039 selected final 5 passes are used in the final compositing process 471 00:23:14.039 --> 00:23:16.074 This final compositing process 472 00:23:16.074 --> 00:23:18.039 is not done in the game engine 473 00:23:18.039 --> 00:23:21.389 but in the software that is 474 00:23:21.389 --> 00:23:23.279 most commonly used in VFX 475 00:23:23.279 --> 00:23:25.441 And the advantage of using the software 476 00:23:25.441 --> 00:23:27.741 most common in VFX is 477 00:23:27.741 --> 00:23:30.641 that it does not interrupt the structure that 478 00:23:30.641 --> 00:23:32.080 VFX pipeline originally had 479 00:23:32.080 --> 00:23:36.182 and that if only the data source that is delivered at last is modified 480 00:23:36.182 --> 00:23:39.432 the whole process of compositing can be structured 481 00:23:39.432 --> 00:23:41.682 So it uses the flexibility of pipeline 482 00:23:41.682 --> 00:23:45.119 and the advantages that the original pipeline has 483 00:23:45.119 --> 00:23:47.469 and also cover for 484 00:23:47.469 --> 00:23:50.239 its disadvantages 485 00:24:00.870 --> 00:24:03.970 The video you just saw is 486 00:24:03.970 --> 00:24:06.670 a final video that uses a crowd 487 00:24:06.670 --> 00:24:10.279 that used a game engine and a filmed image plate 488 00:24:10.279 --> 00:24:13.729 Comparing it to a crowd video we worked on before 489 00:24:13.729 --> 00:24:17.720 it is an example that is not too far off 490 00:24:17.720 --> 00:24:21.921 and even now, these crowd characters, 491 00:24:21.921 --> 00:24:24.679 we use two tracks while working on them 492 00:24:24.679 --> 00:24:26.479 Lastly the second example is 493 00:24:26.479 --> 00:24:29.329 a case where the original pipeline 494 00:24:29.329 --> 00:24:32.879 was maintained while integrating a game engine 495 00:24:32.879 --> 00:24:34.600 into it 496 00:24:34.600 --> 00:24:38.100 The title consists of specular workflow and unreal engine 497 00:24:38.100 --> 00:24:39.252 It is the effort to reduce 498 00:24:39.252 --> 00:24:41.650 the gap between original commercial renderer, 499 00:24:41.650 --> 00:24:45.065 that is, renderer used in VFX, and 500 00:24:45.065 --> 00:24:48.411 the rendering system that 501 00:24:48.421 --> 00:24:50.000 game engine has 502 00:24:50.000 --> 00:24:54.450 First, it can differ from 503 00:24:54.450 --> 00:24:56.900 company to company but 504 00:24:56.900 --> 00:24:59.839 4 textures are most commonly used 505 00:24:59.839 --> 00:25:03.289 Diffuse, specular, normal, roughness 506 00:25:03.289 --> 00:25:06.689 These 4 texture layers are used 507 00:25:06.689 --> 00:25:09.720 to make 3D data 508 00:25:09.720 --> 00:25:12.489 What you're seeing is the images of the 509 00:25:12.489 --> 00:25:15.799 4 textures we used in a drone that we made 510 00:25:15.799 --> 00:25:18.899 and the 4 textures made like this 511 00:25:18.899 --> 00:25:21.949 are materialized using a method 512 00:25:21.949 --> 00:25:24.480 called specular workflow 513 00:25:24.480 --> 00:25:26.530 Specualr workflow works 514 00:25:26.530 --> 00:25:30.720 the complete opposite way to how a metallic workflow does 515 00:25:30.720 --> 00:25:32.470 Physical base rendering method 516 00:25:32.470 --> 00:25:35.320 can be considered one of two methods 517 00:25:35.320 --> 00:25:38.670 Metallic and specular are different 518 00:25:38.670 --> 00:25:41.887 first, how they differently 519 00:25:41.887 --> 00:25:43.799 express metallicity 520 00:25:43.799 --> 00:25:46.952 Metallic workflow expresses metallicity based on 521 00:25:46.952 --> 00:25:49.352 the presence of the value of a metal, that is, 522 00:25:49.352 --> 00:25:51.160 whether it actually shows metallicity 523 00:25:51.160 --> 00:25:52.491 while specular workflow 524 00:25:52.491 --> 00:25:54.760 expresses metallicity according to 525 00:25:54.760 --> 00:25:58.279 how much the albedo(reflection value) is 526 00:25:58.279 --> 00:26:01.429 The problem is that game engine that we use 527 00:26:01.429 --> 00:26:03.839 use metallic workflow 528 00:26:03.839 --> 00:26:05.639 and normal commercial renderers use 529 00:26:05.639 --> 00:26:08.440 specular workflow 530 00:26:08.440 --> 00:26:10.740 So if one brings a texture 531 00:26:10.740 --> 00:26:13.760 directly to a game engine 532 00:26:13.760 --> 00:26:17.559 the metallic would be expressed but the specular might not be 533 00:26:17.559 --> 00:26:21.109 or the specular would be expressed but the metallic might not be 534 00:26:21.109 --> 00:26:24.471 So as you can see 535 00:26:24.471 --> 00:26:26.971 to solve this problem 536 00:26:26.971 --> 00:26:29.399 master material was created 537 00:26:29.643 --> 00:26:32.043 Master material for specular workflow 538 00:26:32.839 --> 00:26:34.989 If you took game engine courses 539 00:26:34.989 --> 00:26:36.559 you might know about materials 540 00:26:36.559 --> 00:26:38.309 but to give a brief explanation 541 00:26:38.309 --> 00:26:40.959 It is a shading system used in game engine 542 00:26:40.959 --> 00:26:44.509 Using this shading system, it can make 543 00:26:44.509 --> 00:26:47.959 a shader with various functions node-based 544 00:26:47.959 --> 00:26:50.759 Through the production of this master material 545 00:26:50.759 --> 00:26:55.320 we made a master material that can convert specular material 546 00:26:55.320 --> 00:27:00.470 and metallic material, creating the same look without change in texture 547 00:27:00.470 --> 00:27:05.720 but only converting the shading 548 00:27:05.720 --> 00:27:10.120 And with this master material 549 00:27:10.120 --> 00:27:14.062 we were able to materialize specular workflow 550 00:27:14.062 --> 00:27:17.359 on a game engine without texture conversion 551 00:27:17.359 --> 00:27:22.409 If we used the original game engine basic material 552 00:27:22.409 --> 00:27:24.920 the specular would not have been expressed at all 553 00:27:24.920 --> 00:27:27.820 but using the master material that we made 554 00:27:27.820 --> 00:27:29.570 allowed the specular map 555 00:27:29.570 --> 00:27:32.799 to express specular as well 556 00:27:32.799 --> 00:27:34.699 This process is 557 00:27:34.699 --> 00:27:37.849 being shared on a market place 558 00:27:37.849 --> 00:27:40.049 So if you are interested 559 00:27:40.049 --> 00:27:42.499 you can use the Westworld Plugin 560 00:27:42.499 --> 00:27:44.449 to use it while taking the 561 00:27:44.449 --> 00:27:46.640 game engine courses 562 00:27:46.640 --> 00:27:52.105 This video shows how a Westworld Plugin works 563 00:28:07.890 --> 00:28:11.440 What's important here is the USD pipeline that we made, 564 00:28:11.440 --> 00:28:14.740 the metallic workflow used in unreal engine, 565 00:28:14.740 --> 00:28:16.990 and specular workflow are integrated 566 00:28:16.990 --> 00:28:20.775 allowing us to create a 567 00:28:20.775 --> 00:28:22.725 final 3D asset that has the same look 568 00:28:22.725 --> 00:28:25.025 with just a single asset on a 569 00:28:25.025 --> 00:28:26.519 game engine 570 00:28:26.519 --> 00:28:29.951 and this allows asset, environment, or creatures 571 00:28:29.951 --> 00:28:33.701 created in VFX to be used right away 572 00:28:33.701 --> 00:28:35.960 in game engine without particular conversions 573 00:28:35.960 --> 00:28:38.060 and this in turn allows 574 00:28:38.060 --> 00:28:42.386 the increase of assets that can be used 575 00:28:42.386 --> 00:28:45.599 in a virtual project 576 00:28:45.599 --> 00:28:48.955 Third example I will introduce is 577 00:28:48.955 --> 00:28:50.719 live virtual camera and animation production process 578 00:28:50.719 --> 00:28:54.719 This isn't anything special but I wanted to tell you about 579 00:28:54.719 --> 00:28:57.969 real time pre-viz, pre-visualization and how game engine 580 00:28:57.969 --> 00:28:59.599 is used in there 581 00:28:59.599 --> 00:29:01.670 You might have heard of pre-visualization 582 00:29:01.670 --> 00:29:02.960 It is also called pre-viz 583 00:29:02.960 --> 00:29:07.660 Pre-viz is the process of creating a 2D or 3D image of 584 00:29:07.660 --> 00:29:11.159 a certain scene or sequence to show the director 585 00:29:11.159 --> 00:29:13.359 and determine the method of filming 586 00:29:13.359 --> 00:29:16.760 or the filming angles 587 00:29:16.760 --> 00:29:19.860 Before, these process use tools like 588 00:29:19.860 --> 00:29:22.817 DCC, Maya, or Blender 589 00:29:22.817 --> 00:29:25.317 to reduce the VFX process 590 00:29:25.317 --> 00:29:27.439 and do rough modelling, rough rigging, 591 00:29:27.439 --> 00:29:30.479 and rough animation 592 00:29:30.479 --> 00:29:33.479 But now with game engine, motion capture, 593 00:29:33.479 --> 00:29:36.229 and virtual camera technology 594 00:29:36.229 --> 00:29:38.679 it can be done in real time 595 00:29:38.679 --> 00:29:41.817 This helped to enhance 596 00:29:41.817 --> 00:29:44.159 production efficiency a lot 597 00:29:44.159 --> 00:29:47.880 and not only pre-visualizing certain scenes 598 00:29:47.880 --> 00:29:49.430 but doing that to all sequences, 599 00:29:49.430 --> 00:29:52.430 that is, every sequence of a movie or a drama series 600 00:29:52.430 --> 00:29:55.840 can be pre-visualized 601 00:29:55.840 --> 00:29:59.029 The video you're seeing shows 602 00:29:59.029 --> 00:30:01.000 a motion captured movement of the actor 603 00:30:01.000 --> 00:30:04.100 and the red camera movement 604 00:30:04.100 --> 00:30:08.479 is made using real time camera tracking technology 605 00:30:08.479 --> 00:30:10.529 It wasn't used in a project 606 00:30:10.529 --> 00:30:13.080 We had made it for a test 607 00:30:13.080 --> 00:30:17.330 Using the same method, with the name of real time pre-viz 608 00:30:17.330 --> 00:30:19.446 one can film 609 00:30:19.446 --> 00:30:21.094 and use the camera image in the bottom 610 00:30:21.094 --> 00:30:23.520 to make an actual video 611 00:30:23.520 --> 00:30:26.719 and form pre-viz data 612 00:30:26.719 --> 00:30:29.419 As you can see, if these animations were 613 00:30:29.419 --> 00:30:31.519 filmed by real people 614 00:30:31.519 --> 00:30:33.869 and even the camera animation was captured by people 615 00:30:33.869 --> 00:30:37.325 the process would have taken a lot of time 616 00:30:37.325 --> 00:30:41.061 but with motion capture, performance capture of virtual production technology, and 617 00:30:41.061 --> 00:30:43.911 camera tracking technology, it can be 618 00:30:43.911 --> 00:30:47.520 done in a short period of time 619 00:30:47.520 --> 00:30:49.285 Virtual production, especially game engine 620 00:30:49.285 --> 00:30:51.350 has a lot of advantages especially in 621 00:30:51.350 --> 00:30:53.640 this camera animation 622 00:30:53.640 --> 00:30:56.290 Camera animation before 623 00:30:56.290 --> 00:30:59.590 would take place where there are 624 00:30:59.590 --> 00:31:01.880 no actual lighting 625 00:31:01.880 --> 00:31:04.280 so it's hard to check what feel it would give off 626 00:31:04.280 --> 00:31:06.730 and how it would be filming 627 00:31:06.730 --> 00:31:10.230 but in game engines, not only real time lighting effect 628 00:31:10.230 --> 00:31:12.289 but also lens effect, the actual 629 00:31:12.289 --> 00:31:14.799 physical effect can also be simulated 630 00:31:14.799 --> 00:31:18.549 and making a track or moving itself 631 00:31:18.549 --> 00:31:20.449 can be made like assembling something 632 00:31:20.449 --> 00:31:22.340 as if in a game, so camera animation can be 633 00:31:22.350 --> 00:31:24.919 produced much quicker 634 00:31:24.919 --> 00:31:27.169 Not just what's behind 635 00:31:27.169 --> 00:31:30.319 but certain SF scenes 636 00:31:30.319 --> 00:31:32.181 also can be produced 637 00:31:32.181 --> 00:31:34.919 using camera animation process 638 00:31:34.919 --> 00:31:39.619 What you're seeing now is an actual camera's 639 00:31:39.619 --> 00:31:43.159 movement flow with a simulation data 640 00:31:43.159 --> 00:31:46.559 You can see here that physically impossible 641 00:31:46.559 --> 00:31:50.479 camera animation can be done in game engines 642 00:31:50.479 --> 00:31:53.329 This cannot be used in actual filming 643 00:31:53.329 --> 00:31:55.879 but once it's connected with 644 00:31:55.879 --> 00:31:58.429 full 3D characters and full 3D environments 645 00:31:58.429 --> 00:32:00.829 the scene will be possible to generate 646 00:32:00.829 --> 00:32:03.919 using this data 647 00:32:03.919 --> 00:32:05.369 So if camera animation 648 00:32:05.369 --> 00:32:07.619 and motion capture technology is actively used, 649 00:32:07.619 --> 00:32:09.969 using this real time pre-viz technology, 650 00:32:09.969 --> 00:32:12.119 angles or scenes that were unheard of 651 00:32:12.119 --> 00:32:14.799 will be possible to make very quickly 652 00:32:14.799 --> 00:32:17.799 Lastly, one thing that 653 00:32:17.799 --> 00:32:21.349 was challenging was that these technological elements 654 00:32:21.349 --> 00:32:23.499 like moving with the camera 655 00:32:23.499 --> 00:32:25.679 or capturing the camera animation 656 00:32:25.679 --> 00:32:30.792 can be difficult for directors or filming directors 657 00:32:30.792 --> 00:32:34.559 that have less experience with game engines 658 00:32:34.559 --> 00:32:38.303 But with this project 659 00:32:38.303 --> 00:32:41.003 we did a process that let them control the camera 660 00:32:41.003 --> 00:32:43.200 as if they were playing a game 661 00:32:43.200 --> 00:32:47.050 As you can see, the movement, recording, editing, 662 00:32:47.050 --> 00:32:50.400 all functions of a camera were connected to a game console 663 00:32:50.400 --> 00:32:54.250 and the director could 664 00:32:54.250 --> 00:32:57.450 create the environment they want 665 00:32:57.450 --> 00:33:00.159 without specific aid 666 00:33:00.159 --> 00:33:02.209 If this was done 667 00:33:02.209 --> 00:33:04.320 through communication 668 00:33:04.321 --> 00:33:07.671 the director would have to tell the animator 669 00:33:07.671 --> 00:33:11.221 and the animator, after doing their part once given the order, 670 00:33:11.221 --> 00:33:15.187 would have to render the final data and again get it confirmed 671 00:33:15.187 --> 00:33:18.708 This would have taken at least a few days and that would 672 00:33:18.708 --> 00:33:20.719 cost a lot of time and money 673 00:33:20.719 --> 00:33:23.769 But if real time rendering is possible 674 00:33:23.769 --> 00:33:26.369 and real time interactive system is 675 00:33:26.369 --> 00:33:29.334 possible, one can create 676 00:33:29.354 --> 00:33:32.204 what is desired by controlling it oneself 677 00:33:32.204 --> 00:33:36.239 which makes the production efficient 678 00:33:36.239 --> 00:33:40.089 What you see now is a final pre-viz video using 679 00:33:40.089 --> 00:33:43.839 real time pre-viz, camera tracking, 680 00:33:43.839 --> 00:33:46.239 and movement generation 681 00:33:46.239 --> 00:33:48.719 through xbox control 682 00:33:48.719 --> 00:33:50.619 I have talked about 683 00:33:50.619 --> 00:33:52.869 VFX pipeline and how it is composed 684 00:33:52.869 --> 00:33:54.657 looking at each stages 685 00:33:54.657 --> 00:33:57.207 And also how virtual production technology 686 00:33:57.207 --> 00:34:01.257 is used within the VFX pipeline 687 00:34:01.257 --> 00:34:03.880 and how it's integrated 688 00:34:03.880 --> 00:34:07.380 As you continue to learn virtual production technology and realistic content technology 689 00:34:07.380 --> 00:34:10.380 and create videos or animations 690 00:34:10.380 --> 00:34:11.730 I hope this can give you an idea 691 00:34:11.730 --> 00:34:14.441 of how you can make use of these 692 00:34:14.441 --> 00:34:15.241 Thank you 693 00:34:17.059 --> 00:34:22.509 Understanding VFX pipeline VFX pipeline 694 00:34:22.509 --> 00:34:23.809 Conditions of linking VFX pipeline and unreal engnie VFX crowd production using unreal engine Character setting AI controller and character blueprint 695 00:34:23.809 --> 00:34:25.059 Crowd positioning using PCG Crowd positioning using PCG and real time composition using composure 696 00:34:25.059 --> 00:34:26.309 Specular workflow and unreal engine Westworld texture construction Metallic workflow that expresses texture according to metallic value 697 00:34:26.309 --> 00:34:27.559 Specular workflow that expresses texture according to the albedo(reflection value) Master material for specular workflow 698 00:34:27.559 --> 00:34:31.159 Live virtual camera animation production process