WEBVTT 1 00:00:04.201 --> 00:00:09.072 Reality Advanced Media Server Disguise Utilization(1) 2 00:00:09.072 --> 00:00:11.626 GCC Academy 3 00:00:27.181 --> 00:00:29.776 Hello, I am Jiyong Park 4 00:00:29.776 --> 00:00:34.080 and I will be giving the first lecture on the utilization of Disguise today 5 00:00:34.080 --> 00:00:39.230 In this lecture, we aim to understand various technical concepts 6 00:00:39.230 --> 00:00:42.180 and workflows, such as XR 2D and 2.5D, 7 00:00:42.180 --> 00:00:46.156 and learn how to apply them in practice 8 00:00:46.156 --> 00:00:48.756 To this end, we will focus on 9 00:00:48.756 --> 00:00:51.279 the following four topics 10 00:00:51.279 --> 00:00:55.029 First, we will learn what each workflow, 11 00:00:55.029 --> 00:00:56.827 2D and 2.5D, along with XR 12 00:00:56.827 --> 00:01:01.077 means and in what production environment they are used 13 00:01:01.077 --> 00:01:04.160 We will compare the advantages and limitations of each technology 14 00:01:04.160 --> 00:01:07.060 and understand which is the most suitable 15 00:01:07.060 --> 00:01:09.400 for which situation 16 00:01:09.400 --> 00:01:12.850 Next, we will learn about mapping, 17 00:01:12.850 --> 00:01:16.499 which accurately projects content onto displays or surfaces 18 00:01:16.499 --> 00:01:18.790 We will learn about 2D, 3D 19 00:01:18.790 --> 00:01:21.552 and various types of mapping methods 20 00:01:21.562 --> 00:01:23.908 Third, we will explore the components 21 00:01:23.908 --> 00:01:26.377 necessary to implement XR 22 00:01:26.377 --> 00:01:32.056 and understand the core technologies required to build an XR environment 23 00:01:32.056 --> 00:01:35.556 such as media servers, LED screens, tracking systems, 24 00:01:35.556 --> 00:01:37.519 and rendering engines 25 00:01:37.519 --> 00:01:39.919 Lastly, we will learn the process of 26 00:01:39.919 --> 00:01:42.320 setting up an actual XR environment 27 00:01:42.320 --> 00:01:45.520 We will learn practical settings for connecting and synchronizing 28 00:01:45.520 --> 00:01:48.584 various displays and tracking systems 29 00:01:48.584 --> 00:01:51.884 in various environments 30 00:01:51.884 --> 00:01:55.007 and for stable shooting and rendering 31 00:01:55.007 --> 00:01:58.160 Through these four topics, we will take the time 32 00:01:58.160 --> 00:02:01.310 to understand the latest technologies 33 00:02:01.310 --> 00:02:04.175 related to XR and learn how to use them 34 00:02:04.175 --> 00:02:05.799 to create creative and efficient content 35 00:02:06.135 --> 00:02:09.986 Understanding XR, 2D, 2.5D, and Mapping 36 00:02:10.531 --> 00:02:13.381 First, let's learn about XR 37 00:02:13.381 --> 00:02:15.531 At Disguise, XR 38 00:02:15.531 --> 00:02:19.065 is a general term for live production 39 00:02:19.065 --> 00:02:21.848 combining AR, MR, and VR elements 40 00:02:21.848 --> 00:02:24.287 to provide a fully immersive experience 41 00:02:24.287 --> 00:02:26.449 Using XR, you can provide 42 00:02:26.449 --> 00:02:29.233 an immersive virtual environment to the audience 43 00:02:29.233 --> 00:02:32.126 and convey the functions and design of the product 44 00:02:32.126 --> 00:02:34.276 more realistically 45 00:02:34.276 --> 00:02:37.186 You can provide an experience where the audience 46 00:02:37.186 --> 00:02:39.851 can directly experience the product's appeal 47 00:02:39.851 --> 00:02:41.970 beyond a simple presentation 48 00:02:41.970 --> 00:02:44.232 We've briefly looked at it 49 00:02:44.232 --> 00:02:46.995 but let's learn about the 2D workflow this time 50 00:02:46.995 --> 00:02:50.095 The 2D workflow is a method of capturing 51 00:02:50.095 --> 00:02:53.240 still images or videos, editing them, 52 00:02:53.240 --> 00:02:57.039 and using them as a background for an LED screen 53 00:02:57.039 --> 00:03:00.989 This method lacks depth and parallax effects 54 00:03:00.989 --> 00:03:04.733 but can be used effectively in certain environments 55 00:03:04.733 --> 00:03:08.475 First, the 2D workflow consists of plates 56 00:03:08.475 --> 00:03:12.380 Plates are created based on 57 00:03:12.380 --> 00:03:14.430 on-site photos or videos 58 00:03:14.430 --> 00:03:16.960 and edited using various tools 59 00:03:16.960 --> 00:03:18.662 The completed plates 60 00:03:18.662 --> 00:03:21.800 are used as backgrounds in virtual production 61 00:03:21.800 --> 00:03:25.550 This method is especially useful 62 00:03:25.550 --> 00:03:27.893 when the camera is fixed or requires 63 00:03:27.893 --> 00:03:30.593 a limited range of movement 64 00:03:30.593 --> 00:03:33.880 The main advantage of the 2D workflow is that 65 00:03:33.880 --> 00:03:37.230 it is flexible and scalable 66 00:03:37.230 --> 00:03:40.491 It can be easily applied to various shooting environments 67 00:03:40.491 --> 00:03:43.087 because it can produce high-resolution images 68 00:03:43.087 --> 00:03:46.737 using multiple camera settings 69 00:03:46.737 --> 00:03:49.831 It can also be scaled to meet a specific scene's needs 70 00:03:49.831 --> 00:03:53.039 while maintaining the background's texture and details 71 00:03:53.039 --> 00:03:55.839 This provides high usability in shooting environments 72 00:03:55.839 --> 00:03:58.929 beyond just visual elements 73 00:03:58.929 --> 00:04:01.782 However, the 2D workflow has some limitations 74 00:04:01.782 --> 00:04:03.182 and caveats 75 00:04:03.182 --> 00:04:06.679 The biggest limitation is the limited depth 76 00:04:06.679 --> 00:04:09.279 and lack of parallax effect 77 00:04:09.279 --> 00:04:12.099 meaning that the background does not follow the camera 78 00:04:12.099 --> 00:04:14.399 naturally when it moves 79 00:04:14.399 --> 00:04:16.920 This can be a limitation in scenes 80 00:04:16.920 --> 00:04:19.220 with large camera movements 81 00:04:19.220 --> 00:04:22.861 It may also not be suitable for environments requiring flexibility 82 00:04:22.861 --> 00:04:26.559 as repeated changes on set are impossible 83 00:04:26.559 --> 00:04:29.309 So, what should you pay attention to 84 00:04:29.309 --> 00:04:31.272 when using 2D plates? 85 00:04:31.272 --> 00:04:35.200 The most important thing is the quality of the original plate 86 00:04:35.200 --> 00:04:39.557 To get good results when reshooting in LED volume 87 00:04:39.557 --> 00:04:41.457 it is important to maintain 88 00:04:41.457 --> 00:04:44.532 the highest possible resolution and dynamic range clarity 89 00:04:44.532 --> 00:04:48.232 in the original camera data and background plate 90 00:04:48.232 --> 00:04:52.386 Since the image is displayed on the LED screen in VP 91 00:04:52.386 --> 00:04:57.555 the lack of resolution or noise 92 00:04:57.555 --> 00:04:59.455 can hinder the realism 93 00:04:59.455 --> 00:05:03.338 Therefore, it is recommended to use a camera with at least 8K resolution 94 00:05:03.338 --> 00:05:06.538 or higher and shoot in the highest possible bitrate 95 00:05:06.538 --> 00:05:10.020 and log or raw format 96 00:05:10.020 --> 00:05:14.594 In conclusion, 2D plates are a powerful tool for implementing cost-effective 97 00:05:14.594 --> 00:05:17.959 and realistic backgrounds in VP environments 98 00:05:17.959 --> 00:05:20.755 However, it is important to maximize this method's advantages 99 00:05:20.755 --> 00:05:24.601 while being aware of its clear disadvantages 100 00:05:24.601 --> 00:05:27.176 In environments where the camera is fixed 101 00:05:27.176 --> 00:05:29.576 r limited movement is required 102 00:05:29.576 --> 00:05:32.119 2D plates may be a more efficient 103 00:05:32.119 --> 00:05:35.679 and convenient choice than a 3D workflow 104 00:05:35.679 --> 00:05:38.429 The first case we introduce is 105 00:05:38.429 --> 00:05:42.208 the 2D plate in the movie Exhuma 106 00:05:42.208 --> 00:05:46.143 This scene was used as a background for the hotel window 107 00:05:46.143 --> 00:05:48.399 They filmed the night view of the city 108 00:05:48.399 --> 00:05:50.981 and installed LEDs outside the hotel set 109 00:05:50.981 --> 00:05:53.399 in the studio 110 00:05:53.399 --> 00:05:56.801 2D plates can be used effectively in this way 111 00:05:56.801 --> 00:05:59.551 with a relatively fixed camera position 112 00:05:59.551 --> 00:06:01.201 bringing about natural harmony 113 00:06:01.201 --> 00:06:04.531 between the background and the characters 114 00:06:04.531 --> 00:06:07.399 In the case of a luxury hotel like Exhuma 115 00:06:07.399 --> 00:06:09.399 it is difficult to film on location 116 00:06:09.399 --> 00:06:11.222 and even if we do film on location 117 00:06:11.222 --> 00:06:14.372 it is a high-rise building where we cannot film outside the window 118 00:06:14.372 --> 00:06:16.282 so this is a very suitable solution 119 00:06:16.282 --> 00:06:18.642 for special location situations like this 120 00:06:18.642 --> 00:06:22.522 Since it provides an accurate background image according to the camera angle 121 00:06:22.522 --> 00:06:26.480 the audience can feel immersed 122 00:06:26.480 --> 00:06:28.480 as if looking out the window 123 00:06:28.480 --> 00:06:31.320 The next work we worked on was 124 00:06:31.320 --> 00:06:34.370 Netflix's D.P. Season 2 125 00:06:34.370 --> 00:06:38.670 Here, we worked on a 2D train driving scene 126 00:06:38.670 --> 00:06:42.448 and in that scene, the actors acted on the train set 127 00:06:42.448 --> 00:06:44.248 and the background seen outside the train 128 00:06:44.248 --> 00:06:47.538 was implemented with 2D plates 129 00:06:47.538 --> 00:06:52.619 As you can see from the series, the train scene is very intense 130 00:06:52.619 --> 00:06:56.857 the route is complex, and there are many characters 131 00:06:56.857 --> 00:06:58.707 so you can see that 132 00:06:58.707 --> 00:07:01.760 2D driving is an excellent choice 133 00:07:01.760 --> 00:07:04.160 We were able to freely film the actors' actions 134 00:07:04.160 --> 00:07:05.633 in a safer environment 135 00:07:05.633 --> 00:07:09.880 and we were able to film various repetitive scenes without time constraints 136 00:07:09.880 --> 00:07:12.635 This train scene was planned through 137 00:07:12.635 --> 00:07:14.399 testing in advance 138 00:07:14.399 --> 00:07:18.099 We used the set and LED that Disguise had implemented 139 00:07:18.099 --> 00:07:21.409 in advance to do techbiz 140 00:07:21.409 --> 00:07:23.600 Here, tech biz 141 00:07:23.600 --> 00:07:26.350 is the process of planning such complex filming 142 00:07:26.350 --> 00:07:30.296 and you can think of it as a technical simulation in advance 143 00:07:30.296 --> 00:07:32.496 by combining virtual elements 144 00:07:32.496 --> 00:07:33.929 with real objects and equipment 145 00:07:33.929 --> 00:07:36.636 Techbiz is an important selection process 146 00:07:36.636 --> 00:07:39.686 that is necessary before filming 147 00:07:39.686 --> 00:07:43.914 as well as camera movement, placement, and lens selection 148 00:07:43.914 --> 00:07:48.324 The difference between a team that actually performed techbiz 149 00:07:48.324 --> 00:07:51.239 and a team that did not is huge on-site 150 00:07:51.239 --> 00:07:54.881 Techbiz is an essential process that minimizes on-site work time 151 00:07:54.881 --> 00:07:59.181 and prepares for costs 152 00:07:59.181 --> 00:08:02.318 and various variables 153 00:08:02.318 --> 00:08:05.235 So far, we have learned about 154 00:08:05.235 --> 00:08:08.185 2D workflow and its uses 155 00:08:08.185 --> 00:08:11.933 2D is a powerful tool for efficiently implementing 156 00:08:11.933 --> 00:08:13.683 fixed backgrounds 157 00:08:13.683 --> 00:08:16.679 and simple visual elements 158 00:08:16.679 --> 00:08:20.429 However, in cases where complex scenes, camera movements, 159 00:08:20.429 --> 00:08:24.111 and higher immersion are required, 160 00:08:24.111 --> 00:08:28.000 2D alone has limitations 161 00:08:28.000 --> 00:08:30.450 The 2.5D workflow emerged 162 00:08:30.450 --> 00:08:33.760 to solve these limitations 163 00:08:33.760 --> 00:08:38.453 The 2.5D workflow is a technology that bridges the gap 164 00:08:38.453 --> 00:08:42.753 between existing 2D video plates and fully 3D-generated scenes 165 00:08:42.753 --> 00:08:48.411 In simple terms, it implements parallax effects 166 00:08:48.411 --> 00:08:52.000 by stacking layers on images or video plates 167 00:08:52.000 --> 00:08:55.150 and adding depth and shape to them 168 00:08:55.150 --> 00:08:58.222 Through this process, more realistic scenes 169 00:08:58.222 --> 00:09:02.161 can be created quickly and efficiently 170 00:09:02.161 --> 00:09:04.811 So, what are the main advantages 171 00:09:04.811 --> 00:09:07.280 of a 2.5D workflow? 172 00:09:07.280 --> 00:09:10.430 Parallax effects and depth perception 173 00:09:10.430 --> 00:09:13.921 A 2.5D workflow is ideal for scenes 174 00:09:13.921 --> 00:09:17.719 requiring small camera movements and a little depth 175 00:09:17.719 --> 00:09:20.569 It has the great advantage of 176 00:09:20.569 --> 00:09:24.672 creating realistic scenes without sacrificing photorealism 177 00:09:24.672 --> 00:09:29.498 When the camera moves, the depth of the background is naturally expressed 178 00:09:29.498 --> 00:09:33.359 allowing the audience to experience a more immersive environment 179 00:09:33.359 --> 00:09:36.839 Second, it saves time and budget 180 00:09:36.839 --> 00:09:40.924 The 2.5D workflow uses layered plates 181 00:09:40.924 --> 00:09:45.574 to create a sense of depth 182 00:09:45.574 --> 00:09:47.424 It can create immersive environments 183 00:09:47.424 --> 00:09:49.760 without the need for long pre-production times 184 00:09:49.760 --> 00:09:52.060 and save on the cost of 185 00:09:52.060 --> 00:09:55.960 3D asset production or VADs 186 00:09:55.960 --> 00:10:00.610 This allows small-scale and large-scale productions 187 00:10:00.610 --> 00:10:02.011 to enjoy the benefits of 188 00:10:02.011 --> 00:10:04.799 virtual production without budget constraints 189 00:10:04.799 --> 00:10:09.399 Third, it increases the flexibility of virtual production 190 00:10:09.399 --> 00:10:12.649 The 2.5D workflow extends the flexibility 191 00:10:12.649 --> 00:10:15.840 that virtual production already offers 192 00:10:15.840 --> 00:10:18.690 It makes it easier to create realistic environments 193 00:10:18.690 --> 00:10:22.440 by adding additional depth to the scene and creating 194 00:10:22.440 --> 00:10:27.040 intermediate points that differentiate it from the background 195 00:10:27.040 --> 00:10:29.690 In particular, it gives directors and cinematographers 196 00:10:29.690 --> 00:10:33.490 more possibilities to explore what else 197 00:10:33.490 --> 00:10:37.709 they can add to the scene within budget 198 00:10:37.709 --> 00:10:39.931 In conclusion, the 2.5D workflow 199 00:10:39.931 --> 00:10:42.631 can be seen as an efficient and flexible solution 200 00:10:42.631 --> 00:10:46.440 for creating realistic scenes and immersive environments 201 00:10:46.440 --> 00:10:49.640 within budget and time constraints 202 00:10:49.640 --> 00:10:53.040 The 2.5D workflow can be widely applied 203 00:10:53.040 --> 00:10:55.520 in a variety of production environments 204 00:10:55.520 --> 00:11:01.020 The most common example is set extension in films and TV dramas 205 00:11:01.020 --> 00:11:03.670 You can easily expand 206 00:11:03.670 --> 00:11:06.070 large-scale backgrounds or environments 207 00:11:06.070 --> 00:11:09.599 that are physically difficult to produce 208 00:11:09.599 --> 00:11:11.849 using image plates and depth layers 209 00:11:11.849 --> 00:11:17.599 This can greatly reduce production costs and time 210 00:11:17.599 --> 00:11:21.849 If you have learned about 2D, 2.5D, and 3D 211 00:11:21.849 --> 00:11:25.442 let's take a moment to briefly understand 212 00:11:25.442 --> 00:11:29.799 how content is mapped in Disguise 213 00:11:29.799 --> 00:11:32.549 First of all, what is Mapping? 214 00:11:32.549 --> 00:11:37.159 Mapping is an important characteristic used in accurately conveying 215 00:11:37.159 --> 00:11:41.509 content as an expression of a specific screen or stage 216 00:11:41.509 --> 00:11:44.359 You can convey the content of a single video layer 217 00:11:44.359 --> 00:11:47.640 to single or multiple screens 218 00:11:47.640 --> 00:11:51.440 or map the content to screens 219 00:11:51.440 --> 00:11:54.119 with irregular shapes and resolutions 220 00:11:54.119 --> 00:11:58.919 Mapping allows you to combine digital content and physical space 221 00:11:58.919 --> 00:12:02.269 to provide a more immersive 222 00:12:02.269 --> 00:12:04.239 and multidimensional visual experience 223 00:12:04.239 --> 00:12:06.989 Next, we will look at the various types of 224 00:12:06.989 --> 00:12:10.159 Disguise mapping and the characteristics of each type 225 00:12:10.159 --> 00:12:13.963 Mapping is largely divided into 226 00:12:13.963 --> 00:12:16.400 2D, 3D, and geometry categories 227 00:12:16.400 --> 00:12:21.750 Each category is selected based on the nature of the content and the target of Mapping 228 00:12:21.750 --> 00:12:24.250 and can be divided into static or dynamic Mapping, 229 00:12:24.250 --> 00:12:26.919 2D, and 3D 230 00:12:26.919 --> 00:12:29.619 Static Mapping is a method suitable for 231 00:12:29.619 --> 00:12:32.069 projecting fixed images or videos on a screen 232 00:12:32.069 --> 00:12:35.640 and is useful for settings with little environmental change 233 00:12:35.640 --> 00:12:38.440 Dynamic Mapping is used when mapping content 234 00:12:38.440 --> 00:12:39.740 that changes in real-time 235 00:12:39.740 --> 00:12:42.919 and is widely used in live events 236 00:12:42.919 --> 00:12:46.119 The standard for dividing 2D and 3D Mapping is that 237 00:12:46.119 --> 00:12:50.359 2D Mapping is suitable for placing content on a flat screen 238 00:12:50.359 --> 00:12:55.599 and is mainly used on rectangular displays such as LED walls 239 00:12:55.599 --> 00:12:59.299 On the other hand, 3D Mapping means 240 00:12:59.299 --> 00:13:03.119 Mapping complex-shaped surfaces, objects, or upper spaces 241 00:13:03.119 --> 00:13:06.019 Geometric Mapping means 242 00:13:06.019 --> 00:13:09.919 Mapping to a fan or cylindrical surface, 243 00:13:09.919 --> 00:13:13.630 as in the case of Spherical or Cylindrical Mapping 244 00:13:13.630 --> 00:13:17.580 Parallel, Perspective and Radial Mapping 245 00:13:17.580 --> 00:13:22.305 means Mapping by using multiple surfaces as a single canvas 246 00:13:22.305 --> 00:13:25.440 or applying perspective to project content 247 00:13:25.440 --> 00:13:28.340 Let's look at some of the more 248 00:13:28.340 --> 00:13:30.359 common mapping methods 249 00:13:30.359 --> 00:13:33.359 The first is direct mapping 250 00:13:33.359 --> 00:13:36.919 Direct mapping is one of the static mapping methods 251 00:13:36.919 --> 00:13:41.469 and the most basic form of mapping in Disguise 252 00:13:41.469 --> 00:13:46.119 It is a mapping method that is automatically set when creating a screen 253 00:13:46.119 --> 00:13:51.219 Since the content is directly applied to the texture coordinates of the object 254 00:13:51.219 --> 00:13:53.319 the content is fixed to 255 00:13:53.319 --> 00:13:56.239 the corresponding screen even if the object moves 256 00:13:56.239 --> 00:13:59.889 When the content is initially loaded, the Disguise software 257 00:13:59.889 --> 00:14:04.389 sets the default value to Clip to Canvas 258 00:14:04.389 --> 00:14:07.479 so the aspect ratio is fixed and displayed 259 00:14:07.479 --> 00:14:11.529 However, suppose the content's aspect ratio differs from the screen's 260 00:14:11.529 --> 00:14:17.209 In that case, the operator can crop, fit, expand, 261 00:14:17.209 --> 00:14:21.759 or apply pixel-perfect to the content 262 00:14:21.759 --> 00:14:23.679 on a pixel-by-pixel basis 263 00:14:23.679 --> 00:14:26.079 The second is feed mapping 264 00:14:26.079 --> 00:14:30.080 Feed mapping is a very useful and flexible mapping method 265 00:14:30.080 --> 00:14:34.280 It allows you to route specific areas 266 00:14:34.280 --> 00:14:36.559 to specific screens and surfaces 267 00:14:36.559 --> 00:14:40.659 The main advantage of feed mapping is that it allows you to supply content 268 00:14:40.659 --> 00:14:45.440 to multiple screens simultaneously using a single content source 269 00:14:45.440 --> 00:14:46.990 This is especially useful for 270 00:14:46.990 --> 00:14:49.840 show programming that uses multiple screens 271 00:14:49.840 --> 00:14:53.119 and it also helps improve server performance 272 00:14:53.119 --> 00:14:55.069 by reducing the number of layers required 273 00:14:55.069 --> 00:14:58.039 to supply content to multiple screens simultaneously 274 00:14:58.039 --> 00:15:00.539 The third is parallel mapping 275 00:15:00.539 --> 00:15:02.889 Parallel mapping works by 276 00:15:02.889 --> 00:15:05.880 geometrically projecting content onto the scene 277 00:15:05.880 --> 00:15:09.280 almost like a projector or other illuminant 278 00:15:09.280 --> 00:15:13.559 shoots the content virtually 279 00:15:13.559 --> 00:15:18.059 Unlike other mapping methods, parallel mapping projects the content 280 00:15:18.059 --> 00:15:22.159 onto the screen rather than directly attaching it to UVs 281 00:15:22.159 --> 00:15:24.509 This allows the content 282 00:15:24.509 --> 00:15:27.509 to dynamically update as its position 283 00:15:27.509 --> 00:15:30.280 in 3D space changes 284 00:15:30.280 --> 00:15:33.730 This means that, unlike direct or feed mapping 285 00:15:33.730 --> 00:15:37.706 the content is not directly attached to the surface 286 00:15:37.706 --> 00:15:40.356 so the content will look different 287 00:15:40.356 --> 00:15:42.906 when the viewpoint changes or objects move 288 00:15:42.906 --> 00:15:45.806 This is advantageous in environments where the physical screen arrangement 289 00:15:45.806 --> 00:15:50.299 is not fixed, such as the exterior of a car 290 00:15:50.299 --> 00:15:53.949 So far, we have looked at 2D, 3D, and 2.5D concepts 291 00:15:53.949 --> 00:15:57.200 and the various mapping methods 292 00:15:57.200 --> 00:15:59.400 The important question is 293 00:15:59.400 --> 00:16:02.750 in what situations should you choose between 2D, 3D 294 00:16:02.750 --> 00:16:05.239 between 2D, 3D and 2.5D? 295 00:16:05.239 --> 00:16:09.889 Each workflow and mapping method has advantages and disadvantages 296 00:16:09.889 --> 00:16:12.689 and choosing the method that suits 297 00:16:12.689 --> 00:16:16.039 your project's purpose and requirements is very important 298 00:16:16.039 --> 00:16:17.943 2D is suitable when you need a fixed background 299 00:16:17.943 --> 00:16:21.320 or simple visual elements 300 00:16:21.320 --> 00:16:25.570 Its main advantages are that it is cost-effective and takes less time to produce 301 00:16:25.570 --> 00:16:28.880 and post-production is simpler than 3D 302 00:16:28.880 --> 00:16:30.930 However, its limitations include a lack of depth 303 00:16:30.930 --> 00:16:34.479 and limited camera movement 304 00:16:34.479 --> 00:16:38.129 Therefore, it is suitable for static or simple scenes 305 00:16:38.129 --> 00:16:41.479 but may have limitations in complex visual production 306 00:16:41.479 --> 00:16:44.279 3D is often used in large-scale projects 307 00:16:44.279 --> 00:16:46.215 such as movies and dramas, 308 00:16:46.215 --> 00:16:50.915 and it provides more immersive visual effects by integrating actual physical sets 309 00:16:50.915 --> 00:16:54.520 and digital environments in real-time 310 00:16:54.520 --> 00:16:57.820 The biggest advantage is that it can implement realistic screens 311 00:16:57.820 --> 00:17:01.679 through real-time camera tracking 312 00:17:01.679 --> 00:17:03.429 Since the digital background reacts dynamically 313 00:17:03.429 --> 00:17:05.779 according to the camera's movement 314 00:17:05.779 --> 00:17:09.129 the director and staff can immediately check 315 00:17:09.129 --> 00:17:10.800 and modify the scene 316 00:17:10.800 --> 00:17:14.200 However, this method requires higher production costs 317 00:17:14.200 --> 00:17:17.040 and complex setup processes than 2D 318 00:17:17.040 --> 00:17:20.140 In addition, it may be difficult to apply to small-scale projects 319 00:17:20.140 --> 00:17:23.199 because it requires a high-performance hardware system 320 00:17:23.199 --> 00:17:26.449 I think you can think of 2.5D 321 00:17:26.449 --> 00:17:29.719 as a method between 2D and 3D, as I mentioned 322 00:17:29.719 --> 00:17:31.719 It is a middle ground 323 00:17:31.719 --> 00:17:35.719 that provides cost and time efficiency while taking advantage of each 324 00:17:35.719 --> 00:17:39.219 ou can use multiple layers to work 325 00:17:39.219 --> 00:17:42.479 in 2D while achieving some 3D effects 326 00:17:42.479 --> 00:17:46.779 If you clearly understand and utilize these advantages and disadvantages well, 327 00:17:46.779 --> 00:17:50.825 you can achieve the best results while producing efficiently 328 00:17:51.228 --> 00:17:55.119 Components and Workflow of XR 329 00:17:55.400 --> 00:17:58.400 Let's go back to XR 330 00:17:58.400 --> 00:18:02.700 As I mentioned, XR is an extended reality concept 331 00:18:02.700 --> 00:18:07.350 that includes virtual, augmented, 332 00:18:07.350 --> 00:18:09.719 and mixed reality technologies 333 00:18:09.719 --> 00:18:12.369 XR is an innovative technology 334 00:18:12.369 --> 00:18:15.519 that blurs the boundaries between virtual and real by combining 335 00:18:15.519 --> 00:18:20.239 real-time virtual environments with physical elements such as LED screens 336 00:18:20.239 --> 00:18:23.189 This technology is used in various industries 337 00:18:23.189 --> 00:18:26.689 such as music videos, live broadcasts, and film production 338 00:18:26.689 --> 00:18:28.560 providing viewers 339 00:18:28.560 --> 00:18:31.959 with amazing visual experiences 340 00:18:31.959 --> 00:18:35.059 Now, let's look at how 341 00:18:35.059 --> 00:18:37.640 the XR stage is structured 342 00:18:37.640 --> 00:18:40.959 First, let's explain the camera view 343 00:18:40.959 --> 00:18:44.109 The camera view shows how the XR environment 344 00:18:44.109 --> 00:18:47.239 is ultimately rendered 345 00:18:47.239 --> 00:18:50.889 Through this view, the director and production team 346 00:18:50.889 --> 00:18:53.389 can check the final visual results 347 00:18:53.389 --> 00:18:56.359 in real-time and immediately reflect on any necessary adjustments 348 00:18:56.359 --> 00:18:58.709 Therefore, the camera view plays 349 00:18:58.709 --> 00:19:00.880 an important role in production 350 00:19:00.880 --> 00:19:04.880 Next is the LED stage, the central element of the XR stage 351 00:19:04.880 --> 00:19:08.180 The LED stage is the core technology of the XR environment 352 00:19:08.180 --> 00:19:12.839 where the virtual background is physically expressed through the LED panel 353 00:19:12.839 --> 00:19:17.039 Actors and hosts can act or proceed naturally 354 00:19:17.039 --> 00:19:20.319 while viewing the virtual background through the LED screen 355 00:19:20.319 --> 00:19:22.319 And the outside of the LED stage 356 00:19:22.319 --> 00:19:25.160 is expressed as a virtual set extension 357 00:19:25.160 --> 00:19:27.660 This extension allows the virtual environment 358 00:19:27.660 --> 00:19:29.810 to continue beyond the LED panels 359 00:19:29.810 --> 00:19:33.280 giving the audience a sense of infinite space 360 00:19:33.280 --> 00:19:36.530 and allowing for a production that transcends 361 00:19:36.530 --> 00:19:39.319 the constraints of actual physical space 362 00:19:39.319 --> 00:19:41.079 Finally, the floor 363 00:19:41.079 --> 00:19:44.879 The floor combines physical elements and LED panels 364 00:19:44.879 --> 00:19:48.229 to help actors and presenters interact 365 00:19:48.229 --> 00:19:51.760 naturally with the XR environment 366 00:19:51.760 --> 00:19:56.210 The floor with LED panels enhances lighting and reflection effects 367 00:19:56.210 --> 00:19:59.327 to create a more realistic environment 368 00:19:59.327 --> 00:20:01.977 To implement immersive content 369 00:20:01.977 --> 00:20:04.427 and visual effects in an XR studio 370 00:20:04.427 --> 00:20:08.640 various components must be organically connected 371 00:20:08.640 --> 00:20:11.090 Here, we will look at five key elements 372 00:20:11.090 --> 00:20:13.880 essential for XR implementation 373 00:20:13.880 --> 00:20:16.719 First, the LED screen 374 00:20:16.719 --> 00:20:19.453 The LED screen is a display device 375 00:20:19.453 --> 00:20:21.019 that forms the physical foundation of XR 376 00:20:21.019 --> 00:20:24.080 and displays the virtual background in real-time 377 00:20:24.080 --> 00:20:27.440 Second, the camera tracking system 378 00:20:27.440 --> 00:20:31.244 The camera tracking system records 379 00:20:31.244 --> 00:20:34.094 the camera's position, angle, and movement data in real-time 380 00:20:34.094 --> 00:20:37.160 and transmits it to other systems 381 00:20:37.160 --> 00:20:39.959 Third, the media server 382 00:20:39.959 --> 00:20:42.509 The media server acts as the hub 383 00:20:42.509 --> 00:20:44.680 of the entire XR system 384 00:20:44.680 --> 00:20:47.360 Fourth, the Genlock Sync 385 00:20:47.360 --> 00:20:50.460 The Genlock Sync maintains frame synchronization 386 00:20:50.460 --> 00:20:53.880 between all devices and systems 387 00:20:53.880 --> 00:20:58.240 Lastly, the real-time content rendering engine 388 00:20:58.240 --> 00:21:01.940 The real-time rendering engine provides high-resolution 3D graphics 389 00:21:01.940 --> 00:21:04.879 and physically-based rendering in real-time 390 00:21:04.879 --> 00:21:09.120 and is a key technology for completing visual elements in XR 391 00:21:09.120 --> 00:21:13.470 In this way, the components of XR play their roles 392 00:21:13.470 --> 00:21:17.503 while organically connecting to create an immersive environment 393 00:21:17.503 --> 00:21:21.959 where the virtual and the real are naturally blended 394 00:21:21.959 --> 00:21:26.239 Let's take a closer look at each of these elements 395 00:21:26.239 --> 00:21:29.139 One of the most important components of an XR stage 396 00:21:29.139 --> 00:21:31.839 is the LED screen 397 00:21:31.839 --> 00:21:35.480 The LED screen physically implements the virtual environment 398 00:21:35.480 --> 00:21:37.430 allowing actors and props to interact 399 00:21:37.430 --> 00:21:40.559 with the virtual background in real-time 400 00:21:40.559 --> 00:21:44.509 The core of an XR stage is a structure that includes walls, 401 00:21:44.509 --> 00:21:48.400 floors, and sometimes even ceilings made of LED panels 402 00:21:48.400 --> 00:21:50.400 This setup is very important 403 00:21:50.400 --> 00:21:54.000 Because the resolution and LED pixel pitch 404 00:21:54.000 --> 00:21:56.440 directly affect the visual quality 405 00:21:56.440 --> 00:21:59.559 and clarity of the digital content 406 00:21:59.559 --> 00:22:02.440 In XR, every detail matters 407 00:22:02.440 --> 00:22:04.990 The pixel density of these LED panels 408 00:22:04.990 --> 00:22:07.440 ensures that the virtual environment looks smooth 409 00:22:07.440 --> 00:22:11.600 and realistic even when shot up close with a camera 410 00:22:11.600 --> 00:22:14.400 LED screens can be installed on walls, floors, 411 00:22:14.400 --> 00:22:17.240 and even in dome shapes, 412 00:22:17.240 --> 00:22:20.839 allowing for flexible background implementation in any environment 413 00:22:20.839 --> 00:22:24.360 In particular, floor LEDs are the most important element of XR 414 00:22:24.360 --> 00:22:27.320 and floor LEDs are essential 415 00:22:27.320 --> 00:22:30.020 for XR to become Extended Reality 416 00:22:30.020 --> 00:22:32.919 or spatial expansion, as its name suggests 417 00:22:32.919 --> 00:22:35.769 The next core component of XR that we will introduce is 418 00:22:35.769 --> 00:22:38.039 the camera tracking system 419 00:22:38.039 --> 00:22:41.539 This technology is essential for 420 00:22:41.539 --> 00:22:45.080 synchronizing the virtual background with the actual camera movement in an XR environment 421 00:22:45.080 --> 00:22:47.530 The camera tracking system 422 00:22:47.530 --> 00:22:50.440 tracks the position and movement angle of the camera in real-time 423 00:22:50.440 --> 00:22:52.940 allowing the virtual background to 424 00:22:52.940 --> 00:22:55.119 automatically adjust to the camera’s viewpoint 425 00:22:55.119 --> 00:22:58.519 This makes the viewer feel like the camera 426 00:22:58.519 --> 00:23:02.000 is moving in real space 427 00:23:02.000 --> 00:23:05.479 As the camera moves, the virtual environment displayed on the LED screen 428 00:23:05.479 --> 00:23:09.399 also changes accurately according to the camera’s viewpoint 429 00:23:09.399 --> 00:23:12.161 This makes it appear that the virtual background and the real person 430 00:23:12.161 --> 00:23:15.639 are in a consistent space 431 00:23:15.639 --> 00:23:17.806 So, what technologies are used 432 00:23:17.806 --> 00:23:19.880 in these camera tracking systems? 433 00:23:19.880 --> 00:23:23.000 Representative camera tracking systems include 434 00:23:23.000 --> 00:23:26.200 Skype, Ncam, 435 00:23:26.200 --> 00:23:28.839 and the Optitrack mentioned earlier 436 00:23:28.839 --> 00:23:31.889 These devices precisely track the position and movement 437 00:23:31.889 --> 00:23:35.880 of the camera using high-precision sensors 438 00:23:35.880 --> 00:23:39.730 The camera tracking data is combined with Disguise software 439 00:23:39.730 --> 00:23:42.919 and reflected in the virtual environment in real-time 440 00:23:42.919 --> 00:23:45.519 It is necessary to add and activate a driver 441 00:23:45.519 --> 00:23:48.320 that matches the Disguise tracking system 442 00:23:48.320 --> 00:23:50.070 create a position receiver 443 00:23:50.070 --> 00:23:52.960 and connect it to the tracking system 444 00:23:52.960 --> 00:23:57.210 Camera tracking systems are essential for all types of XR production, 445 00:23:57.210 --> 00:23:59.960 such as movies, advertisements, and live events 446 00:23:59.960 --> 00:24:03.260 They are a technology that perfectly merges 447 00:24:03.260 --> 00:24:05.399 the virtual and real in the XR environment 448 00:24:05.399 --> 00:24:09.240 Next, we will learn more about the real-time rendering engine 449 00:24:09.240 --> 00:24:12.600 that plays a key role in the Disguise XR environment 450 00:24:12.600 --> 00:24:16.550 The rendering engine is the core technology 451 00:24:16.550 --> 00:24:20.480 that determines the quality of XR content and real-time responsiveness 452 00:24:20.480 --> 00:24:24.530 A real-time rendering engine is a program like Unreal Engine or Unity 453 00:24:24.530 --> 00:24:28.640 that creates and renders 3D content in real-time 454 00:24:28.640 --> 00:24:31.240 and provides immediate visual feedback 455 00:24:31.240 --> 00:24:34.480 on LED screens and virtual environments 456 00:24:34.480 --> 00:24:37.320 Epic Games' Unreal Engine is a prime example 457 00:24:37.320 --> 00:24:41.220 It allows for a high level of realism 458 00:24:41.220 --> 00:24:44.600 and dynamic environment control 459 00:24:44.600 --> 00:24:47.500 You can build environments composed of 3D 460 00:24:47.500 --> 00:24:50.239 and use a studio space as a multi-purpose space 461 00:24:50.239 --> 00:24:52.439 to continuously create 462 00:24:52.439 --> 00:24:55.200 new content in the same space 463 00:24:55.200 --> 00:24:59.800 You can render scenes as the camera sees them 464 00:24:59.800 --> 00:25:02.920 maximize realism 465 00:25:02.920 --> 00:25:05.970 and create perspectives for the scenes 466 00:25:05.970 --> 00:25:08.480 to provide deeper and more vivid screens 467 00:25:08.480 --> 00:25:11.930 The real-time rendering engine also supports 468 00:25:11.930 --> 00:25:14.598 integrated workflows with 469 00:25:14.598 --> 00:25:17.000 Notch, Unity, and Unreal tools 470 00:25:17.000 --> 00:25:19.900 The camera can move freely in the 3D environment 471 00:25:19.900 --> 00:25:22.387 and is not tied to fixed shots 472 00:25:22.387 --> 00:25:24.279 or limited camera positions 473 00:25:24.279 --> 00:25:27.200 Let’s take a look at the XR workflow 474 00:25:27.200 --> 00:25:32.079 Networking is a key element in any XR projection 475 00:25:32.079 --> 00:25:35.191 After all, the data flow between the server and the rendering engine 476 00:25:35.191 --> 00:25:38.920 heavily depends on the network infrastructure 477 00:25:38.920 --> 00:25:42.386 The network must handle the bandwidth 478 00:25:42.386 --> 00:25:44.799 of all the generated and transmitted data 479 00:25:44.799 --> 00:25:46.699 So, what types of networks do we need 480 00:25:46.699 --> 00:25:48.679 in an XR system? 481 00:25:48.679 --> 00:25:51.279 To build an effective XR system 482 00:25:51.279 --> 00:25:54.160 we need the following networks 483 00:25:54.160 --> 00:25:58.722 First is the media server network, d3Net 484 00:25:58.722 --> 00:26:02.822 d3Net handles communication between all the servers 485 00:26:02.822 --> 00:26:04.720 in the XR system 486 00:26:04.720 --> 00:26:09.720 The media server supports fast and reliable data transmission 487 00:26:09.720 --> 00:26:13.480 Next is the metadata and video Ethernet transmission 488 00:26:13.480 --> 00:26:15.040 This is called the render stream 489 00:26:15.040 --> 00:26:18.490 which is the way to communicate with third-party render engines, 490 00:26:18.490 --> 00:26:24.160 such as Unreal Engine 491 00:26:24.160 --> 00:26:27.710 This allows us to transmit metadata 492 00:26:27.710 --> 00:26:30.600 and video data between the VX and RX 493 00:26:30.600 --> 00:26:32.900 For high-quality video streams 494 00:26:32.900 --> 00:26:34.839 we need to transmit uncompressed data 495 00:26:34.839 --> 00:26:37.589 and support 10-bit transmission 496 00:26:37.589 --> 00:26:40.000 to maintain color accuracy and detail 497 00:26:40.000 --> 00:26:42.550 We also need to utilize a 25Gbps network 498 00:26:42.550 --> 00:26:45.440 to quickly transmit large amounts of data 499 00:26:45.440 --> 00:26:49.079 The system must also track camera movement data 500 00:26:49.079 --> 00:26:51.179 through automation and camera tracking networks 501 00:26:51.179 --> 00:26:54.480 to provide real-time synchronization 502 00:26:54.480 --> 00:26:56.430 The last, content management network 503 00:26:56.430 --> 00:26:58.680 provides the connectivity needed 504 00:26:58.680 --> 00:27:01.079 to upload, download, and manage content 505 00:27:01.079 --> 00:27:04.629 It is used to retrieve media files 506 00:27:04.629 --> 00:27:07.320 from external sources for downloading 507 00:27:07.320 --> 00:27:11.559 Let’s look at the multiple networks required for an XR system 508 00:27:11.559 --> 00:27:13.709 Looking at the VX series 509 00:27:13.709 --> 00:27:16.920 the system is equipped with multiple NICs 510 00:27:16.920 --> 00:27:22.200 These NICs enable a variety of bandwidth activities 511 00:27:22.200 --> 00:27:28.650 The VX4+ has one 1Gb port, two 10Gb ports 512 00:27:28.650 --> 00:27:30.819 and two 100Gb ports 513 00:27:30.819 --> 00:27:33.069 and this configuration supports real-time rendering 514 00:27:33.069 --> 00:27:35.679 and high-speed data transfer 515 00:27:35.679 --> 00:27:39.114 By separating the various data types, 516 00:27:39.114 --> 00:27:42.799 such as communication, generation, content streaming, or camera tracking 517 00:27:42.799 --> 00:27:46.160 each channel can operate efficiently without congestion 518 00:27:46.160 --> 00:27:48.839 optimizing system performance 519 00:27:48.839 --> 00:27:52.189 So, let’s organize the system network configuration 520 00:27:52.189 --> 00:27:55.000 diagram of the configurations above 521 00:27:55.000 --> 00:27:57.200 First, the Disguise equipment 522 00:27:57.200 --> 00:28:00.559 has the VX and RX systems in the center 523 00:28:00.559 --> 00:28:02.970 The VX is connected to the LED processor 524 00:28:02.970 --> 00:28:05.239 to output high-quality content 525 00:28:05.239 --> 00:28:08.489 and the RX receives content from a rendering engine 526 00:28:08.489 --> 00:28:12.640 such as Unreal Engine and transmits it to the VX 527 00:28:12.640 --> 00:28:16.390 This is connected via d3Net 528 00:28:16.390 --> 00:28:19.640 a dedicated network between Disguise equipment 529 00:28:19.640 --> 00:28:24.000 On the left are the video cameras and the tracking system 530 00:28:24.000 --> 00:28:26.950 Each camera captures real-time data 531 00:28:26.950 --> 00:28:30.040 and the tracking system transmits the camera’s position 532 00:28:30.040 --> 00:28:33.359 and movement data to the VX4 based on this data 533 00:28:33.359 --> 00:28:35.659 The entire system is also perfectly synchronized 534 00:28:35.659 --> 00:28:38.679 via the Genlock system 535 00:28:38.679 --> 00:28:40.679 Next is the network 536 00:28:40.679 --> 00:28:44.979 The 10Gb network switch and the 25Gb Mellanox switch 537 00:28:44.979 --> 00:28:47.119 support high-capacity transmission 538 00:28:47.119 --> 00:28:50.520 to process the render stream and KVM signals 539 00:28:50.520 --> 00:28:53.070 On the right are the LED processor, the display 540 00:28:53.070 --> 00:28:55.200 and output-related parts 541 00:28:55.200 --> 00:28:59.100 This processor receives the data transmitted from the VX4 542 00:28:59.100 --> 00:29:03.039 and accurately displays the content on the LED screen 543 00:29:03.039 --> 00:29:06.080 We’ve covered the network 544 00:29:06.080 --> 00:29:11.039 but a good system starts with a solid plan 545 00:29:11.039 --> 00:29:13.440 As with any VP shoot 546 00:29:13.440 --> 00:29:16.590 the system administrator 547 00:29:16.590 --> 00:29:18.520 plays an important role in an XR production 548 00:29:18.520 --> 00:29:23.200 They are responsible for designing and planning the overall system 549 00:29:23.200 --> 00:29:26.200 configuring the individual networks, and ensuring 550 00:29:26.200 --> 00:29:29.239 smooth communication between the various devices 551 00:29:29.239 --> 00:29:32.089 We’ve covered some of the important 552 00:29:32.089 --> 00:29:34.719 individual components of XR so far 553 00:29:34.719 --> 00:29:36.969 This time, I’ll explain 554 00:29:36.969 --> 00:29:39.735 how these elements are organically 555 00:29:39.735 --> 00:29:41.960 linked to the workflow 556 00:29:41.960 --> 00:29:46.760 Everything starts in the Disguise Extended Reality Controller 557 00:29:46.760 --> 00:29:48.960 Disguise Designer software 558 00:29:48.960 --> 00:29:51.060 sets up the exact virtual environment 559 00:29:51.060 --> 00:29:54.039 through camera and lens calibration 560 00:29:54.039 --> 00:29:56.940 spatial and color mapping, etc. 561 00:29:56.940 --> 00:29:59.950 This includes the Disguise 3D simulation 562 00:29:59.950 --> 00:30:02.900 and video mapping server 563 00:30:02.900 --> 00:30:05.340 so that each element is precisely synchronized 564 00:30:05.340 --> 00:30:08.859 The second step is real-time graphics generation 565 00:30:08.859 --> 00:30:11.007 This process uses 566 00:30:11.007 --> 00:30:14.780 a rendering engine like Unreal Engine 567 00:30:14.780 --> 00:30:17.739 and the virtual environment is output to the LED screen 568 00:30:17.739 --> 00:30:20.789 The RenderStream plugin creates 569 00:30:20.789 --> 00:30:23.260 the AR content in real time 570 00:30:23.260 --> 00:30:26.010 and this data is rendered in real time 571 00:30:26.010 --> 00:30:28.460 along with the camera tracking data 572 00:30:28.460 --> 00:30:33.539 All of this content is then fed to the LED wall 573 00:30:33.539 --> 00:30:37.299 This is where the LEDs play a key role 574 00:30:37.299 --> 00:30:39.799 The lighting control system 575 00:30:39.799 --> 00:30:44.340 also synchronizes the light from the LED screen with the physical lighting 576 00:30:44.340 --> 00:30:47.219 to create a more immersive environment 577 00:30:47.219 --> 00:30:51.020 The final step is Vision Mixer 578 00:30:51.020 --> 00:30:55.380 This is where the final image composition takes place 579 00:30:55.380 --> 00:30:58.380 The AR content and lens data are combined 580 00:30:58.380 --> 00:31:01.230 to create a perfectly integrated screen with a real camera 581 00:31:01.230 --> 00:31:03.419 and virtual background 582 00:31:03.419 --> 00:31:07.119 This result is ready to be broadcast or streamed 583 00:31:07.119 --> 00:31:08.900 directly to streaming platforms 584 00:31:08.900 --> 00:31:11.550 Overall, the Extended Reality Workflow 585 00:31:11.550 --> 00:31:14.940 integrates all XR components in real-time 586 00:31:14.940 --> 00:31:17.640 streamlining the process from pre-production, real-time lighting 587 00:31:17.640 --> 00:31:19.340 and post-production 588 00:31:19.340 --> 00:31:23.340 to ensure high quality while saving time and money 589 00:31:23.340 --> 00:31:25.390 Let’s examine the core elements of 3D stage calibration 590 00:31:25.390 --> 00:31:29.299 in the Disguise XR workflow 591 00:31:29.299 --> 00:31:32.399 Accurate calibration ensures that 592 00:31:32.399 --> 00:31:35.380 the real and virtual worlds are aligned seamlessly 593 00:31:35.380 --> 00:31:37.979 enabling immersive productions 594 00:31:37.979 --> 00:31:40.179 There are two main types of calibration 595 00:31:40.179 --> 00:31:43.700 Spatial calibration and lens calibration 596 00:31:43.700 --> 00:31:47.219 Spatial calibration involves World Alignment 597 00:31:47.219 --> 00:31:48.892 which is 598 00:31:48.892 --> 00:31:52.260 the process of accurately aligning the real world 599 00:31:52.260 --> 00:31:57.059 such as the actual stage or camera position, with the Unreal virtual world 600 00:31:57.059 --> 00:31:59.609 It accurately interprets tracking data 601 00:31:59.609 --> 00:32:02.500 to match the coordinate system 602 00:32:02.500 --> 00:32:06.350 so that the camera moves in physical space 603 00:32:06.350 --> 00:32:08.020 and is reflected in the Unreal virtual environment 604 00:32:08.020 --> 00:32:11.270 It adjusts the virtual content to blend naturally 605 00:32:11.270 --> 00:32:14.140 with the real environment without distortion 606 00:32:14.140 --> 00:32:16.590 In other words, spatial calibration 607 00:32:16.590 --> 00:32:20.260 is the foundation for aligning the real and virtual spaces 608 00:32:20.260 --> 00:32:22.860 Lens calibration profiles the optical characteristics 609 00:32:22.860 --> 00:32:25.539 of a real camera lens 610 00:32:25.539 --> 00:32:28.260 to compute a unique profile for the lens 611 00:32:28.260 --> 00:32:32.619 This profile is essential for matching the distortion 612 00:32:32.619 --> 00:32:35.669 and field of view of the rendered virtual extended set 613 00:32:35.669 --> 00:32:38.059 with the real camera lens 614 00:32:38.059 --> 00:32:42.099 The virtual 3D models inside Disguise must be 100% accurate 615 00:32:42.099 --> 00:32:44.749 as even a small error can ruin the immersion 616 00:32:44.749 --> 00:32:47.820 and disrupt the audience’s immersion 617 00:32:47.820 --> 00:32:50.220 if the physical stage 618 00:32:50.220 --> 00:32:52.500 and virtual model do not match 619 00:32:52.500 --> 00:32:55.250 Finally, there are countless factors to consider 620 00:32:55.250 --> 00:32:57.700 when designing an actual XR system 621 00:32:57.700 --> 00:33:01.020 but here are a few examples to keep in mind 622 00:33:01.020 --> 00:33:04.500 If you use different types of LED processors 623 00:33:04.500 --> 00:33:07.850 each LED processor may have different color mapping, gamma correction 624 00:33:07.850 --> 00:33:10.739 and brightness levels 625 00:33:10.739 --> 00:33:12.639 which increases the likelihood of color differences 626 00:33:12.639 --> 00:33:16.820 and each processor requires its settings and calibration 627 00:33:16.820 --> 00:33:20.500 This can ultimately lead to longer installation and setup times 628 00:33:20.500 --> 00:33:23.179 If the refresh rate is not high enough 629 00:33:23.179 --> 00:33:26.780 flickering, or screen flickering, can occur 630 00:33:26.780 --> 00:33:29.619 If the signal formats do not match, 631 00:33:29.619 --> 00:33:32.018 when devices using different video signal formats 632 00:33:32.018 --> 00:33:34.140 are connected 633 00:33:34.140 --> 00:33:35.890 the signal may not be recognized 634 00:33:35.890 --> 00:33:38.480 or transmitted 635 00:33:38.480 --> 00:33:41.340 If the signal formats between devices do not match 636 00:33:41.340 --> 00:33:45.059 downscaling or upscaling may occur 637 00:33:45.059 --> 00:33:49.099 which may cause image quality degradation and color distortion 638 00:33:49.099 --> 00:33:52.140 If the signal formats do not match, signal conversion is required 639 00:33:52.140 --> 00:33:55.900 which may cause additional latency 640 00:33:55.900 --> 00:34:00.299 In addition, audio and video may be out of sync 641 00:34:00.299 --> 00:34:03.419 I hope you will continue to explore with curiosity 642 00:34:03.419 --> 00:34:06.319 and gain practical experience 643 00:34:06.319 --> 00:34:08.780 and knowledge necessary for the field 644 00:34:08.780 --> 00:34:12.580 This time, as the first time using Disguise 645 00:34:12.580 --> 00:34:16.630 we aimed to understand 646 00:34:16.630 --> 00:34:19.486 various technical concepts and workflows such as XR, 2D, 647 00:34:19.486 --> 00:34:22.020 and 2.5D and learn how to apply them in practice 648 00:34:22.020 --> 00:34:25.419 I hope this was a time for you to understand the latest technologies in XR 649 00:34:25.419 --> 00:34:27.700 and learn more creative 650 00:34:27.700 --> 00:34:29.350 and efficient ways to create content 651 00:34:29.350 --> 00:34:31.580 for your works 652 00:34:31.580 --> 00:34:33.880 Then, let's finish by reviewing 653 00:34:33.880 --> 00:34:35.191 what we learned this time 654 00:34:35.191 --> 00:34:35.900 Thank you 655 00:34:36.251 --> 00:34:37.251 Understanding XR, 2D, 2.5D, and Mapping XR (Extended Reality) 656 00:34:37.251 --> 00:34:38.201 A general term for live production that combines AR (Augmented Reality), MR (Mixed Reality), and VR (Virtual Reality) elements to provide a fully immersive experience 657 00:34:38.201 --> 00:34:39.201 2D Workflow Capture and edit still images or video to use as LED screen backgrounds 658 00:34:39.201 --> 00:34:40.201 Advantages and limitations of 2D Workflow Excellent flexibility and scalability Scaling to suit the needs of a specific scene while maintaining the texture and detail of the background 659 00:34:40.201 --> 00:34:41.230 Limited depth, no parallax effect Impossible to make repeated changes on set 660 00:34:41.230 --> 00:34:42.071 2.5D Workflow A method of implementing the Parallax Effect by stacking layers of image or video plates to add depth and form 661 00:34:42.071 --> 00:34:42.930 Advantages of 2.5D Workflow Provides Parallax Effect and Depth Saves time and budget Expands flexibility of virtual production 662 00:34:42.930 --> 00:34:43.730 Mapping Major properties used in the process of accurately delivering content to specific screens and stage expressions 663 00:34:43.730 --> 00:34:44.480 Representative mapping methods Direct Mapping: One of the Static Mappings, Disguise's basic mapping method that is automatically set when creating a screen 664 00:34:44.480 --> 00:34:45.330 Feed Mapping: A useful and flexible method that allows routing specific areas to specific screens and expressions 665 00:34:45.330 --> 00:34:46.220 Parallel Mapping: A method that geometrically projects content onto a scene, acting as if the content were being filmed virtually 666 00:34:46.220 --> 00:34:47.170 Components and Workflow of XR Components of XR Camera View: Shows the final rendering result of the XR environment 667 00:34:47.170 --> 00:34:48.120 LED Stage: Core technology of the XR environment, where the virtual background is physically represented through LED panels 668 00:34:48.120 --> 00:34:49.120 Floor: Combining physical elements and LED panels to help actors and presenters interact naturally with the XR environment 669 00:34:49.120 --> 00:34:50.120 XR Implementation Essentials LED Screens(Stage) Camera Tracking System 670 00:34:50.120 --> 00:34:51.140 Media Server(disguise) Genlock Sync Real-Time Content Rendering Engine 671 00:34:51.140 --> 00:34:52.740 Network required for building XR system Media server network (d3Net) Metadata and video Ethernet transmission (RenderStream) Content management network 672 00:34:52.740 --> 00:34:54.390 Types of 3D stage correction Spatial correction: Spatial correction including World Alignment Lens correction: Profiling the optical characteristics of real camera lenses and calculating the lens-specific profile 673 00:34:54.390 --> 00:34:56.030 XR Workflow Extended Reality Controller Real-time graphic generation LED Wall (Stage) Vision Mixer