WEBVTT 1 00:00:02.350 --> 00:00:03.010 Realistic 2 00:00:04.474 --> 00:00:08.795 Realistic Advanced Concept for ICVFX System Configuration 3 00:00:08.795 --> 00:00:11.518 GCC Academy 4 00:00:27.500 --> 00:00:33.100 Hello everyone, I am Park Ji-yong, in charge of ICVFX Integrated App Flow 5 00:00:33.500 --> 00:00:38.310 The lecture on the integrated ICVFX workflow is divided into four sessions 6 00:00:38.764 --> 00:00:45.124 It will cover the overall technology for implementing in-camera prepass using a real-time game engine 7 00:00:45.550 --> 00:00:49.560 This includes fundamental concepts for configuring an ICVFX system, 8 00:00:49.561 --> 00:00:52.481 collaboration and version control for practical use, 9 00:00:52.481 --> 00:00:55.920 multi-display setup using LED displays, 10 00:00:55.921 --> 00:01:00.781 and real-time collaboration and control settings 11 00:01:00.781 --> 00:01:07.680 I encourage you to build on the concepts learned in this lecture 12 00:01:07.680 --> 00:01:11.500 Understanding VP 13 00:01:11.700 --> 00:01:16.290 Before discussing virtual production, let’s briefly review traditional filming methods 14 00:01:16.569 --> 00:01:19.369 The first method is on-location shooting, 15 00:01:19.569 --> 00:01:22.989 where filming takes place at real-world locations, 16 00:01:22.990 --> 00:01:26.009 ranging from urban settings to remote natural environment 17 00:01:26.010 --> 00:01:29.070 This approach provides diverse background options 18 00:01:29.210 --> 00:01:33.161 For instance, films and dramas set in modern times 19 00:01:33.161 --> 00:01:35.609 are often shot in city locations 20 00:01:35.680 --> 00:01:39.040 While this method allows for realistic imagery, 21 00:01:39.041 --> 00:01:43.600 it is highly affected by external factors, such as weather and time of day, 22 00:01:43.600 --> 00:01:46.300 making it difficult to control the shooting environment 23 00:01:46.400 --> 00:01:49.353 The second method is open set shooting, 24 00:01:49.413 --> 00:01:54.723 where backgrounds and props are physically built for filming 25 00:01:54.870 --> 00:02:00.270 This approach is commonly used in films, dramas, period pieces, and fantasy scenes 26 00:02:00.356 --> 00:02:04.616 Lastly, there is chroma key shooting, 27 00:02:04.916 --> 00:02:08.700 which involves filming in front of a green screen 28 00:02:08.701 --> 00:02:11.661 and replacing the background in post-production 29 00:02:12.160 --> 00:02:17.761 This technique offers great flexibility in creating 30 00:02:17.761 --> 00:02:21.341 various locations and settings in post-processing 31 00:02:21.720 --> 00:02:24.260 However, it heavily depends on post-production, 32 00:02:24.261 --> 00:02:28.121 making the workflow complex and potentially increasing costs 33 00:02:28.320 --> 00:02:33.320 Now, what exactly is virtual production? 34 00:02:33.580 --> 00:02:39.800 Many might think it refers only to ICVFX using LED walls, 35 00:02:39.800 --> 00:02:46.540 but virtual production is an umbrella term covering various new aspects of media and entertainment 36 00:02:46.840 --> 00:02:52.260 VP can refer to specific terms or a combination of them 37 00:02:52.520 --> 00:02:55.580 Let’s briefly go over each one 38 00:02:55.800 --> 00:02:59.280 First is the previously mentioned ICVFX 39 00:02:59.280 --> 00:03:02.240 This method projects a virtual background 40 00:03:02.815 --> 00:03:06.275 directly onto the camera through an LED wall 41 00:03:06.435 --> 00:03:10.360 Second is motion capture, or mocap 42 00:03:10.480 --> 00:03:14.920 Mocap captures the real-time movements of actors or objects 43 00:03:14.921 --> 00:03:18.421 and applies them to virtual characters or objects 44 00:03:18.621 --> 00:03:23.400 Third is previsualization, or previs 45 00:03:23.580 --> 00:03:26.660 This visualizes scenes in 3D simulation before shooting, 46 00:03:26.661 --> 00:03:30.621 allowing pre-planning of composition, movement, and lighting 47 00:03:31.020 --> 00:03:32.901 By simulating in advance, 48 00:03:32.901 --> 00:03:36.980 it helps detect and resolve unexpected issues before filming, 49 00:03:36.980 --> 00:03:38.220 making it highly efficient 50 00:03:38.440 --> 00:03:41.800 Next is VR 51 00:03:41.800 --> 00:03:45.100 VR is useful for exploring virtual sets 52 00:03:45.440 --> 00:03:49.280 or simulating shooting plans in advance 53 00:03:49.680 --> 00:03:55.740 A director can wear a VR headset to navigate the set and adjust plans beforehand, 54 00:03:55.940 --> 00:03:59.160 a process called virtual scouting 55 00:03:59.200 --> 00:04:01.240 Next is MR 56 00:04:01.240 --> 00:04:05.180 MR provides a mixed environment of reality and virtual elements, 57 00:04:05.460 --> 00:04:08.960 allowing actors to interact with virtual objects in real time 58 00:04:09.360 --> 00:04:14.220 Its key feature is the seamless integration of virtual and real elements 59 00:04:14.220 --> 00:04:16.360 Additionally, there is XR 60 00:04:16.360 --> 00:04:21.761 XR integrates AR, VR, and MR to provide an extended experience 61 00:04:21.761 --> 00:04:24.379 that bridges virtual and real environments 62 00:04:24.579 --> 00:04:28.460 This enhances immersion 63 00:04:28.460 --> 00:04:31.940 Next is the virtual camera function 64 00:04:32.160 --> 00:04:37.660 A virtual camera allows free exploration of angles and movements in a virtual environment 65 00:04:37.660 --> 00:04:40.334 By integrating mocap data, 66 00:04:40.334 --> 00:04:44.094 it helps pre-determine optimal angles 67 00:04:44.095 --> 00:04:46.175 before actual filming 68 00:04:46.375 --> 00:04:49.200 Lastly is Simulcam 69 00:04:49.200 --> 00:04:52.153 As an extended concept of AR, 70 00:04:52.333 --> 00:04:56.413 Simulcam composites virtual elements into real-time filming scenes 71 00:04:56.700 --> 00:04:59.427 By applying mocap data in real time, 72 00:04:59.427 --> 00:05:04.207 it allows immediate visualization of how actors interact with virtual characters 73 00:05:04.447 --> 00:05:09.800 Many of these technologies existed before the concept of VP emerged 74 00:05:10.100 --> 00:05:14.040 Why have these technologies been grouped under Virtual Production? 75 00:05:14.220 --> 00:05:19.800 Let’s explore the common characteristics of VP features 76 00:05:19.800 --> 00:05:24.140 First, real-time interaction and feedback 77 00:05:24.140 --> 00:05:30.640 VP connects virtual environments and cameras in real time, enabling immediate feedback 78 00:05:30.640 --> 00:05:34.060 Second, the fusion of virtual and real environments 79 00:05:34.220 --> 00:05:39.400 VP naturally integrates physical and virtual sets, 80 00:05:39.401 --> 00:05:45.161 creating a seamless environment 81 00:05:45.540 --> 00:05:49.041 This allows actors to interact naturally with virtual backgrounds 82 00:05:49.041 --> 00:05:52.381 as if performing on a real set 83 00:05:52.721 --> 00:05:57.440 Third, previsualization and optimized planning 84 00:05:57.440 --> 00:06:01.400 VP allows 3D previews of scenes in advance, 85 00:06:01.401 --> 00:06:05.289 helping adjust optimal angles and movements 86 00:06:05.309 --> 00:06:10.000 This helps identify and fix potential issues before shooting 87 00:06:10.000 --> 00:06:14.560 Thanks to these features, VP is increasingly becoming a viable option 88 00:06:14.560 --> 00:06:20.014 that integrates well with traditional methods despite being a new technology 89 00:06:20.014 --> 00:06:27.000 Technologies with these common characteristics are broadly included in VP 90 00:06:27.000 --> 00:06:33.380 Now, let’s briefly look at three Virtual Production technologies using LED walls 91 00:06:33.380 --> 00:06:36.420 First is In-Camera VFX, or ICVFX 92 00:06:36.420 --> 00:06:40.753 This method is mainly used in films and dramas, 93 00:06:41.133 --> 00:06:45.223 projecting real-time 3D backgrounds onto an LED wall for shooting 94 00:06:45.480 --> 00:06:49.960 This allows actors to perform while seeing a realistic virtual background, 95 00:06:49.961 --> 00:06:53.201 achieving high-quality scenes directly on set 96 00:06:53.300 --> 00:06:56.500 Second is 2D In-Camera VFX 97 00:06:56.740 --> 00:07:01.260 This method is also mainly used in film and drama production 98 00:07:01.440 --> 00:07:06.500 It projects a pre-rendered 2D background video onto an LED wall 99 00:07:06.500 --> 00:07:10.220 It is best suited for scenes that require minimal background movement 100 00:07:10.420 --> 00:07:16.420 Since source production is relatively simple, it saves time and cost 101 00:07:16.540 --> 00:07:19.560 while maintaining high image quality 102 00:07:19.740 --> 00:07:22.040 Lastly, there is XR 103 00:07:22.200 --> 00:07:26.121 XR is used in live events and broadcasting, 104 00:07:26.121 --> 00:07:30.441 integrating LED walls with AR and VR technology 105 00:07:30.442 --> 00:07:32.593 It creates an immersive environment, 106 00:07:32.813 --> 00:07:37.140 particularly useful in interactive settings with audiences 107 00:07:37.320 --> 00:07:44.240 These three technologies are chosen based on their purpose, film, drama, or live entertainment 108 00:07:44.440 --> 00:07:48.000 Now, let’s explore each technology in more detail 109 00:07:48.100 --> 00:07:53.140 In-Camera VFX relies on Unreal Engine, which is its key feature 110 00:07:53.540 --> 00:07:58.621 Using Unreal Engine, high-quality 3D environments 111 00:07:58.621 --> 00:08:00.541 can be rendered in real time and projected onto an LED wall 112 00:08:00.661 --> 00:08:04.294 This allows for dynamic background changes, 113 00:08:04.294 --> 00:08:07.768 immediate lighting and angle adjustments, 114 00:08:07.768 --> 00:08:10.208 and real-time feedback on set 115 00:08:10.900 --> 00:08:12.658 As seen in the video, 116 00:08:13.358 --> 00:08:17.718 as the camera moves, the inner frustum moves accordingly, 117 00:08:17.954 --> 00:08:22.054 dynamically adjusting the virtual background in real time 118 00:08:22.380 --> 00:08:26.520 Now, let’s discuss 2D In-Camera VFX 119 00:08:27.020 --> 00:08:34.800 2D In-Camera VFX involves projecting pre-recorded or pre-rendered 2D footage onto an LED wall 120 00:08:34.800 --> 00:08:39.780 This method is mainly used for fixed backgrounds or specific scenery, such as these 121 00:08:40.120 --> 00:08:43.467 A landscape seen through a window 122 00:08:43.467 --> 00:08:47.242 The outside view during a car-driving scene 123 00:08:47.262 --> 00:08:50.323 It enables easy background rendering 124 00:08:50.323 --> 00:08:52.970 without complex 3D environments 125 00:08:53.130 --> 00:08:58.700 Since setup and rendering are simpler, 126 00:08:58.700 --> 00:09:01.074 it reduces time and cost 127 00:09:01.500 --> 00:09:05.780 Additionally, as it uses real footage or pre-rendered videos, 128 00:09:05.781 --> 00:09:09.661 it offers a high level of realism 129 00:09:10.100 --> 00:09:16.680 Before diving into more details, let’s take a quick look at an ICVFX system diagram 130 00:09:16.680 --> 00:09:23.185 The ICVFX system consists of a camera, tracking system, render machine, LED panels, 131 00:09:23.185 --> 00:09:29.305 LED image processor, and sync generator 132 00:09:29.880 --> 00:09:32.620 These components work together 133 00:09:32.620 --> 00:09:36.770 to integrate the virtual environment into real camera footage 134 00:09:37.050 --> 00:09:41.200 Each piece of equipment is connected by different cables, 135 00:09:41.201 --> 00:09:45.661 transmitting large amounts of data and signals in sync 136 00:09:45.940 --> 00:09:51.220 Understanding how these components interact is crucial for system setup and adjustments 137 00:09:51.600 --> 00:09:54.660 Since ICVFX is a complex process, 138 00:09:54.661 --> 00:09:59.361 any misconfiguration can cause system-wide errors 139 00:09:59.500 --> 00:10:06.160 If this diagram seems unfamiliar now, don’t worry 140 00:10:06.400 --> 00:10:10.985 After completing this course, you’ll find it easier to understand 141 00:10:10.985 --> 00:10:13.345 So no need to worry about it too much 142 00:10:13.900 --> 00:10:16.000 Now, let’s break it down further 143 00:10:16.400 --> 00:10:23.920 This diagram represents the ICVFX system setup shown in the previous video 144 00:10:23.920 --> 00:10:29.500 Each component is connected via various cables, 145 00:10:29.501 --> 00:10:32.161 exchanging data and signals in real time 146 00:10:32.720 --> 00:10:35.477 We will go through each component in detail, 147 00:10:35.477 --> 00:10:39.057 but first, here’s a brief overview 148 00:10:39.340 --> 00:10:43.380 The tracking system continuously tracks the camera’s movement, 149 00:10:43.381 --> 00:10:47.261 lens focus, and zoom data, updating the virtual environment in real time 150 00:10:48.080 --> 00:10:53.220 The OptiTrack cameras positioned to the left of the main camera are connected to a switch 151 00:10:53.220 --> 00:10:57.640 These cameras detect the sensors attached to the main camera, 152 00:10:57.640 --> 00:11:00.083 allowing for precise tracking of its position 153 00:11:00.540 --> 00:11:03.340 The diagram shows a total of four render machines, 154 00:11:03.340 --> 00:11:07.700 which play a crucial role in rendering real-time 3D graphics 155 00:11:07.701 --> 00:11:10.821 and displaying them on the LED screen 156 00:11:11.240 --> 00:11:16.660 Each render machine runs Unreal Engine, ensuring seamless real-time rendering of virtual environments 157 00:11:17.340 --> 00:11:22.540 The LED screens and LED processors are responsible for outputting the final visuals 158 00:11:22.540 --> 00:11:25.840 The render machines generate images in real-time, 159 00:11:25.840 --> 00:11:29.413 which are then processed and displayed on large LED screens 160 00:11:30.140 --> 00:11:36.460 The sync generator ensures all devices remain perfectly synchronized 161 00:11:36.460 --> 00:11:39.099 by generating precise timing signals 162 00:11:39.620 --> 00:11:44.500 Since numerous devices must operate simultaneously without any delay or misalignment, 163 00:11:44.501 --> 00:11:45.990 synchronization is critical in ICVFX 164 00:11:46.560 --> 00:11:50.100 Setup in ICVFX 165 00:11:50.520 --> 00:11:55.740 To set up ICVFX, various hardware components are required 166 00:11:55.900 --> 00:12:02.320 I will first introduce these key components briefly from the previously shown diagram 167 00:12:02.440 --> 00:12:06.200 before explaining each in detail 168 00:12:06.320 --> 00:12:09.640 The first major component is the LED wall, 169 00:12:09.880 --> 00:12:14.170 which serves as the primary display for virtual backgrounds 170 00:12:14.171 --> 00:12:20.111 It is installed on the actual filming set, providing both the background for actors 171 00:12:20.112 --> 00:12:23.012 and the environment captured by the camera 172 00:12:23.012 --> 00:12:25.640 Next is the image processor, 173 00:12:25.860 --> 00:12:32.860 which converts the rendered data into a signal optimized for the LED wall 174 00:12:32.940 --> 00:12:36.820 This device adjusts resolution, color, and brightness, 175 00:12:36.821 --> 00:12:40.961 ensuring high-quality visuals on the display 176 00:12:40.961 --> 00:12:43.500 The camera and lens system is also a critical component 177 00:12:43.500 --> 00:12:46.661 In ICVFX production, the camera and lens must 178 00:12:46.662 --> 00:12:52.502 be precisely tracked to align with the virtual background, 179 00:12:52.661 --> 00:12:57.181 ensuring that real and digital elements blend seamlessly 180 00:12:57.181 --> 00:13:01.040 The media server or render PC is another essential element 181 00:13:01.140 --> 00:13:05.600 It runs software such as Disguise or Unreal Engine, 182 00:13:05.673 --> 00:13:09.293 rendering 3D virtual environments in real-time 183 00:13:09.293 --> 00:13:13.280 The genlock device manages synchronization between all components 184 00:13:13.540 --> 00:13:18.127 In an ICVFX system, it is crucial that cameras, LED walls, 185 00:13:18.248 --> 00:13:25.128 and render machines stay in sync, preventing visual discrepancies 186 00:13:25.493 --> 00:13:28.940 The last key component is the camera tracking system and its related elements 187 00:13:29.360 --> 00:13:33.240 This includes lens data streaming and six-axis tracking, 188 00:13:33.360 --> 00:13:39.200 which updates camera position, lens focus, and zoom data in real time 189 00:13:39.201 --> 00:13:40.980 on the virtual background 190 00:13:41.260 --> 00:13:47.040 Each component of the ICVFX system functions in synchronization 191 00:13:47.260 --> 00:13:50.220 Now, let’s take a closer look at each of these components 192 00:13:50.680 --> 00:13:56.320 First, we will examine the various types of LED walls used in virtual production 193 00:13:56.460 --> 00:14:03.240 LED walls generally come in three main configurations: flat, curved, and MR set 194 00:14:03.480 --> 00:14:06.340 The choice depends on the specific purpose 195 00:14:06.640 --> 00:14:09.720 Flat LED walls are the most basic type 196 00:14:09.721 --> 00:14:11.568 They are relatively easy to install 197 00:14:11.569 --> 00:14:14.629 and are best suited for shooting from specific angles 198 00:14:14.849 --> 00:14:19.820 This setup is often used for interviews or static scenes with minimal camera movement 199 00:14:20.140 --> 00:14:24.340 Recently, it has become popular for window background solutions in production 200 00:14:24.680 --> 00:14:28.800 Curved LED walls are the most widely used configuration today 201 00:14:29.140 --> 00:14:33.020 They provide a consistent background across various camera angles 202 00:14:33.100 --> 00:14:36.580 and allow for smooth panning and rotation 203 00:14:36.920 --> 00:14:43.000 They are particularly useful for driving scenes, as they can cover a wide reflection area 204 00:14:43.360 --> 00:14:47.620 MR set LED walls include an LED floor, 205 00:14:47.621 --> 00:14:52.475 extending the virtual background beyond the physical set 206 00:14:52.660 --> 00:14:58.720 Even with a smaller LED setup, this approach allows for background extensions beyond the LED boundaries 207 00:14:58.960 --> 00:15:04.780 Now, let’s briefly compare MR set LED walls and ICVFX LED walls 208 00:15:04.980 --> 00:15:08.001 MR sets include LED floors, 209 00:15:08.001 --> 00:15:10.827 as I mentioned earlier, 210 00:15:10.827 --> 00:15:14.120 allowing for a fully extended virtual background that integrates the floor 211 00:15:14.120 --> 00:15:20.860 ICVFX setups do not use LED floors Instead, they rely only on LED walls and ceilings 212 00:15:20.860 --> 00:15:25.000 This distinction depends on the specific production requirement 213 00:15:25.000 --> 00:15:29.000 In an MR set, the floor is also a digital element, 214 00:15:29.001 --> 00:15:32.371 as a fully extended virtual background 215 00:15:32.371 --> 00:15:37.820 while in ICVFX, practical props and set decorations 216 00:15:37.820 --> 00:15:42.625 are used on the floor to enhance realism 217 00:15:42.900 --> 00:15:48.960 Therefore, it is important to use a set without floor LEDs in ICVFX 218 00:15:49.160 --> 00:15:54.240 Now, let’s take a look at the second component of the ICVFX system: the LED processor 219 00:15:54.360 --> 00:15:59.260 The image processor is responsible for accurately transmitting the virtual background to the LED wall 220 00:15:59.261 --> 00:16:05.136 It receives the rendered image from the render machine, distributes it properly to the LED screen, 221 00:16:05.137 --> 00:16:10.585 and adjusts the color and brightness to manage the final image quality 222 00:16:10.660 --> 00:16:14.720 Since image quality on the LED wall is crucial in ICVFX, 223 00:16:14.721 --> 00:16:18.081 the image processor plays a key role 224 00:16:18.581 --> 00:16:26.220 The most commonly used models include the Helios from Megapixel and the XS40 from Brompton 225 00:16:26.220 --> 00:16:30.280 Each manufacturer offers image processors with different specifications, 226 00:16:30.281 --> 00:16:33.401 allowing studios to choose based on their specific requirements and environment 227 00:16:33.620 --> 00:16:36.800 The LED wall consists of multiple panels 228 00:16:37.000 --> 00:16:42.321 For example, if an LED wall is built to be 20 meters wide and 6 meters high, 229 00:16:42.401 --> 00:16:47.421 it would typically use 480 panels, each measuring 50 cm by 50 cm 230 00:16:47.640 --> 00:16:50.240 Each panel functions as an individual display, 231 00:16:50.241 --> 00:16:55.833 and the image processor ensures that all 480 panels 232 00:16:55.834 --> 00:17:00.082 operate as a single large screen by distributing and linking all the data 233 00:17:00.320 --> 00:17:05.600 Let’s take a quick look at the operating mechanism of an LED image processor through a diagram 234 00:17:05.600 --> 00:17:08.940 As shown in the diagram, the image processor 235 00:17:08.941 --> 00:17:14.001 transmits data to multiple small LED panels via network cables 236 00:17:14.001 --> 00:17:19.040 It ensures that each panel accurately displays its portion of the overall image 237 00:17:19.040 --> 00:17:23.160 Each panel functions like an independent monitor, 238 00:17:23.161 --> 00:17:28.321 and the processor integrates them into a single cohesive display across the entire LED wall 239 00:17:28.660 --> 00:17:34.135 This diagram illustrates how the image processor distributes data to each panel, 240 00:17:34.135 --> 00:17:40.115 integrating the entire LED wall into a unified display 241 00:17:40.380 --> 00:17:45.300 The next essential component for ICVFX shooting is the camera and lens 242 00:17:45.480 --> 00:17:48.400 Not all cameras can be used, 243 00:17:48.857 --> 00:17:52.437 and it is recommended to use models that provide specific features 244 00:17:52.500 --> 00:17:55.860 Most cinema cameras are compatible, 245 00:17:56.040 --> 00:18:00.980 but there are some key functions to consider 246 00:18:01.200 --> 00:18:04.320 The first and most important feature is Genlock 247 00:18:04.780 --> 00:18:09.301 Genlock ensures that the camera, LED wall, and all other equipment 248 00:18:09.301 --> 00:18:11.881 are precisely synchronized for stable shooting 249 00:18:12.181 --> 00:18:16.300 In ICVFX, multiple devices must operate simultaneously, 250 00:18:16.301 --> 00:18:19.581 and if they are not perfectly synchronized, 251 00:18:19.582 --> 00:18:21.766 the footage may become unstable 252 00:18:22.000 --> 00:18:25.180 The second essential feature is a global shutter 253 00:18:25.180 --> 00:18:28.695 or a high-speed rolling shutter 254 00:18:29.000 --> 00:18:34.000 This is crucial for accurately capturing the virtual background displayed on the LED wall 255 00:18:34.000 --> 00:18:38.240 Now, let’s move on to the render machine, 256 00:18:38.241 --> 00:18:40.628 also known as the render PC, used in the ICVFX system 257 00:18:41.000 --> 00:18:44.320 A render machine is not a specific piece of equipment 258 00:18:44.321 --> 00:18:48.481 but rather a high-spec computer equipped with a powerful GPU 259 00:18:48.800 --> 00:18:53.540 Since ICVFX requires real-time image output to the LED wall, 260 00:18:53.541 --> 00:18:57.221 fast and stable rendering performance is essential 261 00:18:57.581 --> 00:19:00.140 Typically, NVIDIA A6000 262 00:19:00.141 --> 00:19:03.881 or A6000 ADA graphics cards are recommended 263 00:19:04.000 --> 00:19:07.240 However, when dealing with extremely high-resolution LED walls, 264 00:19:07.241 --> 00:19:11.261 a single render machine may struggle to handle the entire resolution 265 00:19:11.400 --> 00:19:17.350 For example, let’s assume that the green box in the image represents an LED wall 266 00:19:17.351 --> 00:19:22.611 with a resolution of 9504x1769 pixels, 267 00:19:22.612 --> 00:19:25.108 approximately 10K resolution 268 00:19:25.700 --> 00:19:30.081 Such a high resolution cannot be managed by a single computer, 269 00:19:30.081 --> 00:19:36.021 so multiple render machines are used to divide the rendering workload 270 00:19:36.021 --> 00:19:41.700 In the image, the red lines illustrate an example of dividing the LED wall into four sections 271 00:19:41.700 --> 00:19:48.760 Each render machine is responsible for rendering an area of 2376x1769 pixels, 272 00:19:48.761 --> 00:19:50.941 collectively achieving the full resolution 273 00:19:51.421 --> 00:19:56.020 This method reduces the load on each render machine, 274 00:19:56.021 --> 00:20:00.901 ensuring a stable video output 275 00:20:01.201 --> 00:20:03.941 This is made possible by nDisplay, a multi-display system 276 00:20:03.941 --> 00:20:08.006 provided by Unreal Engine 277 00:20:08.366 --> 00:20:13.140 It allows multiple render machines to work together to form a single LED wall 278 00:20:13.440 --> 00:20:18.340 In the upcoming practical session, we will directly observe the nDisplay function in action 279 00:20:18.560 --> 00:20:22.180 While the render machine handles only the rendering process, 280 00:20:22.181 --> 00:20:27.461 a separate operating computer is required to manage image control and content distribution 281 00:20:27.921 --> 00:20:33.520 The operating computer is responsible for synchronizing and controlling all render machines, 282 00:20:33.521 --> 00:20:37.460 ensuring that the entire system operates smoothly 283 00:20:37.780 --> 00:20:43.828 To achieve high-resolution LED walls, multiple render machines are necessary 284 00:20:43.947 --> 00:20:49.907 Using systems like nDisplay, the rendering workload can be distributed efficiently 285 00:20:50.116 --> 00:20:56.160 Now, let’s take a look at the Genlock system, another key component of ICVFX 286 00:20:56.180 --> 00:21:01.000 The diagram below provides an overview of Genlock synchronization within the system 287 00:21:01.000 --> 00:21:07.740 This illustrates how the Genlock master device connects with cameras, render machines, and image processors, 288 00:21:07.741 --> 00:21:11.061 ensuring system-wide synchronization 289 00:21:11.061 --> 00:21:16.020 At the center of the system, the Genlock master device generates timing signals, 290 00:21:16.021 --> 00:21:20.521 ensuring that all equipment operates in perfect synchronization 291 00:21:20.721 --> 00:21:26.540 With this setup, the LED wall, camera, and render machines remain precisely aligned, 292 00:21:26.540 --> 00:21:28.941 maintaining a seamless display 293 00:21:29.280 --> 00:21:33.580 This perfect synchronization of all machines 294 00:21:33.581 --> 00:21:38.321 prevents flickering and instability, 295 00:21:38.322 --> 00:21:40.362 allowing for smooth and reliable ICVFX shooting 296 00:21:41.022 --> 00:21:46.400 Next is NVIDIA Quadro Sync function, which enables Genlock synchronization across multiple render machines 297 00:21:46.400 --> 00:21:51.300 As shown in the diagram, each render machine is connected via NVIDIA Quadro Sync, 298 00:21:51.300 --> 00:21:54.532 allowing them to share a common Genlock signal and synchronize their rendering tasks 299 00:21:54.640 --> 00:21:58.100 This function is crucial when multiple render machines are used together, 300 00:21:58.100 --> 00:22:02.694 as it ensures that all machines operate in perfect sync 301 00:22:02.694 --> 00:22:06.840 The following image illustrates how NVIDIA Quadro Sync is configured 302 00:22:06.840 --> 00:22:09.380 This sync card connects to the graphics card 303 00:22:09.380 --> 00:22:12.449 and links multiple computers using LAN cables, 304 00:22:12.450 --> 00:22:14.469 forming a daisy-chain connection 305 00:22:14.580 --> 00:22:18.560 The tracking system is one of the most 306 00:22:18.940 --> 00:22:22.780 critical components of ICVFX 307 00:22:22.980 --> 00:22:27.061 Its primary function is to track the camera’s position and lens data 308 00:22:27.061 --> 00:22:31.741 and transmit this information in real-time to the virtual camera in Unreal Engine 309 00:22:32.041 --> 00:22:35.240 This allows the virtual environment to match the exact viewpoint and framing of the real-world camera, 310 00:22:35.240 --> 00:22:38.810 ensuring seamless integration between physical and virtual elements 311 00:22:39.200 --> 00:22:42.400 The six-degree-of-freedom tracking method is used, 312 00:22:42.401 --> 00:22:45.106 capturing movement in all four directions 313 00:22:45.107 --> 00:22:48.652 and rotation to accurately reflect changes in the virtual space 314 00:22:48.932 --> 00:22:53.500 With this, we have covered the key components of ICVFX systems 315 00:22:53.501 --> 00:22:59.301 Next, we will explore each component in greater detail, 316 00:22:59.641 --> 00:23:02.480 starting with shutter types and scanning methods 317 00:23:02.480 --> 00:23:06.960 First, let’s look at an image comparing rolling shutter and global shutter 318 00:23:06.960 --> 00:23:10.460 A rolling shutter captures an image line by line 319 00:23:10.461 --> 00:23:13.441 from top to bottom in sequence 320 00:23:13.441 --> 00:23:19.740 When capturing fast-moving objects, this can lead to skewed or wobbly effects often seen in high-speed footage 321 00:23:19.740 --> 00:23:24.820 On the other hand, a global shutter exposes all pixels simultaneously, 322 00:23:24.821 --> 00:23:30.001 preventing distortion and ensuring clear, accurate images even in fast-moving scenes 323 00:23:30.181 --> 00:23:33.060 Next is genlocking 324 00:23:33.060 --> 00:23:36.860 Each rendering machine in the ICVFX setup must be genlocked to stay synchronized 325 00:23:36.860 --> 00:23:41.900 For example, let’s assume that two computers are responsible for rendering half of an LED wall each 326 00:23:42.400 --> 00:23:50.020 Both computers must render their respective sections simultaneously and send the output to the LED panels 327 00:23:50.580 --> 00:23:54.740 If the two computers are not genlocked, 328 00:23:54.998 --> 00:23:59.198 the images will be out of sync, resulting in visible misalignment as seen in the right 329 00:23:59.418 --> 00:24:03.640 When properly genlocked, both machines output frames at the exact same timing, 330 00:24:03.641 --> 00:24:06.321 ensuring a synchronized image across the entire LED wall 331 00:24:06.680 --> 00:24:11.200 This video shows an error that occurred during my genlock test 332 00:24:11.800 --> 00:24:16.520 As you can see, we played a test video where the same color should appear sequentially 333 00:24:16.980 --> 00:24:20.340 However, the colors do not display at the same time, 334 00:24:20.340 --> 00:24:23.455 indicating that the sync is significantly off 335 00:24:23.840 --> 00:24:26.500 How does genlock work? 336 00:24:26.900 --> 00:24:31.960 Many assume that using timecode-based synchronization ensures everything stays in sync 337 00:24:32.200 --> 00:24:36.420 However, without proper genlock synchronization, the system will not function correctly 338 00:24:36.580 --> 00:24:40.580 Both LED panels and cameras operate using digital signals 339 00:24:40.840 --> 00:24:45.820 For example, let’s say the LED panel outputs 24 frames per second, 340 00:24:45.820 --> 00:24:49.950 and the camera captures at 24 frames per second 341 00:24:50.370 --> 00:24:54.436 If these two signals are not genlocked, 342 00:24:54.436 --> 00:24:59.640 their timing will gradually drift, as seen in this image 343 00:25:00.120 --> 00:25:05.594 For a correct image, the LED must output frames at the exact same timing 344 00:25:05.594 --> 00:25:07.294 that the camera captures them 345 00:25:07.800 --> 00:25:12.120 You’ve likely seen genlock issues 346 00:25:12.120 --> 00:25:15.546 in everyday situations 347 00:25:15.826 --> 00:25:18.920 For example, when filming a TV screen with a smartphone, 348 00:25:18.920 --> 00:25:22.457 you might notice rolling black bars or flickering 349 00:25:22.740 --> 00:25:25.420 This flicker occurs because they are not synchronized, 350 00:25:25.840 --> 00:25:29.000 or when there is no proper genlock 351 00:25:29.860 --> 00:25:33.680 Without proper genlock, flickering or tearing will occur in the final image 352 00:25:34.100 --> 00:25:39.100 To capture accurate and realistic images from an LED wall, 353 00:25:39.100 --> 00:25:41.604 genlock is a critical component 354 00:25:42.060 --> 00:25:47.740 Let’s take a closer look at how perspective changes depending on the camera’s distance from an object 355 00:25:47.740 --> 00:25:53.920 The diagram illustrates how an image is formed based on the distance 356 00:25:53.921 --> 00:25:56.481 between the camera center and an object 357 00:25:57.020 --> 00:26:01.920 As the camera moves closer to the object, 358 00:26:01.921 --> 00:26:05.041 the object appears larger when projected onto the image plane 359 00:26:05.540 --> 00:26:10.800 This distance-to-projection ratio is the fundamental principle behind perspective formation 360 00:26:10.900 --> 00:26:14.440 Here is a more detailed example 361 00:26:15.040 --> 00:26:20.500 In the upper image, both the person in the midground and the tree in the background are visible together 362 00:26:20.900 --> 00:26:26.380 However, as the camera moves closer, the lower image shows that the tree appears relatively smaller 363 00:26:27.080 --> 00:26:31.860 The objects in the background shrink as the camera gets closer to the foreground 364 00:26:31.860 --> 00:26:33.538 This is known as perspective distortion 365 00:26:33.960 --> 00:26:36.993 This perspective effect is crucial in ICVFX 366 00:26:36.993 --> 00:26:41.642 because it ensures that virtual backgrounds and physical props appear seamlessly aligned 367 00:26:42.380 --> 00:26:46.186 Since the size and depth of captured images change 368 00:26:46.186 --> 00:26:49.806 based on the camera’s distance and angle, 369 00:26:49.806 --> 00:26:54.920 this principle helps maintain visual continuity between virtual and real elements 370 00:26:55.240 --> 00:26:58.980 Thus, perspective is an essential concept for understanding 371 00:26:58.980 --> 00:27:03.037 how camera position and distance affect an image 372 00:27:03.720 --> 00:27:09.740 Now, let’s explore how perspective is applied in ICVFX 373 00:27:10.420 --> 00:27:15.040 In ICVFX, perspective is handled using two zones on the LED wall: Inner Frustum and Outer Frustum 374 00:27:15.300 --> 00:27:19.300 The left image shows the Unreal Engine interface, 375 00:27:20.000 --> 00:27:22.440 where the green area represents the Inner Frustum 376 00:27:23.000 --> 00:27:27.032 This zone is directly captured by the camera 377 00:27:27.032 --> 00:27:33.372 and dynamically adjusts its position and angle based on the camera’s movement 378 00:27:34.040 --> 00:27:39.220 Outside the green area, the remaining LED wall belongs to the Outer Frustum 379 00:27:39.740 --> 00:27:43.160 This area is not directly captured by the camera, 380 00:27:43.161 --> 00:27:47.301 but it plays a crucial role in providing accurate lighting and reflections on actors and objects in the scene 381 00:27:47.720 --> 00:27:53.160 Without this system, filming in front of an LED screen would not be different from traditional green screen shooting, 382 00:27:53.161 --> 00:27:57.641 resulting in an unnatural and less immersive final image 383 00:27:58.220 --> 00:28:01.880 By combining real-time rendering and camera tracking, 384 00:28:01.881 --> 00:28:04.562 ICVFX allows virtual backgrounds and physical environments to blend seamlessly, 385 00:28:04.562 --> 00:28:07.802 creating a highly immersive experience 386 00:28:08.220 --> 00:28:12.240 Now, let’s explore why lens data streaming 387 00:28:12.241 --> 00:28:14.981 is crucial in ICVFX and how it works 388 00:28:15.620 --> 00:28:19.220 So far, we’ve focused on camera position tracking, 389 00:28:19.221 --> 00:28:21.861 but lens data also plays a vital role 390 00:28:22.121 --> 00:28:25.920 If we consider a real-world scenario, 391 00:28:25.921 --> 00:28:31.081 while the dog in the foreground and the tree in the background remain blurry, 392 00:28:31.560 --> 00:28:34.860 only the child at 10m is in focus 393 00:28:35.720 --> 00:28:39.080 Now, let’s apply this scenario to an LED wall 394 00:28:39.540 --> 00:28:45.460 If the red line represents the LED screen, 395 00:28:45.461 --> 00:28:50.909 the virtual background projected onto it will always remain in focus 396 00:28:50.910 --> 00:28:53.439 because it exists on a flat plane at a single distance 397 00:28:53.859 --> 00:28:56.020 This creates a problem because, 398 00:28:56.020 --> 00:28:59.216 in real-world optics, the tree at 20m should be out of focus, 399 00:28:59.217 --> 00:29:01.577 but on an LED wall, it appears unnaturally sharp 400 00:29:01.577 --> 00:29:03.342 This discrepancy reduces realism in the final shot 401 00:29:03.702 --> 00:29:08.000 To address this issue, ICVFX uses real-time lens data tracking 402 00:29:08.400 --> 00:29:11.280 By calculating the distance between the camera and the virtual tree, 403 00:29:11.547 --> 00:29:16.347 the system can intentionally blur the tree on the LED wall, 404 00:29:16.627 --> 00:29:20.240 As a result, objects that would naturally be out of focus in reality 405 00:29:20.241 --> 00:29:23.441 also appear out of focus in the virtual background, 406 00:29:23.442 --> 00:29:26.871 ensuring consistent depth perception 407 00:29:27.440 --> 00:29:31.420 By tracking camera position and lens data in real-time 408 00:29:31.421 --> 00:29:36.201 and adjusting the focus and depth of images projected onto the LED wall, 409 00:29:36.201 --> 00:29:41.660 ICVFX ensures highly realistic visuals 410 00:29:42.160 --> 00:29:46.960 This technology allows virtual backgrounds and physical props 411 00:29:46.961 --> 00:29:49.144 to blend naturally, resulting in high-quality scenes 412 00:29:49.660 --> 00:29:53.920 Now, let’s discuss LED meshes, which are essential 413 00:29:53.921 --> 00:29:56.181 for using NDisplay in ICVFX 414 00:29:57.060 --> 00:30:02.500 To accurately project images onto different sections of an LED wall, 415 00:30:02.501 --> 00:30:07.061 we need an exact shape and size of the physical LED wall 416 00:30:07.340 --> 00:30:10.560 The LED mesh serves as the fundamental structure of a 3D model 417 00:30:10.561 --> 00:30:14.561 and must be created to match the exact shape of the physical LED wall 418 00:30:15.060 --> 00:30:21.760 Each mesh determines which render machine is responsible for which section of the LED wall 419 00:30:21.760 --> 00:30:25.060 By importing this mesh into Unreal Engine, 420 00:30:25.061 --> 00:30:29.181 precise area segmentation and rendering become possible 421 00:30:29.720 --> 00:30:33.680 Therefore, accurately creating an LED mesh file 422 00:30:33.681 --> 00:30:36.081 and properly configuring its UV mapping are crucial 423 00:30:36.081 --> 00:30:41.061 for maintaining consistency between the virtual background and the real set, 424 00:30:41.061 --> 00:30:44.861 thus enhancing immersion 425 00:30:45.260 --> 00:30:47.840 Next, let’s look at camera tracking systems 426 00:30:48.820 --> 00:30:54.520 In ICVFX, it is essential to synchronize physical camera movements 427 00:30:54.520 --> 00:30:56.321 with the virtual background 428 00:30:56.820 --> 00:30:58.760 There are two main tracking methods: 429 00:30:59.227 --> 00:31:02.907 outside-in tracking and inside-out tracking 430 00:31:03.540 --> 00:31:08.700 Outside-in tracking uses external cameras or sensors 431 00:31:08.701 --> 00:31:11.221 to track objects inside a defined space 432 00:31:11.420 --> 00:31:16.220 This setup allows for highly precise position tracking since the external device remains fixed, 433 00:31:16.221 --> 00:31:18.861 and it has the advantage of covering a wide area 434 00:31:19.260 --> 00:31:24.740 However, installation complexity and obstructions can affect accuracy, 435 00:31:24.741 --> 00:31:27.281 making it ideal for fixed studio environments 436 00:31:27.741 --> 00:31:32.420 Inside-out tracking, on the other hand, uses sensors or cameras attached to the camera rig 437 00:31:32.421 --> 00:31:35.841 to track its position relative to the surroundings 438 00:31:36.000 --> 00:31:39.700 This method is more portable and adaptable 439 00:31:39.701 --> 00:31:42.761 to changing environments, 440 00:31:43.120 --> 00:31:47.560 but its accuracy may be lower than outside-in tracking, 441 00:31:47.561 --> 00:31:50.105 especially in complex settings 442 00:31:50.820 --> 00:31:57.280 Finally, a markerless inside-out tracking system, such as NCam, can be used 443 00:31:57.280 --> 00:32:02.620 NCam employs sensors that analyze real-time environmental data 444 00:32:02.621 --> 00:32:06.661 and generate point clouds to track positions 445 00:32:07.000 --> 00:32:12.860 The key advantage is that no markers are needed, and installation is quick, 446 00:32:13.412 --> 00:32:17.532 though lighting variations may cause errors 447 00:32:17.980 --> 00:32:23.981 While professional tracking systems offer 448 00:32:23.981 --> 00:32:26.587 high precision and stability, 449 00:32:26.927 --> 00:32:29.400 they are also very expensive, 450 00:32:29.401 --> 00:32:33.981 making them difficult to acquire for individual users 451 00:32:34.461 --> 00:32:38.621 However, if an individual wants to test ICVFX or use it for simple applications, 452 00:32:38.621 --> 00:32:40.161 there are a few alternatives 453 00:32:40.820 --> 00:32:45.141 For example, VR device tracking features, such as those found in Oculus Quest controllers 454 00:32:45.141 --> 00:32:48.101 or Vive trackers, can be utilized 455 00:32:48.440 --> 00:32:53.220 These devices provide lower accuracy 456 00:32:53.221 --> 00:32:56.921 but serve as viable solutions for small projects 457 00:32:57.120 --> 00:33:01.500 These devices may have lower precision compared to professional tracking equipment, 458 00:33:01.501 --> 00:33:07.301 but they can still serve as practical alternatives for personal testing or small-scale projects 459 00:33:07.920 --> 00:33:13.540 Now, let’s take a look at how the lens data streaming system is applied to Unreal Engine 460 00:33:13.880 --> 00:33:18.320 The core of this system is the ARRI lens data tracking system, 461 00:33:18.321 --> 00:33:24.541 which measures camera lens data, such as focus, zoom, and aperture, in real time 462 00:33:24.542 --> 00:33:26.688 and transmits it to Unreal Engine 463 00:33:26.948 --> 00:33:30.640 To achieve this, a device like UMC4 is required 464 00:33:30.641 --> 00:33:36.121 This equipment collects lens data from the camera and transmits it to a computer 465 00:33:36.640 --> 00:33:41.300 Then, using the Live Link function and the ARRI metadata plugin, 466 00:33:41.301 --> 00:33:44.501 the data is streamed into Unreal Engine 467 00:33:44.900 --> 00:33:47.200 By reflecting this data in real time, 468 00:33:47.200 --> 00:33:50.402 the virtual environment and camera movements align perfectly, 469 00:33:50.402 --> 00:33:53.501 enabling more natural visual effects 470 00:33:53.880 --> 00:33:57.960 Now, let’s take a closer look at ARRI’s UMC4 and LDS 471 00:33:58.380 --> 00:34:05.200 The UMC4 is a device designed to accurately stream lens data within the ARRI camera system 472 00:34:05.400 --> 00:34:09.600 This device collects lens metadata and directly transmits it to Unreal Engine, 473 00:34:09.601 --> 00:34:12.836 with data transmission occurring via a network cable 474 00:34:13.141 --> 00:34:16.961 Understanding ICVFX Through a Diagram 475 00:34:17.081 --> 00:34:22.560 So far, we have examined the key components and concepts necessary for ICVFX 476 00:34:23.000 --> 00:34:26.660 Now, let’s go back to the initial ICVFX system diagram 477 00:34:26.661 --> 00:34:30.901 and summarize how the entire system is structured and operates 478 00:34:31.200 --> 00:34:36.000 Looking at the diagram, the core components of the ICVFX system are as follows 479 00:34:36.820 --> 00:34:38.940 First, the operating machine 480 00:34:39.540 --> 00:34:42.861 It controls and manages the entire system 481 00:34:42.861 --> 00:34:45.081 It is responsible for network communication 482 00:34:45.082 --> 00:34:48.562 and is connected to multiple rendering machines and network devices 483 00:34:49.000 --> 00:34:51.000 Second, the rendering machines 484 00:34:51.540 --> 00:34:56.580 It process real-time 3D environments using Unreal Engine 485 00:34:56.581 --> 00:34:58.461 and output them to the LED wall 486 00:34:59.340 --> 00:35:05.800 Third, a network switch hub and a sync generator synchronize all devices 487 00:35:06.280 --> 00:35:11.660 These components ensure that signals between devices remain in sync, 488 00:35:11.661 --> 00:35:14.541 preventing frame drops or synchronization errors 489 00:35:15.180 --> 00:35:17.740 Fourth, the LED image processor 490 00:35:18.060 --> 00:35:22.041 It optimizes the rendered images for display on the LED wall, 491 00:35:22.041 --> 00:35:25.801 ensuring high-quality visual output 492 00:35:26.180 --> 00:35:31.100 Finally, the camera tracking system and lens data tracking system 493 00:35:31.720 --> 00:35:38.601 They ensure that the movements of the real camera and lens data are accurately reflected in the virtual environment 494 00:35:38.601 --> 00:35:42.521 This synchronization helps create seamless and realistic scenes 495 00:35:42.840 --> 00:35:48.620 With this, we now have a comprehensive understanding of the essential elements that make up ICVFX 496 00:35:49.120 --> 00:35:53.380 This diagram illustrates how these components interact to integrate virtual environments 497 00:35:53.381 --> 00:35:57.321 smoothly into real-world production 498 00:35:57.700 --> 00:36:00.600 Next, we will apply everything we have learned 499 00:36:00.601 --> 00:36:05.294 and explore how these components and equipment 500 00:36:05.295 --> 00:36:08.037 are set up and connected in an actual studio 501 00:36:08.524 --> 00:36:12.344 VX Studio On-Site Demonstration 502 00:36:33.900 --> 00:36:38.860 We are now at VX Studio, located inside the Gwangju Immersive Content Cube or GCC 503 00:36:39.690 --> 00:36:42.461 Here, we will demonstrate each piece of equipment discussed in the theoretical lecture 504 00:36:42.461 --> 00:36:46.742 and provide a brief explanation of their functions 505 00:36:47.680 --> 00:36:51.580 First, you can see a large-scale LED wall 506 00:36:51.880 --> 00:36:55.800 This LED wall is controlled from the Brain Bar, 507 00:36:55.801 --> 00:37:03.161 which serves as the central command center for ICVFX operations 508 00:37:03.541 --> 00:37:07.420 Supervisors and operators manage and fine-tune the visuals displayed on the LED wall, 509 00:37:07.421 --> 00:37:11.981 ensuring that they align with the director’s vision 510 00:37:12.540 --> 00:37:16.000 Now, let’s take a closer look at each component in detail 511 00:37:17.000 --> 00:37:20.531 The LED wall, which is the most critical element in in-camera VFX, 512 00:37:20.531 --> 00:37:23.232 may appear as a single, large display 513 00:37:23.232 --> 00:37:26.706 However, in reality, it is composed of multiple small modular display panels 514 00:37:26.707 --> 00:37:29.493 that are combined to create a massive screen 515 00:37:29.673 --> 00:37:32.140 Now, let’s take a quick look at this panel 516 00:37:33.300 --> 00:37:37.080 The LED modules currently in use are typically 517 00:37:37.081 --> 00:37:43.400 50cm × 50cm in size, and when combined, they form a single cabinet 518 00:37:43.600 --> 00:37:50.700 As you can see, the distance between individual LED pixels is referred to as pixel pitch, 519 00:37:50.700 --> 00:37:55.660 and the installed LED wall here has a pixel pitch of 2.6mm 520 00:37:56.100 --> 00:38:03.020 At the back, there is a power board with dedicated connectors that allow multiple modules to be attached 521 00:38:03.340 --> 00:38:08.532 Each power board connects to four modules, 522 00:38:08.533 --> 00:38:09.533 forming a single panel, 523 00:38:09.534 --> 00:38:14.879 which can then be expanded into a large LED wall 524 00:38:15.799 --> 00:38:20.480 The LED wall you see here is designed for vertical installation, 525 00:38:20.481 --> 00:38:25.281 but LED panels for ceilings and floors are a different type 526 00:38:26.840 --> 00:38:29.840 Now, I have moved behind the LED wall 527 00:38:29.840 --> 00:38:31.840 Let's take a closer look 528 00:38:32.540 --> 00:38:37.300 From the front, it may not be obvious, 529 00:38:37.300 --> 00:38:41.280 but each rectangular section here 530 00:38:41.281 --> 00:38:43.768 represents a single LED module 531 00:38:44.008 --> 00:38:47.300 Each cabinet is made up 532 00:38:47.301 --> 00:38:50.311 of four LED modules 533 00:38:50.311 --> 00:38:55.660 The more cabinets added, the larger the total LED display becomes 534 00:38:56.060 --> 00:38:59.280 Looking at the cables, there are four main connections 535 00:38:59.281 --> 00:39:01.243 Data in, 536 00:39:01.244 --> 00:39:02.063 power in, 537 00:39:02.063 --> 00:39:03.643 data out, 538 00:39:03.644 --> 00:39:05.099 and power out 539 00:39:05.500 --> 00:39:13.080 The biggest advantage of this system is that power and data are not supplied individually to each panel from a central unit 540 00:39:13.080 --> 00:39:18.660 Instead, they are daisy-chained from a single input point 541 00:39:18.660 --> 00:39:22.740 Because of this setup, if an issue occurs on one side of the system, 542 00:39:22.740 --> 00:39:24.361 it may cause the entire chain to shut down 543 00:39:24.580 --> 00:39:29.000 This modular structure is a key aspect of the system’s design 544 00:39:29.160 --> 00:39:32.420 Additionally, if a single panel malfunctions, 545 00:39:32.421 --> 00:39:36.356 it can be removed and replaced immediately 546 00:39:36.357 --> 00:39:41.349 This makes maintenance much more efficient and convenient 547 00:39:41.669 --> 00:39:46.500 Now, let’s move to the main control area and take a closer look 548 00:39:47.100 --> 00:39:50.200 I have moved further back from the LED wall, 549 00:39:50.380 --> 00:39:53.780 and as you can see, there are a huge number of racks here 550 00:39:54.000 --> 00:39:59.045 This area serves as the central hub that powers and controls the entire LED wall 551 00:39:59.334 --> 00:40:01.755 If you look here, you will see a small unit, 552 00:40:01.755 --> 00:40:07.418 which is the power supply box responsible for delivering power to the entire LED wall 553 00:40:07.875 --> 00:40:12.960 Next, these units here are Brompton’s XD processors 554 00:40:12.960 --> 00:40:18.059 These devices distribute the image data received from the LED image processor 555 00:40:18.060 --> 00:40:23.227 to the individual LED panels, ensuring that each receives the correct visual information 556 00:40:23.227 --> 00:40:26.622 All of these connections are established through network cables 557 00:40:26.982 --> 00:40:30.175 Now, let’s take a brief look at the back of these units 558 00:40:30.490 --> 00:40:33.349 I have moved to the rear of the XD processors, 559 00:40:33.675 --> 00:40:36.912 and as you can see, there are a large number of LAN cables here 560 00:40:37.065 --> 00:40:40.773 To explain their roles briefly: 561 00:40:40.774 --> 00:40:43.214 The cables neatly lined up at the top 562 00:40:43.214 --> 00:40:47.805 transmit data to the LED panels 563 00:40:47.882 --> 00:40:51.542 The fiber optic cables at the bottom of each device 564 00:40:51.542 --> 00:40:55.233 receive image data from the LED image processor 565 00:40:55.335 --> 00:41:01.168 This entire setup operates through a network infrastructure 566 00:41:01.221 --> 00:41:04.758 Now, let’s move on to the LED image processor 567 00:41:04.758 --> 00:41:09.410 As you can see, everything is structured like this from here onward 568 00:41:09.410 --> 00:41:16.268 These devices are LED image processors, whose role is to calculate, 569 00:41:16.269 --> 00:41:20.064 manage, and deliver images to the LED wall 570 00:41:20.064 --> 00:41:25.651 In simple terms, they optimize video signals before sending them to the LED panels, 571 00:41:25.652 --> 00:41:30.248 ensuring that the panels display images accurately 572 00:41:30.250 --> 00:41:32.189 Looking at the front of the processor, 573 00:41:32.189 --> 00:41:34.363 there are actually not many physical controls 574 00:41:34.659 --> 00:41:39.130 In fact, there are only two main buttons, 575 00:41:39.130 --> 00:41:42.834 but most operations can be managed remotely from a computer 576 00:41:42.840 --> 00:41:48.207 Once these devices are installed in the server room, there is usually no need for frequent manual adjustments 577 00:41:48.207 --> 00:41:51.378 Now, let’s move to the back of the image processors 578 00:41:51.378 --> 00:41:55.074 and examine how they are configured 579 00:41:55.800 --> 00:41:59.709 I have now moved to the rear of the image processor 580 00:41:59.709 --> 00:42:02.774 This space is narrow and densely packed with cables, 581 00:42:02.775 --> 00:42:05.024 but I’ll briefly explain the key components 582 00:42:05.024 --> 00:42:09.820 The blue fiber optic cables here transmit data 583 00:42:09.823 --> 00:42:14.936 from the image processor to the XD processors that we previously discussed 584 00:42:15.092 --> 00:42:22.524 Each image processor sends data to four XD processors 585 00:42:23.033 --> 00:42:29.496 Additionally, the green cables here carry the genlock signal 586 00:42:29.871 --> 00:42:32.796 There are also other network cables here 587 00:42:32.797 --> 00:42:39.207 that allow remote control of the entire system 588 00:42:39.471 --> 00:42:41.413 Although the wiring might look complex, 589 00:42:41.414 --> 00:42:43.844 everything is neatly arranged 590 00:42:43.844 --> 00:42:47.815 The cables are carefully labeled and systematically organized 591 00:42:47.815 --> 00:42:51.050 If any issue occurs, this structured setup 592 00:42:51.051 --> 00:42:53.369 allows for quick identification 593 00:42:53.370 --> 00:42:58.276 of the problematic cable, making repairs 594 00:42:58.277 --> 00:43:00.161 and adjustments far more efficient 595 00:43:00.415 --> 00:43:03.086 Now that we’ve covered the hardware aspects, 596 00:43:03.087 --> 00:43:07.243 let’s move on to how these devices are controlled 597 00:43:07.244 --> 00:43:10.151 and utilized on a computer 598 00:43:10.411 --> 00:43:14.029 This time, I’ll demonstrate how the image processor 599 00:43:14.030 --> 00:43:17.166 can be controlled via a computer 600 00:43:17.167 --> 00:43:19.873 and explain its functions 601 00:43:19.874 --> 00:43:22.509 and roles in detail 602 00:43:23.449 --> 00:43:27.373 On the desktop, you can see the Tessera Remote software 603 00:43:28.167 --> 00:43:32.508 This software is network-based, allowing remote control of the image processor 604 00:43:32.887 --> 00:43:34.518 By clicking Start, 605 00:43:34.519 --> 00:43:41.396 we can remotely access all image processors we saw in the server room 606 00:43:41.641 --> 00:43:48.145 Here, the system is categorized into ceiling, main wall, and floor 607 00:43:48.145 --> 00:43:51.143 Each is labeled, ceiling as ceiling 608 00:43:51.143 --> 00:43:53.084 and floor as floor 609 00:43:53.634 --> 00:43:58.000 For now, we’ll focus on controlling just the main wall 610 00:43:58.877 --> 00:44:00.112 Once launched, 611 00:44:08.707 --> 00:44:13.129 the UI appears, showing the main interface 612 00:44:13.457 --> 00:44:16.236 I’ll briefly explain this UI 613 00:44:16.236 --> 00:44:21.040 and then go over the specific features it offers 614 00:44:22.207 --> 00:44:28.622 The UI is divided into tabs such as Project, Edit, View, and Tools 615 00:44:28.855 --> 00:44:31.122 The Project tab 616 00:44:31.123 --> 00:44:35.719 allows different users to manage 617 00:44:35.720 --> 00:44:37.623 and configure image processor settings 618 00:44:37.623 --> 00:44:39.608 based on their needs 619 00:44:39.727 --> 00:44:42.836 The Edit tab enables modifications, 620 00:44:42.837 --> 00:44:45.128 undo/redo actions, 621 00:44:45.128 --> 00:44:48.374 and other configuration adjustments 622 00:44:49.040 --> 00:44:52.330 Changes made here are not permanent until saved, 623 00:44:52.331 --> 00:44:56.382 meaning we can revert to a previous state using Ctrl+Z if needed 624 00:44:56.382 --> 00:45:03.345 So even if there was a mistake, 625 00:45:03.346 --> 00:45:06.210 we can go back 626 00:45:07.103 --> 00:45:11.295 I don't know if you can see this here, 627 00:45:11.296 --> 00:45:14.905 but if I move it like this, 628 00:45:14.906 --> 00:45:18.054 you can see a change on the LED screen, 629 00:45:18.147 --> 00:45:23.000 which can be undone with Ctrl Z 630 00:45:23.360 --> 00:45:28.612 This software essentially allows for LED wall configuration 631 00:45:29.178 --> 00:45:31.337 These arrows indicate 632 00:45:31.338 --> 00:45:35.110 how data is transmitted from one panel to the next, 633 00:45:35.111 --> 00:45:39.385 in the form of data cable chains as we saw before 634 00:45:39.775 --> 00:45:46.727 ensuring the entire LED wall functions as a single display 635 00:45:48.190 --> 00:45:50.509 Below, we have 636 00:45:50.510 --> 00:45:56.182 preset options for different projects 637 00:45:56.182 --> 00:46:01.375 Since color settings and configurations vary across projects, 638 00:46:01.376 --> 00:46:04.518 presets are used to store and apply settings efficiently 639 00:46:05.828 --> 00:46:09.000 Now, let’s take a look at the most important section at the top 640 00:46:09.000 --> 00:46:12.045 This area is quite intuitive 641 00:46:12.045 --> 00:46:14.485 Input, processing, 642 00:46:14.486 --> 00:46:16.011 how it's being done, 643 00:46:16.012 --> 00:46:19.685 whether to override this, 644 00:46:19.686 --> 00:46:21.711 and then finally the output 645 00:46:21.712 --> 00:46:24.722 All these elements are accessible in one interface 646 00:46:25.049 --> 00:46:29.037 In terms of input sources, as mentioned earlier, 647 00:46:29.038 --> 00:46:32.618 the primary input used here is HDMI, 648 00:46:32.632 --> 00:46:34.319 but SDI can also be connected 649 00:46:35.259 --> 00:46:38.268 If we select the HDMI input, 650 00:46:38.268 --> 00:46:40.527 click on it, 651 00:46:40.528 --> 00:46:43.211 we can view key details such as 652 00:46:43.943 --> 00:46:47.200 resolution, frame rate, 653 00:46:47.200 --> 00:46:51.126 color settings, and bit depths 654 00:46:51.540 --> 00:46:56.183 Based on the image color values being transmitted, 655 00:46:56.184 --> 00:46:59.892 we can choose to use PQ (Perceptual Quantizer) or SDI settings 656 00:47:01.645 --> 00:47:08.334 Additionally, we can select between Rec. 2020 or Rec. 709 657 00:47:08.334 --> 00:47:12.632 Currently, this studio is using a custom color space 658 00:47:13.722 --> 00:47:16.396 Once we verify the input settings, 659 00:47:16.397 --> 00:47:19.891 we move to the processing stage, where additional adjustments can be made 660 00:47:21.158 --> 00:47:24.260 For example, we can modify the resolution 661 00:47:26.220 --> 00:47:30.265 and adjust basic color settings 662 00:47:30.625 --> 00:47:33.818 However, in most cases, rather than directly from the image processor, 663 00:47:33.819 --> 00:47:37.630 we recommend controlling these settings via the media server 664 00:47:38.384 --> 00:47:42.000 In some cases, LUTs (Lookup Tables) can be applied for additional processing 665 00:47:47.793 --> 00:47:50.301 Depending on the situation, 666 00:47:50.729 --> 00:47:54.707 if we want to achieve more precise color calibration, 667 00:47:54.934 --> 00:48:00.238 we can load a cube file following a specific color pipeline 668 00:48:01.458 --> 00:48:04.335 Next, we have the Override and Freeze functions 669 00:48:04.337 --> 00:48:06.201 In the Override section, 670 00:48:06.202 --> 00:48:11.671 you can select a pattern to determine 671 00:48:11.671 --> 00:48:13.735 which screen will be displayed during standby mode 672 00:48:14.283 --> 00:48:19.521 Usually, we use a scrolling mode, 673 00:48:19.522 --> 00:48:24.438 where a basic color bar continuously cycles on the display 674 00:48:27.341 --> 00:48:30.040 Alternatively, the Freeze function is used 675 00:48:30.041 --> 00:48:34.484 when we need to display a specific image on the LED screen while working on other tasks 676 00:48:34.485 --> 00:48:35.921 Since the display would normally update in real time, 677 00:48:35.921 --> 00:48:38.322 Freeze locks the screen to prevent changes 678 00:48:38.323 --> 00:48:42.249 The Blackout function, on the other hand, completely turns off the display, making it go dark 679 00:48:43.176 --> 00:48:48.731 The Nits adjustment allows us to control the brightness of the LED display, 680 00:48:49.253 --> 00:48:52.425 increasing or decreasing it as needed 681 00:48:54.560 --> 00:48:57.470 Here, you can see the network section 682 00:48:57.470 --> 00:49:01.448 Each XD device we previously looked at 683 00:49:01.449 --> 00:49:04.386 can be monitored individually, displaying its data processing status 684 00:49:04.809 --> 00:49:08.144 However, if there is an issue, this green 685 00:49:08.144 --> 00:49:10.609 they may turn red, signaling a problem 686 00:49:10.609 --> 00:49:12.726 So that's the gist of it 687 00:49:13.502 --> 00:49:16.155 Additionally, we can check the Genlock synchronization status 688 00:49:16.156 --> 00:49:21.045 In this setup, we are using 59.94 Hz for Genlock 689 00:49:21.046 --> 00:49:24.486 The system receives an external synchronization signal 690 00:49:24.487 --> 00:49:26.361 via the Reference In input 691 00:49:26.612 --> 00:49:30.288 This concludes the overview of the image processor 692 00:49:30.821 --> 00:49:34.833 Throughout a project, this system remains active at all times 693 00:49:34.834 --> 00:49:37.679 to monitor potential issues, 694 00:49:37.871 --> 00:49:42.584 manage the display, and perform maintenance as needed 695 00:49:43.101 --> 00:49:47.534 Now, let’s look at one of the most critical components of ICVFX, 696 00:49:47.535 --> 00:49:50.362 which is camera tracking 697 00:49:50.673 --> 00:49:55.013 At VX Studio, two primary tracking systems are in use 698 00:49:55.013 --> 00:49:56.738 The first one 699 00:49:56.739 --> 00:50:00.495 is Spider, which we have over here 700 00:50:00.511 --> 00:50:04.061 Spider does not operate independently 701 00:50:04.062 --> 00:50:07.951 If you look above the LED wall, you’ll see small cameras installed 702 00:50:08.101 --> 00:50:10.272 These are called followers 703 00:50:10.607 --> 00:50:15.594 These cameras detect and capture the light emitted from Spider 704 00:50:15.910 --> 00:50:22.335 The biggest advantage of this system is that any camera 705 00:50:22.336 --> 00:50:24.542 equipped with Spider can be tracked instantly 706 00:50:25.007 --> 00:50:27.761 If you look closely, Spider has a circular design, 707 00:50:27.761 --> 00:50:29.534 allowing cameras to seamlessly track motion 708 00:50:29.535 --> 00:50:32.256 when placed within the system 709 00:50:33.137 --> 00:50:35.208 Since Spider emits light, 710 00:50:35.208 --> 00:50:39.081 it requires a power source, 711 00:50:39.087 --> 00:50:42.240 and it must communicate with the lens system to relay positional data accurately 712 00:50:43.761 --> 00:50:46.884 This concludes the basic explanation of Spider tracking 713 00:50:46.885 --> 00:50:49.652 Now, let’s move on to the second tracking system 714 00:50:50.149 --> 00:50:51.284 If we move over here, 715 00:50:53.196 --> 00:50:57.673 you’ll see a Red Spy camera mounted on the rig 716 00:50:57.897 --> 00:51:01.078 This camera system features a lens 717 00:51:01.079 --> 00:51:03.231 and an LED emitter 718 00:51:03.231 --> 00:51:07.055 Red Spy works by emitting light onto, though it's not visible, 719 00:51:07.056 --> 00:51:08.916 reflective stickers placed on the ceiling 720 00:51:09.072 --> 00:51:13.060 These stickers reflect the light back to the camera, 721 00:51:13.061 --> 00:51:15.645 allowing it to calculate positional data 722 00:51:15.646 --> 00:51:17.326 based on the reflection 723 00:51:17.326 --> 00:51:21.967 A key advantage of this system is that, as long as the sticker placements remain unchanged, 724 00:51:21.968 --> 00:51:26.294 the tracking system automatically functions whenever it’s turned on 725 00:51:26.843 --> 00:51:30.152 Now, let’s move to the computer interface 726 00:51:30.152 --> 00:51:32.752 to see how this data is processed 727 00:51:33.087 --> 00:51:34.463 Let's go 728 00:51:36.239 --> 00:51:44.000 Now, if you look at the screen here, this is the Red Spy mapping interface we just saw earlier 729 00:51:44.000 --> 00:51:47.330 Each of these dots represents a sticker affixed to the ceiling 730 00:51:48.629 --> 00:51:53.117 The system continuously calculates the positions of these stickers, 731 00:51:53.118 --> 00:51:56.673 creating a comprehensive map for tracking 732 00:51:57.748 --> 00:51:59.387 Let’s move the camera to demonstrate 733 00:52:03.256 --> 00:52:05.528 As the camera moves, 734 00:52:05.987 --> 00:52:07.658 these dots are staying 735 00:52:07.658 --> 00:52:11.045 You might think these dots are shifting, 736 00:52:11.047 --> 00:52:13.019 but in reality, they are static 737 00:52:13.231 --> 00:52:21.780 The system tracks movement by calculating the positional data changes relative to the fixed points 738 00:52:23.205 --> 00:52:26.076 So far, we’ve covered the basic setup, 739 00:52:26.077 --> 00:52:28.563 but there is an additional tracking system that enhances this process 740 00:52:28.563 --> 00:52:30.985 Let's take a look at that 741 00:52:32.103 --> 00:52:38.531 Until now, the tracking systems we discussed focus on positional, or six-axis data 742 00:52:39.047 --> 00:52:41.608 However, for precise rotation, focus adjustments, 743 00:52:41.609 --> 00:52:43.964 and other camera parameters, 744 00:52:43.964 --> 00:52:48.291 an additional Steadif tracking device is used 745 00:52:48.678 --> 00:52:51.680 This device is mounted on the camera rig 746 00:52:51.681 --> 00:52:53.125 and connected to the camera head 747 00:52:53.125 --> 00:52:56.673 It captures the rotation data of the head 748 00:52:56.674 --> 00:52:58.277 and streams it in real time 749 00:52:58.516 --> 00:53:02.000 These data points are then transferred to a dedicated control panel for further adjustments 750 00:53:11.031 --> 00:53:15.272 The additional camera tracking data we discussed earlier is sent here, 751 00:53:15.273 --> 00:53:19.402 allowing operators to fine-tune and manage settings directly from this panel 752 00:53:20.000 --> 00:53:22.228 The devices you’re looking at now 753 00:53:22.852 --> 00:53:27.619 play a crucial role in the ICVFX system 754 00:53:27.620 --> 00:53:29.224 These are the Disguise media servers 755 00:53:29.224 --> 00:53:32.777 The specific hardware model here is the VX4, 756 00:53:32.777 --> 00:53:34.726 which is responsible 757 00:53:34.727 --> 00:53:38.457 for real-time rendering 758 00:53:38.457 --> 00:53:41.328 and high-quality video output 759 00:53:41.393 --> 00:53:44.269 The system handles all aspects of Genlock synchronization 760 00:53:45.075 --> 00:53:46.790 To explain each part briefly, 761 00:53:46.791 --> 00:53:51.881 the Reference setting allows the device to receive sync signals 762 00:53:51.881 --> 00:53:55.086 from external equipment if needed 763 00:53:55.518 --> 00:53:58.978 In this case, since the device is set as the master, it is configured to Internal mode 764 00:53:59.699 --> 00:54:02.613 One of the most critical sections is the Black Burst output setting 765 00:54:02.615 --> 00:54:08.917 This setting sends synchronization signals externally 766 00:54:09.510 --> 00:54:15.742 If we change the format here, 767 00:54:15.743 --> 00:54:19.540 the sync signals of all connected equipment will be updated accordingly 768 00:54:20.472 --> 00:54:25.145 A key feature of this system is the ability to adjust signal formats for each individual output terminal 769 00:54:27.912 --> 00:54:35.498 Currently, the settings indicate a 1080-59.94i format 770 00:54:35.780 --> 00:54:38.809 Additionally, if timecode synchronization is required, 771 00:54:38.810 --> 00:54:42.042 it can be set to LTC or Linear Timecode 772 00:54:43.073 --> 00:54:47.599 The most essential task here is 773 00:54:47.609 --> 00:54:49.528 to monitor the output status panel 774 00:54:49.635 --> 00:54:54.107 At times, when using multiple tools, it is easy to overlook certain details, 775 00:54:54.108 --> 00:54:57.597 so it is crucial to check this setting first before proceeding 776 00:54:58.216 --> 00:55:02.283 Finally, while we have verified the Genlock sync status on other devices, 777 00:55:02.644 --> 00:55:07.487 let’s now check how Genlock synchronization is managed within the media server 778 00:55:07.935 --> 00:55:11.357 At the top of the software interface, there is a Feed tab 779 00:55:11.790 --> 00:55:13.960 Clicking this tab 780 00:55:13.961 --> 00:55:20.513 allows us to see all the settings related to output transmission 781 00:55:20.955 --> 00:55:26.610 In this section, we can confirm that the Refresh Rate is set to 59.94 782 00:55:27.007 --> 00:55:29.908 Additionally, 783 00:55:29.909 --> 00:55:33.225 frame rates can be adjusted according to our needs 784 00:55:33.906 --> 00:55:35.813 There is also a Latency Mode setting 785 00:55:35.814 --> 00:55:38.816 that allows switching between Full Speed and Half Speed, 786 00:55:38.817 --> 00:55:40.602 depending on the project requirements 787 00:55:41.227 --> 00:55:42.972 One of the most important indicators in this interface 788 00:55:44.227 --> 00:55:49.406 is the green status box in the upper-right corner 789 00:55:49.720 --> 00:55:55.816 This indicates whether the Genlock signal 790 00:55:55.823 --> 00:55:58.961 is correctly received by the system 791 00:55:59.536 --> 00:56:05.759 Currently, all indicators are green, meaning that all devices are synchronized nicely 792 00:56:06.021 --> 00:56:11.315 It means that all devices are operating under a single synchronized timing system 793 00:56:12.118 --> 00:56:14.725 Right now, this section appears black, 794 00:56:14.726 --> 00:56:19.704 which indicates that no signal is being sent or received 795 00:56:20.069 --> 00:56:23.442 If this indicator turns red, 796 00:56:23.443 --> 00:56:26.421 it means there is an issue that needs to be addressed 797 00:56:26.593 --> 00:56:28.378 To resolve this, 798 00:56:28.378 --> 00:56:32.812 we need to reapply the Genlock signal 799 00:56:33.477 --> 00:56:38.068 from the Master Device settings that we saw, 800 00:56:38.069 --> 00:56:41.742 and refresh the synchronization settings 801 00:56:42.000 --> 00:56:46.023 Once applied, the system will take approximately 3 to 5 minutes 802 00:56:46.024 --> 00:56:49.479 This process ensures that 803 00:56:49.479 --> 00:56:54.410 the correct synchronization signal is distributed properly 804 00:56:55.482 --> 00:57:00.429 This concludes our ICVFX system overview at VX Studio 805 00:57:00.907 --> 00:57:03.667 So far, we have covered both the theoretical principles 806 00:57:03.668 --> 00:57:09.432 and practical demonstrations of ICVFX system integration 807 00:57:09.950 --> 00:57:13.530 As we have seen, various hardware and software components 808 00:57:13.531 --> 00:57:16.064 must work together in a synchronized manner 809 00:57:16.064 --> 00:57:21.352 Because of this, each component plays a crucial role 810 00:57:21.524 --> 00:57:27.693 We have explored the fundamental concepts and key components of ICVFX 811 00:57:27.823 --> 00:57:32.281 I hope this lecture has helped clarify that ICVFX is much more than simply using an LED wall, 812 00:57:32.281 --> 00:57:37.825 as a highly sophisticated integration of advanced technology and precision-engineered equipment 813 00:57:38.417 --> 00:57:44.389 I hope the knowledge you have gained today will be valuable for your future projects and work 814 00:57:44.718 --> 00:57:47.850 Since ICVFX is a rapidly evolving field, 815 00:57:47.851 --> 00:57:51.857 new technologies and equipment are constantly emerging 816 00:57:52.330 --> 00:57:55.840 encourage you to stay curious, 817 00:57:55.841 --> 00:58:00.207 continue exploring, and accumulate hands-on experience and practical knowledge 818 00:58:00.378 --> 00:58:04.353 With that, let’s summarize today’s lesson and wrap up 819 00:58:04.354 --> 00:58:05.782 Thank you 820 00:58:05.782 --> 00:58:07.013 Basic Concept of VP Traditional Filming Methods: Location shooting, open set filming, chroma background shooting, post-production VFX 821 00:58:07.014 --> 00:58:10.794 VP Technologies ICVFX, MOCAP, PREVIZ, VR, MR, XR, Virtual Camera, SimulCam 822 00:58:10.794 --> 00:58:15.754 Basic Concept of ICVFX VP technology utilizing LED Walls: ICVFX 2D, ICVFX XR 823 00:58:15.754 --> 00:58:20.739 ICVFX Components LED Walls, Image Processor, Camera and Lenses, Media Server or Render PC, Genlock Sync Across All Devices, Camera Tracking System 824 00:58:20.739 --> 00:58:22.059 ICVFX System Overview via Diagram Operating Machine: Controls the entire system, manages network communication and connects to rendering machines 825 00:58:22.059 --> 00:58:22.841 Rendering Machines: Uses Unreal Engine to render 3D environments in real-time and output them to LED screens 826 00:58:22.841 --> 00:58:23.722 Network Switch Hub & Sync Generator: Synchronizes all devices to maintain accurate signal timing, preventing screen tearing or mismatches 827 00:58:23.722 --> 00:58:24.363 LED Image Processor: Adjusts the rendered images to fit the LED screen, ensuring high-quality visual output 828 00:58:24.363 --> 00:58:24.844 Camera and Lens Tracking System: Ensures real-world camera movement and lens data are accurately reflected in the virtual environment 829 00:58:24.844 --> 00:58:25.603 Helps create seamless, realistic scenes 830 00:58:25.603 --> 00:58:30.523 VX Studio On-Site Demonstration LED Wall, Brain Bar, LED Image Processor, Camera Tracking, Media Server, Master Genlock