WEBVTT 1 00:00:23.802 --> 00:00:24.811 Hello 2 00:00:24.811 --> 00:00:28.235 Today, let's take some time 3 00:00:28.235 --> 00:00:31.320 to analyze new technologies in more detail 4 00:00:32.340 --> 00:00:36.967 Today, we will utilize various sensors and 5 00:00:36.967 --> 00:00:41.099 how they can be used to interact with users 6 00:00:41.099 --> 00:00:46.350 Next, we'll explore the importance and potential applications of XR technology 7 00:00:46.350 --> 00:00:49.564 So, based on these aspects 8 00:00:49.564 --> 00:00:53.730 we aim to utilize new technologies in immersive content 9 00:00:53.730 --> 00:00:58.091 and to discuss what kinds of content 10 00:00:58.091 --> 00:01:00.273 can emerge from these technologies 11 00:01:00.955 --> 00:01:04.737 Definitions and Applications of XR 12 00:01:04.737 --> 00:01:10.235 First, let's discuss the definition and applications of XR 13 00:01:10.235 --> 00:01:16.082 I assume the definitions of XR, VR, AR, and MR 14 00:01:16.082 --> 00:01:17.339 have been covered in many lectures 15 00:01:17.820 --> 00:01:23.228 However, I'll try to explain these concepts 16 00:01:23.228 --> 00:01:25.451 focusing less on theoretical and foundational aspects 17 00:01:25.451 --> 00:01:27.814 in a more simple way 18 00:01:28.220 --> 00:01:31.791 AR is simply based on smartphones 19 00:01:32.919 --> 00:01:36.922 You can think of it as viewing through the camera lens 20 00:01:36.922 --> 00:01:39.379 Perhaps it's easier to think of it that way 21 00:01:39.379 --> 00:01:40.850 Then, what's VR? 22 00:01:40.850 --> 00:01:47.562 VR is based on HMD, which stands for Head-Mounted Display 23 00:01:47.562 --> 00:01:51.497 it's entering a virtual world to enjoy content 24 00:01:51.497 --> 00:01:54.620 VR can be defined as immersing oneself in a virtual environment to experience content 25 00:01:54.620 --> 00:01:56.875 Then, what's MR? 26 00:01:56.875 --> 00:02:02.557 MR can be defined as being based on HMD 27 00:02:02.557 --> 00:02:07.518 But incorporating lens-based technology to simultaneously show the real world 28 00:02:07.518 --> 00:02:10.779 allowing for augmented experiences 29 00:02:10.779 --> 00:02:16.501 Actually, the definitions I just provided for AR, VR, and MR were 30 00:02:16.501 --> 00:02:20.660 to help everyone understand 31 00:02:20.660 --> 00:02:24.206 It might be helpful to blend those examples a bit 32 00:02:24.206 --> 00:02:26.979 for easier understanding 33 00:02:26.979 --> 00:02:29.894 When I approach technology 34 00:02:29.894 --> 00:02:35.779 I often see technology based on content from movies 35 00:02:35.779 --> 00:02:41.082 In fact, if you enjoy sci-fi movies, you'll see all sorts of futuristic technologies in them 36 00:02:41.964 --> 00:02:46.746 So, practicing matching terms with scenes 37 00:02:46.746 --> 00:02:48.763 from such movies might make it easier 38 00:02:48.763 --> 00:02:54.060 for you to encounter new technologies 39 00:02:54.060 --> 00:02:58.699 Thus, we've concluded with the definitions of AR, VR, and MR, returning to the basics 40 00:02:58.699 --> 00:03:00.660 What's the definition of XR? 41 00:03:00.660 --> 00:03:06.651 XR is a definition that combines all definitions of AR, VR, and MR 42 00:03:07.216 --> 00:03:14.177 So, some people actually refer to AR as XR as well 43 00:03:14.177 --> 00:03:17.330 or VR as XR 44 00:03:17.330 --> 00:03:20.264 or even MR as XR 45 00:03:20.264 --> 00:03:22.025 However, it's not an incorrect statement 46 00:03:22.025 --> 00:03:25.689 But if you see XR as an overarching concept 47 00:03:25.689 --> 00:03:28.109 it could be seen as a somewhat incorrect statement in that context 48 00:03:28.109 --> 00:03:31.754 So, going forward, the definition of XR would be 49 00:03:31.754 --> 00:03:35.609 I see it as spatial-based or spatial-content 50 00:03:35.609 --> 00:03:38.510 which is likely to expand 51 00:03:38.510 --> 00:03:42.548 That's not to say that AR, VR, and MR aren't part of XR 52 00:03:42.548 --> 00:03:46.339 But it's important to clearly distinguish the differences between them 53 00:03:46.339 --> 00:03:51.107 Perhaps we could define AR, VR, MR and XR 54 00:03:51.107 --> 00:03:53.168 in terms of their potential for expansion 55 00:03:53.168 --> 00:03:59.460 and extension into these areas 56 00:03:59.460 --> 00:04:03.560 Then, if I were to give a simple definition of XR 57 00:04:03.560 --> 00:04:07.681 let's delve a bit deeper into the definition of AR this time 58 00:04:07.681 --> 00:04:10.268 AR stands for Augmented Reality, which means augmenting the real world with digital elements 59 00:04:11.070 --> 00:04:16.225 Augmented Reality, it's on top of the real world, in physical space 60 00:04:16.225 --> 00:04:20.133 It combines images or videos existing in real space with computer-generated graphics 61 00:04:20.707 --> 00:04:24.591 Therefore, it synthesizes images, whether they exist or not, into the real world 62 00:04:24.591 --> 00:04:27.881 It's a technology that creates virtual spaces within the real world 63 00:04:27.881 --> 00:04:30.555 Where? Where do they create it? 64 00:04:30.555 --> 00:04:33.176 Yes, it's created within a monitor 65 00:04:33.889 --> 00:04:36.524 This kind of augmented reality technology 66 00:04:36.524 --> 00:04:39.602 immerses users in virtual reality 67 00:04:39.602 --> 00:04:43.179 enabling real-time interactions that you can't see in the actual environment 68 00:04:43.179 --> 00:04:46.179 That's how it's defined in the dictionary 69 00:04:46.179 --> 00:04:52.120 Actually, this immersion is incredibly present in both VR and AR 70 00:04:52.120 --> 00:04:54.299 It seems to be present in MR and XR as well 71 00:04:54.299 --> 00:04:58.512 But the idea of immersion seems to involve a similar format to the real environment 72 00:04:58.512 --> 00:05:01.420 It seems to involve implementing that concept 73 00:05:01.420 --> 00:05:05.851 It relates to various industries such as broadcasting, video, and healthcare 74 00:05:05.851 --> 00:05:08.489 It's being applied in various areas 75 00:05:08.489 --> 00:05:12.204 However, AR is one of the most representative 76 00:05:12.204 --> 00:05:14.722 If we're talking about Pokemon GO 77 00:05:14.722 --> 00:05:18.961 the second most prominent thing we encountered after COVID-19 78 00:05:18.961 --> 00:05:21.009 could be considered QR codes 79 00:05:21.009 --> 00:05:22.972 I actually thought QR codes were going to disappear 80 00:05:23.477 --> 00:05:26.834 I find it as ugly and inconvenient 81 00:05:26.834 --> 00:05:30.114 In fact, after COVID-10, there's been a significant increase in the use of QR codes 82 00:05:30.114 --> 00:05:33.500 which have become highly activated 83 00:05:33.500 --> 00:05:39.583 When you go to places like restaurants, you often see keyboards, tablets, or similar devices for ordering, right? 84 00:05:39.583 --> 00:05:42.012 In the future, these kinds of processes will likely 85 00:05:42.012 --> 00:05:44.388 be mostly done through QR codes 86 00:05:44.388 --> 00:05:47.213 And these QR codes 87 00:05:47.213 --> 00:05:49.806 come in a wide variety 88 00:05:49.806 --> 00:05:53.138 serving as a means to promote oneself and express various aspects 89 00:05:53.138 --> 00:05:54.966 Next up is virtual reality 90 00:05:54.966 --> 00:05:58.230 Virtual reality is a completely imagined space 91 00:05:58.230 --> 00:06:00.398 It's a space that doesn't exist in reality 92 00:06:00.398 --> 00:06:05.767 That's why it requires special devices 93 00:06:05.767 --> 00:06:08.829 It's represented by devices like HMDs 94 00:06:08.829 --> 00:06:11.925 Through head-mounted displays 95 00:06:11.925 --> 00:06:16.189 they allow users to access virtual worlds from the real world 96 00:06:16.189 --> 00:06:19.362 and transform the real-world experience 97 00:06:19.362 --> 00:06:22.220 into accessing an artificial environment 98 00:06:22.220 --> 00:06:29.297 In fact, virtual reality can be composed of real places or videos 99 00:06:29.297 --> 00:06:33.859 or it could be 100% computer-generated CG environments 100 00:06:33.859 --> 00:06:37.694 Real environments can be created using 360-degree cameras 101 00:06:37.694 --> 00:06:42.420 and those cameras can produce videos of actual locations 102 00:06:42.420 --> 00:06:44.737 By creating videos of these environments 103 00:06:44.737 --> 00:06:48.500 we try to represent actual places as if you're really there in 360 degrees 104 00:06:48.500 --> 00:06:52.122 So far, these have been represented mainly through HMDs 105 00:06:52.122 --> 00:06:56.100 accessing these realities through head-mounted displays, but 106 00:06:56.100 --> 00:07:01.133 as mentioned in the previous lecture, square and similar areas 107 00:07:01.133 --> 00:07:04.398 extensions of XR are also possible 108 00:07:04.398 --> 00:07:07.043 Extensions into the virtual realm are where 109 00:07:07.043 --> 00:07:08.443 much of the expansion is happening 110 00:07:09.017 --> 00:07:14.870 What are the differences between virtual and augmented reality? 111 00:07:15.385 --> 00:07:16.430 The biggest difference is 112 00:07:16.430 --> 00:07:20.460 whether it utilizes information from the real world 113 00:07:20.460 --> 00:07:24.552 In case of augmented reality, it involves using the camera 114 00:07:24.552 --> 00:07:27.120 by receiving information from the real world 115 00:07:27.120 --> 00:07:29.653 and it operates based on that information 116 00:07:30.247 --> 00:07:34.364 But virtual reality is entering entirely new spaces, spaces that don't exist 117 00:07:34.364 --> 00:07:39.761 Even though it can recreate existing spaces, it also creates entirely new virtual reality spaces 118 00:07:40.266 --> 00:07:43.415 The data for spaces in virtual reality 119 00:07:43.415 --> 00:07:46.479 doesn't measure data from real world 120 00:07:46.479 --> 00:07:50.143 Therefore, the biggest difference between virtual reality and augmented reality is 121 00:07:50.143 --> 00:07:54.789 whether it utilizes information from the real world 122 00:07:54.789 --> 00:07:59.700 And whether it creates entirely new spaces from scratch 123 00:07:59.700 --> 00:08:04.364 it's worth considering that virtual reality can come into existence 124 00:08:05.246 --> 00:08:08.900 For examples of augmented reality 125 00:08:09.534 --> 00:08:15.540 content extensions can be seen in games like Minecraft 126 00:08:15.540 --> 00:08:19.068 It can also be exemplified in experiences like Pokemon GO 127 00:08:19.068 --> 00:08:23.480 Currently, the most promising developments could be 128 00:08:23.480 --> 00:08:26.048 applications in the logistics system 129 00:08:26.048 --> 00:08:28.310 Possible advancements in the logistics system as well 130 00:08:28.310 --> 00:08:32.083 And it can be personalized and tailored to individual needs 131 00:08:32.083 --> 00:08:35.680 It would be beneficial to consider personalized marketing ventures as potential content 132 00:08:35.680 --> 00:08:40.659 for virtual reality and augmented reality in the future 133 00:08:40.659 --> 00:08:42.876 I'm sure you all know what algorithms are 134 00:08:42.876 --> 00:08:44.973 What algorithms do you know? 135 00:08:44.973 --> 00:08:47.693 You've probably heard of 136 00:08:47.693 --> 00:08:50.500 the YouTube algorithm, the Google algorithm, and the Naver algorithm 137 00:08:50.500 --> 00:08:56.290 When you've looked into how algorithms determine what content surfaces, 138 00:08:56.290 --> 00:09:00.020 going forward, personalized content tailored to each individual 139 00:09:00.020 --> 00:09:04.792 will continue to be provided on YouTube 140 00:09:04.792 --> 00:09:06.481 Thus, some contents that you're interested in, such as 141 00:09:06.481 --> 00:09:09.979 if you're interested in politics, you'll see a lot of political content 142 00:09:09.979 --> 00:09:12.841 If you're interested in VR 143 00:09:13.406 --> 00:09:15.885 you'll find a lot of VR content 144 00:09:15.885 --> 00:09:20.219 If you're interested in artificial intelligence, you'll receive a lot of recommendations for AI content 145 00:09:20.219 --> 00:09:23.719 If you're interested in celebrities, you'll find a lot of celebrity content 146 00:09:23.719 --> 00:09:27.260 If you're interested in Hallyu (Korean Wave), you'll also find a lot of content related to that 147 00:09:27.260 --> 00:09:28.786 When you've explored these areas 148 00:09:28.786 --> 00:09:30.591 I believe people can create new algorithms 149 00:09:30.591 --> 00:09:33.213 based on these existing algorithms 150 00:09:34.332 --> 00:09:37.303 Based on the aspects that you've already formed 151 00:09:37.303 --> 00:09:40.893 through your algorithms 152 00:09:40.893 --> 00:09:44.243 I can probably tell what specific colors you like 153 00:09:44.807 --> 00:09:45.484 Can I? 154 00:09:46.008 --> 00:09:47.828 Well, based on the content 155 00:09:47.828 --> 00:09:52.370 through algorithmic analysis, it's possible to determine what specific colors people might like 156 00:09:52.370 --> 00:09:55.460 Color preference and matching outfits: Algorithmic analysis and recommendations If, for example, when you shop online 157 00:09:55.460 --> 00:10:01.183 and looked at clothing related to the specific colors you like 158 00:10:01.183 --> 00:10:05.539 it can be seen as something that could be included in recommendations by the algorithm 159 00:10:05.539 --> 00:10:08.563 I'll imagine this scenario a bit further 160 00:10:08.563 --> 00:10:10.171 This is a product that hasn't been released yet 161 00:10:11.290 --> 00:10:13.096 The phone bends, right? 162 00:10:13.096 --> 00:10:15.962 Through a flexible display, the phone bends 163 00:10:16.566 --> 00:10:20.780 And fiber-based displays are currently being developed 164 00:10:21.473 --> 00:10:25.900 Let me share an imaginative scenario about future content 165 00:10:25.900 --> 00:10:29.752 While preparing for the lecture this morning, 166 00:10:29.752 --> 00:10:34.140 I was considering which suit to wear and looked at various colors 167 00:10:34.140 --> 00:10:36.689 There's navy blue, black 168 00:10:36.689 --> 00:10:40.981 and even charcoal gray suits, among others 169 00:10:40.981 --> 00:10:43.979 I ended up wearing the most classic black suit 170 00:10:43.979 --> 00:10:46.324 However, when I stood in front of the camera 171 00:10:46.324 --> 00:10:49.321 I started to think, "Maybe I should have worn navy blue instead?" 172 00:10:49.321 --> 00:10:51.659 and felt a slight regret 173 00:10:51.659 --> 00:10:54.947 Decisions and dilemmas in everyday life could be solved through algorithms It seems like these aspects could be addressed in the future 174 00:10:54.947 --> 00:10:57.859 I'm imagining content right now 175 00:10:57.859 --> 00:11:02.003 It's a prudent imagination about 176 00:11:02.003 --> 00:11:04.257 whether these aspects might actually come true in 10 years 177 00:11:05.475 --> 00:11:09.251 Flexible displays have already emerged in smartphones 178 00:11:09.251 --> 00:11:10.475 The display bends 179 00:11:10.881 --> 00:11:15.007 And there's a lot of development and 180 00:11:15.007 --> 00:11:16.420 emergence of fiber-based displays right now 181 00:11:16.420 --> 00:11:18.462 So, in the future 182 00:11:18.462 --> 00:11:22.000 it's possible that suits could be made 183 00:11:22.000 --> 00:11:25.059 entirely from fiber materials like this in the future 184 00:11:25.059 --> 00:11:29.513 Someday, it's conceivable that 185 00:11:29.513 --> 00:11:32.649 there could be features on phones where you can say 186 00:11:32.649 --> 00:11:34.900 "Today, I'd like to wear this color," or "Today, I'd like to wear this pattern" 187 00:11:34.900 --> 00:11:38.110 If that were the case, using your phone in the morning 188 00:11:38.110 --> 00:11:43.358 you wouldn't buy clothes based on 189 00:11:43.358 --> 00:11:44.938 patterns like A today, B tomorrow, C the next day 190 00:11:44.938 --> 00:11:47.807 clothing becomes more like a tool 191 00:11:47.807 --> 00:11:51.781 and I think it might change into a form 192 00:11:51.781 --> 00:11:54.460 where people buy designs created by you 193 00:11:54.460 --> 00:12:01.627 In that case, industries like VR and XR, which produce immersive content 194 00:12:01.627 --> 00:12:04.179 could be seen as connected to the fashion industry 195 00:12:04.179 --> 00:12:08.467 Until now, designers and design, as well as content 196 00:12:08.477 --> 00:12:12.350 have been somewhat disconnected from industries 197 00:12:12.350 --> 00:12:14.946 That's why, in fact, design has been 198 00:12:14.946 --> 00:12:18.820 more about my own pride and satisfaction 199 00:12:18.820 --> 00:12:22.503 In reality, many people, especially older generations 200 00:12:22.503 --> 00:12:24.820 have the perception that design is something that can be done in just one day 201 00:12:24.820 --> 00:12:30.460 However, with the emergence of content and immersive experiences, there's been a significant industrial connection 202 00:12:30.460 --> 00:12:33.223 You should be aware of the significant changes 203 00:12:33.223 --> 00:12:35.976 happening in industries, which would be beneficial 204 00:12:37.144 --> 00:12:40.465 Next, we have an example of virtual reality 205 00:12:40.465 --> 00:12:43.156 Currently, the virtual reality we know of is 206 00:12:43.156 --> 00:12:47.979 primarily focused on the entertainment sector 207 00:12:47.979 --> 00:12:50.760 If you look at the cases, you might already be familiar with them 208 00:12:50.760 --> 00:12:53.739 But in this entertainment context 209 00:12:53.739 --> 00:12:57.091 the VR content we create is 210 00:12:57.091 --> 00:13:00.368 now becoming usable 211 00:13:00.368 --> 00:13:02.979 in a wide variety of fields 212 00:13:02.979 --> 00:13:07.497 While the majority of VR content is still based on devices 213 00:13:07.497 --> 00:13:11.580 like head-mounted displays (HMDs) 214 00:13:11.580 --> 00:13:18.868 these applications span across various industries including gaming, film, fine arts, design, architecture, medicine, aviation, military, etc. 215 00:13:18.868 --> 00:13:23.809 VR device Oculus/ Artwork created with VR by Giovanni Nakpil It's reported that there's a lot of utilization happening across various fields 216 00:13:24.219 --> 00:13:26.097 Indeed, in the United States 217 00:13:27.067 --> 00:13:31.456 military personnel are actively using VR technology 218 00:13:31.456 --> 00:13:35.550 for various aspects of military training 219 00:13:35.550 --> 00:13:37.951 Samsung Electronics in South Korea, too 220 00:13:37.951 --> 00:13:44.053 They supply VR training content to the Ministry of National Defense 221 00:13:44.053 --> 00:13:46.173 where personnel wear Gear VR for virtual training 222 00:13:46.896 --> 00:13:51.133 I've analyzed some of these cases 223 00:13:51.133 --> 00:13:55.210 You can see that these technologies are also widely utilized in South Korea 224 00:13:55.210 --> 00:14:00.659 especially in reserve forces training 225 00:14:00.659 --> 00:14:04.845 And these advancements are further developed in the field of MR 226 00:14:04.845 --> 00:14:06.249 Mixed Reality 227 00:14:06.249 --> 00:14:08.615 Mixed Reality is a blend of virtual and real worlds 228 00:14:08.615 --> 00:14:12.181 This mixed reality involves real environments and augmented reality 229 00:14:12.181 --> 00:14:16.246 This includes both augmented reality and virtual reality, 230 00:14:16.246 --> 00:14:21.232 creating a digital world within the real world 231 00:14:21.232 --> 00:14:23.419 that can be integrated with augmented reality and similar technologies 232 00:14:23.419 --> 00:14:28.202 Imagine a scenario where 233 00:14:28.202 --> 00:14:29.618 these real and digital worlds merge 234 00:14:29.618 --> 00:14:35.789 While wearing the MR device, you could navigate the streets and engage in various activities 235 00:14:35.789 --> 00:14:41.933 Mixed Reality (MR) examples: Microsoft HoloLens 2 / Meta Quest 3 Could it be a world similar to the one Iron Man creates in the Avengers? 236 00:14:42.775 --> 00:14:46.341 While using MR technology, you could have you own J.A.R.V.I.S., assisting you with various tasks and providing information 237 00:14:46.341 --> 00:14:49.059 It narrates information like an encyclopedia, based on your knowledge 238 00:14:49.059 --> 00:14:52.116 But what do we have? 239 00:14:52.116 --> 00:14:58.956 We have AI from Google for Android devices, and Siri for iPhones 240 00:14:58.956 --> 00:15:02.219 With these features, currently, 241 00:15:02.219 --> 00:15:07.260 many aspects of future content depiced in Avengers movies 242 00:15:07.260 --> 00:15:11.838 are now reflected in the real world 243 00:15:11.838 --> 00:15:15.940 Apple has released products like the Apple Vision Pro 244 00:15:15.940 --> 00:15:20.524 These developments indicate 245 00:15:21.246 --> 00:15:25.859 a gradual convergence of the real and virtual worlds in our reality 246 00:15:25.859 --> 00:15:28.614 Because I was curious 247 00:15:28.614 --> 00:15:35.059 I also wanted to try out devices for Gear VR and mixed reality 248 00:15:35.059 --> 00:15:38.380 so I bought one and tried using it 249 00:15:38.380 --> 00:15:41.699 When watching movies 250 00:15:41.699 --> 00:15:46.739 you feel like you're completely immersed in the scene, as if you're right there in the square 251 00:15:46.739 --> 00:15:51.543 And when playing games, you experience a tremendous increase in immersion 252 00:15:51.543 --> 00:15:55.260 with a heightened sense of realism 253 00:15:55.260 --> 00:15:58.858 In Korea, there aren't many people 254 00:15:58.858 --> 00:16:00.724 using these devices to walk around the streets just yet 255 00:16:00.724 --> 00:16:03.979 However, I've personally seen many people doing so in the United States 256 00:16:03.979 --> 00:16:07.727 There are even people working in cafes using these devices 257 00:16:07.727 --> 00:16:10.803 Additionally 258 00:16:10.803 --> 00:16:13.435 some people use MR devices 259 00:16:13.435 --> 00:16:15.919 to walk around and view objects in real life 260 00:16:15.919 --> 00:16:19.360 And there are also people who engage in continuous conversation with Siri 261 00:16:19.360 --> 00:16:24.624 I believe that in the future, a world blended with Mixed Reality 262 00:16:24.624 --> 00:16:28.618 Virtual Reality, and XR will unfold before us 263 00:16:28.618 --> 00:16:31.790 I assume that because we anticipate such a world emerging 264 00:16:31.790 --> 00:16:37.460 it's likely why you're listening to this lecture now 265 00:16:37.460 --> 00:16:41.803 Then, if we consider whether XR was always just XR from the beginning 266 00:16:41.803 --> 00:16:44.243 the answer is likely no 267 00:16:44.243 --> 00:16:48.739 If we ask whether VR was always just VR from the beginning, the answer is likely no as well 268 00:16:48.739 --> 00:16:52.856 In fact, the origins of VR can be traced back to simulations and visualizations 269 00:16:52.856 --> 00:16:55.527 used for various purposes 270 00:16:55.527 --> 00:16:58.340 before evolving into the immersive experiences we know today 271 00:16:58.340 --> 00:17:00.555 So it started with simulators 272 00:17:00.555 --> 00:17:03.928 Then it was expressed as VR, AR, MR, XR, and so on 273 00:17:03.928 --> 00:17:09.285 It's being implemented as a time 274 00:17:09.285 --> 00:17:11.517 when we're going back to simulators based on VR and XR 275 00:17:11.517 --> 00:17:13.421 And these aspects 276 00:17:13.421 --> 00:17:18.413 now more people are buying HMDs and using them on their own 277 00:17:18.413 --> 00:17:23.298 And now there's a need for extensions of content 278 00:17:23.298 --> 00:17:27.140 that can be used beyond just wearing HMDs 279 00:17:27.140 --> 00:17:31.696 Let's delve a bit deeper 280 00:17:31.696 --> 00:17:34.803 into the technical aspects of these developments 281 00:17:34.803 --> 00:17:37.420 Now, XR technologies 282 00:17:37.420 --> 00:17:41.158 They encompass a wide range of applications across various industries 283 00:17:41.158 --> 00:17:46.380 Describing these technologies as encompassing a diverse array of capablites would be apt 284 00:17:46.380 --> 00:17:52.064 XR is currently based on tasks such as 3D scanning 285 00:17:53.262 --> 00:18:00.500 motion capture, CGI, and real-time rendering technology in Unreal Engine 286 00:18:00.500 --> 00:18:06.946 Furthermore, deep learning techniques for tactile sensation recognition through VR gloves 287 00:18:06.946 --> 00:18:12.540 are being analyzed as production elements for XR in VR 288 00:18:12.540 --> 00:18:16.526 Indeed, 3D scanning encompasses not only avatar's skeletal structures 289 00:18:16.526 --> 00:18:20.393 but also building architecture 290 00:18:20.393 --> 00:18:25.699 So, the concept of digial twin for buildings and the digital mirror world mentioned in the lecture 291 00:18:25.699 --> 00:18:30.614 can be related to the use of 3D scanning 292 00:18:30.614 --> 00:18:34.583 These days, not only do people work with actual modeling, but 293 00:18:34.583 --> 00:18:38.459 they also express concepts through 3D scanning operations 294 00:18:38.459 --> 00:18:42.590 And along with that, there's motion capture technology to capture movements and actions 295 00:18:42.590 --> 00:18:45.115 All the actions I'm doing 296 00:18:45.115 --> 00:18:48.067 they can be replicated 297 00:18:48.819 --> 00:18:53.296 by a 3D-scanned avatar with natural movements 298 00:18:53.296 --> 00:18:57.180 This capability can be implemented not only with high-end equipment 299 00:18:57.180 --> 00:19:00.670 But also with the smartphones you already have 300 00:19:00.670 --> 00:19:04.180 which is worth mentioning 301 00:19:04.180 --> 00:19:06.565 So, as you can see on your phones 302 00:19:06.565 --> 00:19:10.019 most come equipped with at least two to three built-in cameras 303 00:19:10.019 --> 00:19:13.969 One is known as a depth camera, and the other is an RGB camera 304 00:19:13.969 --> 00:19:17.900 One is based on vision cameras, including IR cameras 305 00:19:17.900 --> 00:19:22.217 Smartphones with three lenses, including depth-sensing and vision cameras 306 00:19:22.217 --> 00:19:27.979 can perform motion capture tasks 307 00:19:27.979 --> 00:19:32.712 In addition, CGI technology, which stands for Computer-Generated Imagery 308 00:19:32.712 --> 00:19:36.060 refers to the process of generating and manipulating visual content using computer software 309 00:19:36.060 --> 00:19:39.939 It's one of the oldest and most foundational technologies in the realm of digital graphics 310 00:19:39.939 --> 00:19:44.260 You could also think of CGI as advanced post-processing for achieving high-quality visual effects 311 00:19:44.260 --> 00:19:46.056 In the past 312 00:19:46.056 --> 00:19:49.773 when creating content based on 3D 313 00:19:49.773 --> 00:19:54.939 All of these elements were primarily developed through rendering technologies 314 00:19:54.939 --> 00:19:58.525 The content created through Unreal Engine and Unity Engine, for example 315 00:19:58.525 --> 00:20:01.955 Live techniques allow for the expression of realistic aspects in real time 316 00:20:01.955 --> 00:20:05.300 even those previously rendered 317 00:20:05.300 --> 00:20:06.753 The following is deep learning 318 00:20:06.753 --> 00:20:11.150 If artificial intelligence technology is incorporated here, it woud enable the creation of higher quality textures 319 00:20:11.150 --> 00:20:16.819 and backgrounds for the technologies that one's developing 320 00:20:16.819 --> 00:20:23.028 Additionally, you can see that gloves and head-mounted can provide 321 00:20:23.028 --> 00:20:27.110 input-sensing that allows for more diverse expressions 322 00:20:27.110 --> 00:20:31.140 through facial expressions and other features 323 00:20:31.140 --> 00:20:34.668 Based on these aspects, it would be good to see that the metaverse platform 324 00:20:34.668 --> 00:20:37.387 it would be good to see that the metaverse platform 325 00:20:37.387 --> 00:20:39.819 can create a wider variety of XR experiences 326 00:20:39.819 --> 00:20:42.446 These aspects allow 327 00:20:42.446 --> 00:20:47.080 for the creation of content in real-time 328 00:20:47.080 --> 00:20:50.860 based on various hardware and motion capture 329 00:20:50.860 --> 00:20:54.540 utilizing diverse inputs 330 00:20:54.540 --> 00:20:57.947 While the performance of the device is also very important 331 00:20:57.947 --> 00:21:01.357 Because device performance has improved significantly 332 00:21:01.357 --> 00:21:03.453 You can see that there are hardly any content 333 00:21:03.453 --> 00:21:08.739 that are being created now that cannot be run 334 00:21:08.739 --> 00:21:11.595 Furthermore 335 00:21:11.595 --> 00:21:14.645 As these aspects become more diverse in terms of devices and sensors 336 00:21:14.645 --> 00:21:19.660 The applications of XR technology are increasingly expanding 337 00:21:19.660 --> 00:21:24.660 Particularly centered around simulators 338 00:21:24.660 --> 00:21:29.699 In fields such as manufacturing, healthcare, education, culture, and defense 339 00:21:29.699 --> 00:21:32.604 Based on these aspects 340 00:21:33.476 --> 00:21:37.550 You can also see that XR can no longer exclude elements 341 00:21:37.550 --> 00:21:40.619 Such as artificial intelligence 342 00:21:40.619 --> 00:21:43.597 These, including artificial intelligence, now involve creating virtual assistants 343 00:21:43.597 --> 00:21:47.203 So, what we were talking about earlier - creating a virtual assistant - 344 00:21:47.203 --> 00:21:50.150 is actually content 345 00:21:50.150 --> 00:21:56.702 However, the technologies that transform what I'm saying into text 346 00:21:56.702 --> 00:22:01.739 and translate into different languages 347 00:22:01.739 --> 00:22:04.330 can be summarized as aspects of artificial intelligence 348 00:22:04.330 --> 00:22:07.110 within the context of virtual assistants 349 00:22:07.110 --> 00:22:12.261 These could lead to a future 350 00:22:12.261 --> 00:22:16.001 where you can naturally utilize everything 351 00:22:16.001 --> 00:22:19.346 by simply wearing earphones 352 00:22:19.346 --> 00:22:21.699 similar to how you might wear earbuds or AirPods now with your Buzz or iPhone 353 00:22:21.699 --> 00:22:25.796 When these aspects become more commercialized 354 00:22:25.796 --> 00:22:28.001 for mixed reality (MR) devices 355 00:22:28.001 --> 00:22:33.260 They can be expressed in a much broader range of applications 356 00:22:33.260 --> 00:22:39.417 Then why would we want to express these aspects of XR? 357 00:22:40.516 --> 00:22:44.160 Why would XR be suitable for industries? 358 00:22:44.160 --> 00:22:46.619 We can actually create real content 359 00:22:46.619 --> 00:22:50.909 Why would industries consider transitioning to XR virtual and augmented reality worlds 360 00:22:50.909 --> 00:22:53.050 transitioning to XR virtual and augmented reality worlds 361 00:22:53.050 --> 00:22:58.755 when they already have existing content to work with? 362 00:22:59.398 --> 00:23:01.201 Would you like to think about it for a moment? 363 00:23:03.320 --> 00:23:06.970 These factors are also evolving in tandem 364 00:23:06.970 --> 00:23:10.381 with our environmental impact 365 00:23:11.133 --> 00:23:17.660 I think you've probably heard a lot about ESG (Environmental, Social, and Governance) issues worldwide 366 00:23:17.660 --> 00:23:21.357 There are indeed many issues related to the environment being addressed 367 00:23:21.357 --> 00:23:24.667 These aspects are also relevant to virtual reality and augmented reality 368 00:23:24.667 --> 00:23:28.336 XR Benefit 1: Waste Reduction And XR helps to reduce waste 369 00:23:28.336 --> 00:23:30.728 Since separate training facilities are not built 370 00:23:30.728 --> 00:23:35.518 It means that training can be conducted multiple times without being discarded after a single use 371 00:23:35.518 --> 00:23:36.971 Because it is a content 372 00:23:36.971 --> 00:23:39.202 XR Benefit 2: Cost Reduction And it can reduce costs 373 00:23:40.014 --> 00:23:42.278 Content related to these aspects 374 00:23:42.278 --> 00:23:45.540 Content that is created once and then discarded 375 00:23:45.540 --> 00:23:47.622 needs to be continually discarded 376 00:23:47.622 --> 00:23:50.932 Therefore, it needs to be continually produced 377 00:23:50.932 --> 00:23:54.099 However, virtual content that is produced once 378 00:23:54.099 --> 00:23:57.782 can be continuously utilized 379 00:23:57.782 --> 00:24:01.979 XR Benefit 3: Improved Understanding Additionally, it can enhance comprehension 380 00:24:01.979 --> 00:24:05.542 XR Benefit 4: Project Efficiency And it allows for efficient project management 381 00:24:05.542 --> 00:24:10.663 Because it's essentially composed of virtual elements 382 00:24:10.663 --> 00:24:13.720 XR Benefit 5: Enhanced Safety Safety is extended and improved 383 00:24:13.720 --> 00:24:19.219 XR Benefit 6: Better Decision Making It can even lead to better decision-making, making a total of six benefits 384 00:24:19.219 --> 00:24:25.237 We anticipate that these aspects will expand further in the future 385 00:24:25.237 --> 00:24:29.097 due to environmental and ESG considerations 386 00:24:29.813 --> 00:24:33.733 Sensor-based interaction research 387 00:24:33.733 --> 00:24:36.489 XR technology encompasses various sensors 388 00:24:36.489 --> 00:24:42.646 User movements, positions, voice, emotions, and more 389 00:24:42.646 --> 00:24:46.300 It involves collecting various information 390 00:24:46.300 --> 00:24:48.921 and reflecting it in the virtual environment 391 00:24:48.921 --> 00:24:52.370 So, because of these aspects 392 00:24:52.370 --> 00:24:55.699 It's often tied closely to the real world 393 00:24:55.699 --> 00:24:59.188 Augmented reality similarly relies heavily on various sensor-based foundations 394 00:24:59.188 --> 00:25:02.264 for its expression 395 00:25:02.264 --> 00:25:06.063 Actually, you could say that 396 00:25:06.063 --> 00:25:08.595 Each of you also has these sensors one by one 397 00:25:09.328 --> 00:25:12.596 Gyroscope and accelerometer sensors Both the gyroscope and the accelerometer sensors are present in VR devices as well 398 00:25:12.596 --> 00:25:15.986 This gyroscope sensor is calibrated based on gravity here 399 00:25:15.986 --> 00:25:22.402 aligning with movements accordingly 400 00:25:22.402 --> 00:25:26.060 It initially started as a sensor primarily for gaming purposes 401 00:25:26.060 --> 00:25:30.167 Various contents are being developed 402 00:25:30.167 --> 00:25:33.184 Based on these sensor foundations 403 00:25:33.184 --> 00:25:35.302 You've probably seen a lot of content on mobile devices 404 00:25:36.401 --> 00:25:41.301 Where you tilt to control the movement 405 00:25:41.301 --> 00:25:44.793 Thus, these contents are expressed 406 00:25:44.793 --> 00:25:49.801 through the interactions with sensors 407 00:25:49.801 --> 00:25:52.210 found in mobile and smartphones, among others 408 00:25:52.210 --> 00:25:54.053 And there's also the accelerometer sensor 409 00:25:54.053 --> 00:25:57.014 Using various sensors 410 00:25:57.014 --> 00:26:00.438 The accelerometer measures 411 00:26:00.438 --> 00:26:06.699 The current position and the next position to determine the acceleration 412 00:26:06.699 --> 00:26:12.015 The current position is determined based on past positions 413 00:26:12.015 --> 00:26:15.314 The sensor also incorporates aspects like velocity 414 00:26:15.314 --> 00:26:21.371 assessing how much movement occurs within a certain time frame 415 00:26:21.371 --> 00:26:24.231 Say, in one second 416 00:26:24.231 --> 00:26:27.245 It allows for a more extensive expression of the interaction 417 00:26:27.245 --> 00:26:31.060 in VR 418 00:26:31.060 --> 00:26:32.520 and XR 419 00:26:32.520 --> 00:26:36.400 Next up is the motion sensor, what is a motion sensor? 420 00:26:36.400 --> 00:26:41.168 It's a sensor that detects gestures and motions to express actions 421 00:26:41.168 --> 00:26:44.665 In the past, motion sensors typically relied on recognizing this kind of shape 422 00:26:44.665 --> 00:26:49.520 (A rectangle cut in half-like shape) to initiate recognition 423 00:26:49.520 --> 00:26:52.674 Nowadays, content based on human skeletons can be extracted 424 00:26:52.674 --> 00:26:57.180 Not only from specialized motion cameras 425 00:26:57.180 --> 00:26:58.739 But also from regular cameras 426 00:26:58.739 --> 00:27:01.016 It recognizes human faces as well 427 00:27:01.016 --> 00:27:04.806 By recognizing the face 428 00:27:04.806 --> 00:27:07.934 It can then extend recognition to the hands, feet, and individual joints 429 00:27:07.934 --> 00:27:10.748 Now, it's possible to recognize human motion 430 00:27:10.748 --> 00:27:13.430 incorporating recognition of various body parts 431 00:27:13.430 --> 00:27:16.257 Radar sensors that utilize electromagnetic waves Next up is the laser sensor 432 00:27:16.257 --> 00:27:20.540 In reality, laser sensors could be based on devices like laser pointers 433 00:27:20.540 --> 00:27:25.240 The ones you see used in presentations or lectures 434 00:27:25.240 --> 00:27:28.422 Actually, these laser sensors can also display colors in various ways 435 00:27:28.422 --> 00:27:29.900 Right now, it's emitting red light 436 00:27:29.900 --> 00:27:33.214 It can also display the red as blue 437 00:27:33.690 --> 00:27:35.500 It can also be expressed as green 438 00:27:35.500 --> 00:27:37.886 And in fact, it can also be represented as sensors 439 00:27:37.886 --> 00:27:41.380 that utilize forms invisible to the naked eye using visible light 440 00:27:41.380 --> 00:27:45.778 So, by positioning the sensor from under the palm of the hand to above it 441 00:27:46.194 --> 00:27:51.739 It can diversify the content accordingly 442 00:27:51.739 --> 00:27:56.574 And these aspects enable the expression of content in a wider variety of forms 443 00:27:56.574 --> 00:28:00.772 Speaking of these laser sensors 444 00:28:00.772 --> 00:28:04.568 They can actually indicate whether a person has passed by 445 00:28:04.568 --> 00:28:06.450 representing the flow of movement 446 00:28:06.955 --> 00:28:07.884 It can also detect speed 447 00:28:07.884 --> 00:28:12.678 The utilization of laser sensors involves 448 00:28:12.678 --> 00:28:15.373 expressing applications in various fields 449 00:28:15.373 --> 00:28:17.500 each focusing on the detection of flow 450 00:28:17.500 --> 00:28:21.246 through the first and last sensors 451 00:28:21.246 --> 00:28:23.429 Next up is the voltage sensor 452 00:28:23.429 --> 00:28:27.760 The Capacitive Touch Sensor detects changes in electrical capacitance to determine user contact These sensors are commonly used in smartphones 453 00:28:27.760 --> 00:28:34.676 In fact, older smartphones used pressure sensors rather than capacitive touch sensors 454 00:28:34.676 --> 00:28:39.676 So, they recognized when pressure applied 455 00:28:39.676 --> 00:28:43.458 Nowadays, it's not the pressure applied 456 00:28:43.458 --> 00:28:49.511 But rather the user's touch sensation based on their electrical capacitance 457 00:28:49.957 --> 00:28:53.639 Therefore, it detects regular changes in capacitance to sense the user's touch sensation 458 00:28:53.639 --> 00:28:56.976 When it's applied to clothing or similar items, it prevents responses based on user contact 459 00:28:56.976 --> 00:29:03.154 and instead detects changes in capacitance 460 00:29:03.154 --> 00:29:07.601 These aspects are expressed through the touch method of capacitive touch sensors 461 00:29:07.601 --> 00:29:11.232 These aspects are utilized in various industries such as homes 462 00:29:11.232 --> 00:29:15.089 Such as homes, smartphones, automotive systems 463 00:29:15.089 --> 00:29:18.459 and electrical control systems 464 00:29:18.459 --> 00:29:24.037 The utilization of capacitive touch is 465 00:29:24.037 --> 00:29:26.606 indeed extensive 466 00:29:27.447 --> 00:29:31.229 In sensor-based interaction, there are also image sensors 467 00:29:31.229 --> 00:29:36.073 An image Sensor reads object information and converts it into electrical image signals This image sensor is a device that 468 00:29:36.073 --> 00:29:42.049 converts visual information into electrical signals, essentially what cameras do 469 00:29:42.049 --> 00:29:46.764 These components convert the light from real-world objects into electrical energy 470 00:29:47.595 --> 00:29:51.099 transforming it into digital signals 471 00:29:51.772 --> 00:29:57.270 which allows physical scenes to be captured and rendered digitally 472 00:29:57.270 --> 00:29:59.130 The size of the image sensor directly determines the quality and resolution of the images you see 473 00:30:00.140 --> 00:30:03.032 Larger sensors typically capture more light and detail 474 00:30:03.032 --> 00:30:05.140 leading to higher-quality images 475 00:30:05.140 --> 00:30:08.536 The image sensor is crucial when transitioning from the physical world to the virtual world 476 00:30:08.536 --> 00:30:11.836 As it serves as the core technology that captures 477 00:30:11.836 --> 00:30:15.140 and converts real-world objects into digital signals 478 00:30:15.140 --> 00:30:17.385 The next component is the GPS sensor 479 00:30:17.385 --> 00:30:19.707 Satellite-based Navigation System: GPS What is a GPS sensor? 480 00:30:20.341 --> 00:30:21.985 GPS sensor receives signals transmitted from the satellites 481 00:30:22.579 --> 00:30:26.194 So, based on the signals sent from these satellites 482 00:30:27.045 --> 00:30:30.819 The current location of the user is determined 483 00:30:30.819 --> 00:30:34.894 Therefore, in a way, one of the most common applications of GPS sensors is 484 00:30:34.894 --> 00:30:38.020 navigation system 485 00:30:38.020 --> 00:30:42.695 Things like your metro card or other similar systems 486 00:30:42.695 --> 00:30:46.339 often rely on GPS-based technology for distance-based integration 487 00:30:46.339 --> 00:30:51.815 These applications are widely used in various fields such as navigation for sailing 488 00:30:51.815 --> 00:30:54.638 aviation, automotive navigation, hiking, and fitness tracking 489 00:30:54.638 --> 00:30:59.606 Recently, there has been a trend towards creating content 490 00:30:59.606 --> 00:31:04.099 that shows outdoor landscapes and environments 491 00:31:04.099 --> 00:31:08.398 As these areas become more diverse 492 00:31:08.398 --> 00:31:12.379 They are based on indoor positioning systems 493 00:31:12.379 --> 00:31:14.391 Contents based on Wi-Fi positioning 494 00:31:14.391 --> 00:31:18.421 The development of technologies that measure 495 00:31:18.421 --> 00:31:22.243 indoor positioning 496 00:31:22.243 --> 00:31:24.440 based on the distance of Wi-Fi signals is also considered 497 00:31:24.440 --> 00:31:26.388 It's likely that services based on these technologies 498 00:31:26.388 --> 00:31:29.998 will become increasingly common in the future 499 00:31:29.998 --> 00:31:36.895 In reality, due to Korea being a country that's technically at war or in a ceasefire situation 500 00:31:36.895 --> 00:31:41.269 Its industries may not have flourished as much as those in peaceful countries 501 00:31:41.269 --> 00:31:44.522 However, in countries like the United States and others 502 00:31:44.522 --> 00:31:49.581 The signals from GPS satellites are being leveraged 503 00:31:49.581 --> 00:31:54.212 for various forward-looking commercial ventures in the private sector 504 00:31:54.955 --> 00:31:59.560 In places like the California desert 505 00:31:59.560 --> 00:32:03.140 You can witness scenes where private satellites are actually being launched 506 00:32:03.140 --> 00:32:04.997 The gyroscope sensor detects and measures rotation speed Next up is the gyroscope sensor 507 00:32:04.997 --> 00:32:10.339 It detects and measures the rate of rotation 508 00:32:10.339 --> 00:32:13.429 It's expressing that a lot of movement can be represented 509 00:32:14.864 --> 00:32:18.479 based on content using gyro sensors 510 00:32:18.479 --> 00:32:22.088 The gyro sensor-based content is widely utilized in drone-related aspects 511 00:32:22.088 --> 00:32:25.489 to represent such activities 512 00:32:25.489 --> 00:32:29.741 In drone sensors 513 00:32:29.741 --> 00:32:32.420 Gyro sensors are extensively used to depict rotation and movement 514 00:32:32.420 --> 00:32:35.519 So, the drone you're seeing now is 515 00:32:35.519 --> 00:32:38.444 based on four gyro sensors 516 00:32:40.514 --> 00:32:44.420 to control its movement 517 00:32:44.420 --> 00:32:48.554 You can consider these sensors as part of a composite sensing system 518 00:32:48.554 --> 00:32:53.044 that contributes to a sense of realism 519 00:32:53.724 --> 00:32:57.595 The convergence of cutting-edge technology and content in the field 520 00:32:57.595 --> 00:33:02.104 In such various sensors 521 00:33:02.104 --> 00:33:04.779 it's fascinating how various emerging technologies are shaping content creation 522 00:33:04.779 --> 00:33:09.014 across different fields 523 00:33:09.014 --> 00:33:12.957 XR technologies are revolutionizing the architecture and engineering construction industries 524 00:33:12.957 --> 00:33:16.702 by integrating sensors 525 00:33:16.702 --> 00:33:22.370 and content creation in immersive environments 526 00:33:22.370 --> 00:33:23.608 For another example 527 00:33:24.430 --> 00:33:28.172 XR in the automotive industry encompasses various aspects 528 00:33:28.984 --> 00:33:33.442 Electric vehicles can be seen as a combination intermediary 529 00:33:33.442 --> 00:33:40.163 for the integrated aspects of these sensors and content in terms of XR 530 00:33:40.163 --> 00:33:43.540 These aspects, before actual construction 531 00:33:43.540 --> 00:33:46.347 are based on virtual simulators 532 00:33:46.347 --> 00:33:50.921 In fact, it's even possible to refine these aspects further 533 00:33:50.921 --> 00:33:56.945 to conduct prototyping on a smaller scale 534 00:33:56.945 --> 00:34:01.034 XR in media and entertainment 535 00:34:01.034 --> 00:34:07.670 VR games, chat apps with augmented reality, AR filters, virtual exhibitions, and more are 536 00:34:07.670 --> 00:34:10.866 increasingly becoming integrated into your daily life 537 00:34:10.866 --> 00:34:13.419 in various recommended forms 538 00:34:14.033 --> 00:34:15.904 And various aspects can be seen as content-convertible from diverse sensing 539 00:34:15.904 --> 00:34:21.904 as content-convertible from diverse sensing 540 00:34:21.904 --> 00:34:24.357 based on motion recognition sensors 541 00:34:24.357 --> 00:34:27.627 mentioned earlier 542 00:34:27.627 --> 00:34:31.943 NEWJAK's Laser Training: Simulation training for avoiding lasers The content you're currently seeing is 543 00:34:31.943 --> 00:34:37.973 an e-sports style experience based on LiDAR sensor technology 544 00:34:37.973 --> 00:34:41.436 See this as an example of 545 00:34:41.436 --> 00:34:45.350 How LiDAR sensor technology is used to recognize users and 546 00:34:45.350 --> 00:34:49.345 How it's utilized to create content for e-sports 547 00:34:49.345 --> 00:34:51.717 without the need for multiple users 548 00:34:51.717 --> 00:34:54.201 to wear sensors 549 00:35:25.637 --> 00:35:33.459 NEWJAK's XR Shooting Simulator: Body movements using gyroscope sensors Another example is using sensors for firearms in XR, based on infrared (IR) technology 550 00:35:33.459 --> 00:35:40.419 and representing various silhouettes of firearms positions as part of XR content 551 00:35:40.419 --> 00:35:46.414 using methods that don't involve wearing traditional HMD VR headsets 552 00:35:46.414 --> 00:35:51.488 users could potentially enjoy content in a format 553 00:35:51.488 --> 00:35:57.419 where they can physically move around 554 00:35:57.419 --> 00:35:59.419 without having to wear a traditional VR headset 555 00:35:59.419 --> 00:36:05.107 These aspects are combined in a way 556 00:36:05.107 --> 00:36:08.775 that effectively expresses the RDW (Realistic Digital World) technology 557 00:36:08.775 --> 00:36:12.244 NEWJAK's RDW applied on Dobong-ro When it comes to VR cave setups 558 00:36:12.244 --> 00:36:15.478 and Redirected Walking (RDW) 559 00:36:15.983 --> 00:36:20.693 we might see a convergence of various forms of content 560 00:36:20.693 --> 00:36:23.072 into more expansive spaces 561 00:36:23.825 --> 00:36:28.308 In fact 562 00:36:28.308 --> 00:36:33.637 VR content, because of wearing goggles, has lines 563 00:36:33.637 --> 00:36:36.289 so you can't move straight ahead 564 00:36:36.289 --> 00:36:40.037 However, in a cave-like structure 565 00:36:40.037 --> 00:36:43.360 if you continue straight ahead in XR format, you'll bump into the wall 566 00:36:44.072 --> 00:36:48.821 In such cases, it would be beneficial 567 00:36:48.821 --> 00:36:53.201 to consider the use of visual illusion effects to subtly redirect the user 568 00:36:53.201 --> 00:36:55.419 making them turn slightly as if circling around 569 00:36:55.419 --> 00:36:58.746 You could observe 570 00:36:58.746 --> 00:37:01.766 a wide variety of technologies related to XR 571 00:37:01.766 --> 00:37:06.046 NEWJAK's Motion Detection Illustration And in another way 572 00:37:06.789 --> 00:37:08.815 interactive forms in galleries could be like 573 00:37:08.815 --> 00:37:13.447 the paintings in Harry Potter that 574 00:37:13.447 --> 00:37:16.231 continuously interact with the audience 575 00:37:16.874 --> 00:37:19.174 The artworks could be adapted to show 576 00:37:19.174 --> 00:37:21.825 the kind of art that each viewer prefers 577 00:37:21.825 --> 00:37:26.417 DIGIMIX's Interactive touch wall These features can be expressed 578 00:37:26.417 --> 00:37:28.993 simply through a touch wall interface 579 00:37:28.993 --> 00:37:31.768 The Wonyeong Flower at the Naju Immersive Content Experience Zone Motion recognition technology is now 580 00:37:32.213 --> 00:37:37.419 commonly used to create interactive content 581 00:37:37.419 --> 00:37:40.077 Such as particles 582 00:37:40.077 --> 00:37:42.667 or flowers that respond to movements 583 00:37:42.667 --> 00:37:46.735 The Light Playground at Jungang Art Center Additionally, particle generation and 584 00:37:47.220 --> 00:37:50.855 content interaction based on LiDAR sensors 585 00:37:50.855 --> 00:37:54.043 could become more prevalent 586 00:37:54.053 --> 00:37:57.649 Besides these, there are sensors and other foundational technologies for content 587 00:37:57.649 --> 00:38:01.033 that haven't been released yet 588 00:38:01.033 --> 00:38:07.208 And I think we could see much more utilization 589 00:38:07.208 --> 00:38:10.518 in various industrial sectors that we haven't discussed yet 590 00:38:10.518 --> 00:38:12.998 So, I've talked about 591 00:38:12.998 --> 00:38:15.627 some challenging sensor-based technologies 592 00:38:15.627 --> 00:38:18.512 and the immersive aspects associated with them 593 00:38:18.512 --> 00:38:20.667 Considering these aspects 594 00:38:20.667 --> 00:38:23.274 I suggest you to imagine 595 00:38:23.274 --> 00:38:27.612 what kind of content you could create 596 00:38:27.612 --> 00:38:29.241 based on these elements? 597 00:38:29.241 --> 00:38:33.267 If you feel a bit lacking in imagination for that creativity 598 00:38:33.267 --> 00:38:38.746 Why not try recreating existing content from movies or other references? 599 00:38:38.746 --> 00:38:43.667 It could be a great way to experiment and get inspired 600 00:38:43.667 --> 00:38:48.696 By doing so 601 00:38:49.231 --> 00:38:54.232 You can explore and expand upon XR immersive content as content planners 602 00:38:54.232 --> 00:38:57.188 It's an exciting prospect for 603 00:38:57.188 --> 00:38:59.568 creating a variety of XR experiences 604 00:38:59.568 --> 00:39:00.330 That's all for now 605 00:39:01.935 --> 00:39:02.735 Definition and application of XR Understanding AR, VR, MR, and XR 606 00:39:02.735 --> 00:39:04.285 AR Technology that creates a virtual space in the real world by combining images or videos with CG Use Medical training / Retail business / Repair and maintenance / Design and modeling / Business distribution / Tourism / Education / Entertainment etc. 607 00:39:04.285 --> 00:39:05.985 VR Immersive spaces that do not exist in reality requiring special devices (HMDs) Use Entertainment(Games, movies, etc.) / Fine art and design / Architecture / Medical / Aviation / Army etc. 608 00:39:05.985 --> 00:39:07.135 MR Showing a combined real world of AR and VR through HMD-based lenses 609 00:39:07.135 --> 00:39:08.385 XR: Technology with the potential to expand into spatial-based content, a hybrid of AR, VR, and MR Use Manufacture / Medical / Culture / Defense field etc. 610 00:39:08.385 --> 00:39:09.335 The benefits of utilizing XR Reduced waste, cost savings, improved understanding, project efficiency, enhanced safety, better decision-making 611 00:39:09.611 --> 00:39:13.561 Sensor-based interaction research Types of sensors Accelerometer Motion Sensor Radar Sensor 612 00:39:13.561 --> 00:39:17.561 Capacitive Touch Sensor Image Sensor Global Positioning System Gyro Sensor 613 00:39:17.710 --> 00:39:19.360 The integration of emerging technologies and content XR in the architecture, engineering, and construction industries Integrate sensors and content to use in the field or environment 614 00:39:19.360 --> 00:39:20.960 XR in product design, manufacturing, and automotive Electric vehicles are a comprehensive medium for sensor fusion and content, etc. 615 00:39:20.960 --> 00:39:22.610 XR in Media and Entertainment VR games, chat apps, augmented reality, AR filters, virtual exhibitions, etc. 616 00:39:22.610 --> 00:39:23.960 Harmony of content and new technologies in practice NEWJAK XR Laser Training NEWJAK XR Shooting Simulator 617 00:39:23.960 --> 00:39:25.010 NEWJAK RDW(Redirected Walking) application on Dobong-ro NEWJAK Motion Detection Illustration DIGIMIX Interactive touch wall 618 00:39:25.010 --> 00:39:25.760 The Wonyeong Flower at the Naju Immersive Content Experience Zone The Light Playground at Jungang Art Center