1
00:00:00.500 --> 00:00:01.950
Okay, the first question is,
2
00:00:01.950 --> 00:00:05.200
could you give us a brief summary of your presentation?
3
00:00:05.350 --> 00:00:06.400
Yeah
4
00:00:07.150 --> 00:00:11.350
So I came here today to talk about Moon Valley and Asteria,
5
00:00:11.350 --> 00:00:14.080
kind of the combined companies that form
6
00:00:14.080 --> 00:00:20.820
what we feel is the most unique and interesting use of AI
7
00:00:20.820 --> 00:00:23.449
and filmmaking and visual intelligence
8
00:00:23.449 --> 00:00:26.149
And so Moon Valley as the umbrella company
9
00:00:26.149 --> 00:00:29.909
focuses on visual intelligence research at a broad level,
10
00:00:29.909 --> 00:00:38.360
and then Asteria is sort of the media and entertainment focused filmmaking arm of that venture
11
00:00:38.360 --> 00:00:40.810
And so our goal is to be a bridge between
12
00:00:41.460 --> 00:00:47.240
the technology and the researchers who make incredible AI come to life,
13
00:00:47.240 --> 00:00:51.740
and the filmmaking community, which is apprehensive
14
00:00:51.740 --> 00:00:56.520
and can be nervous about the future that AI might bring to the industry
15
00:00:56.520 --> 00:01:00.020
And so we're here to facilitate a dialogue between those two sides,
16
00:01:00.020 --> 00:01:06.059
and find a way that filmmaking can remain exciting, creative, and expressive
17
00:01:06.059 --> 00:01:11.480
and represents an ethical path forward in this moment
18
00:01:12.230 --> 00:01:16.120
How do you feel being here at the event?
19
00:01:16.470 --> 00:01:21.680
Yeah, I mean, it's great to be here amongst other brilliant technologists and creatives
20
00:01:21.680 --> 00:01:24.980
And I think like I mentioned at the end of the presentation,
21
00:01:25.030 --> 00:01:29.400
I think Korea is becoming such a driving force of culture in the world
22
00:01:29.400 --> 00:01:33.700
Certainly, in the US, we're finding more and more crossover
23
00:01:33.700 --> 00:01:37.850
and how our culture admires what kind of
24
00:01:37.850 --> 00:01:41.710
how music and film is reflected here
25
00:01:41.710 --> 00:01:45.760
And so to see so many incredible people
26
00:01:45.760 --> 00:01:49.500
be at the forefront of the dialogue around how this technology
27
00:01:49.500 --> 00:01:51.639
is going to impact those industries and entertainment,
28
00:01:51.639 --> 00:01:52.959
I think is really exciting
29
00:01:52.959 --> 00:01:54.909
And it certainly, you know, I think
30
00:01:54.909 --> 00:02:01.480
bodes well for the future of the creative economy here in Korea
31
00:02:03.130 --> 00:02:05.030
As you look at the present era,
32
00:02:05.030 --> 00:02:07.579
was there ever a moment when you thought
33
00:02:07.579 --> 00:02:13.079
this is truly a new kind of movement or a completely new flow?
34
00:02:13.479 --> 00:02:16.969
Yeah, so I've been following technology
35
00:02:16.969 --> 00:02:22.259
kind of in creative industries for a decade and a half,
36
00:02:22.259 --> 00:02:25.859
and I've seen a lot of them kind of promise,
37
00:02:25.859 --> 00:02:31.030
this radical transformation in how we consume things,
38
00:02:31.030 --> 00:02:37.080
whether it was immersive or a metaversal context or interactive
39
00:02:37.080 --> 00:02:40.019
I think there's something completely different
40
00:02:40.019 --> 00:02:43.360
in the scale and magnitude of this moment with generative AI
41
00:02:43.360 --> 00:02:45.860
I think that it is creating
42
00:02:45.860 --> 00:02:49.910
a much more accessible means of filmmaking
43
00:02:49.910 --> 00:02:56.209
and liberating the filmmaking process at a time that it needs it most
44
00:02:56.720 --> 00:03:02.720
I think we're seeing how other forces are impacting the creation of films
45
00:03:02.720 --> 00:03:06.570
you know, the battle for attention
46
00:03:06.570 --> 00:03:10.839
from mobile devices and social media platforms
47
00:03:10.839 --> 00:03:13.320
and how cost-effective it is to make that
48
00:03:13.320 --> 00:03:18.350
I think filmmaking needs that same kind of ethos imbued into it
49
00:03:18.350 --> 00:03:23.679
and breaking some of the patterns of inefficient creation
50
00:03:23.679 --> 00:03:27.720
And so, yeah, that's what I think this moment does
51
00:03:27.720 --> 00:03:30.470
This generative AI brings to this moment
52
00:03:30.470 --> 00:03:33.279
in a way that I've never seen or witnessed before
53
00:03:33.279 --> 00:03:36.449
And when I saw the opportunity emerging,
54
00:03:36.449 --> 00:03:39.949
I felt it was time to go all in
55
00:03:39.949 --> 00:03:43.419
And I think that anyone that's looking at this moment from the sidelines.
56
00:03:43.419 --> 00:03:47.280
I recommend stepping in and really acquainting yourself with the technology
57
00:03:47.280 --> 00:03:49.759
because this is a truly revolutionary moment
58
00:03:49.759 --> 00:03:52.479
This is more significant than the internet
59
00:03:52.479 --> 00:03:54.629
This will be remembered as
60
00:03:54.629 --> 00:03:57.959
one of the most significant breakthroughs in humanity's history
61
00:03:57.959 --> 00:04:00.819
And so, try to find a place to be on the right side of that,
62
00:04:00.819 --> 00:04:02.969
try to find a place to be fluent in the technology
63
00:04:02.969 --> 00:04:07.890
I think those are sort of it's incumbent on us as people
64
00:04:07.890 --> 00:04:10.890
that even if you're afraid of it
65
00:04:10.890 --> 00:04:13.639
to learn about it and see how it's going to impact your life
66
00:04:14.689 --> 00:04:15.789
Next question is,
67
00:04:15.789 --> 00:04:20.640
as AI and humans create together, and as technology and emotion intertwine,
68
00:04:20.640 --> 00:04:24.790
how do you think that content industry or the creative stage will evolve?
69
00:04:24.790 --> 00:04:27.640
And within that, what role do you think humans will play?
70
00:04:27.640 --> 00:04:29.640
Can you ask the first part of that question again? Sorry
71
00:04:29.640 --> 00:04:35.209
Oh, as AI and human create together, and as technology and emotion intertwine,
72
00:04:35.209 --> 00:04:39.659
how do you think the content industry and the creative stage will evolve?
73
00:04:39.659 --> 00:04:42.959
And within that, what role do you think humans will play?
74
00:04:42.959 --> 00:04:43.959
Yeah
75
00:04:43.959 --> 00:04:48.759
So, I think you're going to see an explosion of
76
00:04:50.209 --> 00:04:51.759
I mean, we call it slop
77
00:04:52.109 --> 00:04:56.259
but I think you're going to see an explosion of things
78
00:04:56.259 --> 00:05:01.659
that look and feel expensive but are missing heart and story
79
00:05:01.659 --> 00:05:04.259
and the things that sort of make us as humans
80
00:05:04.259 --> 00:05:08.829
connect with media, all the way back to oral history to the present
81
00:05:08.829 --> 00:05:13.329
And so, I think what's going to be most important in this moment, in this interaction,
82
00:05:13.329 --> 00:05:17.729
is actually helping to find and curate the voices
83
00:05:17.729 --> 00:05:21.279
that represent that abstract concept
84
00:05:21.279 --> 00:05:23.279
that the machine doesn't give us by itself
85
00:05:23.279 --> 00:05:24.929
It can create images,
86
00:05:24.929 --> 00:05:30.279
but it can't conjure emotion to the same extent that we as humans do
87
00:05:30.600 --> 00:05:34.000
And so, I think it's about using it for what it's best at,
88
00:05:34.000 --> 00:05:38.950
which is creating images and moving pixels
89
00:05:38.950 --> 00:05:41.150
and doing really interesting dynamic things
90
00:05:41.150 --> 00:05:43.950
But I think it's going to be more important than ever
91
00:05:43.950 --> 00:05:49.500
to also elevate the role of the best storytellers and visionaries
92
00:05:49.600 --> 00:05:50.750
And I certainly think that
93
00:05:50.750 --> 00:05:54.550
this is going to give a lot of underrepresented voices
94
00:05:54.550 --> 00:05:57.050
an opportunity to be heard in that way
95
00:05:57.050 --> 00:06:03.120
And so, I think curation is going to become more and more important in this future
96
00:06:03.120 --> 00:06:06.920
as these technologies collide with kind of our human experience
97
00:06:06.920 --> 00:06:08.520
This is the last question
98
00:06:08.520 --> 00:06:13.920
So, if you could leave one sentence for the people here today, or the future creators,
99
00:06:13.920 --> 00:06:15.220
what would it be?
100
00:06:15.220 --> 00:06:16.320
One sentence?
101
00:06:16.320 --> 00:06:16.920
Yeah
102
00:06:23.489 --> 00:06:25.239
If you want one second, you can...
103
00:06:25.239 --> 00:06:27.239
Yeah, I'm trying to think
104
00:06:29.239 --> 00:06:31.239
Don't be afraid of the unknown
105
00:06:31.239 --> 00:06:33.289
This technology is here,
106
00:06:33.289 --> 00:06:37.239
and I think there's a way for everybody to benefit from it
107
00:06:37.259 --> 00:06:46.709
So, dive in, find a way that you can grow your ambitions from it,
108
00:06:46.709 --> 00:06:50.559
and that's the best way to look at it, I think
109
00:06:51.109 --> 00:06:52.559
A few sentences, but..
Deprecated: Return type of CI_Session_files_driver::open($save_path, $name) should either be compatible with SessionHandlerInterface::open(string $path, string $name): bool, or the #[\ReturnTypeWillChange] attribute should be used to temporarily suppress the notice in /var/www/editors/system/libraries/Session/drivers/Session_files_driver.php on line 113
Severity: 8192
Message: str_replace(): Passing null to parameter #3 ($subject) of type array|string is deprecated
Filename: core/Output.php
Line Number: 447
Backtrace:
File: /var/www/editors/index.php
Line: 315
Function: require_once