Skip to content

Instantly share code, notes, and snippets.

@do6Wkg7O
Created Aug 31, 2021
Embed
What would you like to do?
Subtitles for "Julia Reinhardt: What do GDPR and AI regulations mean for Silicon Valley?" Podcast
1
00:00:00,240 --> 00:00:04,960
As a matter of principle, I don't 
believe regulation stifles innovation.
2
00:00:04,960 --> 00:00:07,920
Very often in my daily life, 
I work with technologies 
3
00:00:07,920 --> 00:00:10,560
who feel that, actually, 
regulation gives them some
4
00:00:10,560 --> 00:00:14,000
guard rails that they actually 
appreciate, as a matter of fact.
5
00:00:14,000 --> 00:00:17,280
I think on the contrary, smart 
regulation that gets out in front 
6
00:00:17,280 --> 00:00:21,680
of emerging technology can protect 
consumers and drive innovation.
7
00:00:21,680 --> 00:00:28,160
So my focus is on protecting consumers and 
that I think is the most important thing.
8
00:00:29,280 --> 00:00:34,720
I mean we're not just a huge 
couple billion of mice in a lab, 
9
00:00:35,520 --> 00:00:38,720
we are people with rights and dignity.
10
00:00:41,520 --> 00:00:48,160
This is season two of Voices of the Data Economy,
a podcast supported by Ocean Protocol Foundation.
11
00:00:49,040 --> 00:00:52,240
We bring to you the voices 
shaping the Data Economy 
12
00:00:52,240 --> 00:00:56,160
and challenging it at the same time.
Listen to founders,
13
00:00:56,160 --> 00:00:59,600
tech policy experts and 
pioneers in impact investing.
14
00:01:00,240 --> 00:01:02,880
All sharing their relationship with data.
15
00:01:04,160 --> 00:01:07,520
So hello and welcome,
today we have Julia with us.
16
00:01:07,520 --> 00:01:12,000
She is a privacy professional 
and AI governance expert.
17
00:01:12,000 --> 00:01:15,600
She's also a Mozilla Fellow 
in Residence and has been 
18
00:01:15,600 --> 00:01:20,880
a German diplomat in the past, 
so a very versatile profile.
19
00:01:20,880 --> 00:01:21,840
Hello, Julia.
20
00:01:23,040 --> 00:01:23,840
Hi, Diksha.
21
00:01:24,480 --> 00:01:26,080
Hi, I'm sorry, I said it wrong.
22
00:01:26,080 --> 00:01:27,680
Hello, Julia. I always...
23
00:01:29,200 --> 00:01:35,920
I have American friends and German friends
and it's always confusing but here we are, Julia.
24
00:01:36,640 --> 00:01:37,760
Exactly, fine.
25
00:01:37,760 --> 00:01:40,400
So, you are in Germany today.
26
00:01:41,280 --> 00:01:47,360
Yes, I am. I am normally based in San Francisco
but for the first time since the pandemic started,
27
00:01:47,920 --> 00:01:51,600
I was able to travel and
I'm really happy to be 
28
00:01:51,600 --> 00:01:54,640
here and see everybody again 
after such a long time,
29
00:01:54,640 --> 00:01:59,440
but I will go back to San Francisco 
again in a couple of weeks.
30
00:02:00,400 --> 00:02:05,600
Yeah, summer is always a great 
time to be here in Germany.
31
00:02:05,600 --> 00:02:07,760
And it's not the best time in San Francisco.
32
00:02:07,760 --> 00:02:11,280
I think somebody said that nothing
is worse or colder than  
33
00:02:13,360 --> 00:02:15,360
summer in San Francisco because of the fog.
34
00:02:16,480 --> 00:02:18,720
Okay, great, so you have
the best of both worlds.
35
00:02:20,720 --> 00:02:24,640
And okay, so coming back to your profile.
36
00:02:24,640 --> 00:02:27,600
So tell us a little bit 
about your journey from being 
37
00:02:27,600 --> 00:02:31,520
a German diplomat to now
an advisor in data policy
38
00:02:31,520 --> 00:02:33,120
and regulations in the US.
39
00:02:33,120 --> 00:02:36,160
I mean it's a very interesting shift.
40
00:02:36,160 --> 00:02:41,040
How did one thing lead to another 
and you are doing what you are doing?
41
00:02:42,400 --> 00:02:47,040
Yeah, maybe a red thread has 
always been that I always want 
42
00:02:47,040 --> 00:02:51,840
to work on things that matter.
I enjoy cross-functional work most.
43
00:02:53,520 --> 00:02:56,240
I mean things that matter,
that sounds very mission driven.
44
00:02:56,240 --> 00:03:00,080
I think that's what I am but
I'm just very curious about things 
45
00:03:00,080 --> 00:03:04,400
that have some relevance 
for me, also for the world.
46
00:03:04,400 --> 00:03:08,400
I studied international 
relations and European studies 
47
00:03:08,400 --> 00:03:11,440
in Germany and then went to 
California and finally completed
48
00:03:11,440 --> 00:03:12,480
my studies in France.
49
00:03:13,680 --> 00:03:20,800
Went to work at a major foreign policy think tank
and then at a commission outpost in Lebanon.
50
00:03:20,800 --> 00:03:25,120
I have them, like a love for 
the middle east but then decided 
51
00:03:25,120 --> 00:03:27,440
to enter the German diplomatic service and spent
52
00:03:27,440 --> 00:03:34,240
almost 15 years in various, what I found super
interesting positions in government in Germany,
53
00:03:34,240 --> 00:03:36,320
but also for France and Italy.
54
00:03:36,320 --> 00:03:39,200
And I became fascinated with 
technology when I was part 
55
00:03:39,200 --> 00:03:42,320
of negotiation teams on tech 
policy frameworks in Europe.
56
00:03:43,120 --> 00:03:48,000
My focus in that role was on 
drafting and coordinating policies 
57
00:03:48,000 --> 00:03:50,960
for a new and smart regulation of technology
58
00:03:50,960 --> 00:03:53,520
and during that time, actually, 
I was deeply involved in 
59
00:03:53,520 --> 00:03:57,360
the beginning of GDPR 
negotiations on a European level.
60
00:03:57,360 --> 00:04:00,240
I mean they took forever, 
they took six years all in all 
61
00:04:00,240 --> 00:04:04,000
but I was there at the
beginning and then in 2012,
62
00:04:04,000 --> 00:04:07,520
nine years ago, I came to
the US and my job was 
63
00:04:07,520 --> 00:04:10,880
to organize multi-stakeholder 
meetings and discussions between
64
00:04:10,880 --> 00:04:14,160
Germany and tech companies 
in Silicon Valley and also 
65
00:04:15,120 --> 00:04:17,280
to do communication work around that.
66
00:04:17,920 --> 00:04:25,120
So, in a way I combine being strong on content
and being interested in content, but I also really
67
00:04:25,120 --> 00:04:28,560
like coordination and trust 
and relationship building 
68
00:04:29,200 --> 00:04:32,640
and so with that I've been 
growing this really big network
69
00:04:32,640 --> 00:04:34,720
in the tech and regulation space.
70
00:04:34,720 --> 00:04:38,720
And so after those three years 
were over at the diplomatic 
71
00:04:38,720 --> 00:04:42,480
outpost, at kind of the intersection 
of tech and policy I decided
72
00:04:42,480 --> 00:04:46,720
to become a tech policy consultant, 
bridging Europe and the US, 
73
00:04:46,720 --> 00:04:50,560
so I can be in both places 
geographically and also in terms
74
00:04:50,560 --> 00:04:56,880
of disciplines for different clients and 
right now I wear, let's say two hats.
75
00:04:58,000 --> 00:05:00,240
One is that I'm working as a privacy professional, 
76
00:05:01,040 --> 00:05:04,400
so I become part of trust and 
safety teams and tech companies
77
00:05:04,400 --> 00:05:08,560
in Europe and in the bay area 
and I also do the same job 
78
00:05:08,560 --> 00:05:11,840
for advisory firms and for non-profit foundations.
79
00:05:12,480 --> 00:05:16,800
And the main topic is compliance on privacy 
80
00:05:16,800 --> 00:05:20,560
and data protection, so
in Europe that's the famous
81
00:05:20,560 --> 00:05:23,600
General Data Protection Regulation,
but there's something 
82
00:05:23,600 --> 00:05:25,680
like that also now in California.
83
00:05:25,680 --> 00:05:28,560
We can talk about that if you're 
interested in that later on, 
84
00:05:29,280 --> 00:05:33,200
but I also do consequence scanning 
which is kind of a key word
85
00:05:33,200 --> 00:05:37,200
recently in Silicon Valley and on security.
86
00:05:37,200 --> 00:05:40,400
And so, I become really part
of those engineering 
87
00:05:40,400 --> 00:05:44,720
and product teams and start out
with data protection
88
00:05:44,720 --> 00:05:49,520
and then also work on other policies that
I helped draft for those companies.
89
00:05:49,520 --> 00:05:54,240
And the second hat is that, 
because I wanted to work on 
90
00:05:54,240 --> 00:05:57,280
geopolitics of tech regulation 
a bit more, maybe that's
91
00:05:57,280 --> 00:06:02,560
a continuation or I wanted to start
something again that I really 
92
00:06:02,560 --> 00:06:07,520
liked in diplomacy, the non-profit 
Mozilla Foundation offered
93
00:06:07,520 --> 00:06:09,920
me that Fellowship in Residence that you mentioned.
94
00:06:10,880 --> 00:06:15,840
I started that last year and
I work on upcoming EU regulation 
95
00:06:15,840 --> 00:06:21,200
on trustworthy AI and that's
kind of an analytical research work,
96
00:06:21,200 --> 00:06:27,040
but I do like to go out and 
talk with AI builders and also 
97
00:06:27,040 --> 00:06:30,720
with government experts and
lead workshops and
98
00:06:30,720 --> 00:06:33,840
get my insights directly from 
tech companies building AI.
99
00:06:33,840 --> 00:06:36,880
And I'm also on the policy 
team at Mozilla Corporation, 
100
00:06:37,600 --> 00:06:41,680
as opposed to the Foundation, 
and so I help identify
101
00:06:41,680 --> 00:06:44,960
and monitor and analyze policy 
issues that affect Mozilla's 
102
00:06:44,960 --> 00:06:49,520
products, at both state, national 
and international levels.
103
00:06:49,520 --> 00:06:52,880
So, the team there is really 
small but still very global 
104
00:06:52,880 --> 00:06:55,680
and I'm glad I can contribute
to someone who knows
105
00:06:55,680 --> 00:06:57,520
several regions of the world really well.
106
00:06:58,640 --> 00:07:03,200
Wow. So, I mean it's great that 
you work on the public side 
107
00:07:03,200 --> 00:07:06,320
also and then you are pretty
much in the tech hub
108
00:07:06,320 --> 00:07:12,000
and you are dealing with all, sort of,
say, partners in this ecosystem.
109
00:07:12,640 --> 00:07:17,520
We will definitely discuss AI 
regulation in depth but let's take 
110
00:07:17,520 --> 00:07:21,680
a step back and go to GDPR 
and we are sort of obsessed
111
00:07:22,320 --> 00:07:26,240
in EU with that and it's 
just completed three years 
112
00:07:26,880 --> 00:07:30,560
and now you can see things are
getting into place in EU.
113
00:07:31,200 --> 00:07:36,880
But what has broadly been its impact 
in Silicon Valley in these three years?
114
00:07:37,600 --> 00:07:39,760
What has worked? What has not worked?
115
00:07:39,760 --> 00:07:43,840
Like a sort of report card 
of GDPR in Silicon Valley.
116
00:07:44,400 --> 00:07:52,400
Yeah. So, GDPR has had some really notable
and immediate impacts worldwide, 
117
00:07:52,400 --> 00:07:57,280
well first of all in Europe as you can
witness, but it has also brought
118
00:07:57,280 --> 00:08:01,440
an awareness to Silicon Valley and 
the US, that privacy is important 
119
00:08:01,440 --> 00:08:05,120
to people in Europe but also in 
other regions of the world and
120
00:08:05,120 --> 00:08:09,600
that it's a human right that in
the US many had not really 
121
00:08:09,600 --> 00:08:13,120
considered important, but the
surveys, even there among
122
00:08:13,120 --> 00:08:14,400
population, prove them wrong.
123
00:08:15,040 --> 00:08:18,640
So the global conversation around
privacy has really shifted 
124
00:08:18,640 --> 00:08:23,040
in the past, yeah, three years 
but even before a little bit,
125
00:08:23,040 --> 00:08:27,360
but definitely since 2018
and so have the laws.
126
00:08:27,360 --> 00:08:32,320
So, as a direct result of GDPR 
in Europe, countries, like Japan 
127
00:08:32,320 --> 00:08:36,640
or Brazil, passed GDPR-inspired 
privacy laws and India
128
00:08:36,640 --> 00:08:41,040
and even China considering their
own law, even though those 
129
00:08:41,040 --> 00:08:43,280
look different from what
we think they should be
130
00:08:43,280 --> 00:08:47,360
but it's definitely a big push.
131
00:08:47,360 --> 00:08:52,720
And California's new privacy law, 
which went into effect in 2020, 
132
00:08:52,720 --> 00:08:56,960
is a direct result of GDPR and 
California is the first state
133
00:08:56,960 --> 00:09:00,160
in the United States to have 
set up a privacy protection 
134
00:09:00,160 --> 00:09:01,840
agency at state level.
135
00:09:01,840 --> 00:09:04,800
No other US state has
that, although in Europe  
136
00:09:05,440 --> 00:09:09,200
each member state has at least
one on a national level,
137
00:09:09,200 --> 00:09:15,760
Germany, 16, for every Bundesland.
But in California, 
138
00:09:16,400 --> 00:09:21,440
that's the first for the United States
and this agency has just taken up its work.
139
00:09:21,440 --> 00:09:25,600
It has a Spanish-born GDPR 
expert as a board member on 
140
00:09:25,600 --> 00:09:30,320
a California state level of this agency, 
so I think that's really notable.
141
00:09:30,320 --> 00:09:35,760
And also, overall, in more geopolitical
terms, GDPR has also shown 
142
00:09:35,760 --> 00:09:39,360
to Silicon Valley that one of 
their biggest markets, Europe, 
143
00:09:39,360 --> 00:09:42,400
has its own rules and that 
they have to follow when they
144
00:09:42,400 --> 00:09:45,040
want to be a player there
and earn money there.
145
00:09:45,600 --> 00:09:48,720
And as a result,
many US-based organizations 
146
00:09:48,720 --> 00:09:51,440
that process personal data 
of people around the world
147
00:09:51,440 --> 00:09:55,600
have decided to apply GDPR 
and extend all the rights 
148
00:09:55,600 --> 00:09:58,480
that go with it to their 
customers, who don't need to
149
00:09:58,480 --> 00:10:01,520
be European residents but 
who live outside of Europe 
150
00:10:01,520 --> 00:10:04,400
and this is the case for many 
companies around the world.
151
00:10:05,120 --> 00:10:07,680
It gives them an edge in global compliance 
152
00:10:07,680 --> 00:10:11,520
and it's easier for them in terms 
of handling complaints and requests.
153
00:10:11,520 --> 00:10:16,000
So, they say, just give all
of our customers all the rights 
154
00:10:16,000 --> 00:10:19,120
that Europeans have, which you don't have to,
155
00:10:19,120 --> 00:10:22,800
it's a very high bar but it's 
still easier for the organization 
156
00:10:22,800 --> 00:10:26,560
than to sort out the customer's 
location and sometimes
157
00:10:26,560 --> 00:10:29,520
they only have an email address, 
or attribute different 
158
00:10:29,520 --> 00:10:32,080
rights according to their location,
I mean that's really
159
00:10:32,080 --> 00:10:38,240
complicated when you have a hundred 
different sets of terms and conditions.
160
00:10:39,200 --> 00:10:42,480
So, GDPR offers them a
legal framework and a set 
161
00:10:42,480 --> 00:10:46,400
of standards that is, well,
at least compared to other less
162
00:10:46,400 --> 00:10:50,000
spelled out legislation or when 
there's no legislation at all, 
163
00:10:50,000 --> 00:10:52,000
relatively clearly adoptable.
164
00:10:52,720 --> 00:10:56,800
So, my clients are mainly small 
and medium enterprises based 
165
00:10:56,800 --> 00:11:01,200
in the US with only some clients 
in Europe or sometimes just
166
00:11:01,200 --> 00:11:04,080
the mere intention of
soon expanding to Europe, 
167
00:11:05,120 --> 00:11:09,520
but this privacy management 
strategy, just take GDPR
168
00:11:09,520 --> 00:11:12,880
as a high level standard 
and apply it to everybody, 
169
00:11:12,880 --> 00:11:15,520
has been found with bigger tech firms as well.
170
00:11:16,080 --> 00:11:18,560
Because they appreciate that 
there's a standard now that 
171
00:11:18,560 --> 00:11:22,320
is law in one part of the world, 
but can serve as a guideline
172
00:11:22,320 --> 00:11:28,000
also for other parts of the world
and that it's just easier 
173
00:11:28,000 --> 00:11:30,960
to have one high profile standard 
than many different ones,
174
00:11:30,960 --> 00:11:34,160
what I call, the growing global privacy patchwork.
175
00:11:34,960 --> 00:11:39,520
That made the EU the de facto 
rule setter and technology policy 
176
00:11:39,520 --> 00:11:44,880
worldwide, so the whole perspective 
on European rule setting
177
00:11:44,880 --> 00:11:47,680
and on Europe as a factor in 
the tech industry has changed
178
00:11:47,680 --> 00:11:51,440
in a very short time in Silicon 
Valley and yeah, so narratives
179
00:11:51,440 --> 00:11:55,280
have changed and internal 
data management has changed.
180
00:11:55,280 --> 00:11:59,440
I need to mention that a
disappointing factor is enforcement so, 
181
00:12:00,080 --> 00:12:04,640
even when tech companies do get 
hit with billion dollar fines,
182
00:12:04,640 --> 00:12:10,560
for them it's a tap on the wrist 
and so far GDPR hasn't changed 
183
00:12:11,200 --> 00:12:14,240
the underlying business
models, the way money is
184
00:12:14,240 --> 00:12:17,920
made on the internet, with 
surveilling people's behavior.
185
00:12:17,920 --> 00:12:21,120
And so it's not just the 
business model of a company, 
186
00:12:21,120 --> 00:12:27,760
it's the entire internet
that is based on a, yeah,
187
00:12:27,760 --> 00:12:31,360
economic model that doesn't 
have privacy top of mind, 
188
00:12:31,360 --> 00:12:34,400
so changing that, of
course, requires fundamental
189
00:12:34,400 --> 00:12:39,680
and probably painful adjustments to 
the way things have been structured.
190
00:12:39,680 --> 00:12:43,120
That's definitely something 
that GDPR so far hasn't been 
191
00:12:43,120 --> 00:12:46,960
able to achieve and that's 
definitely a bit disappointing.
192
00:12:49,120 --> 00:12:53,280
So there is GDPR in the
US, sorry, in Europe, 
193
00:12:53,280 --> 00:12:58,320
and there are also data regulations 
in the US. I think it's CIPP.
194
00:13:00,240 --> 00:13:04,720
So, when you compare the
standards, how compatible 
195
00:13:04,720 --> 00:13:09,120
are these two when implemented 
by companies together?
196
00:13:10,080 --> 00:13:11,360
Does my question make sense?
197
00:13:12,000 --> 00:13:13,120
Absolutely, yeah.
198
00:13:13,120 --> 00:13:17,040
So, yeah, I mentioned this 
global privacy patchwork 
199
00:13:17,040 --> 00:13:25,360
and definitely, so, the privacy law 
in effect in California is called,
200
00:13:25,360 --> 00:13:28,640
CCPA, called, California's Consumer Privacy Act.
201
00:13:28,640 --> 00:13:33,920
It will actually be replaced again 
soon by another one that's called, 
202
00:13:33,920 --> 00:13:39,200
CPRA and it's a bit, even 
an advanced version of CCPA,
203
00:13:40,400 --> 00:13:48,320
abbreviations, but some principles
are the same as GDPR 
204
00:13:48,320 --> 00:13:56,240
and then again it has some elements
that even go beyond GDPR
205
00:13:56,240 --> 00:14:01,120
and I would say that enforcement 
of these will be very hard.
206
00:14:01,760 --> 00:14:08,160
I think in a way it's been a 
law that has been created very, 
207
00:14:08,160 --> 00:14:13,200
very fast in a way that is 
maybe weird for us in Europe
208
00:14:14,400 --> 00:14:21,760
by a petition of people, 
that got a lot of signatures 
209
00:14:21,760 --> 00:14:25,760
and then actually pressed 
the California legislature
210
00:14:27,120 --> 00:14:29,760
to create that law very rapidly.
211
00:14:29,760 --> 00:14:34,560
So, in a way it's not as 
systematic, not as clear as GDPR.
212
00:14:35,840 --> 00:14:42,240
It does go further in some 
areas which don't always make 
213
00:14:42,240 --> 00:14:45,920
sense to a consumer because 
they're kind of hard to understand,
214
00:14:46,960 --> 00:14:53,680
but then they do have elements that 
have just nothing to do with GDPR 
215
00:14:53,680 --> 00:14:57,760
or just are, yeah, don't go as far.
216
00:14:57,760 --> 00:15:02,560
So it's a bit different but
I would say that what's important is, 
217
00:15:02,560 --> 00:15:05,440
so in a way it's easier 
for companies, I would say,
218
00:15:05,440 --> 00:15:10,320
to comply with GDPR and then 
just to say, and then add some 
219
00:15:10,320 --> 00:15:17,280
important elements of CCPA 
and hope for understanding
220
00:15:17,280 --> 00:15:19,440
on the part of, enforcement agency.
221
00:15:19,440 --> 00:15:24,160
I have to say that enforcement 
is weak in California as well, 
222
00:15:24,880 --> 00:15:28,400
so that's hard of course, as 
a privacy professional when
223
00:15:28,400 --> 00:15:31,680
you try to explain to companies 
that it's really important 
224
00:15:31,680 --> 00:15:40,240
to follow the law but then the 
whole enforcement is so, yeah,
225
00:15:40,240 --> 00:15:44,080
there's so many deficits and
there's so few people working on that.
226
00:15:46,080 --> 00:15:51,280
I think in the end in California 
it will be, usually in the US, 
227
00:15:52,640 --> 00:15:55,040
it will be litigated in court, mainly.
228
00:15:56,000 --> 00:16:00,560
Okay, but as you just said 
that for Big Tech, probably, 
229
00:16:01,120 --> 00:16:05,040
this is not difficult as much 
and they don't get so panicked
230
00:16:05,040 --> 00:16:09,680
about it because they have big
legal teams and heavy pockets and  
231
00:16:10,800 --> 00:16:13,600
you said you deal mostly with SMEs
232
00:16:13,600 --> 00:16:17,120
and I'm not sure if you deal 
with non-tech companies also, 
233
00:16:17,120 --> 00:16:23,280
but what are particularly the
challenges that these kind of companies face
234
00:16:23,280 --> 00:16:26,640
and have there been any
stories where they did something 
235
00:16:26,640 --> 00:16:32,720
that they were not aware of was illegal 
and then they had huge fee and fine and so?
236
00:16:32,720 --> 00:16:40,560
I mean take us through the struggles that 
usually these kinds of SMEs face with GDPR.
237
00:16:41,120 --> 00:16:47,680
Yeah, so, well, so as I mentioned 
I work as an independent 
238
00:16:47,680 --> 00:16:52,080
privacy professional, both 
before GDPR became enforceable
239
00:16:52,080 --> 00:16:57,440
and then also after, and working 
mostly with small companies in the US, 
240
00:16:57,440 --> 00:17:02,160
I've witnessed how hard it has been
for those small startups outside of Europe
241
00:17:02,160 --> 00:17:05,040
in a different legal system 
with a different understanding 
242
00:17:05,040 --> 00:17:10,960
or traditional understanding 
of privacy, to understand why
243
00:17:10,960 --> 00:17:13,280
and how they're exposed to GDPR.
244
00:17:14,240 --> 00:17:18,560
My work, in a way, is both legal
and technical but it's also cultural.
245
00:17:18,560 --> 00:17:24,480
So I translate mentalities in a 
way and what's important in Europe 
246
00:17:24,480 --> 00:17:29,520
and why and plus, for a startup, 
of course, it costs time and money.
247
00:17:29,520 --> 00:17:33,760
And so they took a bit longer,
I mean I'm only one 
248
00:17:33,760 --> 00:17:37,520
for privacy professional,
many others worked on this as well,
249
00:17:37,520 --> 00:17:41,840
but still there was a delay 
in adaptation and it sometimes 
250
00:17:42,880 --> 00:17:45,280
resulted in vendor agreement cancellations,
251
00:17:45,280 --> 00:17:49,680
which means that they lost 
business with other bigger firms 
252
00:17:50,320 --> 00:17:55,360
and yeah, I mean we all lost 
great ideas and innovation
253
00:17:55,360 --> 00:17:57,360
and diversity among tech builders.
254
00:17:57,360 --> 00:18:01,280
So, academically speaking,
I did witness what's 
255
00:18:01,280 --> 00:18:07,120
called the Brussels effect in 
California as a, let's say,
256
00:18:07,120 --> 00:18:10,720
dyed in the wool European 
integrationist, I did celebrate it.
257
00:18:10,720 --> 00:18:16,640
I thought it was really great 
that European style rules come 
258
00:18:16,640 --> 00:18:22,640
to California and I think overall
it's a positive result,
259
00:18:22,640 --> 00:18:26,000
but I also witnessed how much 
easier it was for Big Tech 
260
00:18:26,000 --> 00:18:29,840
to adapt because they have
a presence in Brussels,
261
00:18:29,840 --> 00:18:33,440
they have huge legal teams, 
they know about compliance, 
262
00:18:33,440 --> 00:18:38,800
about loopholes and this 
over time has been leading
263
00:18:38,800 --> 00:18:41,920
to more concentration of power 
in the hands of Big Tech.
264
00:18:41,920 --> 00:18:46,080
I wouldn't say that compliance 
is more difficult for startups 
265
00:18:46,080 --> 00:18:49,120
or small firms, actually, 
the opposite is true because,
266
00:18:51,440 --> 00:18:54,960
very banal, they have less 
data to start with because 
267
00:18:54,960 --> 00:19:02,000
they're just starting to collect
data from their users in their products.
268
00:19:02,000 --> 00:19:05,520
So, in a way it's easier
for them to right away, 
269
00:19:06,080 --> 00:19:11,440
establish good and compliant 
data management systems.
270
00:19:11,440 --> 00:19:15,600
So what I always tell them 
is start early, have a system 
271
00:19:15,600 --> 00:19:23,440
and follow it and in a way that's 
much easier than to clean up
272
00:19:23,440 --> 00:19:28,160
a whole mess of collected 
data with no legal basis, 
273
00:19:28,160 --> 00:19:32,720
or that you've been storing 
forever in some place because
274
00:19:32,720 --> 00:19:35,120
you might be using it for another product.
275
00:19:35,120 --> 00:19:38,640
So, in a way when you start 
early enough and you're aware 
276
00:19:38,640 --> 00:19:43,920
of the rules it is easier and 
you just have to follow through,
277
00:19:44,800 --> 00:19:52,320
but that of course, needs some kind of a shift 
in culture but I think that shift has happened.
278
00:19:52,320 --> 00:20:00,400
It's also shown that some small companies, 
and the big ones now as well, use privacy,
279
00:20:01,760 --> 00:20:04,720
yeah, as a marketing tool 
but that sounds too negative, 
280
00:20:04,720 --> 00:20:09,040
it's actually a big asset 
because consumers demand it.
281
00:20:10,320 --> 00:20:13,840
And so when you do that in a 
smart way as a small startup, 
282
00:20:13,840 --> 00:20:19,280
you have huge tools at your hands and so
283
00:20:19,280 --> 00:20:21,040
it's not negative when you  
284
00:20:22,640 --> 00:20:29,840
start early, when you are aware of
the rules but also upcoming principles
285
00:20:31,520 --> 00:20:37,040
and you deploy them in your systems,
in your products early on.
286
00:20:38,720 --> 00:20:44,320
Yeah. Now, I just sort of picked up your 
line that they do it out of marketing 
287
00:20:44,320 --> 00:20:49,040
because I think this is also something we 
discussed in a podcast with Safiya Noble,
288
00:20:49,600 --> 00:20:53,760
the author of Algorithms
of Oppression, and I'm not sure 
289
00:20:53,760 --> 00:20:58,560
if there is a word called, privacy washing,
like you have green washing.
290
00:20:58,557 --> 00:20:59,360
Yeah.
291
00:20:59,360 --> 00:21:04,400
But probably it has become
a marketing gig to talk about 
292
00:21:04,400 --> 00:21:09,920
your customer data and how safe it is 
and having discussions around that.
293
00:21:10,800 --> 00:21:16,160
So yeah, just a comment,
not a question, but okay.
294
00:21:16,160 --> 00:21:21,440
Now, so, first we had GDPR 
and now we have the proposed 
295
00:21:21,440 --> 00:21:25,200
AI regulations in Europe and
we have never really spoken about
296
00:21:25,200 --> 00:21:30,080
it in this podcast and so
I would say that even me 
297
00:21:30,080 --> 00:21:34,560
or particularly the audience, 
are pretty new to this.
298
00:21:35,120 --> 00:21:39,360
So, if you could, and you seem to 
be very deeply involved in this, 
299
00:21:39,360 --> 00:21:43,440
so if you could take us 
through what are actually these
300
00:21:43,440 --> 00:21:49,440
AI regulations that have been proposed in Europe
and what are their challenges?
301
00:21:49,440 --> 00:21:52,480
And then probably we can go deeper into it.
302
00:21:53,280 --> 00:21:57,600
Yeah, sure. So, yeah,
the upcoming AI regulation that 
303
00:21:57,600 --> 00:21:59,280
has been tabled by the commission,
304
00:21:59,280 --> 00:22:05,440
but is only on the negotiation 
table now for member states 
305
00:22:05,440 --> 00:22:06,640
and the European parliament,
306
00:22:06,640 --> 00:22:12,320
is actually the core thing
I'm working on in my fellowship 
307
00:22:12,320 --> 00:22:16,800
and my intention is really
to make sure that this
308
00:22:17,760 --> 00:22:19,680
upcoming regulation, again from Europe,  
309
00:22:21,120 --> 00:22:24,400
does not again cause that
lagging behind of small players.
310
00:22:24,400 --> 00:22:29,120
Because in the field of AI, as
you all know, size clearly matters.
311
00:22:29,120 --> 00:22:32,960
The more data you can gather, 
the better your AI system works 
312
00:22:32,960 --> 00:22:35,920
and we're already pretty far 
down the road to monopolization,
313
00:22:36,720 --> 00:22:41,760
because few players in the market have access
to an impressive range of data and they can also
314
00:22:41,760 --> 00:22:45,600
afford gathering high quality 
data, which then enables them 
315
00:22:45,600 --> 00:22:50,880
to build better performing AI 
and for small scale providers,
316
00:22:50,880 --> 00:22:53,840
what's most important is the clarity of the guidance.
317
00:22:54,560 --> 00:22:56,560
I mean that sounds banal, who wouldn't say that? 
318
00:22:56,560 --> 00:23:00,880
But if you follow it wise,
you set up an internal team analyzing
319
00:23:01,440 --> 00:23:07,840
those rules for your in-house implementation,
adapted to your AI system,  
320
00:23:08,640 --> 00:23:12,000
you get much better at estimating 
necessary additional processes.
321
00:23:14,080 --> 00:23:25,040
Yeah, so that to say why I work on that and again 
this link to the small scale providers of AI.
322
00:23:26,400 --> 00:23:30,720
In general, the regulatory draft 
that the European commissioned 
323
00:23:30,720 --> 00:23:34,400
and it was tabled on april 21st, 
so that's not very long ago
324
00:23:34,400 --> 00:23:39,040
and in Brussels things take time,
committees are formed, 
325
00:23:40,960 --> 00:23:44,080
but the draft that the 
European commission tabled also
326
00:23:44,080 --> 00:23:46,960
has been a very long time in the making.
327
00:23:46,960 --> 00:23:51,840
Von der Leyen, the commission 
president, announced something 
328
00:23:51,840 --> 00:23:54,800
like it already at the
beginning of her presidency
329
00:23:54,800 --> 00:23:57,200
and then it came up this year.
330
00:23:57,840 --> 00:24:01,520
It's the most ambitious and 
the most comprehensive attempt 
331
00:24:01,520 --> 00:24:05,520
at reigning in the risks linked 
to the diplomatic AI that
332
00:24:05,520 --> 00:24:09,920
we have seen so far across the globe,
so it's a bold new step.
333
00:24:11,440 --> 00:24:16,480
Before, the big focus of the past 
years were always just principles, 
334
00:24:16,480 --> 00:24:21,040
I'm saying, just, but I mean of course, you 
start with principles on an international level.
335
00:24:21,040 --> 00:24:26,000
The OECD principles on AI 
that were adopted in May 2019, 
336
00:24:27,280 --> 00:24:29,760
most European countries
and the US are members  
337
00:24:30,720 --> 00:24:33,520
of the Organization for Economic 
Cooperation Development.
338
00:24:33,520 --> 00:24:35,280
So they're already...
339
00:24:36,800 --> 00:24:40,880
They promoted uses of AI that are 
innovative but also trustworthy, 
340
00:24:40,880 --> 00:24:43,840
that respect human rights
and democratic values.
341
00:24:43,840 --> 00:24:47,280
And I think now, in 2021, 
really we're at the stage of, 
342
00:24:47,920 --> 00:24:52,400
how do you transform these principles 
into practical rules and regulations?
343
00:24:52,400 --> 00:25:00,320
So, the rules that the European commission 
proposed wouldn't cover all AI systems.
344
00:25:00,320 --> 00:25:08,000
They do cover systems that are 
deemed to pose a significant risk 
345
00:25:08,000 --> 00:25:12,000
to the safety and fundamental 
rights of people living in Europe.
346
00:25:12,960 --> 00:25:19,600
So it's, again, it's this geographical 
scope of people living in Europe, 
347
00:25:21,040 --> 00:25:25,760
so it's a risk-based approach, 
it has several layers
348
00:25:25,760 --> 00:25:32,320
and those layers have different rules 
for different classes of AI systems.
349
00:25:32,320 --> 00:25:36,400
There are some that are 
prohibited, there are some that 
350
00:25:36,400 --> 00:25:41,360
are considered high risk and 
have to follow certain rules
351
00:25:41,360 --> 00:25:48,560
and then, others where they just say,
you have to be more transparent.
352
00:25:48,560 --> 00:25:52,640
Do you want me to go more into 
details of what those rules are?
353
00:25:53,760 --> 00:25:57,440
Yes, I mean broadly, if you could 
point it out, one, two, three. 
354
00:25:57,440 --> 00:25:59,680
I don't know if it can be done that way but...
355
00:25:59,680 --> 00:26:00,459
Absolutely.
356
00:26:00,800 --> 00:26:01,593
...yeah.
357
00:26:01,920 --> 00:26:05,760
Sometimes it's confusing because 
it's really so much of a risk, 
358
00:26:05,760 --> 00:26:10,400
like you have to know in which 
category your AI system falls
359
00:26:10,400 --> 00:26:15,920
and that's really on the developers,
kind of, to understand 
360
00:26:15,920 --> 00:26:19,360
the categories and in the end 
it's actually pretty clear that
361
00:26:20,720 --> 00:26:25,360
not many AI systems actually 
fall under that rule. So..
362
00:26:25,360 --> 00:26:26,400
Oh, wow.
363
00:26:26,400 --> 00:26:29,680
...definitely something to understand. 
364
00:26:29,680 --> 00:26:34,000
So, for some users of AI the 
commission proposes an outright ban.
365
00:26:34,560 --> 00:26:39,280
So, that's a use where the commission says it's 
366
00:26:39,280 --> 00:26:42,960
an unacceptable threat to 
citizens, that for example,
367
00:26:42,960 --> 00:26:48,000
it's an AI system that likely causes
physical or psychological harm by  
368
00:26:48,720 --> 00:26:50,800
manipulating a person's behavior.
369
00:26:51,520 --> 00:26:55,280
What that means, I mean that's 
a question of definition 
370
00:26:55,280 --> 00:27:00,000
or by exploiting their vulnerabilities, 
like age or disability.
371
00:27:00,880 --> 00:27:08,320
Then also something like social scoring systems,
where people collect points and also minuses,
372
00:27:08,320 --> 00:27:09,840
like those which we know in China  
373
00:27:11,040 --> 00:27:15,520
and facial recognition in public 
spaces by law enforcement authorities.
374
00:27:16,560 --> 00:27:23,040
So not all facial recognition, but just those 
used by police in public spaces, essentially.
375
00:27:23,040 --> 00:27:27,760
Although, even there you have exceptions
and I think too many of them.
376
00:27:29,280 --> 00:27:37,040
So that's, you know, the systems that the 
commission proposes should be banned in Europe.
377
00:27:38,160 --> 00:27:41,520
But most of the regulatory draft,
actually focuses on AI 
378
00:27:42,400 --> 00:27:47,360
that is considered high risk and 
what is high risk is, of course,
379
00:27:47,360 --> 00:27:52,480
defined in the regulatory draft, 
so that's, kind of problematic 
380
00:27:52,480 --> 00:27:56,160
uses in the recruiting field, in the employment
381
00:27:56,160 --> 00:28:01,120
and admissions context, in determining a person's 
382
00:28:01,120 --> 00:28:04,640
credit worthiness or 
eligibility for public services
383
00:28:04,640 --> 00:28:10,880
and benefits and also some 
applications used in law enforcement 
384
00:28:10,880 --> 00:28:15,600
and security and judiciary and 
for those, these systems have
385
00:28:15,600 --> 00:28:21,280
to meet different requirements and 
undergo kind of a conformity assessment, 
386
00:28:22,960 --> 00:28:27,680
so, of course, before they can be 
placed in the in the European market.
387
00:28:27,680 --> 00:28:34,640
So, in a way, that's kind of a, 
yeah, FDA, as we call it in the US, 
388
00:28:34,640 --> 00:28:39,120
like clinical testing for, 
not for drugs or for vaccines
389
00:28:39,120 --> 00:28:44,320
but for algorithms. It's to 
make sure that an AI system 
390
00:28:44,320 --> 00:28:50,400
complies with several requirements 
around serious risk management.
391
00:28:50,400 --> 00:28:58,240
It has to use data sets in training, validation 
and testing that are relevant, representative,  
392
00:28:58,240 --> 00:29:00,000
free of errors, complete.
393
00:29:00,000 --> 00:29:07,520
That's a super tall order, of course.
Then documentation about 
394
00:29:07,520 --> 00:29:13,040
a high risk AI system must be 
really extensive and very precise.
395
00:29:13,040 --> 00:29:18,560
Why you chose certain designs, 
why you designed it in 
396
00:29:18,560 --> 00:29:23,840
a specific way and also to 
show that the developers
397
00:29:23,840 --> 00:29:26,960
really checked all these factors diligently.
398
00:29:27,520 --> 00:29:31,280
There has to be, the key
word is always human oversight, 
399
00:29:31,280 --> 00:29:34,800
so high-risk AI systems must 
be designed in a way that
400
00:29:34,800 --> 00:29:40,960
allows people to understand the capabilities
and the limitations of a system and to counter
401
00:29:40,960 --> 00:29:46,880
the so-called automation bias and also if 
necessary reverse or overwrite the output.
402
00:29:47,600 --> 00:29:48,100
Yeah.
403
00:29:48,880 --> 00:29:54,000
And then of course, accuracy, 
robustness, security and transparency.
404
00:29:54,000 --> 00:29:57,280
So this is, kind of, this clinical
testing for algorithm.
405
00:29:58,480 --> 00:30:04,240
So, do you see, because there’ve also
been some articles criticizing, 
406
00:30:04,240 --> 00:30:07,280
I think, I mean that's why we have media, right?
407
00:30:07,280 --> 00:30:11,600
But do you see any loopholes in this regulation, 
408
00:30:12,320 --> 00:30:18,080
proposed AI regulation or
do you feel that there was some,
409
00:30:19,040 --> 00:30:24,000
I mean, in terms of your learnings, do you 
see any challenges or loopholes in this?
410
00:30:25,200 --> 00:30:28,480
Yeah. I mean, first of all,
of course, we only have a draft 
411
00:30:28,480 --> 00:30:31,760
so far and already the European parliament
412
00:30:31,760 --> 00:30:34,800
and also other bodies in 
Europe, have already called 
413
00:30:34,800 --> 00:30:39,120
for much stricter rules in 
some elements of the draft,
414
00:30:39,120 --> 00:30:45,680
so, certain member states will 
definitely also have their say on, 
415
00:30:45,680 --> 00:30:51,360
I mean all of them will have their say, but 
certain member states are of the opinion
416
00:30:51,360 --> 00:30:56,640
that it should be stricter in some 
cases, so this is not the final word.
417
00:30:58,960 --> 00:31:06,640
Well, in my personal opinion, I do think 
that exceptions for facial recognition, 
418
00:31:06,640 --> 00:31:15,280
just to take one big part of what people 
expected of this regulatory draft, are too wide.
419
00:31:16,800 --> 00:31:26,320
So, it's just difficult when you ban 
very specific uses of facial recognition, 
420
00:31:26,320 --> 00:31:33,600
but then, actually in industry or private 
uses there's no ban at all and so,
421
00:31:33,600 --> 00:31:38,640
even in the uses of law enforcement you 
have certain areas where it can be used.
422
00:31:38,640 --> 00:31:43,280
So in a way, practically speaking, 
law enforcement in Europe 
423
00:31:43,280 --> 00:31:47,600
will buy facial recognition
systems on the market,
424
00:31:47,600 --> 00:31:54,400
wherever they're produced, and use them in those
specific cases that they would be allowed to
425
00:31:55,280 --> 00:31:59,120
and how do you really want to make sure 
that they don't use them for other things?
426
00:31:59,120 --> 00:32:01,920
So, I think that's a huge 
loophole and then of course, 
427
00:32:02,800 --> 00:32:10,240
and I think it's really dangerous, so I really 
don't want to sound alarmist but in a way,
428
00:32:10,240 --> 00:32:17,520
I do think that facial recognition has a 
potential to actually undermine our free society.
429
00:32:17,520 --> 00:32:20,800
That sounds big but I do think it is a big danger.
430
00:32:21,520 --> 00:32:22,020
Yeah.
431
00:32:23,600 --> 00:32:24,960
I mean there is also  
432
00:32:24,960 --> 00:32:32,560
other things that we're kind of used to
in the European regulatory process,
433
00:32:33,440 --> 00:32:41,040
some rules are just very loosely 
defined and over time with use, 
434
00:32:41,040 --> 00:32:47,600
once it's in effect, we will better understand 
what's actually meant by it or maybe get guidance,
435
00:32:47,600 --> 00:32:56,240
but definitely, also, from practical life, 
court decisions, practical use cases.
436
00:32:56,240 --> 00:33:06,400
All of that, just like in GDPR, wasn't or
isn't always clear for me in the end, 
437
00:33:06,400 --> 00:33:11,680
so there's a lot to criticize about this draft.
It's definitely one that has been  
438
00:33:12,960 --> 00:33:16,640
drafted in a part
439
00:33:16,640 --> 00:33:22,640
of the commission that is mostly interested 
in encouraging AI development in Europe, 
440
00:33:22,640 --> 00:33:30,000
so, just to make it a more 
competitive market and not the legal
441
00:33:30,000 --> 00:33:35,840
and the justice oriented
and human rights oriented  
442
00:33:37,440 --> 00:33:42,000
experts in the commission 
so that, I think it shows.
443
00:33:42,560 --> 00:33:47,840
But for me, the most important thing is,
actually, that there is some draft 
444
00:33:47,840 --> 00:33:52,880
on the table in one part of 
the world and other places  
445
00:33:53,520 --> 00:34:00,240
will look at it and decide if it's 
something that they want to adopt as well
446
00:34:00,240 --> 00:34:07,520
or create differently, but still
discuss what it can and what it can't do.
447
00:34:08,560 --> 00:34:15,120
I don't think it's a coincidence that just 
before the commission published its proposal, 
448
00:34:15,120 --> 00:34:20,480
the date was public, the
United States Federal Trade Commission, the FTC,
449
00:34:20,480 --> 00:34:25,280
published a reminder that it's watching 
closely what AI systems companies aim 
450
00:34:25,280 --> 00:34:30,400
to use and that they really have to make 
sure that there's no bias in those system
451
00:34:30,400 --> 00:34:36,640
and no discrimination resulting from its use.
452
00:34:37,920 --> 00:34:42,880
So, I'm pretty hopeful that
there will be some collaboration in making 
453
00:34:42,880 --> 00:34:47,840
rules compatible between the United States and Europe.
454
00:34:48,800 --> 00:34:56,160
There's been some really positive
signs already from the US administration, 
455
00:34:56,160 --> 00:34:59,360
that the US wants to work 
with the EU in this field,
456
00:35:00,160 --> 00:35:02,480
because in the end let's not 
forget that the United States 
457
00:35:02,480 --> 00:35:08,640
and Europe share values and commitments 
to protecting rights and human dignity.
458
00:35:08,640 --> 00:35:13,040
It sounds cheesy but I do believe 
it's true at the core, we, 
459
00:35:13,040 --> 00:35:16,640
compared to the rest of the world,
we have a lot in common
460
00:35:16,640 --> 00:35:20,240
and we have certain values that are at stake.
461
00:35:21,200 --> 00:35:29,440
We should go back to those roots and know that 
if we let AI develop completely unregulated, 
462
00:35:30,560 --> 00:35:36,000
those core values of our democracies 
are at stake and that's not alarmist,
463
00:35:36,000 --> 00:35:40,320
as I mentioned, it's not
some science fiction,  
464
00:35:41,280 --> 00:35:45,600
it's actually already happening 
in a lot of developments.
465
00:35:46,800 --> 00:35:54,400
So, I think, yeah, that's a very good reason 
to cooperate internationally on these rules.
466
00:35:54,400 --> 00:36:00,000
Yeah, actually that's a very interesting 
perspective because one of my next questions, 
467
00:36:00,000 --> 00:36:06,080
I don't know if it's null and void now,
but I was going to ask you that  
468
00:36:07,600 --> 00:36:12,720
if regulation is going to impact innovation in AI,
469
00:36:12,720 --> 00:36:20,480
but you just mentioned that this is a very 
pro and innovation, kind of regulation, 
470
00:36:22,400 --> 00:36:27,760
so I would rather reframe it and ask you that,
how do innovation and regulations  
471
00:36:29,120 --> 00:36:31,120
in AI go hand in hand?
472
00:36:32,640 --> 00:36:38,560
Yeah, well, as a matter of principle, I 
don't believe regulation stifles innovation. 
473
00:36:38,560 --> 00:36:43,760
Very often in my daily life, I work with 
technologies who feel that, actually,
474
00:36:43,760 --> 00:36:48,080
regulation gives them some guard rails that 
they actually appreciate, as a matter of fact.
475
00:36:49,520 --> 00:36:52,640
I think on the contrary,
smart regulation that gets out in 
476
00:36:52,640 --> 00:36:57,760
front of emerging technology can 
protect consumers and drive innovation.
477
00:36:57,760 --> 00:37:04,240
So my focus is on protecting consumers and 
that, I think, is the most important thing.
478
00:37:05,360 --> 00:37:07,025
I mean, we're not just a  
479
00:37:07,520 --> 00:37:18,880
huge, couple billion of mice in a lab, 
we are people with rights and dignity.
480
00:37:18,880 --> 00:37:21,840
And it feels to me that policymakers have,  
481
00:37:22,720 --> 00:37:28,880
kind of, over the last decades,
forgotten that regulation
can be beneficial and  
482
00:37:28,880 --> 00:37:33,360
that it's not always just good to 
give industry players free reign,
483
00:37:35,040 --> 00:37:40,160
just to deploy the technologies 
according to their business model, 
484
00:37:40,160 --> 00:37:47,040
because there's also, definitely,
an increasing backlash against tech companies.
485
00:37:48,080 --> 00:37:52,080
People kind of have this suspicion, 
kind of a dark suspicion, 
486
00:37:52,080 --> 00:37:56,080
that these companies are interested 
primarily in promoting their own dominance.
487
00:37:56,800 --> 00:38:00,480
Very low trust. I mean there are 
surveys that show this very clearly.
488
00:38:00,480 --> 00:38:05,920
The technology companies enjoy 
very low trust in general but then, 
489
00:38:05,920 --> 00:38:10,160
I mean, they do sell their products 
but people don't trust them
490
00:38:10,160 --> 00:38:16,960
and as a result, this is when 
policymakers at state and local level, 
491
00:38:17,520 --> 00:38:23,520
actually begin to consider
technology bans and that's starting mostly
492
00:38:23,520 --> 00:38:27,600
at those lower local and state
levels but I think that at some point, 
493
00:38:27,600 --> 00:38:32,480
and it's already happening, at a
higher level they're also talking about that.
494
00:38:32,480 --> 00:38:39,680
In Europe faster than in other 
places but, yeah, as I said, 
495
00:38:39,680 --> 00:38:46,640
I think those who know AI development well 
and who warn of unguarded development,
496
00:38:47,920 --> 00:38:52,960
are often considered that they're 
overreacting to science fiction speculation, 
497
00:38:52,960 --> 00:38:57,760
but actually there are really creepy
AI systems already deployed in real life.
498
00:38:59,280 --> 00:39:04,240
Examples that I really find creepy
are those algorithmic selection systems  
499
00:39:04,240 --> 00:39:06,960
in hiring interviews that are 
actually based on bias data,
500
00:39:06,960 --> 00:39:10,560
so you could say that, well, it's 
just a really badly built product 
501
00:39:11,440 --> 00:39:16,240
but I mean they're just let loose 
on people before being tested
502
00:39:16,240 --> 00:39:26,080
or checked and they cause real harm or 
mood detection systems and music apps.
503
00:39:26,080 --> 00:39:32,560
Spotify presented something like that.
It sounds fun but actually those  
504
00:39:32,560 --> 00:39:39,360
systems put people in categories and they 
reinforce, sometimes, dangerous cliches.
505
00:39:40,560 --> 00:39:44,960
They do, as a matter of fact, not 
comply with existing consumer protection 
506
00:39:44,960 --> 00:39:51,120
laws or non-discrimination laws or actually 
just basic principles of decency. So...
507
00:39:51,120 --> 00:39:51,760
Yeah.
508
00:39:51,760 --> 00:39:57,200
...I think what's changing right now in the 
US, is that the idea that the technology sector 
509
00:39:57,200 --> 00:40:02,720
is in some way different from other industry 
sectors, that it needs more freedoms
510
00:40:02,720 --> 00:40:04,960
and exceptions to general market rules.
511
00:40:04,960 --> 00:40:10,080
That's fading and agencies are more looking 
into how to apply their existing rules  
512
00:40:11,120 --> 00:40:14,160
around competition also or anti-trust,
513
00:40:14,160 --> 00:40:17,440
called in the US, and consumer protection safety, 
514
00:40:18,000 --> 00:40:21,760
non-discrimination to technology 
and that's very similar in Europe.
515
00:40:21,760 --> 00:40:28,720
I think it's time that we consider the technology 
sector as a normal part of our economies, 
516
00:40:28,720 --> 00:40:31,360
that doesn't need and shouldn't enjoy exceptions
517
00:40:31,360 --> 00:40:34,960
or special protection, compared
to other more regulated industries.
518
00:40:35,680 --> 00:40:44,000
Yeah and also I think generally, people are 
redefining innovation, it's that sentiment.
519
00:40:44,560 --> 00:40:48,720
But beautifully put and also 
a very positive note to come  
520
00:40:48,720 --> 00:40:53,440
to concluding statements for today's episode.
521
00:40:54,000 --> 00:41:02,480
So, Julia, if there was one thing
you would want our audience to take away 
522
00:41:02,480 --> 00:41:11,200
from today's discussion, what 
would it be? Very broad question.
523
00:41:11,200 --> 00:41:15,600
Yeah, very broad, but let me 
think of something good for you. 
524
00:41:15,600 --> 00:41:21,040
I think what I really want is 
that people look at technology,
525
00:41:21,040 --> 00:41:29,760
as I just mentioned, as one industry 
among many, that should be centered around 
526
00:41:29,760 --> 00:41:38,240
us people who use it as a tool
and should not be a cause in itself.
527
00:41:38,800 --> 00:41:43,040
So, those companies don't need some special status. 
528
00:41:43,040 --> 00:41:47,680
They're not, like,
small innovation sparks anymore.
529
00:41:47,680 --> 00:41:56,400
They are our biggest and most costly and 
most powerful economic players and so, 
530
00:41:57,120 --> 00:42:07,760
we should really kind of turn around the 
picture and say, who are you there for?
531
00:42:07,760 --> 00:42:14,400
Are you there for us or are you there 
for just sucking up power and money? 
532
00:42:14,400 --> 00:42:23,920
And so, if we say that you're there for us and for 
us to really benefit of all the really positive
533
00:42:23,920 --> 00:42:30,640
and promising uses that, not only AI, but 
technology in general, can have for humanity, 
534
00:42:30,640 --> 00:42:39,920
let's also create smart and easily understandable 
rules that make sure that we don't just say,
535
00:42:39,920 --> 00:42:48,320
“oops yeah, that was a mistake, I'm sorry 
that I just destroyed your existence”.
536
00:42:50,880 --> 00:42:57,120
And so, we have to make sure that
the human is at the center and that AI 
537
00:42:57,120 --> 00:43:02,480
or in general, technology, is a 
tool for humans and that humans are  
538
00:43:02,480 --> 00:43:06,480
still created equal and have dignity as an individual.
539
00:43:07,680 --> 00:43:12,800
Yeah, and we try to do a bit 
by interviewing people like you 
540
00:43:13,440 --> 00:43:16,480
and I mean it was such a pleasure,  
541
00:43:16,480 --> 00:43:22,880
it was a spontaneous plan that we had to 
record this but it turned out so great.
542
00:43:22,880 --> 00:43:27,680
I'm so happy that we could host 
you after all these trays of months 
543
00:43:27,680 --> 00:43:33,760
to get you on Voices of Data Economy and
thank you for being here today.
544
00:43:34,560 --> 00:43:38,149
Thank you, Diksha. It was a really nice conversation.
1
00:00:00,240 --> 00:00:04,960
Ως θέμα αρχής, δεν πιστεύω ότι η
ρύθμιση καταπνίγει την καινοτομία.
2
00:00:04,960 --> 00:00:07,920
Πολύ συχνά στην καθημερινή μου
ζωή, εργάζομαι με τεχνολογίες  
3
00:00:07,920 --> 00:00:10,560
που αισθάνονται ότι, στην πραγματικότητα,
η ρύθμιση τους δίνει κάποια
4
00:00:10,560 --> 00:00:14,000
προστασία που στην πραγματικότητα εκτιμούν.
5
00:00:14,000 --> 00:00:17,280
Νομίζω ότι, αντίθετα, η έξυπνη
ρύθμιση που μπαίνει μπροστά από
6
00:00:17,280 --> 00:00:21,680
την αναδυόμενη τεχνολογία μπορεί να προστατεύσει
τους καταναλωτές και να προωθήσει την καινοτομία.
7
00:00:21,680 --> 00:00:28,160
Έτσι, επικεντρώνομαι στην προστασία των καταναλωτών
και αυτό θεωρώ ότι είναι το πιο σημαντικό πράγμα.
8
00:00:29,280 --> 00:00:34,720
Θέλω να πω ότι δεν είμαστε απλώς μερικά
δισεκατομμύρια ποντίκια σε ένα εργαστήριο,
9
00:00:35,520 --> 00:00:38,720
είμαστε άνθρωποι με δικαιώματα και αξιοπρέπεια.
10
00:00:41,520 --> 00:00:48,160
Αυτή είναι η δεύτερη σεζόν της εκπομπής Voices of the Data Economy,
ένα podcast που υποστηρίζεται από το Ίδρυμα του Πρωτοκόλλου Ocean.
11
00:00:49,040 --> 00:00:52,240
Σας παρουσιάζουμε τις φωνές που
διαμορφώνουν την Οικονομία των Δεδομένων
12
00:00:52,240 --> 00:00:56,160
και ταυτόχρονα την αμφισβητούν.
Ακούστε ιδρυτές,
13
00:00:56,160 --> 00:00:59,600
ειδικούς σε θέματα τεχνολογικής πολιτικής
και πρωτοπόρους στις επενδύσεις με αντίκτυπο.
14
00:01:00,240 --> 00:01:02,880
Όλοι μοιράζονται τη σχέση τους με τα δεδομένα.
15
00:01:04,160 --> 00:01:07,520
Γεια σας λοιπόν και καλώς ήρθατε,
σήμερα έχουμε μαζί μας την Julia.
16
00:01:07,520 --> 00:01:12,000
Είναι επαγγελματίας στον τομέα της προστασίας των προσωπικών δεδομένων
και ειδικός στη διακυβέρνηση της Τεχνητής Νοημοσύνης (AI).
17
00:01:12,000 --> 00:01:15,600
Είναι επίσης μέλος του Mozilla Fellow
in Residence και έχει διατελέσει
18
00:01:15,600 --> 00:01:20,880
Γερμανίδα διπλωμάτης στο παρελθόν,
οπότε ένα πολύ ευέλικτο προφίλ.
19
00:01:20,880 --> 00:01:21,840
Γεια σου, Julia.
20
00:01:23,040 --> 00:01:23,840
Γεια σου, Diksha.
21
00:01:24,480 --> 00:01:26,080
Γεια, συγγνώμη, το είπα λάθος.
22
00:01:26,080 --> 00:01:27,680
Γεια σου, Julia. Πάντα...
23
00:01:29,200 --> 00:01:35,920
Έχω Αμερικανούς φίλους και Γερμανούς φίλους
και είναι πάντα μπερδεμένο, αλλά εδώ είμαστε, Julia.
24
00:01:36,640 --> 00:01:37,760
Ακριβώς, ωραία.
25
00:01:37,760 --> 00:01:40,400
Λοιπόν, είσαι στη Γερμανία σήμερα.
26
00:01:41,280 --> 00:01:47,360
Ναι, είμαι. Κανονικά βρίσκομαι στο Σαν Φρανσίσκο
αλλά για πρώτη φορά από τότε που ξεκίνησε η πανδημία,
27
00:01:47,920 --> 00:01:51,600
μπόρεσα να ταξιδέψω και είμαι
πραγματικά χαρούμενη που είμαι
28
00:01:51,600 --> 00:01:54,640
εδώ και που σας βλέπω όλους
ξανά μετά από τόσο καιρό,
29
00:01:54,640 --> 00:01:59,440
αλλά θα επιστρέψω στο Σαν Φρανσίσκο
σε μερικές εβδομάδες.
30
00:02:00,400 --> 00:02:05,600
Ναι, το καλοκαίρι είναι πάντα μια υπέροχη
εποχή για να βρίσκεσαι εδώ στη Γερμανία.
31
00:02:05,600 --> 00:02:07,760
Και δεν είναι η καλύτερη εποχή στο Σαν Φρανσίσκο.
32
00:02:07,760 --> 00:02:11,280
Νομίζω ότι κάποιος είπε ότι τίποτα
δεν είναι χειρότερο ή ψυχρότερο από το  
33
00:02:13,360 --> 00:02:15,360
καλοκαίρι στο Σαν Φρανσίσκο λόγω της ομίχλης.
34
00:02:16,480 --> 00:02:18,720
Εντάξει, ωραία, οπότε έχεις τα
καλύτερα και από τους δύο κόσμους.
35
00:02:20,720 --> 00:02:24,640
Και εντάξει, επιστρέφοντας στο προφίλ σου.
36
00:02:24,640 --> 00:02:27,600
Πες μας λίγο για την πορεία
σου από το να είσαι
37
00:02:27,600 --> 00:02:31,520
Γερμανίδα διπλωμάτης σε
σύμβουλο πολιτικής δεδομένων
38
00:02:31,520 --> 00:02:33,120
και κανονισμών στις ΗΠΑ.
39
00:02:33,120 --> 00:02:36,160
Εννοώ ότι είναι μια πολύ ενδιαφέρουσα αλλαγή.
40
00:02:36,160 --> 00:02:41,040
Πώς οδήγησε το ένα πράγμα στο
άλλο και κάνετε αυτό που κάνετε;
41
00:02:42,400 --> 00:02:47,040
Ναι, ίσως ένας συνδετικός κρίκος
ήταν πάντα ότι πάντα ήθελα να  
42
00:02:47,040 --> 00:02:51,840
να εργάζομαι σε πράγματα που έχουν σημασία.
Μου αρέσει περισσότερο η διαλειτουργική εργασία.
43
00:02:53,520 --> 00:02:56,240
Εννοώ τα πράγματα που έχουν σημασία, αυτό
ακούγεται πολύ προσανατολισμένο στην αποστολή.
44
00:02:56,240 --> 00:03:00,080
Νομίζω ότι αυτό είμαι, αλλά είμαι
πολύ περίεργη για τα πράγματα
45
00:03:00,080 --> 00:03:04,400
που έχουν κάποια σημασία για
μένα, αλλά και για τον κόσμο.
46
00:03:04,400 --> 00:03:08,400
Σπούδασα διεθνείς σχέσεις και Ευρωπαϊκές σπουδές
47
00:03:08,400 --> 00:03:11,440
στη Γερμανία και στη συνέχεια πήγα
στην Καλιφόρνια και τελικά ολοκλήρωσα
48
00:03:11,440 --> 00:03:12,480
τις σπουδές μου στη Γαλλία.
49
00:03:13,680 --> 00:03:20,800
Πήγα να εργαστώ σε ένα μεγάλο επιστημονικό επιτελείο για την εξωτερική
πολιτική και στη συνέχεια σε ένα φυλάκιο της Επιτροπής στο Λίβανο.
50
00:03:20,800 --> 00:03:25,120
Έχω μια αγάπη για τη Μέση Ανατολή,
αλλά στη συνέχεια αποφάσισα
51
00:03:25,120 --> 00:03:27,440
να μπω στη Γερμανική διπλωματική υπηρεσία και πέρασα
52
00:03:27,440 --> 00:03:34,240
σχεδόν 15 χρόνια σε διάφορες, τις οποίες βρήκα εξαιρετικά
ενδιαφέρουσες θέσεις στην κυβέρνηση της Γερμανίας,
53
00:03:34,240 --> 00:03:36,320
αλλά και στη Γαλλία και την Ιταλία.
54
00:03:36,320 --> 00:03:39,200
Και άρχισα να γοητεύομαι από την
τεχνολογία όταν ήμουν μέλος
55
00:03:39,200 --> 00:03:42,320
σε ομάδες διαπραγμάτευσης για πλαίσια
πολιτικής τεχνολογίας στην Ευρώπη.
56
00:03:43,120 --> 00:03:48,000
Ο ρόλος μου σε αυτό το πλαίσιο επικεντρώθηκε
στη σύνταξη και το συντονισμό πολιτικών
57
00:03:48,000 --> 00:03:50,960
για μια νέα και έξυπνη ρύθμιση της τεχνολογίας
58
00:03:50,960 --> 00:03:53,520
και κατά τη διάρκεια εκείνης της περιόδου,
στην πραγματικότητα, ασχολήθηκα έντονα με
59
00:03:53,520 --> 00:03:57,360
στην αρχή των διαπραγματεύσεων για
τον ΓΚΠΔ σε ευρωπαϊκό επίπεδο.
60
00:03:57,360 --> 00:04:00,240
Τους πήρε αρκετό καιρό,
χρειάστηκαν έξι χρόνια συνολικά 
61
00:04:00,240 --> 00:04:04,000
αλλά ήμουν εκεί στην αρχή
και στη συνέχεια το 2012,
62
00:04:04,000 --> 00:04:07,520
πριν από εννέα χρόνια, ήρθα
στις ΗΠΑ και η δουλειά μου ήταν
63
00:04:07,520 --> 00:04:10,880
να οργανώνω συναντήσεις και συζητήσεις μεταξύ
πολλών ενδιαφερομένων μερών μεταξύ
64
00:04:10,880 --> 00:04:14,160
Γερμανίας και τεχνολογικών εταιρειών
στη Silicon Valley και επίσης
65
00:04:15,120 --> 00:04:17,280
να κάνω επικοινωνιακό έργο γύρω από αυτό.
66
00:04:17,920 --> 00:04:25,120
Έτσι, κατά κάποιον τρόπο συνδυάζω το ότι είμαι ισχυρή στο περιεχόμενο
και να ενδιαφέρομαι για το περιεχόμενο, αλλά επίσης πραγματικά
67
00:04:25,120 --> 00:04:28,560
μου αρέσει ο συντονισμός και η
εμπιστοσύνη και η οικοδόμηση σχέσεων
68
00:04:29,200 --> 00:04:32,640
και έτσι με αυτό το τρόπο έχω
αναπτύξει αυτό το πραγματικά μεγάλο δίκτυο
69
00:04:32,640 --> 00:04:34,720
στο χώρο της τεχνολογίας και της ρύθμισης.
70
00:04:34,720 --> 00:04:38,720
Και έτσι, αφού τελείωσαν αυτά
τα τρία χρόνια στο διπλωματικό
71
00:04:38,720 --> 00:04:42,480
φυλάκιο, στη διασταύρωση της τεχνολογίας
και της πολιτικής αποφάσισα
72
00:04:42,480 --> 00:04:46,720
να γίνω σύμβουλος πολιτικής τεχνολογίας,
γεφυρώνοντας την Ευρώπη και τις ΗΠΑ,
73
00:04:46,720 --> 00:04:50,560
έτσι ώστε να μπορώ να βρίσκομαι και στα
δύο μέρη γεωγραφικά και επίσης από την άποψη
74
00:04:50,560 --> 00:04:56,880
των κλάδων για διαφορετικούς πελάτες και
αυτή τη στιγμή φοράω, ας πούμε, δύο καπέλα.
75
00:04:58,000 --> 00:05:00,240
Το ένα είναι ότι εργάζομαι ως επαγγελματίας
στον τομέα της προστασίας της ιδιωτικής ζωής,
76
00:05:01,040 --> 00:05:04,400
έτσι γίνομαι μέλος ομάδων εμπιστοσύνης
και ασφάλειας και τεχνολογικών εταιρειών
77
00:05:04,400 --> 00:05:08,560
στην Ευρώπη και στην περιοχή του
κόλπου και κάνω επίσης την ίδια δουλειά
78
00:05:08,560 --> 00:05:11,840
για συμβουλευτικές εταιρείες
και για μη κερδοσκοπικά ιδρύματα.
79
00:05:12,480 --> 00:05:16,800
Και το κύριο θέμα είναι η συμμόρφωση στην ιδιωτικότητα
80
00:05:16,800 --> 00:05:20,560
και την προστασία των δεδομένων,
οπότε στην Ευρώπη αυτός είναι ο διάσημος
81
00:05:20,560 --> 00:05:23,600
Γενικός Κανονισμός για την
Προστασία Δεδομένων, αλλά υπάρχει
82
00:05:23,600 --> 00:05:25,680
κάτι παρόμοιο τώρα και στην Καλιφόρνια.
83
00:05:25,680 --> 00:05:28,560
Μπορούμε να μιλήσουμε γι' αυτό
αν σας ενδιαφέρει αργότερα,
84
00:05:29,280 --> 00:05:33,200
αλλά κάνω επίσης σάρωση των
συνεπειών που είναι μια λέξη-κλειδί
85
00:05:33,200 --> 00:05:37,200
πρόσφατα στη Silicon Valley και στην ασφάλεια.
86
00:05:37,200 --> 00:05:40,400
Και έτσι, γίνομαι πραγματικά μέρος
αυτών των ομάδων μηχανικής
87
00:05:40,400 --> 00:05:44,720
και προϊόντων και ξεκινώ
με την προστασία δεδομένων
88
00:05:44,720 --> 00:05:49,520
και στη συνέχεια εργάζομαι και σε άλλες πολιτικές που
βοήθησα να συνταχθούν για αυτές τις εταιρείες.
89
00:05:49,520 --> 00:05:54,240
Και το δεύτερο καπέλο είναι ότι,
επειδή ήθελα να δουλέψω στη
90
00:05:54,240 --> 00:05:57,280
γεωπολιτική της τεχνολογικής ρύθμισης
λίγο περισσότερο, ίσως αυτό είναι
91
00:05:57,280 --> 00:06:02,560
μια συνέχεια ή ήθελα να ξεκινήσω
κάτι πάλι που πραγματικά μου
92
00:06:02,560 --> 00:06:07,520
μου πρόσφερε την Υποτροφία Διαμονής που αναφέρατε.
93
00:06:07,520 --> 00:06:09,920
me that Fellowship in Residence that you mentioned.
94
00:06:10,880 --> 00:06:15,840
Το ξεκίνησα πέρυσι και εργάζομαι
σε επερχόμενη ρύθμιση της ΕΕ 
95
00:06:15,840 --> 00:06:21,200
για την αξιόπιστη Τεχνητή Νοημοσύνη και αυτό
είναι ένα είδος αναλυτικής ερευνητικής εργασίας,
96
00:06:21,200 --> 00:06:27,040
αλλά μου αρέσει να βγαίνω και να μιλάω με
κατασκευαστές τεχνητής νοημοσύνης και επίσης
97
00:06:27,040 --> 00:06:30,720
με κυβερνητικούς εμπειρογνώμονες
και να διευθύνω εργαστήρια και
98
00:06:30,720 --> 00:06:33,840
να λαμβάνω τις γνώσεις μου απευθείας από τις εταιρείες
τεχνολογίας που κατασκευάζουν Τεχνητή Νοημοσύνη.
99
00:06:33,840 --> 00:06:36,880
Είμαι επίσης μέλος της ομάδας
πολιτικής στην Mozilla Corporation,
100
00:06:37,600 --> 00:06:41,680
σε αντίθεση με το Ίδρυμα,
και έτσι βοηθάω στον εντοπισμό
101
00:06:41,680 --> 00:06:44,960
και την παρακολούθηση και την ανάλυση
θεμάτων πολιτικής που επηρεάζουν τα
102
00:06:44,960 --> 00:06:49,520
προϊόντα της Mozilla, τόσο σε πολιτειακό,
όσο και σε εθνικό και διεθνές επίπεδο.
103
00:06:49,520 --> 00:06:52,880
Έτσι, η ομάδα εκεί είναι πραγματικά μικρή,
αλλά εξακολουθεί να είναι πολύ παγκόσμια
104
00:06:52,880 --> 00:06:55,680
και χαίρομαι που μπορώ να
συνεισφέρω σε κάποιον που γνωρίζει
105
00:06:55,680 --> 00:06:57,520
αρκετές περιοχές του κόσμου πολύ καλά.
106
00:06:58,640 --> 00:07:03,200
Ουάου. Οπότε, εννοώ ότι είναι
υπέροχο που εργάζεσαι στη δημόσια πλευρά
107
00:07:03,200 --> 00:07:06,320
επίσης και μετά βρίσκεσαι
στο κέντρο τεχνολογίας
108
00:07:06,320 --> 00:07:12,000
και ασχολείσαι με όλους, ας πούμε,
τους εταίρους σε αυτό το οικοσύστημα.
109
00:07:12,640 --> 00:07:17,520
Σίγουρα θα συζητήσουμε σε βάθος για τη
ρύθμιση της τεχνητής νοημοσύνης, αλλά ας
110
00:07:17,520 --> 00:07:21,680
πάμε ένα βήμα πίσω και να πάμε
στο ΓΚΠΔ και έχουμε μια εμμονή
111
00:07:22,320 --> 00:07:26,240
στην ΕΕ με αυτό και μόλις
συμπληρώθηκαν τρία χρόνια 
112
00:07:26,880 --> 00:07:30,560
και τώρα μπορείτε να δείτε ότι τα
πράγματα μπαίνουν σε εφαρμογή στην ΕΕ.
113
00:07:31,200 --> 00:07:36,880
Ποιος ήταν όμως σε γενικές γραμμές ο αντίκτυπός
του στη Silicon Valley αυτά τα τρία χρόνια;
114
00:07:37,600 --> 00:07:39,760
Τι λειτούργησε; Τι δεν λειτούργησε;
115
00:07:39,760 --> 00:07:43,840
Κάτι σαν ένα είδος κάρτας αξιολόγησης
του ΓΚΠΔ στη Silicon Valley.
116
00:07:44,400 --> 00:07:52,400
Ναι. Έτσι, ο ΓΚΠΔ είχε ορισμένες πραγματικά
αξιοσημείωτες και άμεσες επιπτώσεις παγκοσμίως,
117
00:07:52,400 --> 00:07:57,280
πρώτα απ' όλα στην Ευρώπη, όπως μπορείτε
να δείτε, αλλά έχει επίσης επιφέρει
118
00:07:57,280 --> 00:08:01,440
ευαισθητοποίηση της Silicon Valley και των
ΗΠΑ, ότι η ιδιωτικότητα είναι σημαντική
119
00:08:01,440 --> 00:08:05,120
για τους ανθρώπους στην Ευρώπη αλλά
και σε άλλες περιοχές του κόσμου και
120
00:08:05,120 --> 00:08:09,600
ότι είναι ένα ανθρώπινο δικαίωμα που
στις ΗΠΑ πολλοί δεν το είχαν πραγματικά
121
00:08:09,600 --> 00:08:13,120
θεωρήσει σημαντικό, αλλά οι
έρευνες, ακόμη και εκεί μεταξύ του
122
00:08:13,120 --> 00:08:14,400
πληθυσμό, τους απέδειξαν ότι έκαναν λάθος.
123
00:08:15,040 --> 00:08:18,640
Έτσι, η παγκόσμια συζήτηση γύρω από την προστασία
της ιδιωτικής ζωής έχει πραγματικά μετατοπιστεί
124
00:08:18,640 --> 00:08:23,040
τα τελευταία, ναι, τρία χρόνια,
αλλά και λίγο νωρίτερα,
125
00:08:23,040 --> 00:08:27,360
αλλά σίγουρα από το 2018 και
μετά, το ίδιο και οι νόμοι.
126
00:08:27,360 --> 00:08:32,320
Έτσι, ως άμεσο αποτέλεσμα του ΓΚΠΔ
στην Ευρώπη, χώρες, όπως η Ιαπωνία
127
00:08:32,320 --> 00:08:36,640
ή η Βραζιλία, ψήφισαν νόμους για την προστασία της
ιδιωτικής ζωής εμπνευσμένους από τον ΓΚΠΔ και η Ινδία
128
00:08:36,640 --> 00:08:41,040
και ακόμη και η Κίνα εξετάζουν τους
δικούς τους νόμους, παρόλο που αυτοί 
129
00:08:41,040 --> 00:08:43,280
φαίνονται διαφορετικοί από αυτό που
πιστεύουμε ότι θα έπρεπε να είναι
130
00:08:43,280 --> 00:08:47,360
αλλά είναι σίγουρα μια μεγάλη ώθηση.
131
00:08:47,360 --> 00:08:52,720
Και ο νέος νόμος της Καλιφόρνιας για την προστασία της
ιδιωτικής ζωής, ο οποίος τέθηκε σε ισχύ το 2020,
132
00:08:52,720 --> 00:08:56,960
είναι άμεσο αποτέλεσμα του ΓΚΠΔ και
η Καλιφόρνια είναι η πρώτη πολιτεία
133
00:08:56,960 --> 00:09:00,160
στις Ηνωμένες Πολιτείες που έχει θεσπίσει
ένα σύστημα προστασίας της ιδιωτικής ζωής
134
00:09:00,160 --> 00:09:01,840
σε πολιτειακό επίπεδο.
135
00:09:01,840 --> 00:09:04,800
Καμία άλλη πολιτεία των ΗΠΑ δεν
διαθέτει κάτι τέτοιο, αν και στην Ευρώπη  
136
00:09:05,440 --> 00:09:09,200
κάθε κράτος μέλος διαθέτει
τουλάχιστον ένα σε εθνικό επίπεδο,
137
00:09:09,200 --> 00:09:15,760
η Γερμανία, 16, για κάθε ομόσπονδο
κρατίδιο. Αλλά στην Καλιφόρνια,
138
00:09:16,400 --> 00:09:21,440
αυτό είναι το πρώτο για τις Ηνωμένες Πολιτείες
και αυτή η υπηρεσία μόλις ξεκίνησε τις εργασίες της.
139
00:09:21,440 --> 00:09:25,600
Διαθέτει έναν ισπανικής καταγωγής εμπειρογνώμονα
ΓΚΠΔ ως μέλος του διοικητικού συμβουλίου 
140
00:09:25,600 --> 00:09:30,320
σε επίπεδο πολιτείας της Καλιφόρνιας σε αυτόν τον οργανισμό,
οπότε νομίζω ότι αυτό είναι πραγματικά αξιοσημείωτο.
141
00:09:30,320 --> 00:09:35,760
Και επίσης, συνολικά, σε πιο γεωπολιτικούς
όρους, ο ΓΚΠΔ έχει επίσης δείξει
142
00:09:35,760 --> 00:09:39,360
στη Silicon Valley ότι μια από
τις μεγαλύτερες αγορές της, η Ευρώπη,
143
00:09:39,360 --> 00:09:42,400
έχει τους δικούς της κανόνες και
πρέπει να τους ακολουθούν όταν
144
00:09:42,400 --> 00:09:45,040
θέλουν να έχουν σημαντικό ρόλο
και να κερδίσουν χρήματα εκεί.
145
00:09:45,600 --> 00:09:48,720
Και ως αποτέλεσμα, πολλοί
οργανισμοί με έδρα τις ΗΠΑ
146
00:09:48,720 --> 00:09:51,440
που επεξεργάζονται προσωπικά
δεδομένα ανθρώπων σε όλο τον κόσμο
147
00:09:51,440 --> 00:09:55,600
αποφάσισαν να εφαρμόσουν τον ΓΚΠΔ
και να επεκτείνουν όλα τα δικαιώματα
148
00:09:55,600 --> 00:09:58,480
που το συνοδεύουν στους πελάτες
τους, οι οποίοι δεν χρειάζεται να
149
00:09:58,480 --> 00:10:01,520
να είναι κάτοικοι Ευρώπης
αλλά που ζουν εκτός Ευρώπης 
150
00:10:01,520 --> 00:10:04,400
και αυτό ισχύει για πολλές
εταιρείες σε όλο τον κόσμο.
151
00:10:05,120 --> 00:10:07,680
Τους δίνει ένα πλεονέκτημα
στην παγκόσμια συμμόρφωση
152
00:10:07,680 --> 00:10:11,520
και είναι ευκολότερο γι' αυτές όσον
αφορά τον χειρισμό παραπόνων και αιτημάτων.
153
00:10:11,520 --> 00:10:16,000
Έτσι, λένε, απλά δώστε σε όλους
τους πελάτες μας όλα τα δικαιώματα 
154
00:10:16,000 --> 00:10:19,120
που έχουν οι Ευρωπαίοι, κάτι
που δεν είστε υποχρεωμένοι,
155
00:10:19,120 --> 00:10:22,800
είναι ένας πολύ υψηλός πήχης, αλλά εξακολουθεί
να είναι ευκολότερο για τον οργανισμό 
156
00:10:22,800 --> 00:10:26,560
από το να ξεκαθαρίσει την τοποθεσία
του πελάτη και μερικές φορές
157
00:10:26,560 --> 00:10:29,520
έχουν μόνο μια διεύθυνση ηλεκτρονικού
ταχυδρομείου, ή αποδίδουν διαφορετικά
158
00:10:29,520 --> 00:10:32,080
δικαιώματα ανάλογα με την τοποθεσία
τους, εννοώ ότι αυτό είναι πραγματικά
159
00:10:32,080 --> 00:10:38,240
περίπλοκο όταν έχετε εκατό
διαφορετικά σύνολα όρων και προϋποθέσεων.
160
00:10:39,200 --> 00:10:42,480
Έτσι, ο ΓΚΠΔ τους προσφέρει ένα
νομικό πλαίσιο και ένα σύνολο 
161
00:10:42,480 --> 00:10:46,400
προτύπων που είναι, λοιπόν, τουλάχιστον
σε σύγκριση με άλλη λιγότερο
162
00:10:46,400 --> 00:10:50,000
διευκρινισμένη νομοθεσία ή όταν
δεν υπάρχει καθόλου νομοθεσία, 
163
00:10:50,000 --> 00:10:52,000
σχετικά σαφώς υιοθετήσιμη.
164
00:10:52,720 --> 00:10:56,800
Έτσι, οι πελάτες μου είναι κυρίως
μικρές και μεσαίες επιχειρήσεις με έδρα
165
00:10:56,800 --> 00:11:01,200
στις ΗΠΑ με μερικούς μόνο πελάτες
στην Ευρώπη ή μερικές φορές μόνο
166
00:11:01,200 --> 00:11:04,080
την απλή πρόθεση να επεκταθούν
σύντομα στην Ευρώπη, 
167
00:11:05,120 --> 00:11:09,520
αλλά αυτή η στρατηγική διαχείρισης της
ιδιωτικής ζωής, απλά πάρτε τον ΓΚΠΔ
168
00:11:09,520 --> 00:11:12,880
ως πρότυπο υψηλού επιπέδου και
εφαρμόστε τον σε όλους,
169
00:11:12,880 --> 00:11:15,520
έχει διαπιστωθεί και σε
μεγαλύτερες εταιρείες τεχνολογίας.
170
00:11:16,080 --> 00:11:18,560
Επειδή εκτιμούν ότι υπάρχει
ένα πρότυπο τώρα που
171
00:11:18,560 --> 00:11:22,320
είναι νόμος σε ένα μέρος του κόσμου, αλλά
μπορεί να χρησιμεύσει ως κατευθυντήρια γραμμή
172
00:11:22,320 --> 00:11:28,000
και για άλλα μέρη του κόσμου
και ότι είναι πιο εύκολο
173
00:11:28,000 --> 00:11:30,960
να υπάρχει ένα πρότυπο υψηλού
προφίλ παρά πολλά διαφορετικά,
174
00:11:30,960 --> 00:11:34,160
αυτό που αποκαλώ, το αυξανόμενο
παγκόσμιο συνονθύλευμα ιδιωτικότητας.
175
00:11:34,960 --> 00:11:39,520
Αυτό κατέστησε την ΕΕ τον πραγματικό
ρυθμιστή κανόνων και τεχνολογικής πολιτικής 
176
00:11:39,520 --> 00:11:44,880
παγκοσμίως, οπότε η όλη προοπτική
της ευρωπαϊκής θέσπισης κανόνων
177
00:11:44,880 --> 00:11:47,680
και στην Ευρώπη ως παράγοντα της
τεχνολογικής βιομηχανίας έχει αλλάξει
178
00:11:47,680 --> 00:11:51,440
σε πολύ σύντομο χρονικό διάστημα στη
Silicon Valley και ναι, έτσι οι αφηγήσεις
179
00:11:51,440 --> 00:11:55,280
έχουν αλλάξει και η εσωτερική
διαχείριση δεδομένων έχει αλλάξει.
180
00:11:55,280 --> 00:11:59,440
Πρέπει να αναφέρω ότι ένας απογοητευτικός
παράγοντας είναι η επιβολή, έτσι, 
181
00:12:00,080 --> 00:12:04,640
ακόμη και όταν οι εταιρείες τεχνολογίας
πλήττονται με πρόστιμα δισεκατομμυρίων δολαρίων,
182
00:12:04,640 --> 00:12:10,560
για αυτές είναι ένα χτύπημα στον καρπό
και μέχρι στιγμής ο ΓΚΠΔ δεν έχει αλλάξει
183
00:12:11,200 --> 00:12:14,240
τα υποκείμενα επιχειρηματικά μοντέλα,
τον τρόπο με τον οποίο τα χρήματα
184
00:12:14,240 --> 00:12:17,920
κερδίζονται στο διαδίκτυο, με την
παρακολούθηση της συμπεριφοράς των ανθρώπων.
185
00:12:17,920 --> 00:12:21,120
Επομένως, δεν πρόκειται μόνο για το
επιχειρηματικό μοντέλο μιας εταιρείας,
186
00:12:21,120 --> 00:12:27,760
είναι ολόκληρο το διαδίκτυο
που βασίζεται σε ένα, ναι,
187
00:12:27,760 --> 00:12:31,360
οικονομικό μοντέλο που δεν έχει
την ιδιωτικότητα στο μυαλό του, 
188
00:12:31,360 --> 00:12:34,400
οπότε η αλλαγή αυτού, φυσικά,
απαιτεί θεμελιώδεις
189
00:12:34,400 --> 00:12:39,680
και πιθανώς επώδυνες προσαρμογές στον τρόπο
με τον οποίο έχουν δομηθεί τα πράγματα.
190
00:12:39,680 --> 00:12:43,120
Αυτό είναι σίγουρα κάτι που ο
ΓΚΠΔ μέχρι στιγμής δεν έχει
191
00:12:43,120 --> 00:12:46,960
καταφέρει να επιτύχει και αυτό
είναι σίγουρα λίγο απογοητευτικό.
192
00:12:49,120 --> 00:12:53,280
Υπάρχει λοιπόν ο ΓΚΠΔ στις
ΗΠΑ, συγγνώμη, στην Ευρώπη,
193
00:12:53,280 --> 00:12:58,320
και υπάρχουν επίσης κανονισμοί για τα δεδομένα
στις ΗΠΑ. Νομίζω ότι είναι το CIPP.
194
00:13:00,240 --> 00:13:04,720
Έτσι, όταν συγκρίνετε τα
πρότυπα, πόσο συμβατά 
195
00:13:04,720 --> 00:13:09,120
είναι αυτά τα δύο όταν
εφαρμόζονται από τις εταιρείες μαζί;
196
00:13:10,080 --> 00:13:11,360
Βγάζει νόημα η ερώτησή μου;
197
00:13:12,000 --> 00:13:13,120
Απολύτως, ναι.
198
00:13:13,120 --> 00:13:17,040
Οπότε, ναι, ανέφερα αυτό το
παγκόσμιο συνονθύλευμα απορρήτου
199
00:13:17,040 --> 00:13:25,360
και σίγουρα, έτσι, ο νόμος για την προστασία της
ιδιωτικής ζωής που ισχύει στην Καλιφόρνια ονομάζεται,
200
00:13:25,360 --> 00:13:28,640
CCPA, ονομάζεται, Νόμος περί
Απορρήτου των Καταναλωτών της Καλιφόρνιας.
201
00:13:28,640 --> 00:13:33,920
Στην πραγματικότητα θα αντικατασταθεί
ξανά σύντομα από ένα άλλο που ονομάζεται,
202
00:13:33,920 --> 00:13:39,200
CPRA και είναι λίγο, ακόμα και
μια προηγμένη έκδοση του CCPA,
203
00:13:40,400 --> 00:13:48,320
συντομογραφίες, αλλά ορισμένες
αρχές είναι ίδιες με τον ΓΚΠΔ
204
00:13:48,320 --> 00:13:56,240
και, στη συνέχεια, έχει και πάλι ορισμένα
στοιχεία που υπερβαίνουν ακόμη και τον ΓΚΠΔ
205
00:13:56,240 --> 00:14:01,120
και θα έλεγα ότι η επιβολή
τους θα είναι πολύ δύσκολη.
206
00:14:01,760 --> 00:14:08,160
Νομίζω ότι κατά κάποιο τρόπο είναι
ένας νόμος που έχει δημιουργηθεί πολύ, 
207
00:14:08,160 --> 00:14:13,200
πολύ γρήγορα με τρόπο που ίσως
είναι περίεργος για εμάς στην Ευρώπη
208
00:14:14,400 --> 00:14:21,760
από μια αναφορά ανθρώπων,
που πήρε πολλές υπογραφές
209
00:14:21,760 --> 00:14:25,760
και στη συνέχεια πίεσαν το
νομοθετικό σώμα της Καλιφόρνια
210
00:14:27,120 --> 00:14:29,760
να δημιουργήσει αυτό το νόμο πολύ γρήγορα.
211
00:14:29,760 --> 00:14:34,560
Έτσι, κατά κάποιον τρόπο δεν είναι τόσο
συστηματικός, ούτε τόσο σαφής όσο ο ΓΚΠΔ.
212
00:14:35,840 --> 00:14:42,240
Προχωράει περισσότερο σε ορισμένους
τομείς που δεν βγάζουν πάντα 
213
00:14:42,240 --> 00:14:45,920
νόημα για έναν καταναλωτή επειδή
είναι κάπως δύσκολο να κατανοηθούν,
214
00:14:46,960 --> 00:14:53,680
αλλά έχουν και στοιχεία που
δεν έχουν καμία σχέση με τον ΓΚΠΔ
215
00:14:53,680 --> 00:14:57,760
ή απλά δεν προχωρούν σε βάθος.
216
00:14:57,760 --> 00:15:02,560
Έτσι, είναι λίγο διαφορετικά, αλλά θα έλεγα
ότι αυτό που είναι σημαντικό είναι, 
217
00:15:02,560 --> 00:15:05,440
κατά κάποιον τρόπο είναι
ευκολότερο για τις εταιρείες, θα έλεγα,
218
00:15:05,440 --> 00:15:10,320
να συμμορφωθούν με τον ΓΚΠΔ και στη συνέχεια απλά να πουν,
και στη συνέχεια να προσθέσουν κάποια
219
00:15:10,320 --> 00:15:17,280
σημαντικά στοιχεία του CCPA
και να ελπίζουν στην κατανόηση
220
00:15:17,280 --> 00:15:19,440
από την πλευρά του οργανισμού
επιβολής της νομοθεσίας.
221
00:15:19,440 --> 00:15:24,160
Πρέπει να πω ότι η επιβολή της
νομοθεσίας είναι αδύναμη και στην Καλιφόρνια,
222
00:15:24,880 --> 00:15:28,400
οπότε αυτό είναι δύσκολο φυσικά, ως
επαγγελματίας της προστασίας της ιδιωτικής ζωής, όταν
223
00:15:28,400 --> 00:15:31,680
προσπαθείς να εξηγήσεις στις
εταιρείες ότι είναι πραγματικά σημαντικό
224
00:15:31,680 --> 00:15:40,240
να ακολουθούν το νόμο, αλλά η
επιβολή του νόμου είναι τόσο, ναι,
225
00:15:40,240 --> 00:15:44,080
υπάρχουν τόσα πολλά ελλείμματα και υπάρχουν
τόσο λίγοι άνθρωποι που ασχολούνται με αυτό.
226
00:15:46,080 --> 00:15:51,280
Νομίζω ότι στο τέλος στην Καλιφόρνια
θα είναι, συνήθως στις ΗΠΑ,
227
00:15:52,640 --> 00:15:55,040
θα πρέπει να διεκδικηθεί
από τα δικαστήρια, κυρίως.
228
00:15:56,000 --> 00:16:00,560
Εντάξει, αλλά όπως μόλις είπατε για τις
μεγάλες τεχνολογικές επιχειρήσεις, πιθανόν, 
229
00:16:01,120 --> 00:16:05,040
δεν είναι τόσο δύσκολο και
δεν πανικοβάλλονται τόσο πολύ
230
00:16:05,040 --> 00:16:09,680
επειδή έχουν μεγάλες νομικές
ομάδες και μεγάλες τσέπες, και
231
00:16:10,800 --> 00:16:13,600
είπατε ότι ασχολείστε κυρίως
με μικρομεσαίες επιχειρήσεις
232
00:16:13,600 --> 00:16:17,120
και δεν είμαι σίγουρη αν ασχολείστε
και με μη τεχνολογικές εταιρείες, 
233
00:16:17,120 --> 00:16:23,280
αλλά ποιες είναι ιδιαίτερα οι προκλήσεις που
αντιμετωπίζουν αυτού του είδους οι εταιρείες
234
00:16:23,280 --> 00:16:26,640
και έχουν υπάρξει ιστορίες όπου έκαναν κάτι 
235
00:16:26,640 --> 00:16:32,720
που δεν γνώριζαν ότι ήταν παράνομο και στη συνέχεια
είχαν τεράστιο τέλος και πρόστιμο και ούτω καθεξής;
236
00:16:32,720 --> 00:16:40,560
Θέλω να πω, αναφέρετε μας τις δυσκολίες που συνήθως
αντιμετωπίζουν αυτού του είδους οι ΜΜΕ με τον ΓΚΠΔ.
237
00:16:41,120 --> 00:16:47,680
Ναι, λοιπόν, όπως ανέφερα
εργάζομαι ως ανεξάρτητος 
238
00:16:47,680 --> 00:16:52,080
επαγγελματίας στον τομέα της προστασίας της
ιδιωτικής ζωής, τόσο πριν ο ΓΚΠΔ γίνει εφαρμόσιμος
239
00:16:52,080 --> 00:16:57,440
αλλά και μετά, και εργάζομαι
κυρίως με μικρές εταιρείες στις ΗΠΑ.
240
00:16:57,440 --> 00:17:02,160
Έχω δει πόσο δύσκολο ήταν για τις
μικρές νεοσύστατες επιχειρήσεις εκτός Ευρώπης
241
00:17:02,160 --> 00:17:05,040
σε ένα διαφορετικό νομικό
σύστημα με διαφορετική αντίληψη 
242
00:17:05,040 --> 00:17:10,960
ή παραδοσιακή κατανόηση της ιδιωτικής
ζωής, να κατανοήσουν γιατί
243
00:17:10,960 --> 00:17:13,280
και πώς εκτίθενται στον ΓΚΠΔ.
244
00:17:14,240 --> 00:17:18,560
Η δουλειά μου, κατά κάποιον τρόπο, είναι και
νομική και τεχνική, αλλά και πολιτιστική.
245
00:17:18,560 --> 00:17:24,480
Έτσι, μεταφράζω νοοτροπίες κατά κάποιο τρόπο
και αυτό που είναι σημαντικό στην Ευρώπη
246
00:17:24,480 --> 00:17:29,520
και γιατί και επιπλέον, για μια νεοσύστατη
επιχείρηση, φυσικά, κοστίζει χρόνο και χρήμα.
247
00:17:29,520 --> 00:17:33,760
Και έτσι τους πήρε λίγο περισσότερο χρόνο,
εννοώ ότι είμαι μόνο ένας επαγγελματίας
248
00:17:33,760 --> 00:17:37,520