WEBVTT 1 00:00:00.240 --> 00:00:02.700 Anna Delaney: Hello, I'm Anna Delaney and welcome back to the 2 00:00:02.700 --> 00:00:05.790 ISMG Editors' Panel where I'm joined by three of my ISMG 3 00:00:05.790 --> 00:00:09.150 colleagues to discuss the latest trends on the cybersecurity 4 00:00:09.180 --> 00:00:13.950 landscape. Very pleased to have our SVP of editorial Tom Field, 5 00:00:13.950 --> 00:00:18.270 back in the studio. Also, Rashmi Ramesh, senior sub-editor for 6 00:00:18.270 --> 00:00:22.590 ISMG's global news desk. And our man in the EU, executive news 7 00:00:22.590 --> 00:00:29.430 editor Tony Morbin. Very, very good to see you all. It's been a 8 00:00:29.430 --> 00:00:29.970 while, Tom. 9 00:00:30.390 --> 00:00:31.500 Tom Field: It has been a while, yes. 10 00:00:32.580 --> 00:00:34.770 Anna Delaney: My question to you: are you flying right now? 11 00:00:34.770 --> 00:00:35.790 Is that your backdrop? 12 00:00:35.000 --> 00:00:37.680 Tom Field: I almost have to look around to double check. No, not 13 00:00:37.731 --> 00:00:40.715 at the moment. But I am in Atlanta, where we just concluded 14 00:00:40.766 --> 00:00:43.750 our Southeast Cybersecurity Summit yesterday. But yes, lots 15 00:00:43.800 --> 00:00:46.633 of travel over the past few weeks. And I'm sure over the 16 00:00:46.684 --> 00:00:49.820 course of this conversation, I'll share some of the gleanings. 17 00:00:50.300 --> 00:00:53.330 Anna Delaney: And how's the hotel room lighting? It looks 18 00:00:53.330 --> 00:00:55.310 great to me if that's where you are. 19 00:00:56.300 --> 00:00:58.130 Tom Field: That's the only thing this hotel has going for it. 20 00:00:58.130 --> 00:00:59.270 Yes, that's all I'll say. 21 00:01:00.800 --> 00:01:05.090 Anna Delaney: Speaking of color, Rashmi, you are very beautifully 22 00:01:05.570 --> 00:01:07.280 colorful today. Let's just say. 23 00:01:07.910 --> 00:01:11.750 Rashmi Ramesh: Thank you. It's the festive season in India. So, 24 00:01:12.140 --> 00:01:16.010 so much binge eating and shopping and socializing and 25 00:01:16.010 --> 00:01:19.970 holidays. What's not to love? Oh, in my background is a 26 00:01:19.970 --> 00:01:24.110 representation of one of the oldest flower markets in 27 00:01:24.110 --> 00:01:24.740 Bangalore. 28 00:01:25.340 --> 00:01:27.530 Anna Delaney: Ah, fascinating. Lovely. 29 00:01:27.530 --> 00:01:29.720 Tom Field: I think, I actually have visited there Rashmi. 30 00:01:30.230 --> 00:01:34.220 Rashmi Ramesh: Oh, yeah? When was this? Was a called K.R 31 00:01:34.220 --> 00:01:34.910 Market? 32 00:01:35.750 --> 00:01:37.220 Tom Field: That's probably before you were born. Let's 33 00:01:37.220 --> 00:01:37.910 leave it at that. 34 00:01:39.980 --> 00:01:41.810 Anna Delaney: Tony, sailing off into the distance? 35 00:01:44.210 --> 00:01:47.270 Tony Morbin: Not particularly promising though. It's the 36 00:01:47.270 --> 00:01:52.430 Titanic, which kind of gives you a hint for my view on 37 00:01:52.700 --> 00:01:55.610 Transatlantic data flows and where they're going. 38 00:01:58.910 --> 00:02:01.400 Anna Delaney: More to be revealed later, I am in the 39 00:02:01.400 --> 00:02:06.080 Tuileries Garden in Paris, outside the Louvre, opposite the 40 00:02:06.080 --> 00:02:10.100 Louvre, where I was last weekend. Paris is always a good 41 00:02:10.130 --> 00:02:14.120 idea. And I'm happy to inform it's as lovely as ever. And I 42 00:02:14.120 --> 00:02:17.810 did a lot of walking and cafe hopping in the glorious autumn 43 00:02:17.810 --> 00:02:23.270 sunshine. So life was good. So speaking of travel, Tom, you 44 00:02:23.270 --> 00:02:26.930 mentioned Atlanta. Tell us more about why you were there. 45 00:02:27.500 --> 00:02:29.840 Tom Field: Well, we did have our Southeast Cybersecurity 46 00:02:29.840 --> 00:02:32.150 Conference yesterday, it's the first time we've been back in 47 00:02:32.150 --> 00:02:36.200 Atlanta for one of our summits, in at least three years, maybe 48 00:02:36.200 --> 00:02:40.580 more. And, Atlanta is always a place I enjoy going because to 49 00:02:40.580 --> 00:02:44.060 me, it's got one of the tightest CISO communities that I've seen 50 00:02:44.060 --> 00:02:47.000 in the world. These are people that have worked together, 51 00:02:47.000 --> 00:02:51.200 they've been colleagues, they've been rivals, but they stay in 52 00:02:51.200 --> 00:02:55.130 touch no matter how the names on the business cards might change. 53 00:02:55.130 --> 00:02:58.910 But they stay in touch with one another, support one another and 54 00:02:58.910 --> 00:03:03.080 are great for networking and for just information sharing. So 55 00:03:03.590 --> 00:03:07.880 alright, then yesterday coincided with the start of 56 00:03:07.940 --> 00:03:11.300 baseball playoffs in Atlanta, where the Atlanta Braves are 57 00:03:11.300 --> 00:03:15.080 defending their World Series title. So we had a little bit of 58 00:03:15.080 --> 00:03:18.440 competition for attention yesterday. But the crowd that we 59 00:03:18.440 --> 00:03:21.470 attracted both to the event and of course, hybrid 60 00:03:21.530 --> 00:03:26.030 internationally, terrific engagement and great focus. I 61 00:03:26.030 --> 00:03:29.360 would say some of the highlights of our conversations were a 62 00:03:29.360 --> 00:03:32.840 panel on third-party risk, which of course rises to the top of 63 00:03:32.840 --> 00:03:37.310 everybody's priorities today. And we had good engagement from 64 00:03:37.310 --> 00:03:40.610 the folks on the stage as well as questions from the audience. 65 00:03:41.150 --> 00:03:46.670 In addition, we had a terrific incident response panel, mainly 66 00:03:46.850 --> 00:03:51.470 populated by healthcare executives. And it was moderated 67 00:03:51.470 --> 00:03:55.040 by a healthcare security executive as well. So he was 68 00:03:55.040 --> 00:03:59.570 able to really get some good input from the panelists and 69 00:03:59.570 --> 00:04:03.260 have discussions about what works, what doesn't work where 70 00:04:03.260 --> 00:04:07.220 the holes are in incident response plans now. And there 71 00:04:07.220 --> 00:04:11.660 was a discussion as well, about as much as people did or did not 72 00:04:11.660 --> 00:04:15.500 want to talk about, it about thoughts on the conviction last 73 00:04:15.500 --> 00:04:20.540 week of former Uber CISO Joe Sullivan, and the repercussions 74 00:04:20.540 --> 00:04:24.230 of that. That was a conversation you could find in lots of places 75 00:04:24.230 --> 00:04:28.610 yesterday, given this is a week beyond the court decision. And 76 00:04:28.610 --> 00:04:32.090 the session that I particularly enjoyed, I moderated a panel 77 00:04:32.120 --> 00:04:36.650 with four members of the United States Secret Service, and they 78 00:04:36.650 --> 00:04:40.610 were talking about business email compromise, which, as you 79 00:04:40.610 --> 00:04:44.450 know, has been a common theme among our summits this year when 80 00:04:44.450 --> 00:04:48.110 the Secret Service is visited with us. And so they updated on 81 00:04:48.140 --> 00:04:51.830 the billions that are being lost every year to business email 82 00:04:51.830 --> 00:04:55.850 compromise, talked about the tactics especially the 83 00:04:55.850 --> 00:04:59.660 automation is being used now by the fraudsters to pull off these 84 00:04:59.660 --> 00:05:03.290 schemes. Where organizations typically have gaps in being 85 00:05:03.290 --> 00:05:08.120 able to detect and respond, and the criticality of being able to 86 00:05:08.150 --> 00:05:11.870 respond quickly to this. And Rashmi, you know this and all 87 00:05:11.870 --> 00:05:16.400 the conversations that you have about crypto. Once the money is 88 00:05:16.400 --> 00:05:21.680 stolen in a BEC scheme, it's dispersed so quickly to so many 89 00:05:21.680 --> 00:05:25.100 different places. If you're not on that immediately, your 90 00:05:25.100 --> 00:05:28.970 chances of recovering those funds just go down dramatically. 91 00:05:29.300 --> 00:05:33.020 And so it was interesting on one hand to get this insight from 92 00:05:33.020 --> 00:05:35.510 these veteran Secret Service agents, but then to have the 93 00:05:35.510 --> 00:05:38.750 conversation that this is not what you thought you were 94 00:05:38.750 --> 00:05:42.290 getting into when you joined the Secret Service. And 15-20 years 95 00:05:42.290 --> 00:05:45.500 ago, when it was all about presidential protection and 96 00:05:45.500 --> 00:05:49.700 protecting elected officials. But there's a lot of good 97 00:05:49.700 --> 00:05:53.540 expertise there within the Secret Service now. And I admire 98 00:05:53.750 --> 00:05:56.810 them going out and establishing these partnerships and being 99 00:05:56.810 --> 00:05:59.870 present in our discussions. I even had one of the Secret 100 00:05:59.870 --> 00:06:04.340 Service agents come to my subsequent lunch roundtable 101 00:06:04.340 --> 00:06:07.970 discussion on software supply chain security. He had nothing 102 00:06:07.970 --> 00:06:10.760 to offer, but wanted to sit and listen and hear what was on the 103 00:06:10.760 --> 00:06:13.730 minds of the executives. So here's a brief overview of 104 00:06:13.730 --> 00:06:14.270 Atlanta. 105 00:06:14.870 --> 00:06:16.880 Anna Delaney: It's always great for attendees, when a member of 106 00:06:16.880 --> 00:06:19.580 the Secret Service joins a roundtable. It happened to me at 107 00:06:19.580 --> 00:06:23.990 our last summit in the U.S. I can't remember which state. Must 108 00:06:23.990 --> 00:06:24.650 be New York. 109 00:06:24.740 --> 00:06:25.610 Tom Field: I think you were in New York. 110 00:06:27.590 --> 00:06:29.960 Anna Delaney: And it was just great, because you get that 111 00:06:30.470 --> 00:06:34.610 insight, which you can't get otherwise. So that was 112 00:06:34.610 --> 00:06:38.360 wonderful. Wonderful to hear it went so well. Congratulations. I 113 00:06:38.360 --> 00:06:42.800 was watching a bit of it online as well. So what did the members 114 00:06:42.800 --> 00:06:45.800 want to know of our community? What were the sort of questions 115 00:06:45.800 --> 00:06:48.080 they were asking? Where was their focus? 116 00:06:48.170 --> 00:06:50.690 Tom Field: Well, I'd say first of all, we finally got to the 117 00:06:50.690 --> 00:06:53.900 point now, when you came over here earlier in the year, and we 118 00:06:53.900 --> 00:06:56.840 hosted the event in Chicago, for many people, that was their 119 00:06:56.840 --> 00:07:00.320 first live event since COVID. I think we're past that now. 120 00:07:00.320 --> 00:07:02.240 People have gotten out, they've been in the community, they've 121 00:07:02.240 --> 00:07:05.120 been involved, they're starting to travel a lot more. So you're 122 00:07:05.120 --> 00:07:08.810 past the novelty of getting together. Now, it's the idea 123 00:07:08.810 --> 00:07:11.720 that people do want to be talking with one another. They 124 00:07:11.720 --> 00:07:14.690 want more access to the vendors as well, to hear their 125 00:07:14.690 --> 00:07:17.840 perspectives on what are the trends that we need to be 126 00:07:17.840 --> 00:07:20.000 looking out for? And what are some of the new solutions, 127 00:07:20.330 --> 00:07:22.730 everyone is faced with the same challenges, they do not have 128 00:07:22.730 --> 00:07:26.060 enough people, they do not have the skills that they need. And 129 00:07:26.060 --> 00:07:30.080 they're dealing with automated attacks with tools that likely 130 00:07:30.080 --> 00:07:32.930 aren't automated. And so they're looking for a way to just 131 00:07:32.930 --> 00:07:35.570 enhance that detection and response. So these are the 132 00:07:35.570 --> 00:07:38.180 things that you're commonly hearing people talk about - 133 00:07:38.210 --> 00:07:41.060 managed services, software supply chain security. 134 00:07:43.010 --> 00:07:47.240 Anna Delaney: Yes, similar themes to other cities and 135 00:07:47.240 --> 00:07:49.940 countries, whether it's Bangalore, Paris or London, you 136 00:07:49.940 --> 00:07:54.110 do hear these similar challenges. Was there anything 137 00:07:54.110 --> 00:07:57.800 fresh or unique in terms of solutions that you heard 138 00:07:58.160 --> 00:08:03.050 yesterday, to help solve our cybersecurity challenges today? 139 00:08:03.380 --> 00:08:05.240 Tom Field: Oh, that's a question I wish I could to answer 140 00:08:05.240 --> 00:08:09.350 positively. No, I didn't hear anything particularly new. It's 141 00:08:09.350 --> 00:08:13.190 the same challenges in organizations trying to take 142 00:08:13.190 --> 00:08:18.080 some different approaches. And these things move slowly. But I 143 00:08:18.080 --> 00:08:20.570 think here's one thing, I would say it was a little bit 144 00:08:20.570 --> 00:08:24.260 different. Hearing a little bit less about the potential 145 00:08:24.260 --> 00:08:28.370 repercussions of Russia in Ukraine. I think there's less 146 00:08:28.370 --> 00:08:32.030 concern now about something directly coming out of that, 147 00:08:32.030 --> 00:08:33.350 because there's too much else happening. 148 00:08:34.820 --> 00:08:37.730 Anna Delaney: Yeah. Well, I'm looking forward to join you in 149 00:08:37.730 --> 00:08:39.710 Phoenix very soon. 150 00:08:39.950 --> 00:08:41.600 Tom Field: Phoenix in two weeks Look forward to seeing you 151 00:08:41.600 --> 00:08:41.960 there. 152 00:08:41.990 --> 00:08:45.140 Anna Delaney: Yeah, I'm going have to dress for desert 153 00:08:45.140 --> 00:08:52.340 weather, which we like. Rashmi, another major DeFi bridge has 154 00:08:52.340 --> 00:08:56.000 been exploited and Binance's Ethereum compatible blockchain 155 00:08:56.000 --> 00:08:59.180 is the target. Do share the latest from the crypto world. 156 00:09:00.830 --> 00:09:05.150 Rashmi Ramesh: So like I say, another day, another high. So 157 00:09:05.600 --> 00:09:09.350 Binance runs a blockchain called the Binance Smart Chain. A 158 00:09:10.040 --> 00:09:13.670 hacker exploited a vulnerability on the cross-chain bridge that 159 00:09:13.670 --> 00:09:17.900 runs on it, which is called the BSC Token Hub, to make about 160 00:09:17.930 --> 00:09:23.420 $570 million. It seems pretty run of the mill so far. But 161 00:09:23.420 --> 00:09:26.150 here's where it gets interesting. The hacker didn't 162 00:09:26.210 --> 00:09:29.030 actually steal the crypto from the blockchain in the 163 00:09:29.030 --> 00:09:33.950 traditional sense. What they did is exploit a bug that allowed 164 00:09:33.950 --> 00:09:38.450 them to mint new cryptocurrency. So Binance, to its credit, 165 00:09:38.450 --> 00:09:41.660 suspended the entire chain so that the attacker couldn't move 166 00:09:41.660 --> 00:09:45.530 money off the chain and cash out later. But this took a while 167 00:09:45.530 --> 00:09:49.910 because it's not a 100% centralized platform. So Binance 168 00:09:49.910 --> 00:09:52.910 does not actually have full control of the blockchain. It's 169 00:09:52.910 --> 00:09:58.250 run by nodes called validators who basically have to approve 170 00:09:58.250 --> 00:10:03.920 these transactions and Binance has about 26 validators across 171 00:10:03.920 --> 00:10:08.570 the world. So it took a few hours. But here's how the hack 172 00:10:08.570 --> 00:10:13.790 happened. The attacker exploited the flaw, minted two million of 173 00:10:13.790 --> 00:10:16.880 the company's native token, which was at the time valued 174 00:10:16.880 --> 00:10:21.560 about $570 million. And some experts who analyzed the attack 175 00:10:21.860 --> 00:10:24.380 said that the attacker could have walked away with a cool 176 00:10:24.380 --> 00:10:27.290 half billion, and then flooded the market with the extra tokens 177 00:10:27.290 --> 00:10:32.210 to reduce its value for legitimate users. But there's 178 00:10:32.210 --> 00:10:36.980 another twist still. Something did not go as planned. Elliptic, 179 00:10:36.980 --> 00:10:40.520 which is a Web3 security company, says that the attacker 180 00:10:40.550 --> 00:10:44.180 began to move the currency of the Binance chain, but they 181 00:10:44.180 --> 00:10:49.460 moved a majority of it to platforms that are centrally 182 00:10:49.460 --> 00:10:53.420 controlled. So all Binance had to do was contact this company 183 00:10:53.450 --> 00:10:57.230 and ask if their funds be frozen. Yay, centralization. I'm 184 00:10:57.230 --> 00:11:01.910 going to get so many hate mails for that. But Elliptic says that 185 00:11:01.940 --> 00:11:06.560 the attacker was eventually able to only get away with 10% of the 186 00:11:06.560 --> 00:11:11.420 funds, which is still a lot of free money. Binance did not 187 00:11:11.420 --> 00:11:15.470 confirm this, but it did release an emergency patch with a new 188 00:11:15.470 --> 00:11:19.550 software version and restored services. So things are back to 189 00:11:19.550 --> 00:11:21.800 normal, until the next attack. 190 00:11:23.090 --> 00:11:26.960 Anna Delaney: Yes, it's still $100 million in assets, isn't 191 00:11:26.960 --> 00:11:32.180 it? That 10%. So Binance says it will share the lessons from the 192 00:11:32.180 --> 00:11:35.900 incident and implement security measures to shore up cross-chain 193 00:11:35.900 --> 00:11:39.200 vulnerabilities, as you've said in your article. Do you get the 194 00:11:39.200 --> 00:11:42.350 sense that other exchanges are taking note and doing the same? 195 00:11:43.700 --> 00:11:49.220 Rashmi Ramesh: I'm sure. So DeFi is a very new and very niche 196 00:11:49.220 --> 00:11:54.260 segment. So a lot of it is something that people and 197 00:11:54.260 --> 00:11:58.340 companies are still discovering. And because it's grown so 198 00:11:58.370 --> 00:12:03.020 incredibly quickly, it's been a little difficult for people and 199 00:12:03.020 --> 00:12:06.140 companies to take a step back and see where the flaws are and 200 00:12:06.140 --> 00:12:11.450 how to fix it. So I know for a fact that a lot of Web3 201 00:12:11.450 --> 00:12:15.260 companies or a lot of Web3 security companies, the DeFi 202 00:12:15.260 --> 00:12:21.020 platforms themselves are trying to do the best they can. But 203 00:12:21.050 --> 00:12:24.620 cybersecurity is not something that you can master. It's a 204 00:12:24.620 --> 00:12:25.730 continuous process. 205 00:12:26.950 --> 00:12:31.630 Anna Delaney: Well said! Would you say Binance responded in an 206 00:12:31.630 --> 00:12:34.450 efficient manner? Were you pleased with what you saw? 207 00:12:36.380 --> 00:12:40.040 Rashmi Ramesh: I wouldn't go as far as saying pleased. But I 208 00:12:40.040 --> 00:12:45.530 think it did do a quick job, as quick as a decentralized 209 00:12:45.530 --> 00:12:49.250 platform possibly could, contacting the validators and 210 00:12:49.250 --> 00:12:54.950 all of that, but we'll see what really happened when the company 211 00:12:54.950 --> 00:12:58.160 releases its postmortem report, which is expected to come 212 00:12:58.160 --> 00:12:59.120 sometime this week. 213 00:12:59.570 --> 00:13:01.820 Anna Delaney: Okay. We'll look forward to discussing that with 214 00:13:01.820 --> 00:13:06.440 you. Thank you, Rashmi. Tony, moving on to the EU-U.S. data 215 00:13:06.440 --> 00:13:10.490 flow agreement. It's all about the Titanic. So, U.S. President 216 00:13:10.490 --> 00:13:13.970 Joe Biden signed an executive order last week setting up a new 217 00:13:13.970 --> 00:13:17.390 legal framework for personal data transfers between the EU 218 00:13:17.390 --> 00:13:22.850 and the U.S. Hot stuff. So this seems to have received a mixed 219 00:13:22.880 --> 00:13:26.420 reaction, would you say in the privacy world. Can you talk us 220 00:13:26.420 --> 00:13:27.950 through the proposed changes? 221 00:13:27.000 --> 00:13:29.040 Tony Morbin: Okay, well, we're often told about how data is the 222 00:13:30.180 --> 00:13:35.940 new oil and flows of data between the U.S. and EU did 223 00:13:35.940 --> 00:13:40.800 underpin services worth $264 billion in 2020. And the EU 224 00:13:40.800 --> 00:13:44.250 estimates that a loss of cross border data flows on exports 225 00:13:44.250 --> 00:13:48.120 from data reliance sectors would end up with a reduction in EU 226 00:13:48.330 --> 00:13:53.070 gross domestic product of 330 billion euros annually. So it's 227 00:13:53.070 --> 00:13:57.000 a big deal. But there's a fundamental problem and that the 228 00:13:57.000 --> 00:14:01.590 EU and specifically its European privacy advocates, are objecting 229 00:14:01.590 --> 00:14:05.580 to the way the U.S. handles the data of non-U.S. citizens. 230 00:14:06.570 --> 00:14:10.530 Really goes right back to the revelations of NSA contractor, 231 00:14:10.530 --> 00:14:13.770 Edward Snowden, that the U.S. was effectively spying on 232 00:14:13.770 --> 00:14:18.390 everyone. An Austrian privacy activist Max Schrems 233 00:14:18.540 --> 00:14:23.940 successfully brought down the Safe Harbor Agreement, the rules 234 00:14:23.940 --> 00:14:28.350 under which data transfers have been conducted since about 2000. 235 00:14:28.920 --> 00:14:33.630 So a new EU-U.S. data privacy framework - Privacy Shield - 236 00:14:33.660 --> 00:14:37.080 took its place, and that had increased protection for data 237 00:14:37.080 --> 00:14:41.550 privacy rights of Europeans. But they too were found to be 238 00:14:41.610 --> 00:14:45.450 inadequate, and eventually that was thrown out as well. This is 239 00:14:45.450 --> 00:14:50.490 by the European Court of Justice. And as you say, now the 240 00:14:50.550 --> 00:14:53.430 U.S. President Joe Biden has issued an executive order 241 00:14:54.000 --> 00:14:57.360 enhancing safeguards for the United States signals 242 00:14:57.360 --> 00:15:01.200 intelligence activities. He's also signed a national security 243 00:15:01.200 --> 00:15:04.440 memorandum, establishing new safeguards on signals 244 00:15:04.440 --> 00:15:07.950 intelligence gathering. And he's also creating a tribunal within 245 00:15:07.950 --> 00:15:11.130 the Justice Department to deal with redress for complaints, 246 00:15:12.240 --> 00:15:17.880 which is intended to resolve any EU-U.S. data transfer issues 247 00:15:17.880 --> 00:15:20.700 when people sort of object to the fact that they've been 248 00:15:20.700 --> 00:15:25.710 surveilled, or whatever it might be. Now, a senior administration 249 00:15:25.740 --> 00:15:28.650 official said at the launch: "We're confident that this 250 00:15:28.650 --> 00:15:31.950 addresses the concerns expressed in the court's opinion, but 251 00:15:31.950 --> 00:15:34.890 obviously, we can't predict the outcome of any legal challenges 252 00:15:34.890 --> 00:15:39.510 that might occur in the future." You kind of can predict, and the 253 00:15:39.510 --> 00:15:42.660 first indications from Schrems - the guy who scuppered the two 254 00:15:42.660 --> 00:15:46.560 previous agreements - his comment was "at first sight, it 255 00:15:46.560 --> 00:15:49.470 seems the core issues were not resolved, and it will be back to 256 00:15:49.470 --> 00:15:55.800 the CJEU sooner or later." Now the problem for Europeans is 257 00:15:55.800 --> 00:16:00.090 that the U.S. Fourth Amendment enshrines a right to privacy and 258 00:16:00.090 --> 00:16:03.150 requires probable cause and judicial approval for any 259 00:16:03.150 --> 00:16:06.900 wiretap. But this only applies to U.S. citizens or permanent 260 00:16:06.900 --> 00:16:11.640 residents. In Europe, the CJEU - the Court of Justice for the 261 00:16:11.640 --> 00:16:14.730 European Union - requires that all surveillance must be 262 00:16:14.730 --> 00:16:17.760 targeted, and there must be judicial approval or review 263 00:16:17.790 --> 00:16:22.230 under the EU's Charter of Fundamental Rights. Now Schrems' 264 00:16:22.320 --> 00:16:25.830 view is that the only difference is that while the EU seeks 265 00:16:25.830 --> 00:16:29.280 privacy as a human right, that applies to any human, the Fourth 266 00:16:29.280 --> 00:16:31.950 Amendment only applies to U.S. citizens and permanent 267 00:16:31.950 --> 00:16:36.420 residents. And he says that, in the view of the U.S., Europeans 268 00:16:36.420 --> 00:16:40.500 have no privacy rights as U.S. laws allow surveillance that is 269 00:16:40.500 --> 00:16:42.900 illegal under the Fourth Amendment, so long as no 270 00:16:42.900 --> 00:16:47.550 Americans are targeted. He goes on to say that it seems that the 271 00:16:47.550 --> 00:16:51.660 EU and the U.S. agreed to copy the words necessary and 272 00:16:51.660 --> 00:16:56.130 proportionate into the executive order. But they didn't have the 273 00:16:56.220 --> 00:17:00.870 same agreement on what the legal meaning was going to be. If it 274 00:17:00.870 --> 00:17:03.360 did have the same meaning the U.S. would have had to 275 00:17:03.600 --> 00:17:06.690 fundamentally limit its mass surveillance systems to comply 276 00:17:06.690 --> 00:17:11.610 with EU understanding of what is proportionate surveillance, and 277 00:17:11.670 --> 00:17:14.940 intelligence gathering limitations agreed by the U.S., 278 00:17:15.090 --> 00:17:19.650 in his view, just don't go far enough. The second issue is 279 00:17:19.650 --> 00:17:23.460 who's responsible for redress? A senior administration official 280 00:17:23.460 --> 00:17:28.020 said what you'll see with this (tribunal) is a far more 281 00:17:28.020 --> 00:17:30.780 independent tribunal with the backing of the Attorney General 282 00:17:30.780 --> 00:17:34.230 when it comes to enforcement. Whereas Schrems described the 283 00:17:34.230 --> 00:17:37.110 tribunal and the Justice Department as simply not a 284 00:17:37.110 --> 00:17:41.340 court, saying the charter has a clear requirement for judicial 285 00:17:41.340 --> 00:17:45.300 redress. Not just renaming of some complaints body as a court. 286 00:17:45.450 --> 00:17:49.980 That doesn't make it an actual court. A court or this 287 00:17:49.980 --> 00:17:53.640 particular tribunal has already said that it will respond to any 288 00:17:53.640 --> 00:17:57.450 complaint, effectively no matter what your argument or case by 289 00:17:57.450 --> 00:18:02.280 saying the review either did not identify any covered violations 290 00:18:02.310 --> 00:18:06.150 of the data protection review court issued and issued a 291 00:18:06.150 --> 00:18:10.680 determination requiring appropriate remediation. So 292 00:18:11.100 --> 00:18:15.000 that's kind of the situation with the U.S. and EU. In 293 00:18:15.000 --> 00:18:18.660 contrast, also this month, the U.K. brought into force, an 294 00:18:18.660 --> 00:18:21.630 agreement with the U.S. on access to electronic data for 295 00:18:21.630 --> 00:18:24.840 the purpose of countering serious crime - the Data Access 296 00:18:24.840 --> 00:18:28.950 Agreement. The U.S. government says our agreement will maintain 297 00:18:28.950 --> 00:18:31.620 the strong oversight and protections that our citizens 298 00:18:31.620 --> 00:18:34.530 enjoy and does not compromise or erode the human rights and 299 00:18:34.530 --> 00:18:37.830 freedoms that our nations cherish and share. But the 300 00:18:37.830 --> 00:18:40.860 purpose is to allow information and evidence held by service 301 00:18:40.860 --> 00:18:44.310 providers in each country relating to prevention, 302 00:18:44.310 --> 00:18:47.820 detection, investigation or prosecution of serious crime to 303 00:18:47.820 --> 00:18:51.420 be accessed more quickly than ever before. So it remains to be 304 00:18:51.420 --> 00:18:56.250 seen whether this also will impact U.K.-EU data transfers. 305 00:18:57.180 --> 00:18:59.700 We're talking really big business, really important 306 00:18:59.700 --> 00:19:04.230 stuff. And it doesn't appear as though we're all working on a 307 00:19:04.230 --> 00:19:05.520 level playing field at the moment. 308 00:19:05.960 --> 00:19:08.390 Anna Delaney: Well, that's a very thorough overview, Tony. So 309 00:19:08.570 --> 00:19:11.900 I'm guessing by your backdrop, we're going to hit an iceberg 310 00:19:11.900 --> 00:19:12.200 soon. 311 00:19:12.830 --> 00:19:19.460 Tony Morbin: Well, yeah, I mean, the EU is wanting to approve 312 00:19:19.460 --> 00:19:23.510 Biden's new regulations, and it probably will do, but then 313 00:19:23.510 --> 00:19:28.790 that's subject to objection. And there will be an objection from 314 00:19:28.790 --> 00:19:33.890 Schrems. And on the basis that you can see so far, it looks as 315 00:19:33.890 --> 00:19:37.520 though that objection will be upheld. So we're probably 316 00:19:37.520 --> 00:19:40.400 looking at hitting the iceberg in 2023. 317 00:19:41.480 --> 00:19:42.290 Anna Delaney: Long road ahead. 318 00:19:43.520 --> 00:19:45.980 Tony Morbin: While to go, and ultimately, it does have to be 319 00:19:45.980 --> 00:19:49.730 resolved because it's just too important. We've got people like 320 00:19:50.240 --> 00:19:53.780 Google and Meta saying they're going to have to pull out of 321 00:19:53.780 --> 00:19:59.120 Europe if they are not able to get this sorted because it's to 322 00:19:59.120 --> 00:20:03.410 do with where they store data. It effectively means you can't 323 00:20:03.410 --> 00:20:04.370 be a cloud provider. 324 00:20:04.580 --> 00:20:07.430 Anna Delaney: Yeah. And all the other organizations of course, 325 00:20:07.430 --> 00:20:11.810 some of the smaller organizations and medium. yeah. 326 00:20:11.840 --> 00:20:14.270 Tony Morbin: And they've had interim kind of agreements with 327 00:20:14.720 --> 00:20:17.840 sort of country by country contracts of agreement and so 328 00:20:17.840 --> 00:20:20.540 on, but it makes everything a lot more complicated. 329 00:20:21.890 --> 00:20:25.790 Anna Delaney: Thank you, Tony. Well, final question. You have 330 00:20:25.790 --> 00:20:29.300 created a new phishing training awareness program. What would 331 00:20:29.300 --> 00:20:31.160 you call it? Tom is ready. 332 00:20:31.360 --> 00:20:35.260 Tom Field: I have an idea. Go on. Okay, just like KnowBe4 has 333 00:20:35.260 --> 00:20:39.220 got Kevin Mitnick as sort of its celebrity spokesperson and chief 334 00:20:39.220 --> 00:20:44.650 hacking officer, I am going to enlist one of your countrymen 335 00:20:45.040 --> 00:20:50.650 Anna to work for my organization, and it's going to 336 00:20:50.650 --> 00:20:54.610 be called FPS - For Phish Sake. 337 00:20:56.170 --> 00:21:05.020 Anna Delaney: Very good. Rashmi, any thoughts? 338 00:21:05.840 --> 00:21:09.980 Rashmi Ramesh: I'm going to be very cliche and name it: Don't 339 00:21:10.010 --> 00:21:14.930 take the phishing bait, with a fish and a red line crossing is 340 00:21:14.930 --> 00:21:15.470 my logo. 341 00:21:16.010 --> 00:21:18.830 Anna Delaney: I love it. You've even got the graphics in place. 342 00:21:19.430 --> 00:21:21.080 One step ahead. 343 00:21:20.010 --> 00:21:20.190 Tom Field: There's a 344 00:21:24.240 --> 00:21:26.730 Anna Delaney: Tony, no icebergs in this one? 345 00:21:26.820 --> 00:21:29.670 Tony Morbin: No, I mean, I am simply going to go for Gotcha 346 00:21:29.700 --> 00:21:33.240 because they're going to get you. People will fall for it. 347 00:21:33.300 --> 00:21:37.830 And my favorite one was the one where the email went out saying 348 00:21:38.190 --> 00:21:41.460 we've had a thief stealing things from the fridge in the 349 00:21:41.460 --> 00:21:45.600 kitchen. And we have a camera there to check and catch who's 350 00:21:45.600 --> 00:21:49.140 the next person and everybody clicked on to the camera to see 351 00:21:49.230 --> 00:21:50.310 you in the fridge. 352 00:21:52.400 --> 00:21:55.730 Anna Delaney: Food always gets here. So I'm going to call my 353 00:21:55.760 --> 00:22:00.380 awareness training Lumpsucker Awareness Training. I'm going 354 00:22:00.380 --> 00:22:02.510 quite literal here. Apparently there is a fish called 355 00:22:02.540 --> 00:22:07.880 lumpsucker. But it's actually quite cute looking. So lump 356 00:22:08.030 --> 00:22:11.720 comes from the round shape and sucker for its ability to 357 00:22:11.720 --> 00:22:16.070 suction, rocks and kelp. So a bit of deception here; looks 358 00:22:16.070 --> 00:22:22.820 cute, names sounds ugly, suitabIe phishing deception all 359 00:22:23.510 --> 00:22:29.870 you know melded into one. Well, thank you very much everybody 360 00:22:29.870 --> 00:22:34.430 for your creativity and your great insights. I've really 361 00:22:34.430 --> 00:22:37.100 enjoyed this Tom, Tony and Rashmi. Thank you very much. 362 00:22:37.310 --> 00:22:37.700 Tom Field: Thank you. 363 00:22:37.730 --> 00:22:38.180 Tony Morbin: Thank you. 364 00:22:38.180 --> 00:22:38.450 Rashmi Ramesh: Thanks, Anna. 365 00:22:39.440 --> 00:22:41.540 Anna Delaney: Thank you so much for watching. Until next time.