WEBVTT 1 00:00:01.860 --> 00:00:04.170 Anna Delaney: Hello, welcome to the ISMG Editors' Panel on day 2 00:00:04.170 --> 00:00:08.250 two of RSA Conference 2023. My name is Anna Delaney. I'm joined 3 00:00:08.250 --> 00:00:11.820 by my colleagues, Mathew Schwartz and Tom Field. We 4 00:00:11.820 --> 00:00:13.890 survived day one. How was it for you? 5 00:00:13.920 --> 00:00:14.940 Tom Field: And we're going to do day two. 6 00:00:15.090 --> 00:00:17.970 Anna Delaney: Yes. Well, I hope so. Yeah. So tell me about day 7 00:00:17.970 --> 00:00:21.600 one. What's the feeling so far from the conference? 8 00:00:22.530 --> 00:00:24.570 Tom Field: It's exhilarating to be here and to seeing people 9 00:00:24.570 --> 00:00:28.020 again. And whether it's sitting in here and talking with our 10 00:00:28.020 --> 00:00:30.810 guests, or standing out in the hall, having people come by you 11 00:00:30.810 --> 00:00:33.480 haven't seen for three years or two years or whatever. It's just 12 00:00:33.480 --> 00:00:36.030 good to be back into the community. And yes, I think we 13 00:00:36.030 --> 00:00:38.460 are seeing some common themes in the discussions I've had 14 00:00:38.460 --> 00:00:43.290 certainly. We've talked about, you know, AI has come up. And 15 00:00:43.290 --> 00:00:48.330 also a fair amount of talk about cloud security and OT security. 16 00:00:48.330 --> 00:00:50.370 I think those are highlights of the conversations I've had so 17 00:00:50.370 --> 00:00:50.640 far. 18 00:00:51.120 --> 00:00:55.140 Anna Delaney: Tom, AI, for ages, people have been saying AI, the 19 00:00:55.140 --> 00:01:00.000 buzzword, ML the buzzword. Are you hearing concrete takeaways 20 00:01:00.000 --> 00:01:02.730 about this generative AI? 21 00:01:03.300 --> 00:01:06.930 Tom Field: Not specifically. I would say what I'm hearing is 22 00:01:06.930 --> 00:01:11.730 backlash against the marketing buzz that has been ChatGPT. And 23 00:01:11.730 --> 00:01:15.900 people here, trying to maybe throw a little bit of cold water 24 00:01:16.110 --> 00:01:20.310 and get down to the real conversation and about how this 25 00:01:20.340 --> 00:01:24.480 should impact organizations, instead of just the rush to 26 00:01:24.480 --> 00:01:28.050 adopt or the rush to ban. Let's have a rational conversation 27 00:01:28.050 --> 00:01:28.410 about it. 28 00:01:28.590 --> 00:01:30.930 Anna Delaney: And overall, is the feeling positive for 29 00:01:30.930 --> 00:01:31.680 defenders? 30 00:01:33.180 --> 00:01:33.750 Tom Field: I think so. 31 00:01:35.100 --> 00:01:38.280 Anna Delaney: Feedback from yesterday, and the feeling and 32 00:01:38.400 --> 00:01:39.450 being back at the conference ... 33 00:01:39.450 --> 00:01:41.160 Mathew Schwartz: I'll second what Tom said about the sense of 34 00:01:41.160 --> 00:01:43.890 community, I think there is almost a sense of relief for 35 00:01:43.890 --> 00:01:47.940 people to be able to be back at an event like this, where you're 36 00:01:47.940 --> 00:01:51.690 not just dialing in virtually so to speak, but actually getting 37 00:01:51.690 --> 00:01:54.360 to see so many people that you haven't seen in so long. So I 38 00:01:54.360 --> 00:01:58.920 think that's wonderful sense of energy and exhilaration. It's 39 00:01:58.920 --> 00:02:02.430 great being back in our studio here. Love it. Also over at 40 00:02:02.430 --> 00:02:06.480 Broadcast Alley, there's a wonderful buzz. As you know, 41 00:02:06.480 --> 00:02:09.570 we're seeing people streamed through there. And there was a 42 00:02:09.570 --> 00:02:12.420 good showing I thought on the first day, a lot of energy, 43 00:02:12.420 --> 00:02:15.030 people seeing what's going on, and some great interviews that 44 00:02:15.030 --> 00:02:15.390 we had. 45 00:02:15.570 --> 00:02:17.310 Anna Delaney: And apparently on the conference hall, lots of 46 00:02:17.310 --> 00:02:18.360 activity and buzz. 47 00:02:18.510 --> 00:02:20.370 Tom Field: That's right here. There's people come in here. 48 00:02:20.370 --> 00:02:21.120 That's what they tell us. 49 00:02:22.110 --> 00:02:24.060 Anna Delaney: Were there any takeaways from your interviews, 50 00:02:24.180 --> 00:02:25.890 the sessions you saw? 51 00:02:26.640 --> 00:02:28.770 Mathew Schwartz: Yes. So I didn't do any sessions 52 00:02:29.190 --> 00:02:33.090 yesterday. But I had some great interviews, looking at how we 53 00:02:33.090 --> 00:02:38.280 defend better against attacks, and a lot of level setting, if I 54 00:02:38.280 --> 00:02:41.580 can use that term, in terms of how come we're still seeing all 55 00:02:41.580 --> 00:02:44.940 these attacks, we have all this investment. We're doing a good 56 00:02:44.940 --> 00:02:48.030 job, I think. And several of the people I spoke to said, you 57 00:02:48.030 --> 00:02:51.540 know, we need to emphasize that fact. We're not being static, we 58 00:02:51.540 --> 00:02:54.780 are bringing a better response to bear. But the attackers are 59 00:02:54.780 --> 00:02:57.990 very savvy. They're very innovative. And we see that in 60 00:02:57.990 --> 00:03:01.200 terms of ransomware attacks. Of course, that's not all we're 61 00:03:01.200 --> 00:03:04.890 seeing. But as a lot of what is being seen is what I'm 62 00:03:04.890 --> 00:03:08.670 continuing to hear from the organizations, I mean, such as 63 00:03:08.670 --> 00:03:12.330 Sophos, for example, such as IBM Security, that are working with 64 00:03:12.330 --> 00:03:15.150 organizations helping them respond to incidents. In the 65 00:03:15.150 --> 00:03:18.060 case of IBM, do we manage detection and response as well, 66 00:03:18.270 --> 00:03:20.610 just seeing what's going on? Obviously, as we know, 67 00:03:20.610 --> 00:03:24.120 cybercrime hugely profitable, unfortunately. And the attackers 68 00:03:24.120 --> 00:03:28.680 are finding new ways to get in, unfortunately. So how do we deal 69 00:03:28.680 --> 00:03:31.380 with that? It's an ever present theme, and the answers are 70 00:03:31.410 --> 00:03:32.220 different this year. 71 00:03:33.390 --> 00:03:36.480 Anna Delaney: Tom, you speak with Alberto Yepez. He always 72 00:03:36.480 --> 00:03:39.210 gives you the sort of the forecast of the year ahead, the 73 00:03:39.210 --> 00:03:43.020 industry forecast. Did you get a sense of where we're going from 74 00:03:43.020 --> 00:03:43.770 what he said? 75 00:03:43.830 --> 00:03:45.750 Tom Field: I am going to say I'm a parent. So I love all my 76 00:03:45.750 --> 00:03:48.510 interviews, and all my interviewees. They're all equal. 77 00:03:48.510 --> 00:03:52.470 But I would single out Alberto for some of the insight because 78 00:03:52.470 --> 00:03:57.750 we talked about how now in economic uncertainty is the time 79 00:03:57.780 --> 00:04:02.100 when you build innovation. This is the time when companies start 80 00:04:02.160 --> 00:04:05.790 and rise. And he reminds me that we saw that in 2008, companies 81 00:04:05.790 --> 00:04:09.750 such as CrowdStrike came out of the economic downturn of 2008. 82 00:04:09.900 --> 00:04:13.170 And we can look forward to that, you know, in the uncertainty we 83 00:04:13.170 --> 00:04:16.710 have now. And I buy that, because the cybersecurity 84 00:04:16.740 --> 00:04:21.420 concerns haven't diminished, they've increased. There's a 85 00:04:21.420 --> 00:04:25.530 national global urgency to respond to the threats and the 86 00:04:25.530 --> 00:04:28.410 threat actors that we see. Sometimes I buy into that and I 87 00:04:28.410 --> 00:04:30.750 think there's much to look forward to in the year ahead. 88 00:04:31.020 --> 00:04:34.290 Might not come from companies whose names we know today, but 89 00:04:34.290 --> 00:04:36.510 might come out of companies that are born today. 90 00:04:37.620 --> 00:04:40.590 Anna Delaney: So were there any surprises yesterday, anything 91 00:04:40.590 --> 00:04:43.650 that stood out as particularly, like okay, that's different? 92 00:04:44.070 --> 00:04:46.500 Mathew Schwartz: Well, the lack of ChatGPT in my discussions. It 93 00:04:46.500 --> 00:04:48.600 was a welcome surprise. I mean there was a little bit of 94 00:04:49.470 --> 00:04:52.650 discussion. You know, that's luck. I suppose the dice weren't 95 00:04:52.650 --> 00:04:56.040 thrown, but there wasn't too much discussion either of AI and 96 00:04:56.040 --> 00:04:59.790 ML and when there was there was a bit of, I think, it was it was 97 00:05:00.180 --> 00:05:04.140 possibly presented not as a savior, but as a tool that is 98 00:05:04.140 --> 00:05:07.710 growing in usefulness, but it's not going to do everything that 99 00:05:07.710 --> 00:05:07.890 we do. 100 00:05:08.520 --> 00:05:09.990 Tom Field: I'll give you a little bit of tension that came 101 00:05:09.990 --> 00:05:11.820 out of a couple of interviews, and then we're together, but 102 00:05:11.820 --> 00:05:15.540 just the dialogue ahead, talking with Alberto Yepez about the 103 00:05:15.750 --> 00:05:20.160 National Cybersecurity Strategy, when it comes to the tenant of 104 00:05:20.160 --> 00:05:24.180 putting more accountability on the industry, that makes him a 105 00:05:24.180 --> 00:05:28.230 little bit nervous, because how do we do that? But then talking 106 00:05:28.230 --> 00:05:32.010 to Eric Goldstein of CISA, about the same topic is something he's 107 00:05:32.010 --> 00:05:34.230 very bullish about it. How can we not do that? 108 00:05:34.260 --> 00:05:36.030 Mathew Schwartz: Yeah, break the regulations. Perhaps! 109 00:05:36.420 --> 00:05:38.580 Tom Field: We're in the place where I want to be in some of 110 00:05:38.580 --> 00:05:41.340 those conversations right now. I want to be talking with some of 111 00:05:41.340 --> 00:05:44.550 these industry leaders about it. The sentiment is that we're 112 00:05:44.550 --> 00:05:47.160 going to put accountability on you just like, you know, 113 00:05:47.160 --> 00:05:51.060 Firestone can't sell unsafe tires for a car, and we can't 114 00:05:51.060 --> 00:05:54.930 put airplanes up in the sky that have got any sort of issues. How 115 00:05:54.930 --> 00:05:59.850 can we put software out into the ecosystem? I like to have some 116 00:05:59.850 --> 00:06:01.830 of these conversations. This is the place it's going to happen 117 00:06:01.830 --> 00:06:03.990 this week. And I think some of these tensions are going to 118 00:06:03.990 --> 00:06:04.410 emerge. 119 00:06:04.950 --> 00:06:07.020 Mathew Schwartz: Yeah. Adam Isles, the principal at the 120 00:06:07.020 --> 00:06:10.530 Chertoff Group, made a great point. He said, you can look at 121 00:06:10.530 --> 00:06:13.890 the sense of urgency that is out there about all of these 122 00:06:13.920 --> 00:06:17.280 cybersecurity issues that we're facing, by counting all of the 123 00:06:17.280 --> 00:06:21.840 regulations or the government efforts aimed at dealing with 124 00:06:21.840 --> 00:06:24.810 it. But he said that creates some fatigue, of course. How do 125 00:06:24.810 --> 00:06:27.090 you reconcile all these things, you've got the Biden 126 00:06:27.540 --> 00:06:30.060 Administration's National Cybersecurity Strategy, for 127 00:06:30.060 --> 00:06:32.610 example. You've got a lot of things coming out of CISA now, 128 00:06:32.850 --> 00:06:35.610 for example, and that's just the United States, of course. That 129 00:06:35.610 --> 00:06:38.100 is adding a little bit of pressure on cybersecurity 130 00:06:38.100 --> 00:06:41.040 professionals. I mean, this is what happens though, as we get 131 00:06:41.040 --> 00:06:44.820 to where we need to get to. So I thought that was a great point. 132 00:06:45.060 --> 00:06:47.880 I also had a really fun discussion with Winn Schwartau , 133 00:06:48.120 --> 00:06:53.550 a longtime deep thinker, the gentleman who coins electronic 134 00:06:53.610 --> 00:06:57.240 Pearl Harbor, for example, had a seminal book on information 135 00:06:57.240 --> 00:07:02.100 warfare that came out. I forget 20 years ago, maybe perhaps, and 136 00:07:02.760 --> 00:07:05.280 correctly forecasted a lot of the issues we're dealing with 137 00:07:05.280 --> 00:07:09.480 today. Really fascinating, deep dive into what he says is the 138 00:07:09.480 --> 00:07:14.460 threat posed by the metaverse, not just Facebook's VR headset, 139 00:07:14.460 --> 00:07:18.960 kind of kerfuffle. But he has a more expansive definition of 140 00:07:19.200 --> 00:07:22.710 anything that can alter the way that you interact with or I 141 00:07:22.710 --> 00:07:25.950 think, think or feel about things in the sense of if we 142 00:07:25.950 --> 00:07:27.990 have systems where if you look at the reality around you as 143 00:07:27.990 --> 00:07:33.150 augmenting it, for example, or I'll leave it at that. But his 144 00:07:33.150 --> 00:07:35.160 point is that we're having better and better and better 145 00:07:35.160 --> 00:07:39.600 technology to the point where it can make us feel like what is 146 00:07:39.600 --> 00:07:42.810 reality is not reality? What happens when people get in there 147 00:07:42.810 --> 00:07:45.510 and start to mess with that? What sorts of risks are we 148 00:07:45.510 --> 00:07:48.930 facing? What if they can predict, and you got these 149 00:07:48.930 --> 00:07:51.960 sensors on you, that gives some indication of how you're feeling 150 00:07:51.990 --> 00:07:54.120 or what your body's doing? How can they predict how you're 151 00:07:54.120 --> 00:07:56.640 going to react to different situations? How could that be 152 00:07:56.640 --> 00:08:00.900 used against you? So some kind of forward thinking deep 153 00:08:00.900 --> 00:08:03.600 thoughts. But it's fun to go there. 154 00:08:03.600 --> 00:08:06.780 Tom Field: There's been a buzzword for it. It came out of 155 00:08:06.780 --> 00:08:08.850 the conversation with Alberto Yepez. You all have heard the 156 00:08:08.850 --> 00:08:12.930 term "shift left." Well, now in the age of cloud migration, the 157 00:08:12.930 --> 00:08:17.100 term is becoming, and apparently it's trademarked - shift up. So 158 00:08:17.130 --> 00:08:19.380 there you go. The t-shirt will be issued. 159 00:08:21.330 --> 00:08:23.100 Anna Delaney: So let's look at today. What are you looking 160 00:08:23.100 --> 00:08:23.730 forward to? 161 00:08:24.510 --> 00:08:26.190 Mathew Schwartz: I am really looking forward to the 162 00:08:26.190 --> 00:08:28.920 Cryptographers' Panel and I go to my notes here, so I don't get 163 00:08:28.920 --> 00:08:32.550 anyone's names wrong. But Whitfield Diffie is going to be 164 00:08:32.580 --> 00:08:36.120 moderating; seminal figure. That should be fun because he's 165 00:08:36.120 --> 00:08:39.870 usually the cantankerous one. In terms of the good cop, bad cop. 166 00:08:39.870 --> 00:08:42.090 Tom Field: The Mount Rushmore of cryptographers. He's right on 167 00:08:42.090 --> 00:08:42.150 it. 168 00:08:42.150 --> 00:08:44.370 Mathew Schwartz: He's right on it, definitely. We've got 169 00:08:44.400 --> 00:08:48.300 wonderful other people as well. Radia Perlman, distinguished 170 00:08:48.300 --> 00:08:51.330 fellow at Dell. She's great. She's been on it before. And one 171 00:08:51.330 --> 00:08:53.550 of my favorites, although I don't play favorites, Adi 172 00:08:53.550 --> 00:08:59.790 Shamir, the S in RSA. Again, just wonderfully outspoken in 173 00:08:59.790 --> 00:09:03.840 past years - things like blockchain. I just remember Adi 174 00:09:03.840 --> 00:09:08.010 saying 95%, maybe it was 98%, of what they're proposing to solve 175 00:09:08.010 --> 00:09:10.410 with the blockchain can be solved easier and better using 176 00:09:10.560 --> 00:09:14.880 other things. So they're not married to the buzzwords. I 177 00:09:14.880 --> 00:09:17.040 think we're going to be hearing about cryptocurrencies, for 178 00:09:17.040 --> 00:09:22.890 example, quantum computing, no doubt, ChatGPT. And I just love 179 00:09:22.890 --> 00:09:26.400 the kind of breath of fresh air they bring to what can be an 180 00:09:27.000 --> 00:09:30.600 often awfully buzzwordy event. 181 00:09:31.800 --> 00:09:34.440 Tom Field: And I would say along the lines of fresh air, we're 182 00:09:34.440 --> 00:09:38.250 going to be joined in the studio today by Art Coviello, the 183 00:09:38.250 --> 00:09:42.960 former chair of RSA - the company, and in his retirement 184 00:09:42.960 --> 00:09:46.320 from RSA, he's become very outspoken, and I think very 185 00:09:46.320 --> 00:09:49.620 circumspect on the industry, and he has nothing to hold back. So 186 00:09:49.620 --> 00:09:52.350 it's always a terrific conversation. And along the same 187 00:09:52.350 --> 00:09:55.530 lines, we've got Hugh Thompson of the RSA Conference, who will 188 00:09:55.530 --> 00:09:58.380 be coming in and he's been a big part of what's been the 189 00:09:58.380 --> 00:10:01.320 programming this year and can talk about that and about maybe 190 00:10:01.320 --> 00:10:02.280 the future of this event. 191 00:10:02.610 --> 00:10:04.920 Anna Delaney: Very good. Well, I look forward to watching these 192 00:10:04.950 --> 00:10:07.440 interviews and, you know, hearing back from the 193 00:10:08.160 --> 00:10:09.780 Cryptographers' Panel. 194 00:10:09.960 --> 00:10:11.400 Mathew Schwartz: Say that 10 times quickly, Anna. 195 00:10:13.980 --> 00:10:15.960 Anna Delaney: Well, thank you. And thank you so much for 196 00:10:15.960 --> 00:10:17.970 watching. Until next time, until tomorrow.