WEBVTT 1 00:00:00.000 --> 00:00:03.090 Tom Field: Hi there, I'm Tom Field. I'm senior vice president 2 00:00:03.090 --> 00:00:05.940 of editorial with Information Security Media Group. Talking 3 00:00:05.940 --> 00:00:09.780 today about the state of the cybersecurity industry, who else 4 00:00:09.780 --> 00:00:13.830 would I ask about that? My old friend, Alberto Yépez. He is the 5 00:00:13.830 --> 00:00:17.700 co-founder and managing director of Forgepoint Capital. Alberto, 6 00:00:17.700 --> 00:00:19.710 the first time in three years, we've been able to do this 7 00:00:19.710 --> 00:00:20.430 face-to-face. 8 00:00:20.460 --> 00:00:22.920 Alberto Yépez: It's a pleasure to see you again. And I'm so 9 00:00:22.920 --> 00:00:26.640 happy with the success ISMG is going through. Here now you have 10 00:00:26.640 --> 00:00:27.960 about 300 people now? 11 00:00:28.440 --> 00:00:29.040 Tom Field: All here. 12 00:00:29.670 --> 00:00:30.810 Alberto Yépez: Oh, here, really? 13 00:00:31.140 --> 00:00:33.120 Tom Field: The company has grown tremendously. 14 00:00:33.210 --> 00:00:35.940 Alberto Yépez: And you are covering India, Israel, of 15 00:00:35.940 --> 00:00:37.620 course in Americas and ... 16 00:00:37.620 --> 00:00:38.490 Tom Field: America as well, yeah 17 00:00:38.520 --> 00:00:38.880 Alberto Yépez: Europe 18 00:00:39.090 --> 00:00:41.400 Tom Field: Nine different brands of the company, we've come far. 19 00:00:41.430 --> 00:00:45.270 But so is the cybersecurity industry since the last time we 20 00:00:45.270 --> 00:00:48.030 sat down to talk, we've seen ups, we've seen downs. Since 21 00:00:48.030 --> 00:00:50.700 this year has started, we've seen some companies have left 22 00:00:50.700 --> 00:00:53.160 the market. We've seen some layoffs, we've seen some 23 00:00:53.160 --> 00:00:59.160 consolidation. Okay, give me your diagnosis here, what's the 24 00:00:59.160 --> 00:01:01.110 state of the cybersecurity industry and I know you're going 25 00:01:01.110 --> 00:01:01.950 to be optimistic. 26 00:01:02.520 --> 00:01:04.620 Alberto Yépez: I have to, because I think this is the best 27 00:01:04.620 --> 00:01:07.980 time to start a company. Think about during the downturns and 28 00:01:07.980 --> 00:01:12.210 you kind of correlate when Palo Alto, CrowdStrike, you know, 29 00:01:12.210 --> 00:01:15.960 Zscaler got created, it was around the 2008 downturn. 30 00:01:15.990 --> 00:01:16.290 Tom Field: Sure. 31 00:01:16.530 --> 00:01:19.500 Alberto Yépez: And, you know, obviously, is less competition 32 00:01:19.500 --> 00:01:23.490 because there's less noise. You know, you would have backforward for 33 00:01:23.520 --> 00:01:27.150 maybe 12 months ago, there was so many companies doing the same 34 00:01:27.150 --> 00:01:30.630 thing. And a lot of it was marketing, and you wouldn't 35 00:01:30.660 --> 00:01:34.650 recognize which ones had the real technology solution. So 36 00:01:34.860 --> 00:01:37.740 there's some good things about the market that is just by 37 00:01:37.740 --> 00:01:43.350 itself, it has to correct. So Forcepoint is to remind you, 38 00:01:43.350 --> 00:01:47.370 we're one of the largest, if not the largest, early-stage venture 39 00:01:47.370 --> 00:01:51.150 capital firms investing in cybersecurity. We have invested 40 00:01:51.150 --> 00:01:56.070 in 46 companies to date, we have 35 active portfolio companies. 41 00:01:56.490 --> 00:01:59.760 Then like you, we've grown the team, not as big, but we have 25 42 00:01:59.790 --> 00:02:03.420 members on the team, of 12 of them investors, because imagine 43 00:02:03.420 --> 00:02:06.240 having to invest in all these different technologies, you 44 00:02:06.240 --> 00:02:08.430 cannot be an expert in everything else. So we're really 45 00:02:08.460 --> 00:02:11.640 privileged and feel honored to be able to be working with 46 00:02:11.640 --> 00:02:15.120 amazing intrapreneurs to create the next generation of companies 47 00:02:15.540 --> 00:02:18.060 that protects the digital future. This in our mission is 48 00:02:18.060 --> 00:02:21.240 protecting the digital future. So talking about the market, at 49 00:02:21.240 --> 00:02:25.950 a macro level, you see, you know, moving toward a recession, 50 00:02:26.370 --> 00:02:29.640 capital is getting more expensive. SPV, you 51 00:02:29.640 --> 00:02:33.150 know, came through some challenges. And now there's an 52 00:02:33.150 --> 00:02:36.060 opportunity for other banks to serve the venture capital 53 00:02:36.060 --> 00:02:39.510 community and venture-backed companies. But all in all, it's 54 00:02:39.510 --> 00:02:42.990 an ecosystem that has that is very resilient, right? If the 55 00:02:42.990 --> 00:02:45.450 demand wouldn't be there, therefore, these companies in 56 00:02:45.450 --> 00:02:50.100 this market will not go anywhere. So as you can attest, 57 00:02:50.610 --> 00:02:53.490 this is one of the top priorities in terms of budgets. 58 00:02:54.210 --> 00:02:57.360 When you talk about technology budgets with anybody in the 59 00:02:57.360 --> 00:03:00.240 industry, they will tell you that, you know what, some of 60 00:03:00.240 --> 00:03:03.810 them may be going down, cyber is the only budget that is going up 61 00:03:03.810 --> 00:03:07.080 because the board of directors are not getting involved in as 62 00:03:07.080 --> 00:03:11.490 you know, the SEC is beginning to get very active in trying to 63 00:03:11.850 --> 00:03:16.290 I wouldn't say demand is a strong word, to require, yeah, 64 00:03:16.290 --> 00:03:19.620 it has to. And then why because as an as a board member in a 65 00:03:19.620 --> 00:03:22.260 public company or a private company, you will be liable if 66 00:03:22.260 --> 00:03:27.210 you don't provide the right, you know, guidance for companies to 67 00:03:27.210 --> 00:03:30.990 do that. So all that generates demand. And therefore, then 68 00:03:30.990 --> 00:03:33.630 there's an opportunity to create innovation. And then we you 69 00:03:33.630 --> 00:03:36.660 combine that with the cyberattacks and the 70 00:03:36.660 --> 00:03:39.180 cyberthreat. There is more, continues to be more 71 00:03:39.180 --> 00:03:42.960 sophisticated, using AI to make it even more difficult to 72 00:03:42.960 --> 00:03:47.550 detect. So you see a perfect storm grooming because you have 73 00:03:47.580 --> 00:03:51.180 the need to avoid those attacks, you have regulation that is 74 00:03:51.180 --> 00:03:56.790 driving, you also have the ability to try to drive answers 75 00:03:56.790 --> 00:03:59.160 to the new emerging technologies that we have. 76 00:03:59.220 --> 00:04:01.290 Tom Field: Well, you make a good point you talked about 2008. And 77 00:04:01.290 --> 00:04:03.780 certainly yes, because budgets are tight, and staffs are lean 78 00:04:03.780 --> 00:04:07.890 as they were in 2008. The attack surface is infinitely larger 79 00:04:07.950 --> 00:04:11.580 than it was in 2008. The adversaries are infinitely more 80 00:04:11.580 --> 00:04:15.750 focused than they were in 2008. How does that foster an 81 00:04:15.750 --> 00:04:18.750 environment for innovation when you're constantly just trying to 82 00:04:18.750 --> 00:04:20.580 keep up with what's going on around you? 83 00:04:21.000 --> 00:04:24.690 Alberto Yépez: Well, that's a great question, I guess. Always 84 00:04:24.720 --> 00:04:30.330 ground yourself with the need of the customer. They have to 85 00:04:30.360 --> 00:04:33.870 irregardless whether all these things change, they need to 86 00:04:33.870 --> 00:04:37.860 defend the banks, the water processing, the nuclear plants, 87 00:04:37.860 --> 00:04:41.370 whatever. Therefore, they need to look for answers. Therefore, 88 00:04:41.850 --> 00:04:45.180 there's an opportunity to create new solutions on the changing 89 00:04:45.180 --> 00:04:47.580 environment to be able to do that. So the fostering of 90 00:04:47.580 --> 00:04:51.510 innovation comes because of the attack surface, the 91 00:04:51.510 --> 00:04:57.210 sophisticated attacks. The fact that these are real in their 92 00:04:57.210 --> 00:04:59.640 impacting companies and therefore think about the growth 93 00:04:59.640 --> 00:05:02.040 of cyber insurance, for instance, you know, as a board 94 00:05:02.040 --> 00:05:06.090 member, you need to be able to ask, do you have cyber 95 00:05:06.090 --> 00:05:08.580 insurance? Are you going to have the right coverage? And even 96 00:05:08.580 --> 00:05:11.730 though that happened in the last couple of years, the insurance 97 00:05:11.730 --> 00:05:16.410 industry lost a lot of money. Because as they underwrote cyber 98 00:05:16.410 --> 00:05:21.630 policies, you know, they were all not profitable. So the use 99 00:05:21.630 --> 00:05:26.850 of data analytics, predictive models, to make sure that you 100 00:05:26.850 --> 00:05:31.770 can underwrite cyber, and try to help companies at least get the 101 00:05:31.770 --> 00:05:35.310 right processes in place to be able to defend the adversary. So 102 00:05:35.640 --> 00:05:39.450 I think it's an it's more of a need, rather than a nice to 103 00:05:39.450 --> 00:05:39.720 have. 104 00:05:39.870 --> 00:05:41.340 Tom Field: Let me ask you about some themes. Certainly over the 105 00:05:41.340 --> 00:05:44.550 past year or so two years, we've seen distributed work, we have 106 00:05:44.550 --> 00:05:49.110 seen cloud migration, we've seen digital transformation, what are 107 00:05:49.110 --> 00:05:53.130 the new areas of investment you see opening up? And I bet that 108 00:05:53.160 --> 00:05:55.980 generative AI is going to be near the top of that list. 109 00:05:55.000 --> 00:05:58.900 Alberto Yépez: It will be. But let's step back. So the pandemic 110 00:05:59.170 --> 00:06:01.690 taught us that we could be resilient and really move very 111 00:06:01.690 --> 00:06:04.330 quickly to enable that distributed workforce, not only 112 00:06:04.330 --> 00:06:07.840 distributed workforce, our kids were going to school, in remote 113 00:06:07.840 --> 00:06:10.120 basis. But you know, the decisions that were made two 114 00:06:10.120 --> 00:06:13.300 years ago to actually protect the distributed workforce, were 115 00:06:13.300 --> 00:06:18.430 very tactical. Therefore, you know, now many companies they 116 00:06:18.430 --> 00:06:21.340 are not looking at the next gen, they're saying, architecturally, 117 00:06:21.340 --> 00:06:24.520 do we have the right solution? In this way, you keep on hearing 118 00:06:24.520 --> 00:06:29.560 zero trust, again. Why architecturally sound solutions 119 00:06:29.560 --> 00:06:32.980 that can actually help these distributed environment and it's 120 00:06:32.980 --> 00:06:36.190 here to stay, it's not going to go away anytime soon. So from 121 00:06:36.190 --> 00:06:38.980 that regard, you see, the next-generation companies are 122 00:06:39.130 --> 00:06:43.900 beginning to offer you that, you know, distributed infrastructure 123 00:06:43.900 --> 00:06:46.990 for people to do that. But you said it correctly, because it 124 00:06:46.990 --> 00:06:50.860 now includes the cloud, which before it may was beginning to 125 00:06:50.860 --> 00:06:54.670 get included, but it wasn't there. So now, when you talk 126 00:06:54.670 --> 00:06:58.120 about cloud, there's not one, there's three or four. So it's, 127 00:06:58.150 --> 00:07:03.280 you know, to name a few, Google, Microsoft, Amazon, etc. And then 128 00:07:03.490 --> 00:07:06.580 - you see, the private cloud is a public cloud. And all of this 129 00:07:06.610 --> 00:07:09.880 is dictated by the need in applications that people use. So 130 00:07:10.840 --> 00:07:14.320 the need for protecting the distributed environment, 131 00:07:14.320 --> 00:07:16.690 enabling their transformation of businesses therein, and 132 00:07:16.690 --> 00:07:20.470 therefore, I think, once again, creates new opportunities for 133 00:07:20.470 --> 00:07:24.580 investment. So zero trust, the whole area of protecting 134 00:07:25.450 --> 00:07:30.340 identities across multiple clouds. Imagine before you know 135 00:07:30.340 --> 00:07:34.240 that all enough to say there was the IBM stack that the Oracle 136 00:07:34.240 --> 00:07:37.510 stack and others. And then we felt comfortable, because we 137 00:07:37.510 --> 00:07:40.180 have stacks that manage areas across them, there were silos. 138 00:07:40.210 --> 00:07:43.060 The same thing is happening. The silo for identity for Google, 139 00:07:43.060 --> 00:07:46.090 the silo for identity of Microsoft, because they want to 140 00:07:46.210 --> 00:07:49.510 lock you in. But as a consumer, you want to work with all of 141 00:07:49.510 --> 00:07:52.300 them, and you have to work with all of them. And therefore, 142 00:07:52.480 --> 00:07:55.780 that's a whole area, the whole area of identity, is because 143 00:07:55.780 --> 00:07:58.060 you're trying to protect information that somebody is 144 00:07:58.060 --> 00:08:00.220 using and consuming. And therefore how can you provide 145 00:08:00.220 --> 00:08:03.100 the appropriate controls and the right safety? 146 00:08:03.450 --> 00:08:06.540 Tom Field: Now, we have got a new US National Cybersecurity 147 00:08:06.540 --> 00:08:09.570 Strategy. And we already talked about those. There's a sense 148 00:08:09.570 --> 00:08:12.480 that there's more regulation coming. What role would you see 149 00:08:12.480 --> 00:08:15.480 the U.S. continuing to play - the government - in improving 150 00:08:15.480 --> 00:08:18.510 critical infrastructure protection and fostering the 151 00:08:18.510 --> 00:08:19.950 type of innovation you've talked about? 152 00:08:19.980 --> 00:08:23.280 Alberto Yépez: Wow. Loaded question. Because two things on 153 00:08:23.280 --> 00:08:28.530 the cyber strategy, you know, the thing that got me nervous is 154 00:08:28.530 --> 00:08:31.200 the shifting of liability to the providers. 155 00:08:31.230 --> 00:08:31.680 Tom Field: Yes. 156 00:08:31.920 --> 00:08:34.740 Alberto Yépez: And everything else is great, because we got 157 00:08:34.740 --> 00:08:37.020 our work together, and we have to have all the right. 158 00:08:37.440 --> 00:08:39.240 Tom Field: Okay, everyone else says it's a great thing to put 159 00:08:39.240 --> 00:08:41.130 it liable, but you, it makes you nervous. Why is that? 160 00:08:41.220 --> 00:08:45.180 Alberto Yépez: Well you know, how ... are they prepared? How 161 00:08:45.180 --> 00:08:48.150 are they going to assess that I did my best effort to do that. 162 00:08:48.210 --> 00:08:51.090 No nervousness is that, you know, I think there's a lot of 163 00:08:51.090 --> 00:08:51.780 work to be done. 164 00:08:51.810 --> 00:08:52.080 Tom Field: Sure. 165 00:08:52.080 --> 00:08:54.270 Alberto Yépez: People talk about software assurance, talked about 166 00:08:54.270 --> 00:08:58.170 the supply chain. How do you know that NVIDIA, or you know, 167 00:08:58.200 --> 00:09:03.090 whoever SolarWinds and Zoom, they didn't have any bad 168 00:09:03.090 --> 00:09:07.350 intentions, those backdoors that existed and created, how do I 169 00:09:07.350 --> 00:09:11.640 know that I do, was that I'm going to shift the liability, 170 00:09:11.670 --> 00:09:15.420 those companies will not exist. So they need to show the proper 171 00:09:15.480 --> 00:09:19.080 hygiene, the proper investment so that they can say, I did my 172 00:09:19.080 --> 00:09:21.570 best efforts, therefore, I cannot do that. So stepping 173 00:09:21.570 --> 00:09:23.160 back, that thing that makes me nervous a little bit is the 174 00:09:23.160 --> 00:09:25.680 shifting of liability. Not because it's not the right thing 175 00:09:25.680 --> 00:09:27.630 to do, it is the right thing to do because you're gonna incent 176 00:09:27.630 --> 00:09:30.750 companies to actually make the right things, both on the 177 00:09:30.750 --> 00:09:33.030 consumer side, they're gonna demand so what are you doing to 178 00:09:33.030 --> 00:09:35.220 test your applications to be able to make sure that they 179 00:09:35.220 --> 00:09:37.710 don't have backdoors. On the other hand, you have the vendors 180 00:09:37.710 --> 00:09:39.960 having to take it really seriously and invest to do that. 181 00:09:40.260 --> 00:09:42.870 But that said, you know, I think the cyber strategy. I was in 182 00:09:42.870 --> 00:09:47.580 Japan six weeks ago, and I met with a Japanese cyber leadership 183 00:09:47.580 --> 00:09:52.200 and they were waiting for the US Cyber Strategy. So we take a 184 00:09:52.200 --> 00:09:56.040 huge leap and people oftentimes mirror what we have 185 00:09:56.070 --> 00:09:56.310 186 00:09:56.460 --> 00:09:59.370 Alberto Yépez: And I think, we're really taking it very 187 00:09:59.370 --> 00:10:01.830 serious when you talk about critical infrastructure, when 188 00:10:01.830 --> 00:10:04.260 you talk about, you know, there's all these 189 00:10:04.380 --> 00:10:07.440 next-generation things that sometimes seem a bit mundane to 190 00:10:07.440 --> 00:10:10.710 us. You know, 5G was something that we didn't really took 191 00:10:10.710 --> 00:10:14.430 practically took a role in the world. So we're trying to figure 192 00:10:14.430 --> 00:10:19.350 out what is the role of the U.S. to do 6G and beyond. The other 193 00:10:19.350 --> 00:10:22.230 thing is the use of AI to protect critical infrastructure. 194 00:10:22.620 --> 00:10:25.890 And there's been a lot of hype discussions about AI. AI has 195 00:10:25.890 --> 00:10:28.890 been around for a long time, right? In expert systems and in 196 00:10:28.890 --> 00:10:30.990 machine learning, and you know, people are talking about 197 00:10:30.990 --> 00:10:34.560 cognitive AI and all that. But the reality is, how can you do 198 00:10:34.560 --> 00:10:39.030 the responsible use of AI to automate tasks, to try to filter 199 00:10:39.660 --> 00:10:43.230 tease the signal from the noise and so on, but that's a lot of 200 00:10:43.230 --> 00:10:46.200 different moving parts. So the strategy gives you a framework 201 00:10:46.200 --> 00:10:48.930 on how you think about and with the investment that in 202 00:10:48.930 --> 00:10:51.480 the public sector, the private sector, the innovation 203 00:10:51.480 --> 00:10:54.510 community, and incensing innovation community to do that, 204 00:10:54.510 --> 00:10:57.300 and the world is listening, because they want to follow our 205 00:10:57.300 --> 00:10:57.660 lead. 206 00:10:57.840 --> 00:11:00.750 Tom Field: So given all this context you talked about, what 207 00:11:00.750 --> 00:11:02.670 are the areas of investment you're bullish on today? 208 00:11:04.500 --> 00:11:07.500 Alberto Yépez: The whole multi-cloud migration, everyone 209 00:11:07.500 --> 00:11:09.720 talks about shift left, which, you know, we talked about it, 210 00:11:09.720 --> 00:11:12.870 which is shifting toward the developer to build secure code. 211 00:11:13.080 --> 00:11:15.900 But that's kind of, I wouldn't say passe, people are still 212 00:11:15.930 --> 00:11:18.900 working on a secure code. Instead of shifting left, we're 213 00:11:18.900 --> 00:11:22.230 calling something called shift up. Shift up means here you're 214 00:11:22.230 --> 00:11:23.250 gonna shift to the cloud. 215 00:11:23.310 --> 00:11:23.610 Tom Field: Right. 216 00:11:23.900 --> 00:11:25.490 Alberto Yépez: In the multi-cloud environment. So that 217 00:11:25.490 --> 00:11:29.690 whole area is still nascent, early innings in a lot of things 218 00:11:29.690 --> 00:11:34.400 that we knew in our normal environment is all the way from 219 00:11:34.400 --> 00:11:37.370 the physical environments and firewalls and all that is moving 220 00:11:37.370 --> 00:11:39.710 to the cloud in the multi-cloud and hybrid environments. So 221 00:11:39.710 --> 00:11:42.770 that's one area that we're going to be talking for the next five 222 00:11:42.770 --> 00:11:45.710 to 10 years. It's going to be big companies that will emerge. 223 00:11:45.710 --> 00:11:47.900 And some of them, they're not going to be able to do that. The 224 00:11:47.900 --> 00:11:53.660 other area is, you know, AI. How do you enable the responsible 225 00:11:53.660 --> 00:11:56.720 use of AI. It's not about containment, it's all about 226 00:11:56.720 --> 00:12:01.790 enabling. And, you know, I have a thesis on responsible AI, 227 00:12:02.000 --> 00:12:05.780 obviously, we've been using it for automated systems and trying 228 00:12:05.780 --> 00:12:10.550 to automate some basic human tasks. But I think that the key 229 00:12:10.550 --> 00:12:14.630 to make sure that we don't get in trouble and you've seen a lot 230 00:12:14.630 --> 00:12:17.450 of news about, you know, companies in Asia, they were 231 00:12:17.450 --> 00:12:23.090 using the OpenAI and they were exposing intellectual property, 232 00:12:23.480 --> 00:12:29.720 PII, personal information and people than the user, right? 233 00:12:29.720 --> 00:12:30.620 This is not the way to do it. 234 00:12:30.660 --> 00:12:30.990 Tom Field: Right 235 00:12:31.260 --> 00:12:33.660 Alberto Yépez: The way to do it is okay, there are going to be 236 00:12:33.690 --> 00:12:37.500 these walled gardens. Okay, for instance, you have a massive 237 00:12:37.530 --> 00:12:42.600 database of videos and interviews. And if you use AI, 238 00:12:43.080 --> 00:12:45.600 to say, how many times did I talk to Ron Gula? Like, 239 00:12:45.600 --> 00:12:48.150 remember, we talked about it the other day, or Alberto? And what 240 00:12:48.150 --> 00:12:51.480 things do they say, it will actually probably do it faster 241 00:12:51.480 --> 00:12:54.060 than you and I or the whole team to be able to do that. So there 242 00:12:54.060 --> 00:12:57.240 are ways in which AI can be used in a very responsible way. So 243 00:12:57.240 --> 00:13:00.000 you're going to see walled gardens with information, that 244 00:13:00.030 --> 00:13:04.260 the information you're using those models is verified, the 245 00:13:04.290 --> 00:13:08.460 problem is when this information is not verified, so people can 246 00:13:08.580 --> 00:13:13.410 introduce misinformation. And therefore now author and you 247 00:13:13.410 --> 00:13:18.060 drive conclusions that are the wrong conclusions. So one thing 248 00:13:18.060 --> 00:13:21.750 is how do we look at these vertical orientation and walled 249 00:13:21.750 --> 00:13:25.170 gardens in the use of AI automation, to be able to drive 250 00:13:25.170 --> 00:13:29.880 that you're seeing the bad guys already using ChatGPT to 251 00:13:29.880 --> 00:13:34.230 increase the effectiveness of their ransomware attacks. All 252 00:13:34.230 --> 00:13:36.780 these phishing emails and all these BECs, remember, you could 253 00:13:36.780 --> 00:13:39.270 detect them really quickly, because their English wasn't 254 00:13:39.270 --> 00:13:41.550 even very good. And the punctuations weren't very good. 255 00:13:41.760 --> 00:13:45.270 Now through their use, it looks really good. It looks like it 256 00:13:45.270 --> 00:13:48.900 would be me sending you an email. So Alberto, fine, this is 257 00:13:48.900 --> 00:13:51.270 what happened. So we're beginning to see the bad guys 258 00:13:51.270 --> 00:13:54.600 using it. So how do we turn it around so that we can, you know, 259 00:13:54.600 --> 00:13:56.790 detect some of those issues to be able to do that? So that's 260 00:13:56.790 --> 00:13:59.430 one area that it's not like we're waiting for something to 261 00:13:59.430 --> 00:14:03.630 happen. People already using for the wrong reasons. So I guess as 262 00:14:03.630 --> 00:14:07.140 corporations, you're beginning to see who will be the most 263 00:14:07.320 --> 00:14:10.680 qualified individuals that can really take that leap, at the 264 00:14:10.680 --> 00:14:15.000 end of the day its data - data analytics, data processing, data 265 00:14:15.000 --> 00:14:18.810 automation and processing. Therefore, now you have many 266 00:14:18.840 --> 00:14:22.110 companies, very large banks, the government setting up these 267 00:14:22.830 --> 00:14:26.340 joint AI centers. They're going to try to impact the same way 268 00:14:26.640 --> 00:14:31.530 the migration to the cloud and the digital transformation, AI 269 00:14:31.530 --> 00:14:34.860 is going to enable us to do more things better, faster, but we 270 00:14:34.860 --> 00:14:40.980 have to be responsible in the use. In areas like observability 271 00:14:41.010 --> 00:14:44.520 or containment, just do a little parallel to open source. 272 00:14:44.730 --> 00:14:46.860 Everybody, when they use an open source routine, they thought it 273 00:14:46.860 --> 00:14:49.950 was a good thing to search or an index, or reuse it. But there 274 00:14:49.950 --> 00:14:52.260 were backdoors. So the same thing in AI, you're going to 275 00:14:52.260 --> 00:14:56.100 have these models. It could tell you that those models don't have 276 00:14:56.100 --> 00:14:59.040 a backdoor. That's another area where, it's a very tactical 277 00:14:59.040 --> 00:15:01.230 area. Just to give you an example areas we are focusing on 278 00:15:01.500 --> 00:15:03.630 when trying to invest, what is the observability and the 279 00:15:03.630 --> 00:15:07.350 containment and of AI models so that we don't get surprised. 280 00:15:07.500 --> 00:15:09.390 Tom Field: Terrific overview. I hate to wrap this up. But we've 281 00:15:09.390 --> 00:15:11.340 come to the end of our time here. Before I do wrap up 282 00:15:11.340 --> 00:15:14.760 though, as we go into the second half of this critical year, 283 00:15:15.090 --> 00:15:18.300 what's your message to investors that support your efforts, 284 00:15:18.450 --> 00:15:21.210 companies you invest in, and the customers who rely on these 285 00:15:21.210 --> 00:15:22.470 cybersecurity solutions? 286 00:15:22.950 --> 00:15:26.490 Alberto Yépez: I would say, you know, this is where the 287 00:15:27.180 --> 00:15:32.400 companies that endure, the companies have resilience are 288 00:15:32.400 --> 00:15:36.570 going to come up stronger, because many of them had to cut 289 00:15:36.570 --> 00:15:40.170 back expenses and everything else. This is the best time to 290 00:15:40.170 --> 00:15:43.230 actually start a company, establish the company, you are 291 00:15:43.230 --> 00:15:45.780 focused on the right things, which is the customer success. 292 00:15:46.380 --> 00:15:49.560 And I would say you know, it's all about our lives are 293 00:15:49.560 --> 00:15:52.950 continuing to transform digitally in the way we think 294 00:15:52.950 --> 00:15:57.480 about it is protecting the digital future and the best is 295 00:15:57.480 --> 00:15:58.020 yet to come. 296 00:15:58.650 --> 00:16:00.270 Tom Field: And shift up. You've got to trademark that. 297 00:16:00.270 --> 00:16:02.010 Alberto Yépez: And shift up. Actually one of our companies 298 00:16:02.010 --> 00:16:04.890 called Uptycs has actually trademarked that. 299 00:16:05.430 --> 00:16:06.180 Tom Field: Not surprised 300 00:16:06.210 --> 00:16:09.060 It is upto Uptcys, since they're the ones that are driving that 301 00:16:09.060 --> 00:16:09.450 efforts. 302 00:16:09.480 --> 00:16:11.100 It's been a pleasure. Thank you so much for your time. 303 00:16:11.670 --> 00:16:12.570 Alberto Yépez: Yeah, thanks again. 304 00:16:12.780 --> 00:16:14.250 Tom Field: Once again, we've been speaking with Alberto 305 00:16:14.280 --> 00:16:17.280 Yépez. He is the co-founder and managing director of Forgepoint 306 00:16:17.280 --> 00:16:20.430 Capital. For Information Security Media Group, I'm Tom 307 00:16:20.430 --> 00:16:22.710 Field. Thank you for giving us your time and your attention 308 00:16:22.710 --> 00:16:23.040 today.