Computing / Cybersecurity

Cybersecurity in 2020: The rise of the CISO

Data breaches taught companies hard lessons in 2019. One key takeaway: they need a chief information security officer, says Forrester’s Stephanie Balaouras.

Feb 24, 2020
Sponsored Content

Produced in partnership with Microsoft Security

As the new year (and new decade) begins, one thing is certain: cybersecurity will continue to have an increasing impact on business, for better or worse. In this episode, we hear from Stephanie Balaouras, a cybersecurity expert who has spoken to thousands of customers over her 15 years at Forrester Research. She is the vice president and group director of security and risk research, as well as infrastructure and operations research.

Balaouras makes the case that all businesses should have a chief information security officer, or CISO, as the world of cyberthreats becomes more intricate and perilous. "Even companies that have a CISO should take a hard look at how high in the organization they report," Balaouras says. "Do they have the right budget? Do they have enough staff? Have you given them the right span of control?"

Balaouras also reviews some of the biggest cybersecurity trends in 2019 and makes predictions for 2020.

Business Lab is hosted by Laurel Ruma, director of Insights, the custom publishing division of MIT Technology Review. The show is a production of MIT Technology Review, with production help from Collective Next. Music is by Merlean, from Epidemic Sound.

Cybersecurity isn’t only about stopping the threats you see. It’s about stopping the ones you can’t see. That’s why Microsoft Security employs over 3,500 cybercrime experts, and uses AI to help anticipate, identify, and eliminate threats. So you can focus on growing your business, and Microsoft Security can focus on protecting it. Learn more at Microsoft.com/cybersecurity.

Show notes and links

Forrester Research: Cybersecurity

“A CISO’s Guide to Leading Change” by Jinan Budge, Forrester Research

Stephanie Balaouras

“The Need for Complete Cloud Security,” an interview with Stephanie Balaouras, on YouTube

Full transcript

Laurel Ruma: From MIT Technology Review, I'm Laurel Ruma and this is Business Lab, the show that helps business leaders make sense of new technologies coming out of the lab and into the marketplace.

Security threats are everywhere. That's why Microsoft Security has over 3,500 cybercrime experts constantly monitoring for threats to help protect your business. More at microsoft.com/cybersecurity.

Our topic today is cybersecurity and more specifically the role of the chief information security officer, the CISO. We'll also review cybersecurity news from 2019 and look ahead to cybersecurity trends for 2020. One word for you: Deepfakes. My guest is Stephanie Balaouras who is a cybersecurity analyst and has spoken with thousands of customers in her nearly 15 years at Forrester research. Stephanie is the vice president and group director of security and risk research as well as infrastructure and operations research. Stephanie, thank you so much for talking with me on Business Lab.

Stephanie Balaouras: Thanks.

Laurel: So just to start, in 2017 Forester published a report by Jeff Pollard, a member of your team, about the career paths of CISOs, chief information security officers. And I particularly like talking about this role because it is so new to the C-suites in the business. If Citibank just appointed the first CISO ever, in 1995, that's really recent history. But if every company is also a technology company, why doesn't every company have a CISO?

Stephanie: If you look at the history of other roles, like I think the first CMO, chief marketing officer was in the 1950s you still had companies 20, 30 years later without a CMO. So when these emerging roles first start out, it does take some time for it to become the norm. But I will say every company really should have a CISO. Publicly traded companies are required to have a CISO. But what we will often find is, depending on the size of the company, sometimes they'll get away with calling the CIO also the CISO or some other IT executive the CISO as well. So that's pretty common with smaller companies—they'll get away without having a standalone role.

But what you'll also find is if they have a breach or some sort of major cybersecurity issue or even a major compliance violation that's data security-related, the first thing that they'll do is name a dedicated CISO. And then even companies that have a dedicated CISO, when they have a breach, a lot of times what happens is they realize the CISO didn't report up high enough in the organization or didn't have the right span of responsibilities or enough budget or enough people. So then they'll fix that. Name CISO should be requirement, but I would say even companies that have a CISO should take a hard look at how high in the organization they report. Do they have the right budget? Do they have enough staff? Have you given them the right span of control?

Laurel: Because that's an expensive fix, isn't it?

Stephanie: Yes, exactly.

Laurel: Only after an attack, do we have looking at the roles and responsibilities in a new light, in a more responsible light?

Stephanie: Exactly.

Laurel: So if the perfect CISO is a bit of both a businessperson who can talk directly to the CEO, explain the necessity for security and risk mitigation, but then also talk to customers perhaps as well as other employees and talk about security and how that role is important to the company. Where are these people coming from? Where are they getting all of this education?

Stephanie: When we looked at CISO career paths, we did find most of them did come up through the security ranks. So typically they did start off as security professionals. They gained decades of experience in the role. But what we often found is the majority of them often would go back for graduate degrees, and they would actually go after a business degree. They would often get MBAs, and it was because they needed to satisfy both of those requirements, which is, yes, I'm a technology executive, but at the same time I'm a technology executive that in a large company reports to the board on a quarterly basis or reports directly to the CIO or directly to the CEO. So it's definitely a combination of education and experience.

I would say universities are doing a better job of providing undergraduate and graduate degrees in information security. There's some areas where they're incredibly weak, like application security is not taught well at the undergraduate level, if at all. It's done really poorly. The other thing I'll pick on universities about is they're not doing a good job of recruiting women into undergraduate and graduate programs. There is a huge skills gap as well as a staffing issue in security, and at Forrester we like to say it's largely self-inflicted because we're not recruiting from half of the population, and we're not recruiting people from diverse backgrounds. We've got this one mold of individual that we recruit from, and then we're shocked when we can't find enough people with this very narrow skill set, so.

Laurel: Yeah, the report said nine out of 10 CISOs are male.

Stephanie: Exactly, exactly. We haven't looked at the end of 2019 yet, but the last couple of years that's been true. And if you look at the staff as well, it's worse than the general security industry. It's at about 11% of security staff are female. It's worse than general IT. General IT also has a problem, but it's more somewhere between 20% to 30%, so security is even worse.

Laurel: So when we talk about lack of diversity in security in general, how are companies trying to respond to that? Are you seeing any particular companies showing best practices?

Stephanie: There are definitely some best practices. Actually, I've seen a lot of vendors, like large technology vendors, they're actually partnering with universities and they're actually even partnering at the high school level. For example, with the Girl Scouts of America, to actually foster programs that get girls excited and interested in cybersecurity from a very young age and then want to continue to pursue it at an undergraduate and graduate level. At a number of universities, there's pretty aggressive scholarship programs.

And then also there's just a lot of introspection that's happening at the corporate level where we kind of look at the culture of security teams. We look at a lot of the traditional routes from where we recruit from, which is conferences that are male-dominated or again, we have these job descriptions that emphasize a lot of military experience as an example. So it's sort of like broadening the aperture of people that will recruit into the security industry, a willingness to develop their skill set as well as doing a much better job of actually filling the funnel, filling the actual pipeline over the long term.

But to your point, diverse teams make better decisions, and in the long run they're higher-performing. And then the second thing I would say is there are so many open jobs in security, not just in the US, but also globally. It's also a math problem. We are not going to fill these open positions if we're not recruiting from half the population.

Laurel: Also, it seemed like not a lot of security talent was necessarily promoted from within. In the reports from Forrester, it sounded like you had more likelihood to be given a promotion if you went to a different company. Are companies now re-examining their own talents?

Stephanie: That's actually true at kind of the individual level as well as the CISO level. So we've found amongst the Fortune 500 that first-time CISOs were rare, and they weren't promoted from within. So they'd like to hire externally for CISOs, and they wanted CISOs that had prior experience as a CISO. And actually if you are somebody in security, so that means if you want to be promoted into CISO, your best opportunity is actually to look externally, outside your company. And we also found that when companies hired CISOs externally as opposed to promoting them from within, they were more likely to have them report higher up in the organization. So yeah, at the CISO level, companies could do a better job of looking within and giving those individuals the right opportunity to report higher in the organization.

But we also found things similar again at manager levels and individual contributor levels, which is they weren't hiring from within the company or when they did hire individuals, they weren't giving them good career paths and ongoing skills development. So again, if those individuals really wanted to further their career, most of them ended up leaving. So that's why we say so much of the skills and the staffing challenges are completely self-inflicted.

Laurel: So it's a bit of a blind spot that probably everyone could do a little bit better on, right?

Stephanie: Exactly. Yeah.

Laurel: I was reading this Ponemon Institute report, and this particular phrase jumped out at me and as we were talking about the CISO and what kind of person would do that role in the first place and the experience they had, more than just a resume, it's also your attitude and ability to act really quickly and really smartly and also communicate very well. But the quote was, and I'm paraphrasing a bit here, “technology has transformed the internet age into a period of cruel miracles for security professionals.” All of our cruel miracles are that we have devices in every pocket. We can go anywhere, we can talk to anybody at anytime, and we can do it at the speed of a lightning bolt, but at the same time if you're a CISO, how do you secure it all?

Stephanie: Right. Yeah. All of these devices that extend the four walls of the company, they are basically extending the attack surface of the organization. So for CISOs, it's been sort of this march away from a traditional perimeter-based approach to security and actually taking more of a data-centric and application-centric, and I would even say identity-centric approach to security.

Not that the network's not important, network security is hugely important, but it's more of the perimeter-based approach to security that's changed dramatically. So again, where there's no true four walls of the corporation. The perimeter is actually much smaller. So we tend to think of secure enclaves. How do I build a micro perimeter around our most important assets?

When we talk about an extended network of all kinds of devices like you mentioned or the computing environment itself, which could be a combination of on-premise, cloud, hosted private cloud, and every variation thereof and any kind of user population that interacts with the company systems and data. That could be your own employees, it could be consumers and customers, it could be third-party partners. So when you think about devices, user populations and different computing models, there is no perimeter. So the focus becomes on protecting the data itself regardless of where it travels and regardless of the hosting model or the location. Really, really taking a hard look at identity. So limiting and strictly enforcing access, both human and nonhuman. So it does kind of flip the traditional security paradigm on its head. You move away from perimeter-centric to data- and identity-centric.

That's what we typically recommend to CISOs. And we call that the zero-trust model of security, which is you assume you already have a breach, and you never assume trust in your environment. You just always assume that something's going wrong somewhere, but it works. It perhaps is not the most positive spin in the world, like, “oh, zero trust,” but it works. It's very effective.

The other thing I would say is I would really encourage manufacturers of all these devices, IoT sensors, IoT devices, everything that you can think of to really do a better job of building security into the device itself from the beginning. That would definitely make the CISO's job much easier. It's just so frustrating. It's largely out of their control as well.

Laurel: Right? Well, especially—

Stephanie: Except for the CISOs that actually work at product companies. You should be involved in product development. You should be advising the organization.

Laurel: And it's interesting because security doesn't always come first, does it?

Stephanie: No.

Laurel: Especially when you're doing product design. So do you see that happening often though? CISOs actually actively involved in product design?

Stephanie: Not to date, unfortunately, but it is something that we do recommend. And I would say some CISOs don't necessarily see it as their traditional role, like their traditional role has been to secure the back-end systems of record and infrastructure and the company's data and not necessarily get involved in development, but we actually actively encourage CISOs to get involved in product design and product development to really help the organization secure what you sell. So whatever it is you sell, whatever service it is you're delivering to a consumer, a patient, a citizen, another corporation, if you're a B2B organization, actively being involved in securing what you sell.

Laurel: And that's certainly a competitive differentiator, isn't it?

Stephanie: Yeah, absolutely. Absolutely. We found security, as well as privacy—and those aren't synonymous, but sometimes they do go hand in hand—does create competitive differentiation for companies.

Laurel: Yeah. And that's an important differentiator, and all the noise that you have coming out. So if there's something very specific that you can market, that would be a good one. But we're also kind of talking about the CISO really taking an active role in everything. So you have to be this multi-talented person who can talk and understand product as well as be out and about in the community, right? All at the same time sharing, but not sharing, company secrets and how you defend the data because there is this idea, especially in the tech community, where you do share your best practices and what you've learned from. And I was just wondering a bit about that, how do CISOs actually share but not share everything?

Stephanie: Yeah, there is that challenge. A lot of CISOs are very loathe to talk about specifics about their deployments, and I don't necessarily see that changing anytime soon. Sometimes in smaller groups though, there are a lot of communities that support CISOs. Actually at Forester, we have a peer networking group of about a 100 CISOs. There's all kinds of ISACs and intelligence sharing communities amongst like CISOs that are industry specific. So often in tightknit communities where there's an understanding that everything's under NDA, where there's candidness, where there's some personal relationships, CISOs will share a lot more. But I have found CISOs willing to talk about overall strategy. When I mentioned moving from perimeter-based approaches to data- and identity-centric. Talking about culture. Culture is actually hugely important, not just at the CISO but for the rest of the security organization as well.

Because you need an organization that has the right kind of staff that can actually talk to developers and be part of secure application development, that can work with infrastructure and operations teams to secure cloud deployments. That could actually work with marketing teams to help them understand privacy implications of how they might be personalizing services and data and ads to consumers. So you need also the security team itself, not just the CISO, the security team itself to be vocal and outspoken, collaborative and willing to insert themselves into core business and IT processes throughout the organization. So they'll talk about culture, they'll talk about staffing, they'll talk about the kind of skills that are required as well. We definitely see some change there.

Laurel: And also the business has to be willing to allow security kind of come full circle on this idea but not just product but then also everyone else. So marketing, thinking, again, security first or security at some point. How do you then have this conversation, so everyone is a bit educated? You don't have to be an expert in security if you're in marketing, but you have to be willing to listen.

Stephanie: A lot of times CISOs would kind of tell the stories and everything was doom and gloom. I think taking a much more risk-based approach where you're helping the business understand future risks and helping them just understand both probability and impact and advising them on making the right decisions, like moving from that department of no to more of that consultative role I think helps. The more you become that consultative subject matter expert more, I think you can bring along the rest of the organization with you. I think that that's a big help and it sort of varies by CISO skill set as to how good they are at doing that. I think anytime you can put things in a positive business terms as well, that helps.

There was an analyst on my team that wrote this report, it was called “security for profit,” and in it he outlined ways that security could potentially be a revenue generator for the company. Again, it could be value-added features that people were willing to pay more, or it becomes a competitive differentiator in a product or service that you offer. So it could actually contribute to the top line. And then he also outlined all the ways that can actually save the company money beyond breach avoidance and avoidance of compliance fines.

There's all kinds of ways where if you do security right, it can actually dramatically improve employee experience and reduce operational costs within the company. Identity is one of the biggest examples, when you think about onboarding an employee and the ability to automate all the ways that you give them access to the systems that they need. Resetting passwords. I mean, there's so many just low-hanging fruit where you can make employees lives easier, but then you're actually really reducing hard costs.

Laurel: Yeah. And that's certainly something you don't think about, but you are certainly frustrated when you have to redo your password and it takes forever and/or you have to go on a different system and blah, blah, blah. But that kind of streamlining is not just from a security perspective, but as you said, it's from everyone's perspective to just make their lives easier, which is what ultimately every employee wants.

Stephanie: Yep.

Laurel: So how do CISOs stay on top of the latest trends? I mean, conferences, those small groups that they talk to?

Stephanie: Yeah, I think they do do their own research, whether it's publications like yours, firms like Forrester, the other big kind of strategy consulting firms as well. They do do their own research though. They'll often send their staff to a lot of the conferences. And then I do think those peer-networking groups help dramatically as well. But it is hard to stay on top of every single possible trend. So I do think it always helps to have some sort of external advice as well, to give you a heads up on emerging threats, on emerging risks, emerging compliance and regulations that are happening all over the globe.

Laurel: Yeah, and then just like you said, having that peer group to establish trust and some kind of transparency with sharing best practices and just hearing various stories, even if it's from a friend I've heard, to kind of get those warnings out to various organizations and people. Speaking of that, other than in these peer groups, is there much cooperation between government and business? Are you seeing more of it or do people pretty much pretty stay in their lane because there are other conflicts to worry about with businesses and governments?

Stephanie: Yeah. In the US and actually other countries like the UK, if you're considered a critical infrastructure industry, you will need to have close relationships with federal government officials. If you're in critical infrastructure, I mean, there's going to be industry-specific cybersecurity regulations that you have to follow, you know, if you're in energy. I mean, even financial services is considered critical infrastructure. So then you'll have to follow NIST guidelines, as an example. Anybody doing business with the federal government will have to follow NIST.

You don't want to wait to form relationships with the federal government or specific agencies, like the FBI. You don't want to wait until you suspect something or have a breach. Or in a lot of cases, it's the reverse which is, they've detected something, they're alerting you to it. Sometimes, they can't offer you specifics because their hands are tied as part of a larger investigation. So you can actually develop relationships with a lot of the US federal government agencies ahead of time, so that you can share threat intelligence. Or again, should something actually really occur, you already have those pre-existing relationships in place.

Laurel: Yeah, and speaking of something already occurring and preparation plans, are you seeing more companies develop those preparation plans for, again, not if, but when they are hacked or a cyberattack happens and they need to go public with it?

Stephanie: So with incident response, there's sort of the internal incident response, which is sort of all of the processes that you need to detect, then remediate, and then respond. And a lot of the responding is more of what we call kind of a forensic level responding: determining exactly what happened, remediating it, potentially collecting forensic evidence if you decided that you were actually going to pursue legal action, depending on who it was afterwards. Then there's the external response, and you really need both. You really need a sophisticated incident response, process and initiative within the company with dedicated experts, particularly if you're a large enterprise.

But I think where companies often really fall down is on external breach response. And again, regulations require that if it's consumer-related, if it's affected individuals, you are required to notify them within specific days. In many cases, it's 30 days. Under GDPR in Europe, it's 72 hours or less. And we have seen companies royally botch the external breach response, meaning that they were cagey about offering information to consumers.

I don't want to pick on companies because victim blaming often isn't all that helpful, but I've seen companies kind of blame the consumer, in a way, saying, "Oh, if you had better password hygiene, if you were monitoring your own accounts much more closely, this wouldn't be as big of an impact." No. You need to show empathy with your customers. Put them first. Do everything you can to protect them. Don't be cagey about sharing information because of CYA kinds of concerns. And in some cases, if you do it right, it's an opportunity to not lose their trust, but potentially even to reinforce it and build it up, if you've put them first. But you can really botch it and make the breach so much worse than it needed to be.

Laurel: And that just cost the company even more money.

Stephanie: Exactly.

Laurel: When you look back at 2019 and there's a lot to talk about cybersecurity wise, if we kind of look at three specific areas, first off is just cyberattacks, but very specifically on cities and municipalities. So New Orleans was the most recent, as of the end of the year, that we know of, but it was also on the heels of the State of Louisiana having a cybersecurity attack. We know it's happening across the country. So to ask a very loaded question, why are cities and municipalities being targeted for cyberattacks when they're not necessarily the most well-funded outfits?

Stephanie: Yeah. So that's why, because they're easy targets. So if they've been underfunding their security efforts for years, then they're much easier to penetrate and then ask for a ransom, even if the ransom if small.

Laurel: It's better than nothing.

Stephanie: It's better than nothing. That's actually the consensus of a lot of the team, is so many of these local, city, and state governments and municipalities are just such easy targets because they have been underfunded and understaffed for years. And most of the time, there is financial motivation, but there are other types of motivation. It could be political, social. If you get to a larger kind of states or federal agency, you might even get into geopolitical and even military in some of the nature.

Actually, the City of New Orleans, what was interesting about that is the attackers didn't ask for a ransom. So they used ransomware to disable them. Everything was encrypted and forced them. I think they were replacing tons of computer infrastructure. It can be really difficult to recover from backups. We say that so flippantly, like, "Oh, just recover from your backups." Most backups complete with errors and the ability to recover from a backup at scale is actually very, very difficult. And who knows when the ransomware was actually introduced? So then you're just reinstalling the ransomware.

Laurel: Interesting.

Stephanie: But yeah. From my understanding, they didn't actually ask for a ransom. So their motivation wasn't financial. So it could've been ...

Laurel: Just disruption.

Stephanie: ... just disruption for the sake of it.

Laurel: To see if they could do it, yeah.

Stephanie: Or interestingly enough, I read this article where it's forced the city to replace a ton of computer infrastructure, laptops, desktops, server infrastructure. So there’s a part of me that's wondering, "Oh, it could be city employees. I know how to get the city to upgrade."

Laurel: Right, right. Force them.

Stephanie: Force them.

Laurel: By ruining everything.

Stephanie: Yeah. So they're easy targets and the motivations for the attack are much varied, I think, when it comes to critical infrastructure and then city, state, and local government.

Laurel: And that's not necessarily when a ransom is asked for that you ever find out where they're coming from or who they are or if they are foreign state actors.

Stephanie: Yeah, you don't necessarily know.

Laurel: You'll never know. It's just a guess.

Stephanie: Yeah. We actually put out a controversial report this year that said, in some cases, organizations might want to consider paying the ransom. I'll be honest, I think for city, state, and local governments they might be prohibited from paying the ransom. I don't know. I would have to look into that, but private-sector companies, even though I'm sure FBI and other law enforcement agencies would prefer that they not do so, in some cases, it might actually make sense. And cyber insurers would even say that it might make sense in some cases. And there are actually firms that specialize in helping companies pay the ransom. Sometimes, you can actually negotiate for a lower ransom. It's like bartering. They'll act as the go-between between the various characters in the company. Obviously, you're paying them in a cryptocurrency. You're not just transferring cash.

Laurel: Of course.

Stephanie: So they can facilitate that, as well. I mean, if you look at the City of Baltimore, what they ended up spending to recover from the ransomware attack was probably a hundred times more than the actual ransom. I forget the numbers, but the difference was ridiculous.

Laurel: So some advice to cities and municipalities would be to actually look at your systems an try to get them up to date and protected, in some way.

Stephanie: Yeah. Certainly with ransomware, make sure all your systems are up to date, patched. If you look at most successful attacks, external attacks, they're taking advantage of vulnerabilities and other types of software exploits. It's nothing fancy. Everybody always loves to use advanced attacks or state-sponsored attacks. The reality is most of these attacks are pretty low budget, but yet still effective.

The other thing is take a close look at your backups. I can't emphasize it enough. People always overlook their backups. It becomes this rote IT process that nobody ever looks twice at or people demean it and call it not important. It could be more important today if you don't want to pay the ransom.

****

Laurel: Cybersecurity isn't only about stopping the threats you see. It's about stopping the ones you can't see. That's why Microsoft Security employs over 3,500 cybercrime experts and uses AI to help anticipate, identify, and eliminate threats so you can focus on growing your business and Microsoft Security can focus on protecting it. Learn more at microsoft.com/cybersecurity.

****

Laurel: So another interesting topic coming out of 2019 were just general data breaches. So 2019, it did really seem like, every other day, some company or someone was announcing a data breach. And then, according to Risk Based Security, 2019 saw more than seven billion records exposed. So when we get back to CISOs, how are CISOs and company executives really responding to that if 2019 was sort of this year where we [have seen so many] breaches, in one year?

Stephanie: Yeah. I do think 2019 was finally the year of breach fatigue. I mean, it was even difficult for us to keep up with every breach that hit the news. I do think it helps to put it in perspective. Not every one of these breaches was an attack. A lot of them actually were the result of accidental exposures. So if, for example, you misconfigured cloud storage, that's actually considered a breach, even though there's not necessarily any proof that any kind of third party or external attacker or organization actually misused or abused the data. Just the fact that somebody either internally or, oftentimes, it's the security researcher actually who discovers that all the information was less exposed. That is considered a breach.

But yeah. If you look at breaches, themselves, 51% of companies had at least one breach in the past year. And that number is probably higher because a lot of organizations don't know about it immediately. But then, there are a large percentage of them, actually the majority, are internal, a result of internal incidents, third-party incidents, or just lost or stolen devices. And if you do look at true external breaches, where it was an external party that attacked you and gained access to your sensitive data, getting back to a lot of it's low budget, the top three attack vectors were a direct attack on your application, taking advantage of a software vulnerability, or compromised user credentials.

And the three of those, if you look at it ... I hate to be flippant and say it's easy to solve, but it doesn't necessarily take the most advanced software technology to address, but it's secure application development. Getting back to our whole product development and design, it's just making sure that the applications that you develop that are customer-facing are secure by design—that you build security into them from the get-go. That will reduce a huge number of the direct attacks on applications. And then once the applications are deployed in production, again, protecting them with everything from web application firewalls to other types of security measures that are sort of wrapping the application and protections.

The other thing is software exploits. So many attackers are just taking advantage of vulnerabilities that haven't been addressed. So having a really strong vulnerability management program—not the sexiest topic in the world, vulnerability management, but companies don't actually have a good way of prioritizing vulnerabilities and then addressing them.

Laurel: And that's low-hanging fruit.

Stephanie: Such low-hanging fruit. And then the user credential compromise, multi-factor authentication.

Laurel: Yes. Go to your phone. Type in the password. We've gotten pretty used to it, though.

Stephanie: Right. And so many times with these major breaches ... and we used to write this annual lessons learned report and it almost got tedious because I just kept giving the same advice over and over again, like secure app development, vulnerability management, encrypting the most sensitive data that you have and then two-factor authentication and then having a robust detection and incident response program in place. So there's still low-hanging fruit out there. I mean, we talk so much about advance attacks and the increasing attack surface and that's true, but again, we still haven't addressed so many of the basics.

Laurel: And those basics, like you said, you've been writing about them for years. Do you think that some of these more notable attacks add color and maybe get people's attention in ways that they're willing to actually dedicate part of an IT budget to vulnerability?

Stephanie: They do. To your point about multifactor authentication, I think that's now become a forgone conclusion and accepted best practice—you have to have it in place. It took a while. It had to be breach after breach. It had to be the industry saying two-factor, two-factor, two-factor. Even consumer technology companies telling their customers, "Turn on two-factor, turn on two-factor." So I think the more that we kind of repeat some of these best practices, the more that they do finally become ingrained. I'm hoping that the same thing will happen with secure application development.

And then vulnerability management, again, having a way of prioritizing the vulnerabilities that you need to address is really important. And thousands of vulnerabilities are added to the National Vulnerability Database every year and 30% of them are deemed critical. So that doesn't help. You need a process, internally, of further prioritizing the vulnerabilities based on your own business context, IT context, your own threat environment to help you patch the system.

Laurel: Because otherwise, it's just ...

Stephanie: It would be overwhelming.

Laurel: So overwhelming.

Stephanie: Yeah.

Laurel: I think banks and the banking industry as being one of the industries that kind of force two-factor authentication, down the line. Are you seeing any particular industries doing that with securing applications or building secure apps?

Stephanie: Building secure apps, who do I think might be in the lead? That is a good question. I don't know if any other industry really stands out. We did this state-of-application security report recently and it was pretty eye opening, as well as dismal. It was not a feel-good report. And then we did do an industry-specific version of it where we filtered the data that we had by government. And the federal government was even worse than the private sector, so that's pretty concerning. I think the industries that maybe concern me the most would be health care and government.

Laurel: And those are two that most people probably use more regularly than almost anything else.

Stephanie: Right.

Laurel: Yeah. The health-care one is not a small concern at all. The idea of hospital data, personal data, but also PII [personally identifiable information] and just getting into systems and machines. Is everything a concern? When everything's a concern, how do you actually prioritize that if you are a health-care institution?

Stephanie: Yeah. I think with health care, for a long time, so much of the emphasis was on complying with HIPAA. Talking about US specific. In the US, there was so much emphasis on complying with HIPAA, even the CISOs was kind of focused on HIPAA as opposed to true cybersecurity. So historically, we often found that, unfortunately, health care spent less than other industries on security. And again, if they had a CISO, we often found that he or she wasn't reporting high enough in the organization or had the right span of control. That is changing. I think where health care also struggles is, even if they're addressing some of the historical budget issues that they had, it's an industry that's struggling with cost. If you have a dollar to spend on a clinical system versus cybersecurity, it's a really tough choice, particularly in the US where we spend so much on health care, but still have the worst health-care outcomes of all developed nations. That's the real struggle is security does require a decent amount of budget to do well.

Yeah. So health care, that's what concerns me is that they're behind other industries in terms of overall maturity, but I think that that's changing. I think the next area that probably concerns me within health care would be medical device security and application security.

Laurel: Yeah, that's a whole different, scary topic, right? Well, so lastly, what I was thinking about from 2019 is, this topic is broad and big, but also really important for most companies that actually produce a physical item or ship things from one place to another, which is supply chain security attacks. If a company's supply chain is so very long and complicated, we're not just talking about securing, yeah, your physical premises or cloud. It's also the product. Where it's shipped from, where it's going, and then also the vendors.

Stephanie: The components.

Laurel: The components and the vendors that you work with.

Stephanie: Yeah. I think a lot of that will start with overall third-party risk. Third-party risk and supply chain risk, I mean, supply chain is sort of a subset of overall third-party risk. In general, I would say third-party risk has been one of the top five challenges that a lot of our CISO clients—they talk to us about for the last several years. Again, if you look at the breach data, almost a good 25% to 30%, somewhere in that range of breaches are the result of third parties. And a typical large enterprise could have as many as 300 third-party relationships when you add up not just suppliers, but then you think about IT service providers, cloud providers. I once did a business impact analysis where I was inventorying all the third-party relationships for a really midsize organization, a thousand employees. And I found like 30 third-party relationships that the IT organization didn't know anything about.

Laurel: Oh, wow.

Stephanie: And so now you take that to a large enterprise. So I think a lot of it will come back to if you can start with the core third-party risk program. One, do you have one? Is security embedded within third-party risk? Do they have veto capabilities over third-party relationships? And it's not just a one-time evaluation of third parties of security posture. It's an ongoing relationship as well where you're periodically reassessing them. There's a lot of requirements to define like service-level agreements around security with third parties. So it all kind of starts with basic third-party risk management. And then I think it will extend into supply chain risk as well.

Laurel: And you’ll probably see that like with the people or companies really just exhibiting best practices. They push that downstream to all of their vendors. So, if you expect to work with us, you'll have to follow these procedures.

Stephanie: Exactly.

Laurel: OK.

Stephanie: Exactly. Yeah. I've wanted to write this report for a while, which is, for long-term business success, act like a breach company. Companies that have actually had mega breaches and kind of survived the initial aftermath, after about five years actually go on to have business results that will often outperform the S&P 500. And if you go back and you look at a lot of what they've done, it forces them to make both really tough business as well as IT and security decisions. So from a business perspective, it'll force them to like reprioritize a lot of their strategic initiatives because the breach ended up costing them $300 million, $500 million, even $1 billion. So it forces them to take out much harder to look on their overall business strategy and their business operations because they just don't have the money to spend on everything. Or in some cases they have to cut loose from bad decisions that they kept like pouring money into.

From an IT perspective, we'll see them implement things that they should've done all along. Third-party risk. I had one company that had a breach said that after the breach, you know, you couldn't buy a pencil without …

Laurel: Approval.

Stephanie: Approval. And you don't necessarily want to go to that extreme, but it was clear that they had no third-party risk beforehand. Or they'll hire the CSO or they'll change where the CSO reports into, start funding incident response and breach response programs. So, the breach actually forces them to be a much more mature and better company. It's just unfortunate that they had to go through the breach to get there.

Laurel: To get there. I guess that's probably what every company at some point probably should look at it is like, are we actually doing everything that we should be the right way? What is the state of affairs, the state of the company? and taking a hard look. But I'm sure that also wouldn't make you very popular if this was just an exercise.

Stephanie: Yeah.

Laurel: If you go through a terrible security breach, then you sort of have an excuse. So we kind of get back to the whole thing where when something terrible happens, now you have an excuse to …

Stephanie: I do think maturity assessments help there. I mean there's a lot of industry specific maturity assessments for cybersecurity. At Forrester, we have our own maturity assessments. So, we'll often find that clients will use that as a baseline, and then they can go to the board and they go to their business and IT executives and say, "Look, against industry best practices, here's where we're mature, and here's really where we're weak." And then that also helps them start to create like roadmaps and then justify funding for specific initiatives because you can say it's going to improve our maturity from a 1.5 to a 3. And this is really, for our industry, we need to be at least a 3 in this area.

So I do think to your point, taking that baseline against industry best practices and pick a maturity assessment, you know—Forrester’s, some other consulting firm’s. But baseline yourself in your security posture, communicate the results, work with business to set realistic goals about where you should be as a business and industry. And then that really drives your roadmap and a lot of your funding. So it gives you like a true north to point to.

Laurel: And it does get you on the road of having security as a differentiator.

Stephanie: Exactly.

Laurel: Yeah. So in 2020 we'll probably see more of that possibly. But a few more worrisome additions. We only briefly touched on geopolitics. The end of the year showed us a bit of a bumpy road with Iran specifically. So how much do you think average companies are concerned about geopolitics, or is it just any bad actor coming their way they need to beef up security?

Stephanie: We actually had written this report back in, I want to say initially in 2016, and it was for CISOs. And the title of it was “Ignore Geopolitical Risk at Your Peril.” And it was this idea that CISOs really did need to wake up to the cybersecurity implications of geopolitical risk. So, yeah, it's Iran today. It could be another country in the future. Even if you think that, oh, I'm not in government, I'm not in critical infrastructure, private sector organizations are targeted all the time. Again, in some cases because they're easy targets, because you do business with the federal government.

So you do have to be concerned about geopolitical risk. And again, for a lot of large enterprises where you are multinational, you have to think about how it impacts your employees, your offices, the third parties that you do business with. So as if CSOs didn't have enough to do, they do have to take geopolitical risk into account. How does it increase the risk? Does it make them more of a target? Actually, interestingly enough, if you look at some of the breaches, you know Marriott for example. Anthem, I forget if that was about three years ago. They wouldn't have thought themselves to be the targets of a nation state, but it was their data that they wanted.

Laurel: Of course. And are those kind of marquee names more at risk or really it's just, yeah, you have some great data?

Stephanie: I think it was less of the name and more about, in those particular instances about the data that they had. You know, in the case of Marriott, they had incredibly detailed information about the travel habits of government employees and senior officials. I think in some cases, they even had copies of their passport numbers as part of the sensitive data that was compromised.

In the case of Anthem, you have sensitive medical information about individuals. You do have to think about geopolitical risk and how a nation state might be able to use your data.

Laurel: There's definitely a lot there. But for what about domestic politics and risk? 2020 is an election year. But other than ballot box security, what other kind of special interest groups and campaigns may be collecting data that could be targeted?

Stephanie: Yeah, I mean that's an interesting one. Well, again, the ability to target voters is very powerful. So again, if you're a consumer organization that has very detailed information about individuals like preferences, likes, habits. Might not necessarily be used for nefarious purposes like identity theft. But again, it could be used by somebody who wants to target and influence those individuals. So I can definitely see, again, if you're a consumer company that has incredibly detailed data about preferences, buying behaviors, attitudes, all of that would definitely be really ripe for targeting.

Laurel: As well as attacks like deepfakes, right?

Stephanie: Yes.

Laurel: So would you say deepfakes are like the new phishing or is it just a flavor of phishing?

Stephanie: Flavor. It's sort of the next evolution of social engineering. As if social engineering wasn't difficult enough to address, now we've got to contend with deepfakes as well. And I feel like with deepfakes, they matured so much more quickly than the industry was expecting. I mean, I remember reading and listening to like podcasts and experts saying like, "Oh, you know, we're still years away from deepfakes that can really fool experts." And I feel like that timeline was way off.

Laurel: Way accelerated.

Stephanie: Way accelerated.

Laurel: So Forrester has this great report called Predictions 2020, and the prediction is that deepfakes will cost businesses more than a quarter of $1 billion next year. So, the technology is speeding up, but also the impact and harm is.

Stephanie: Yeah, correct. And I think just like a typical breach, there's the immediate costs. So if you think about your typical like social engineering attacks. Just recently it was a German company that someone was actually convincingly able to fake the voice of the CEO and they convinced another executive within the company to wire something like quarter of a million dollars to them. So, there's an example of your next evolution of business compromise and social engineering attacks. So you can imagine that being even more sophisticated, more grander scale, but then it's also dealing with the fallout of it. Proving that it was a deepfake, dealing with the breach response, how you handle it. The total cost of it we definitely think could be significant.

Laurel: Is that something insurance companies are ready to handle or can handle?

Stephanie: Interestingly enough, that example that I just gave, so it's a deepfake, everything happened over or computing infrastructure. It would be considered fraud as opposed to an external attack. So I'm not sure your cyber insurance would actually even cover it.

Laurel: Wow.

Stephanie: And I know a lot of companies do as part of their response. There's an assumption that a good chunk of it will be covered. For a large enterprise, it's not unusual to have a $100 million cyber insurance policy. So if you are banking on being able to use that $100 million, depending on the nature of exactly what happened. If it's classified as fraud, they might not cover it.

Laurel: What could executives do? Go back to secret passwords and handshakes or...

Stephanie: Yeah, all joking aside. If in those wire transfer instances or any other thing involving like sensitive data, having that second factor, authentication, I think actually is really important.

Laurel: Interesting. Obviously deepfakes, lots of fast moving technology, everything is moving and becoming so much more sophisticated. So, we're not just talking about the good guys having access to all the technology. The bad guys do as well. So what are we looking at in another Forrester prediction here about weaponized artificial intelligence and machine learning? It's a large topic, but with such rapidly advancing technology and everyone has access to the same tools, is there an arm race going on, or is it just we all have to kind of look out for each other and just get it done?

Stephanie: I think it's both. I mean security is a perennial arm race. I will say, I think security teams do need to embrace machine learning and other types of artificial intelligence very quickly. As fast as the bad guys are. I mean there's huge opportunities to automate a lot of our core processes within security. Everything from detection to remediation to the actual response.

Using machine learning for that better detection capabilities. So imagine if we were able to more quickly detect something and then so much more of our response was automated. Right now, a lot of what we do is very manual. The detection’s manual, like security teams actually are often, like if you look at a typical SOC, I mean they're flooded with thousands of alerts. Trying to make sense of which alerts are more important than others is difficult. So again, it's kind of the prioritization. Like understanding immediately which ones are truly nefarious and dangerous is critical. And you could hire as many people as you want. You would never be able to keep up with it.

Laurel: Right.

Stephanie: So you really actually do need technology to do that for you. But then once you know something truly is malicious, again, right now it's a manual process that's kicked off, which is, you would manually need to reset user's passwords. You would manually need to isolate devices from the network. Everything is done manually, and in some cases we still have a lot of steps where we're asking for approval. In the future, all of that would happen automatically. You would need business to work with you to say, "OK, from now on if it meets a certain threshold, if we're 90% confident that this is malicious, then security team, all the processes are automated." Everything I just described like resetting of passwords, isolating of devices, all of that can happen automatically. You don't need to wait for anybody's approval.

Laurel: And time is of the essence?

Stephanie: Yeah, so we have to talk about businesses needing to undergo digital transformation to meet new customer expectations and changing business demands. I think security teams need to undergo their own digital transformation, because everything that we do today is so manual and so time-consuming. And if we are going to keep up with cyber criminals taking advantage of these technologies, we need to do it sooner rather than later.

There was this one company that used machine learning algorithms to target individuals with more sophisticated social engineering messages and was able to increase not just the total number of people that they were able to socially engineer, but the success of the clickthroughs exponentially with machine learning and automation technologies. So yeah, we've got to keep up.

Laurel: Would you say there's any other particular trends you're looking at for 2020 or things people should keep an eye out?

Stephanie: I actually do think third-party risk and supply chain are going to continue to be a major, major problem. And that's actually another area because of the size of the problem. When I talk about an enterprise having 300 third-party relationships—that's just an average. There's large companies with even a bigger number than that, and it's very time-consuming to assess the posture, negotiate SLAs, continue to assess them, keep tabs on them, get logs from them so you understand where your data is. So the third party risk challenge alone needs its own set of kind of automation tools and supply chain rules. I do think the transformation of the SOC will continue to be an issue in 2019.

Laurel: The security operation center?

Stephanie: The security operation center. As if we didn't have a skills and a staffing shortage. Even if you could hire all the people you want, you can't keep up with the scale and the challenge of the problems. So you've got to adopt automation there as well.

Laurel: Do you see companies adopting automation first perhaps in security where they may have been more hesitant in other parts of the business?

Stephanie: Yes. And for a long time there was a hesitancy to adopt automation, because there was this fear that I might block a legitimate business transaction. And you know, historically, the security team struggled to be seen as a business partner. They were the department of no. And so there was this dreaded fear of potentially blocking any kind of legitimate business transaction. I think that that fear is gone now given all of the breaches that we have. The business is like, "No, you know something's malicious, you block it."

Laurel: Right.

Stephanie: "Or within a certain confidence, you think it's malicious, you block it." So, there's definitely an openness to automation. I will say, there's this phrase like don't automate stupid. You can't just invest in an automation technology if you lack fundamental processes in your SOC. So, you do have to mature your SOC operations and have mature processes and then you can automate them. Otherwise, you're just automating stupid. And in some cases too, that's why a lot of companies are turning to managed services.

The vast majority of security budget is actually now spent on services now. Whether that's consultative, professional managed services, or security actually delivered as a service from the cloud. It's moved away from on-premise products towards services. And I think part of that reflects the need that security teams, they need external help. They have a scale of problem, they have an expertise problem, so they are turning more and more to services. And then as the business becomes more and more cloud-based, your security technologies need to become cloud-based as well.

Laurel: And rapidly.

Stephanie: Exactly.

Laurel: Yeah. And that certainly kind of comes back full circle to, don't go it alone. You can't do this all by yourself out in the wilderness.

Stephanie: Definitely not.

Laurel: Well, thank you very much, Stephanie. It was great having you today on the Business Lab.

Stephanie: Thanks for having me. It was great to be here.

Laurel: That was Stephanie Balaouras, vice president and group director of security and risk research as well as infrastructure and operations research at Forrester Research, who I spoke with from Cambridge, Massachusetts, the home of MIT and MIT Technology Review, overlooking the Charles River. Also, thank you to our partner, Microsoft Security.

That's it for this episode of the Business Lab. I'm your host, Laurel Ruma. I'm the director of Insights, the custom publishing division of MIT Technology Review. We were founded in 1899 at the Massachusetts Institute of Technology, and you can also find us in print, on the web, at dozens of live events each year. For more information about us and the show, please check out our website, TechnologyReview.com. This show is available wherever you get your podcasts. If you enjoyed this episode, we hope you'll take a moment to rate and review us. Business Lab is a production of MIT Technology Review. This episode was produced by Collective Next. Thanks for listening.

Security threats are everywhere. That's why Microsoft Security has over 3,500 cybercrime experts constantly monitoring for threats to help protect your business. More at Microsoft.com/cybersecurity.