Safety Detectives, a publishing group of cybersecurity experts, privacy researchers, and technical product reviewers, interviewed Tim Mackey, principal security strategist at Synopsys Cybersecurity Research Center (CyRC), to discuss his journey working in cybersecurity, the state of the industry, and how current events like the pandemic are shaping the future of cybersecurity.
I started out as an electrical engineer. I was doing space communications, satellite communications, long-distance, and telco.
Shortly before graduation, one of the major employers where I was living at the time suddenly laid off hundreds of people with qualifications that were significantly more advanced than what I had, which forced me to pivot. I went from the electrical engineering world to software.
One of the first companies that I worked for in the software world was in the business of putting control systems for heavy industries—oil refineries, manufacturing plants, food processors. At the time, that software had a critical challenge as well as a critical opportunity. Most of those plants were doing control systems that were largely based off relays—ladder logic.
When software came in, and software-defined things came into that world, there was great concern about the electrical impulses that would be associated with creating an environmental challenge—explosions, systems not turning off when they’re supposed to, and an overflow situation.
My first cybersecurity task was that if I didn’t get the code right and couldn’t prove that the code did what it was supposed to do, then bad things will happen. I was co-author of A Rootkit for Good, doing some deep certificate work all the way into owning product security.
It is the whole problem-solving aspect of it. When you look at how people try to secure things, there’s an old adage that the least secure thing is the thing that someone actually believes is significantly secure because they wrote it. The level of expertise that’s required to understand how, for example, certificate communications work is pretty advanced. There’s a lot of math that is involved in it to the extent that the average developer simply needs to assume that the library is doing the correct thing and leave the library details around, for example, how SSL might work to somebody else, how the certificate revocation process needs to work to somebody else.
Synopsys is a portfolio company. We range from being able to design the next generation of chips through lithography all the way through to cybersecurity. What we really have is a set of capabilities that allow for our customers to build trust in whatever they’re delivering to the world, whether that’s a piece of hardware or a piece of software that it’s been tested every possible way using technologies that are at the vanguard of their respective disciplines. This way our customers can have confidence that whatever they’re producing, their end customers are going to be satisfied with the outcome.
The big trick on the cybersecurity side of the equation is through investment and innovation, so for us, it’s really a question of identifying where the trends are. As an example, earlier this year, U.S. President Biden issued an executive order on cybersecurity, and there are a number of elements in that executive order—it’s not a directive that “thou shalt buy these specific tools,” but it explores the problem space of how software, and the trust boundaries associated with software development, are being exploited.
There have been a number of things that have come as a result of the executive order related to NIST, but I’ve likened this executive order and its feature impact to that of GDPR, where GDPR was very much around the digital privacy and management of one’s personal identity. The executive order that President Biden signed has a far-reaching impact on cybersecurity.
As an example, when most people look at the individual piece of software that they’re using, they have no idea how to actually properly manage and patch it. They know that it comes from supplier X, and that supplier X should be doing something, but they don’t have a mechanism to determine whether supplier X is doing everything that they ought to with respect to issuing patches or having a secure configuration. They don’t have a mechanism to determine whether or not there were any secure development practices used when the software was authored. They don’t have any confidence or any mechanism to determine if the update mechanism is itself a secure process. As a result, they simply have to trust that whoever’s name is on the box that they bought from Amazon is going to take care of things correctly. But the executive order effectively says we can’t assume that as an industry, we should be working towards that.
Within Synopsys, one of the things that we’ve been doing is a whole lot of primary research around the security of open source componentry. We’ve been doing that for the better part of six years, all with an objective to understand how software is being used and how software is being consumed.
The most important one in my book is a set of complacencies—so if an organization is not able to provide a comprehensive, up-to-date of software that they’re running, where they got the software from, why they obtained that software in the first place, and what role it effectively plays, they’re not really in a position to be able to patch it. There is no end of drivers on end-users’ machines that are probably obtained when someone needed a patch and went to the internet to search for their favorite driver update.
An enterprise scenario is that someone wants to take the component that is packaged up in a Docker container and use it because it does what they need it to do. Nobody looks at whether or not that was the most recent version or didn’t include a crypto miner.
The pandemic created a work-from-home paradigm that has stretched the edge of the enterprise. A person’s shared laptop, where they are doing work activities on it, is a risk factor that wasn’t present in whatever threat models that were created around how the employees were going to interact with data. If that edge now is secured using an antivirus solution from vendor X as opposed to the corporate standard of vendor Y, there is a little bit of ambiguity as to whether or not the enterprise is going to know whether that endpoint protection is going to be sufficient to the task or whether it needs to be augmented in some capacity; if it’s a multi-use environment, that creates further challenges.
Everything associated with the pandemic has been around extending the edge out, like the rise of everything as a service. We’re going to be grappling with this over the course of the next several years in the face of an ever-more brazen cybercriminal enterprise that is into ransomware activities and trying to force people into paying exorbitant sums of Bitcoin.