The Need for Patching: A Podcast with Fortinet’s Renee Tarun

On this episode of the Early Adopter Research (EAR) Podcast, EAR’s Dan Woods spoke about patching with Renee Tarun, VP of Information Security at Fortinet. She is one of more than 30 contributors to the upcoming book from Fortinet CISO Phil Quade entitled The Digital Big Bang: The Hard Stuff, the Soft Stuff and the Future of Cybersecurity. The Digital Big Bang puts forth a big history style explanation of cybersecurity. In the book Phil Quade proposes a framework for creating a truly scientific approach to cybersecurity. Phil Quade’s ambition is to find a way to systematically address many of the problems that have risen up because cybersecurity was not properly incorporated into the design of the internet. Tarun is an expert in patching and monitoring real time cybersecurity operations as well. Woods is a technology analyst and founder of, a research publication that focuses on high-value use cases and how to create multi-product platforms to implement them. He played the role of editor for Phil Quade’s book, which came out in preview at Fortinet’s Accelerate Conference in April 2019 and should be published by John Wiley in August. Their conversation covered:

* 2:10 – The undervalued importance of patching
* 4:05 – Can gamification work for patching
* 7:00 – The need for libraries to help with patching

Listen or read an edited Q&A of their conversation below:

Woods: On 24 or Mr. Robot they don’t talk about patching. But it’s really like leaving your keys in the car with the doors unlocked. Why isn’t there more urgency about this? Why is it important and why don’t people pay more attention to it?

Tarun: I think there are several reasons why. For patching, a lot of organizations, when you look at the number of patches and vulnerabilities that were released last year, you’re talking in the thousands and that number is ever-increasing. So that creates quite a conundrum for IT professionals where patching is just one part of their daily jobs. So when you talk about the sheer volume of patching that they need to do, it creates quite a bit of overhead for them. And there are also concerns in the culture and mindset of don’t fix it if it’s not broken. Though in our world of always on, interconnected type mentality, the other concern is that if you put in a patch and even though you do all the testing in the world, there’s always some certain configurations that may not work well and ultimately what could happen is what you intended to do good, ultimately can take the system down. It’s a concern that sometimes there are legacy systems. Systems are so old that the hardware or software can’t support the patch. 

Have you come up with any gamification or communication mechanisms to make the value that’s created by patching more visible?

Absolutely. I’m a firm believer in what gets measured gets done. So we report on our monthly statistics of where we are in our patching and our statistics. Because when you look at some of the latest breaches that have happened, you know, it’s not some latest greatest malware, some sophisticated attack, it’s because, like you said, they left the doors open and the car’s unlocked. Equifax is a very good example. It was simply because they had an unpatched system.

If you start down the road of doing a better job with patching, it seems to me that one of the things that you may be able to achieve is what has been called cyberagility. And agility in general, dev ops, CICD, it’s all about being really confident in making changes so that you can have a rapid rate of change and not be frightened. How do you start that process going and can patching be the thin end of the wedge for cyberagility? 

I think a lot of that starts actually in the beginning of the software development life cycle. Ultimately it starts with teaching developers and software coders what are good secure coding practices and making sure that from the beginning the software is developed without vulnerabilities, minimizing the number of patches that have to be done but also making sure that not only in when you’re doing the testing for quality assurance—quality assurance is not only just focusing on the functionality but also ensuring that the security is there. 

How would CICD or dev ops work differently secure coding practices and patching were incorporated in them?

It would work much better because again, you’re doing more proactive versus more reactionary activities. I think from my perspective, someone who actually used to do development myself, having that insight, it’s always fun to work on later and new advanced features. Having to go back and fix old code that’s written in the past is never fun. It also can be very mundane for developers. 

I’ve heard the idea of increasing the visibility about the dependencies so that if you did know that a library needed to be patched, you could then alert all the teams that they needed to rebuild their software and have better documentation of everything that would allow you to do a better job.

Absolutely most developers use some type of third party open source libraries and oftentimes people use those libraries within the code and then forget about them, not realizing that there has been a vulnerability that’s been reporting in that library. So that’s one of the things that was important to our product security instant response team. So when there is some type of vulnerability report in a third party open source library, they go back to the developers to make sure they are using that version. 

What do vendors need to do better in terms of presenting patches to users? 

When you implement a patch, you’ve got to essentially restart or reboot the system, that equates to downtime, lost revenue, lost productivity. So being able to deploy patches where you don’t have to reboot or restart a system. Another aspect is all or nothing approach to patches. A lot of times vendors will include within their security fixes new advanced features. And a lot of times for some organizations they may not want the new features because it doesn’t either integrate to their system or changes their processes but they do want the security patch. But ultimately they’re in that conundrum because it’s all or nothing. So they either fix the patch and get the new features or not patch. A third idea is similar to how database technologies have a rollback feature, where you apply a patch and you get the results that you weren’t looking for or it has a negative impact, you need to be able to quickly recover back to the previous state. So similar to like a rollback feature in database technologies.

From a practical matter, what is the beginning of that road to becoming more scientific about cybersecurity?

A lot of it is starting with some of the basics in the culture. You have to have the policies in place and ensuring that the systems that you identify that need to be patched in priority order, having that foundation and then having the ability for those people that have that authority to patch, making sure they have the ability to execute.