We could stop the next massive cyberattack, with some simple quality controls

Getty Images

WannaCry makes me “want to cry,” because the self-replicating ransomware payload was delivered by a worm that entered through a known vulnerability. A known vulnerability is just what it sounds like: a security flaw that has been known about and publicized, and a patch issued.

Those whose computers have been owned by WannaCry did not patch their system, and WannaCry breached their computers because they failed to update their software with the security patch.

{mosads}The unlocked door was left unlocked. The bad guys simply walked right in.

 

If you leave your keys in your car, with the doors unlocked and the car running, don’t be surprised when your car is stolen.

This known vulnerability allowed the ransomware through the unlocked door and to deploy its self-replicating worm inside computers in a network. Then it infected every computer on that network, and jumped to other networks as well.

The problem of known vulnerabilities is far worse than personnel simply failing to update their software with every new patch.

Virtually every software sold on the market today is sold with known vulnerabilities that are serious enough to have patches issued. 

But how is a software buyer supposed to know if the software or hardware they just bought is hard-wired with known vulnerabilities?

Is there a third party that checks software? No.

Is there are any entity, like the Insurance Institute for Highway Safety (the crash test dummy guys), that independently and rigorously tests all software sold in the United States and then publishes the results? No.

Are there building codes for software, that builders are forced to follow and that are enforced by inspections or fines or ending the sale of the software? No.

Are any of the supply chain management systems that exist for food, which allow any tainted food to be traced back to the source, in place for technology? No.

Do most manufacturers test their software for known vulnerabilities or unknown vulnerabilities before publishing it for sale? No.

Is there any independent testing body that must be used, or that all of the industry has agreed should be used prior to the sale of software? No.

Are software manufacturers liable in any way for code they sell riddled with defects and known vulnerabilities? No.

When you sign an End User License Agreement (EULA) upon purchase or download of software, does it protect the software manufacturers from any legal action against the company for its defect, known vulnerability code? Yes.

Do EULAs force the buyer to transfer liability for a flawed, defective product away from the manufacturer and to the buyer? Yes.

Companies buying software should mimic the Mayo Clinic’s procurement protocol and force vendors to test their own software and share the test results before Mayo will consider buying it. 

Companies can demand software have no known vulnerabilities prior to considering it for purchase and demand that certain testing software be used to prove it. 

Or, companies can demand that any software they purchase be certified by the Cyber Assurance Program Underwriters Laboratory has created.

Legislation could be introduced and passed that forced the U.S. government to force all software and hardware vendors to be even considered for purchase they prove that there were not known vulnerabilities in their systems or software or if there are known vulnerabilities, where they are and why they do not make the software or hardware vulnerable.

Government purchasing software should force companies to patch their software for all known vulnerabilities that appear after the software is purchased, and that patch not compromise the software or any of its capabilities.

If a chief financial officer at a publicly traded company makes a material error or does not deal with deficiencies revealed in their annual audit, that CFO is personally liable.

What if chief information officers (CIO) or chief information security officers (CISO) were forced to have an independent security audit every year, and the result was presented to theboard of directors?

Or, what if the CIO or CISO were held personally liable for breaches?

Presently, there is no will in Congress to implement any one of these ideas. However, there is some promise to President Trump’s executive order on cybersecurity issued this month.

The best one is that U.S. government agencies must list their vulnerabilities and provide a rationale for why they are unaddressed. This provision alone, if the National Command Authority forces them to fix the vulnerabilities that make the agencies vulnerable, will be a huge step in the right direction. 

Dan Perrin is the founder of the Council to Reduce Known Cyber Vulnerabilities.


The views expressed by contributors are their own and are not the views of The Hill.

Tags cybersecurity Dan Perrin Technology WannaCry

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts

Main Area Bottom ↴

Top Stories

See All

Most Popular

Load more