Last week, the congressional spotlight was again on Facebook, as former product manager Frances Haugen testified before a Senate subcommittee that the social media company continues to put profit ahead of user security. Many praised Haugen for her articulate accounting of the company’s practices and for releasing thousands of pages of proprietary Facebook documents to federal regulators, Congress and the press.
Yet in taking a stand, Haugen took on major legal risk. Because protections for technology-sector employees who blow the whistle on digital harms are limited, Facebook has the right, in theory, to sue Haugen for disclosing confidential information.
While Haugen’s public prominence may make her situation unique, the dilemma in which she found herself – as an employee who feared her company was knowingly deploying technologies that caused real people harm – is not.
The same Congress that hailed Haugen’s bravery now has the responsibility to act. Congress should pass a whistleblower protection act that creates a lawful method for private technology-sector employees to disclose digital harms that may harm the public. A “harm” might be something concrete, such as a hardware vulnerability. It also might be something less clear, such as evidence of a pattern of willful misconduct. This protection act must also prevent employer retaliation.
The fact that this law does not already exist is dangerous and anachronistic. It is an indisputable fact that digital vulnerability is personal and public vulnerability. What happens online and over networks does not only augment our daily life — it defines, underpins and shapes it. As such, protecting whistleblowers who disclose evidence of potential malfeasance is a matter of both domestic and national security. We need to incentivize the reporting of potentially major security flaws to ensure that the public good takes precedence over private interest.
As a country, we have long recognized this: it’s the reason why there are whistleblower protection laws for federal employees, and it’s why the Securities and Exchange Commission (SEC) – the existing protection program to which Haugen has applied – offers not only the promise of non-retaliation, but also a reward. And it’s why government agencies offer anonymous bug bounty programs to incentivize reporting of technical flaws under specific circumstances.
It’s time to update whistleblower laws to include digital privacy and security concerns. What this will look like in practice is not yet defined; a narrow private-sector law might only cover the disclosure of discrete vulnerabilities in software or hardware, while a more expansive law might cover the type of broader, and arguably less definable, digital harm identified by Haugen. But as the law currently stands, an employee who seeks to report even on a specific, known flaw in a product that a company refuses to fix or disclose has no legal protections — even if that flaw is reported directly to certain government agencies.
One possible path forward is to expand the Consumer Product Safety Act (CPSA) to explicitly include a definition of digital harms within its “duty to report” provision — one that could include vulnerabilities in software or hardware, networks or processes, as well as evidence indicating harm to the public good. Importantly, the CPSA covers not only proven harms but potential harms — issues that create unreasonable risk of future harm.
If such an expansion were passed, relevant employees would have access to a reporting process and would be protected from employer retaliation, much like how the Occupational Safety and Health Administration (OSHA) shields those who complain about workplace safety violations. Such a law should clearly apply to both present and former employees, and preclude not only workplace retaliation but also legal action for breach of non-disclosure agreements and/or proprietary information disclosed lawfully in the course of the complaint.
Of course, any whistleblower protection paradigm can’t give employees carte blanche to disclose company intellectual property. But this concern should not prove an obstacle to the creation of either a new law or a new protected category to enable technology-sector employees to act if their company does not.
Frances Haugen took a risk, one that many other employees may have been dissuaded from taking out of fear. We have no idea how many other “Haugens” are out there or what information they may share to ensure the security of the United States and its citizens.
It doesn’t have to be this way. We need to create a culture of safety in the U.S. on technological and cybersecurity issues — one in which the onus is on the company to protect consumers and employees over its bottom line. A whistleblower law will help get us there.
Mary Brooks is a resident fellow on the Cybersecurity and Emerging Threats team at the R Street Institute. Follow her on Twitter @Mary_K_Brooks.