The feds have work to do to defend against future cyberattacks
On paper, few organizations in the world should be better equipped to anticipate and repel cyberattacks than the U.S. federal government. There are 2.7 million employees in the federal civilian workforce, more than any private company in the world. Its information technology budget is set to clock in at $95.7 billion in 2018.
However, the vast majority of those resources — more than 75 percent — are devoted to operating and maintaining legacy systems. The Office of Personnel Management data breach demonstrated the vulnerability of these increasingly outdated systems, as the government announced in June 2015 that the personal information of more than 21.5 million current and former government employees had been compromised.
Plans are underway to modernize federal IT infrastructure and migrate data to more modern cloud services. Following up on the executive order President Donald Trump handed down in May on cybersecurity, the White House’s American Technology Council recommends data systems transition away from on-site mainframes to shared cloud services in a governmentwide program called FedRAMP. This transition promises cheaper scalable services, increased data security, better functionality and the simplicity of transitioning from myriad systems to just one.
{mosads}Alas, the history of large-scale federal IT modernization projects suggests this all may be easier said than done. Look no further than the Healthcare.gov debacle, which resulted in cost overruns attributed to poor contractor management. Likewise, the Social Security administration’s planned revamp of its Disability Case Processing System wasted nearly $300 million for a dysfunctional product. A post-mortem audit of the failed DCPS revamp credited its failure to poor management and vague plans.
Similarly, the FBI suffered multiple setbacks when it tried to implement the Virtual Case File system in the early 2000s. The project suffered from management issues and turnover, entrenched contractors and a lack of engineers. A glaring example of this inefficiency was the bureau’s decision to recreate software that could have been purchased on the market with a few modifications. The effort ultimately was abandoned after four years and more than over $170 million spent. The replacement “Sentinel” file system itself cost more than $425 million, suffered multiple delays and eventually was implemented after six years.
Government IT failures like these paved the way for the U.S. Digital Service and 18F, two governmental entities tasked with helping federal agencies on technical projects. While these efforts represent an innovative approach with a good track record on execution, USDS has been criticized for growing too big and too quickly under the Obama administration, while 18F has been hampered by cost overruns and inefficient staffing allocations. It remains to be seen whether these efforts will result in cost savings for federal agencies or just add to the bureaucracy surrounding federal IT implementation over the long run. Indeed, public comments submitted to the ATC for the council’s recent report on modernization suggest unnecessary public sector duplication of private sector efforts constitutes a common concern.
Finding money for mammoth IT projects is another issue. While the federal budget for information technology increased to $94 billion in 2017, the amount spent on modernization and enhancement has fallen by more than $7 billion since 2010. According to a 2016 report by the Government Accountability Office, some agencies still use systems with components that are at least 50 years old, including a U.S. Treasury Department database to assess taxes and generate tax refunds that is written in 56-year-old assembly language code. A variety of federal agencies rely on programs written in COBOL, a programming language developed in the 1950s and 1960s. Maintaining such systems is increasingly difficult, given that the language is no longer taught.
Continued use of legacy software and hardware systems also poses cybersecurity problems. Congressional hearings following the OPM hack pointed to the agency’s failure to shut down or secure 11 legacy computer systems previously identified as insecure. The hack cost taxpayers more than $350 million in recovery, victim notification and identity theft services. Subsequent attacks underscore vulnerabilities common in older systems. The WannaCry ransomware attack in May 2017 piggybacked off devices that used an older, unsupported version of Microsoft Windows. The resulting disruption cost $4 billion and affected computers in 150 countries. Compared to older systems, newer cloud systems can adapt to attacks and are more actively maintained and monitored for vulnerabilities.
Migration of federal IT systems to cloud services could be a boon if done right, or an expensive boondoggle if mishandled. To avoid the latter, the White House should continue its execution process in an open and transparent manner, soliciting broad input during each stage. Policymakers should also focus on clearing accreditation and compliance hurdles for federal vendors. These include simplifying the procurement and provider compliance process across federal agencies and reducing the reporting burden for federal IT vendors. For example, the government should allow reciprocity in the FedRamp cloud accreditation process so a single application can work for all agencies.
Agencies will also need incentives and clear direction to overcome institutional inertia and prioritize spending for new systems. With these considerations in mind, the transition to cloud services may avoid the pitfalls of previous government IT projects.
Anne Hobson (@AnneLHobson) is an associate fellow of the R Street Institute. Danielle Barden is a research assistant at George Mason University.
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. Regular the hill posts