Congress is about to respond to the Office of Personnel Management cyberattack with a move as ill-considered as it will be ineffective. Rather than focus on pushing both the government and private sector to do obvious things to make their networks more secure, it wants to pass a bill that will put even more data in the hands of agencies that aren’t themselves secure, while immunizing corporations regardless of their own preparedness.
Several years ago, when it came to describing cybersecurity threats, Congress imagined China stealing the intellectual property of the private sector. “[D]angerous economic predators, including nation-states like China, use the Internet to steal valuable information from American companies and unfairly compete with our economy,” former House Intelligence Committee Chair Rep. Mike Rogers wrote in a 2012 U.S. News & World Report op-ed. His preferred solution, the Cyber Intelligence Sharing and Protection Act (CISPA), accordingly emphasized the government’s role in sharing information on cyberattacks with the private sector, along with any personal information not clearly unrelated to the details of any attacks. That second stipulation rightly alarmed privacy advocates, as it potentially gutted a number of legal restrictions on the sharing of digital data.
Perhaps Rogers, who was reportedly among the millions of current and former government employees whose data was stolen in the OPM attack, may now understand that accelerated information-sharing with the private sector won’t protect America.
The OPM hacks were the largest cyberattacks in U.S. government history; the personal information of 4.2 million government employees and the security clearance materials of 21.5 million employees and contractors and their families was stolen. The hacks also exposed a number of the flaws in Rogers’s thinking. No additional information-sharing would have helped. Not only did the government already have information about a related hack on Anthem, which provides federal benefits, but OPM itself was closely monitoring that hack. Yet, as a series of audit reports made clear, OPM had a number of weaknesses in its own security, and it didn’t realize it was being attacked via a similar mechanism.
“Until proven otherwise, this is in fact the worst national security disaster this country’s ever experienced.”
It turns out the government wasn’t even taking basic precautions to protect some of the most sensitive data of this country: the security clearance applications, adjudication information, and fingerprints of security-cleared professionals.
Commentators describe the potential impact of the OPM hack in fairly alarming terms. Because of the counterintelligence risks, it will take decades to recover. Longtime CIA veteran Charles Allen called it “a national security risk unlike any I’ve seen in my 50 years in the intelligence community.” He worried, in particular, about the ability for the presumed culprit, China, to alter security clearance information to make it easier for spies to infiltrate U.S. agencies. Mike Adams, who spent the past 15 years in information security following his retirement from the U.S. Army Special Forces, said in an interview, “Until proven otherwise, this is in fact the worst national security disaster this country’s ever experienced.”
This is a counterintelligence goldmine for China, that same “predator” Rogers envisioned as stealing private, not government, data—but in this case, the country allegedly acquired some of the most valuable information imaginable on our national security personnel.
Congress has blamed the White House. At a hearing on the OPM hack, Cybersecurity Subcommittee Chair Rep. John Ratcliffe claimed, “The White House is essentially calling on federal agencies to do in the next 30 days what they were already required to do.” Unfortunately, Ratcliffe went on to say, “Cybersecurity should not be a sprint exercise; but rather a marathon”—and that’s part of the problem. For years, no one in the U.S. government has treated cybersecurity with the urgency it deserves.
The White House’s 30-day Cybersecurity Sprint required immediate implementation of measures—such as two-factor authentication and increased scrutiny on privileged users—that agencies had slowly been improving over the last 13 years. And Congress, too, has failed to use the tools at its disposal. Even intelligence agencies keep getting extensions for implementing the kind of network monitoring that might have prevented the Snowden leaks—or some hacks. More recently, the Senate Appropriations Committee refused to fund OPM’s request for $37 million in additional funds to speed up its network modernization as part of its response to the hack.
CISA would have prevented neither the OPM hack nor any of the other recent major hacks like Sony or Target.
Aside from the finger-pointing, Congress’s initial instinct, besides holding a few hearings and demanding the resignation of OPM head Katherine Archuleta, has been to refocus on what it was already doing: trying to pass the Cybersecurity Information Sharing Act, the successor to Rogers’s information-sharing bill. But CISA would have prevented neither the OPM hack nor any of the other recent major hacks like Sony or Target. In fact, passing CISA as the response to the OPM hack would be counterproductive—most obviously because the government has far more urgent things to do to protect itself before it takes on a new role in information-sharing.
Moreover, if CISA worked as planned, the private sector would share far more data with the federal government than it currently does—in the wake of evidence that government networks are not secure. Audits show that the departments of energy, treasury, and commerce have their own cybersecurity vulnerabilities, yet they would all be among the agencies that would immediately have to share any data received with the federal government under CISA. More importantly, the Department of Homeland Security, which would have the lead role in information-sharing, was attacked via the same approach used with OPM—through a security clearance contractor last year—and has struggled with cybersecurity issues in recent years.
CISA’s approach of offering immunity in exchange for information-sharing may lead to sloppier cybersecurity practices among corporations that aren’t otherwise pressured to improve. Since corporations will gain immunity by sharing their customers’ information, they can’t be sued for their negligence—currently one source of pressure on corporations to improve. As Sen. Ron Wyden said just days after disclosure of the OPM hack, “This particular cyber security bill is largely focused on trying to make it more difficult for individuals to be able to take on corporations.”
Immunizing corporations may make it harder for the government to push companies to improve their security. As Wyden explained, while the bill would let the government use data shared to prosecute crimes, the government couldn’t use it to demand security improvements at those companies. “The bill creates what I consider to be a double standard—really a bizarre double standard in that private information that is shared about individuals can be used for a variety of non-cyber security purposes, including law enforcement action against these individuals,” Wyden said, “but information about the companies supplying that information generally may not be used to police those companies.”
After seeing banks go free for their role in crashing the economy, should we really be offering corporations further immunity for self-reporting?
Financial information-sharing laws may illustrate why Wyden is concerned. Under that model, banks and other financial institutions are obligated to report suspicious transactions to the Treasury Department, but, as in CISA, they receive in return immunity from civil suits as well as consideration in case of sanctions, for self-reporting. “Consideration,” meaning that enforcement authorities take into account a financial institution’s cooperation with the legally mandated disclosures when considering whether to sanction them for any revealed wrongdoing. Perhaps as a result, in spite of abundant evidence that banks have facilitated crimes—such as money laundering for drug cartels and terrorists—the Department of Justice has not managed to prosecute them. When asked during her confirmation hearing why she had not prosecuted HSBC for facilitating money laundering when she presided over an investigation of the company as U.S. Attorney for the Eastern District of New York, Attorney General Loretta Lynch said there was not sufficient “admissible” evidence to indict, suggesting they had information they could not use.
All these reasons why CISA would be a stupid response to the OPM hack don’t even begin to get into the privacy concerns. The bill supersedes the Electronic Communications Privacy Act and all other privacy protections, permitting companies to voluntarily share their customers’ data with the government. But according to a new DOJ Inspector General report on the Federal Bureau of Investigation’s cybersecurity efforts, even private partners have concerns; they’re hesitant to share information with the FBI because of concerns for their customers’ privacy.
There’s some indication that the Senate is waking up to misplaced priorities represented by CISA. Along with five colleagues, Virginia Sen. Mark Warner last week introduced a bill, the Federal Information Security Management Reform Act, that would prioritize government defense. “The attack on OPM has been a painful illustration of just how behind-the-curve some of our federal agencies have been when it comes to cybersecurity,” he said in a press release. FISMA would require all agencies to submit to real-time monitoring and would also permit DHS to take “countermeasures” against intruders (the latter of which, also included in CISA, would likely cause additional problems). But as with CISA, it supersedes privacy protections to foster information-sharing, including with outside contractors.
Increased attention on the susceptibility of networked cars—heightened by but not actually precipitated by the report of a successful remote hack of a Jeep Cherokee—led two other senators, Ed Markey and Richard Blumenthal, to adopt a different approach. They introduced the Security and Privacy in Your Car Act, which would require privacy disclosures, adequate cybersecurity defenses, and additional reporting from companies making networked cars and also require that customers be allowed to opt out of letting the companies collect data from their cars.
“We cannot sustain this model if our software could actually seriously injure or kill us.”
The SPY Car Act adopts a radically different approach to cybersecurity than CISA in that it requires basic defenses from corporations selling networked products. Whereas CISA supersedes privacy protections for consumers like the Electronic Communications Privacy Act, the SPY Car Act would enhance privacy for those using networked cars. Additionally, while CISA gives corporations immunity so long as they share information, SPY Car emphasizes corporate liability and regulatory compliance.
That’s a lot easier to do in the automotive industry, which already faces heavy safety and environmental regulation. Jeffrey Vagle, executive director of the Center for Technology, Innovation and Competition at the University of Pennsylvania Law School, explains that in software, the approach has long been to push liability onto the consumer via terms of service agreements “to encourage innovation by pushing risk away from the manufacturers and onto consumers.” But the car hacks may change that, he says. “We cannot sustain this model if our software could actually seriously injure or kill us.”
At the Aspen Security Forum in July, former Homeland Security Secretary Michael Chertoff largely agreed, admitting that the Internet of Things—the term used to generally refer to objects connected to the Internet—may require certain standards. “It may be that certain things like the ability to remotely patch, or some limitations on the degree of connectivity,” he said, “need to be embedded, and just as we don’t allow adulterated food in the marketplace, or unsafe automobiles or automobiles without airbags, maybe we have to have similar kinds of requirements in the architecture on the device side.”
But if the Internet of Things requires it—because hacking a car or a pacemaker could kill someone—why don’t Internet applications that might similarly ruin someone’s life require it? If the SPY Car Act gets any traction, it ought to raise real questions about the current model of liability for insecurity that can ruin someone’s life.
These are weighty questions of the sort Congress should be considering after the nation’s intelligence agents have had their lives and covers endangered as surely as if they’d been in a car crash. Instead, Congress looks poised to take the easy and ineffectual route.
Photo via Matthew Straubmiller/Flickr (CC BY 2.0) | Remix by Max Fleishman