Advertisement

This one matters, too: Carnegie Mellon issues guide to disclosing software vulnerabilities responsibly

"This is not a technical document ... This is about a very human process," says one of the authors from the prestigious Software Engineering Institute at Carnegie Mellon.
The Software Engineering Institute at Carnegie-Mellon University in Pittsburgh (Source: SEI)

Over the past year or so, there’s been an explosion of interest in vulnerability disclosure policy — the question of what to do about flaws in software found by security researchers that need patching lest they get used by hackers. Both the Defense Department and the General Services Administration have launched bug bounty programs to reward researchers who responsibly report security flaws they find, and the National Telecommunications and Information Administration’s multistakeholder process published a guide to coordinated vulnerability disclosure, or CVD.

Even the Justice Department has gotten in on the act — putting out a set of legal guidelines for companies and other organizations interested in establishing a vulnerability reporting and fixing process.

So you would think the publication of yet another set of guidance would be anti-climatic and might even be ignored. But you’d be wrong.

The prestigious Software Engineering Institute at Carnegie Mellon University published its Guide to Coordinated Vulnerability Disclosure on Tuesday, and the cybersecurity community is likely to pay close attention. The authors work at the institute’s CERT Coordination Center — celebrated as the place that pioneered the Computer Emergency Response Team model for coordinated vulnerability disclosure in the first place.

Advertisement

“We were joking internally that we should have written this five years ago,” when they would have had the field more or less to themselves, report co-author Art Manion noted.

As it is, the issue of vulnerability disclosure has been propelled to center stage in the past year — and even entered mainstream public consciousness following the WannaCry and NotPetya attacks earlier this year. A debate has been raging in policy circles about the way the government should treat newly discovered vulnerabilities known as zero-days; and there’s competing data on which to base conflicting theories.

Bottom line: There has been a lot more attention paid to the issue in recent months than ever before — and there are a lot more players on the field.

As a result, these days a big chunk of the work involved in producing cybersecurity best-practices guidance is making sure it doesn’t contradict or conflict with other advice already out there or in the works.

“There were a number of efforts to write documents like this,” said Manion, citing as examples the NTIA multistakeholder process at the end of last year; and the International Organization for Standardization’s standards ISO/IEC 29147 and 30111 — the latter of which is due for redraft next year.

Advertisement

“I was on a working group that wrote one of the NTIA documents,” he said. At the institute, “We invested time in making sure we were aligned with those efforts.” The SEI guide took six-to-eight months to produce, with the four authors working “in fits and starts,” he said.

Manion acknowledged that “there was some concern about duplication” with other efforts, but said the real issue was avoiding conflicting or contradictory advice.

“There’s bound to be overlap” between different sets of guidance aimed at different audiences, he said. That didn’t matter if the advice was consistent where it overlapped.

Nonetheless, he said, there was something uniquely valuable about the perspective the CERT/CC could offer.

“This is not a technical document,” Manion said, rather the authors had sought to illuminate the motivational dynamics of different players under a very wide variety of circumstances. “This is about a very human process … What should you do when you find a [software] vulnerability? Who do you tell? What should that person do?”

Advertisement

That chain of disclosure and fixing, he said, from security researcher to manufacturer to patch, is simple in theory. “If you look at the big building blocks — have a [vulnerability reporting] program in place; accept [vulnerability] reports; triage them; …  make [publish and distribute] software fixes — the outline is pretty straightforward.”

Indeed, he said, as software rushes into every corner of our lives and every facet of the manufacturing business, CVD “should become a standard business practice” like accounting.

But in real life, the devil is in the details: “There’s a world of exceptions … Benign things like time differences or the language barrier” can cause misunderstandings about exactly when publication will take place, or how detailed it will be, even when it involves only two parties — the researcher who finds the vulnerability, and the manufacturer who must fix it.

“Different parties are differentially motivated and … what’s best for everyone overall may not always be best for each one individually … A bit of game theory creeps in,” Manion said.

In a blog post about the guide, co-author Allen Householder waxes almost philosophical about the CVD process: “If we have learned anything in nearly three decades of coordinating vulnerability reports at the CERT/CC, it is that there is no single right answer to many of the questions and controversies.”

Advertisement

The guide, he concluded, “is a summary of what we know about a complex social process that surrounds humans trying to make the software and systems they use more secure.”

Latest Podcasts