Are Failures in Cyber Security an Ethics Lapse?

Media Division | September 19, 2017

Innovation has always outpaced regulation, yet issues of ethics arise in every industry for which that statement holds true. Innovators want to map the human genome, and some would desire to even alter it, which poses a number of ethical issues. On the one hand, what if you could cure a genetic condition? Wouldn’t a parent want their child to have every opportunity to health and wellness? On the other hand, at what point does that become the next level of social engineering, and even further devalue human individual differences (and treat differences in health as somehow a weakness, when some of the greatest accomplishments have occurred despite, perhaps even because, of such presumed setbacks?).

Another modern hot topic of regulation has to do with the fully automated self-driving car: presuming such technology will occur, how will it be regulated? If swerving to avoid a collision would save a life, but cost another, what would you want the car to do? While, perhaps, less of the life-and-death variety, issues of cyber intelligence face similar questions of ethics. When cyber security fails, is a lapse in ethics behind it?

Some of the Bigger Questions

The recent ransomware attacks (WannaCry, Petya, and NotPetya) brought ethics to the forefront. While the United States was not behind the attacks, some argued the US government was responsible. After all, the attacks relied on weaknesses in Windows first exploited by the NSA for their own spying. Of course, the NSA later notified Windows, who released a patch. But that was not the only issue, you could make a case for a number of responsible entities and individuals:

  • The NSA (or any other cyber intelligence agency) — They knew about the vulnerabilities. Should they have informed Microsoft sooner?
  • Microsoft (or other software developers) — Could they have done more to ensure everyone was updated against the security weaknesses?
  • Individual companies — Was each organization responsible for ensuring adequate cyber intelligence monitoring, and thus swiftly updating systems?
  • Cryptocurrency — Without the unregulated payment method of bitcoin, could ransomware really grow to such proportions?
  • End users — Does every individual who uses an internet-connected device, even such as a shared hospital workstation computer, have direct responsibility for updates and protection?

What’s more, if this attack has died down, why are we even asking such questions?

Protecting the Future with Ethics

It would be positively ludicrous to assume that this wave of viruses was the last. Each year has been called “the biggest year in cyber security history,” because attacks have grown, both in frequency and in magnitude, literally every year. The ease with which information spreads in a digital age means that threat actors have quick access to the methods of others, just as easily as you view email on your smartphone.

It would be equally preposterous to assume that the NSA bag of tricks has all been seen. Other exploits most certainly exist and could also be leaked and utilized. All of this finger-pointing and question asking amounts to one simple truth: it would be impossible to fully regulate every possible scenario. Instead, a system of ethical agreements and best practices for cyber security across all sectors is needed.

It Starts With You

Ethics, and thus security, starts with you. How many users free-ride on the internet (meaning have internet access without maintaining updates)? How many people would view or otherwise access tools or content without guaranteeing explicit rights?

We live in an era of intellectual property, beyond mere physical goods. We begin by choosing to ensure security by viewing ethics as more important than information, on a personal level. Then, insist on ethical cyber intelligence monitoring within your own business interactions.

Massive's Media Division publishes timely news and insights based on current events, trends, and actionable cross-industry expertise.