Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Edge Articles

10/30/2019
04:30 PM
Curtis Franklin Jr.
Curtis Franklin Jr.
Edge Features
100%
0%

Cybersecurity's 'Moral Imperative'

Cybersecurity professionals often talk about the economic drivers of security. But should the conversation shift to include a moral component? At least one analyst says "yes."

(Image by Photocreo Bednarek, via Adobe Stock)
(Image by Photocreo Bednarek, via Adobe Stock)

When the audience files into a keynote session at a computer industry conference, they can be primed to hear many different words. "Moral imperative" are rarely among them. But those are exactly the words that were part of the opening at last week's Gartner SYMposium.

Mbula Schoen, senior principal analyst for Gartner, was charged with talking about business' role in a digital society, which she defined as "the sum of all our interactions between human and technology." As part of the responsible business role, she says that companies must invest in a safe digital society while protecting the enterprise.

And just to put a point on it, she told the audience that, "Security is a moral imperative in a digital society." That moral imperative covers the responsibility the company has to society at large, as well as to all of the organization's stakeholders — partners, employees, customers, as well as shareholders.

But what does that imperative look like when turned into action? Schoen had several examples of issues IT security teams should be looking for in their work. One of the first she talked about was inappropriate use of technology.

Big, splashy examples of inappropriate technology use aren't hard to find. Schoen pointed to the drones that were sighted near England's Gatwick airport, closing it for 33 hours in December 2018. More insidious cases, she pointed out, could be in bias introduced in AI systems.

Researchers have known that those AI biases are a potential issue for years. But the impact of bias took on heightened urgency when it was recently shown that some AI models favored white patients over black patients for healthcare treatment. When Gartner data shows that 30% of organizations will use AI to make decisions by 2022, the potential for those critical biases to increase reaches a critical level.

In another example, Schoen pointed to the increasing collection of personal data for use by businesses. The data is being collected, processed, and stored, often without the understanding of the customer. And each of those steps requires security.

"Finding data to collect isn't hard, but society is skeptical about how it's being used," she explained. As a result, "There is more regulation of privacy than ever before, and less privacy."

ISC(2), the organization of CISSP and other cybersecurity certifications, also sees moral and ethical components to cybersecurity.

"I think [morality is] very relevant today. It's about doing the right thing for society," said COO Wesley Simpson in an interview at the ISC(2) Security Congress, in Orlando this week. "For every one of our 145,000 members, it's not just about passing an exam or getting endorsed. The third component is that you have to accept, abide by, and live up to our ethical canons. That gets to the moral obligation of our members."

Simpson pointed out that ISC(2) has, and will continue to, revoke the certification of members found to have violated moral or ethical standards within cybersecurity.

Back at Gartner SYMposium, Schoen said finding data to collect is easy, but society has become skeptical about how that data is being used and secured. To build great trust, she said, companies must institute solid information governance and provide greater transparency regarding security and privacy controls.

Finally, Schoen said every organization should institute "three Ds" regarding using and protecting user data:

  • Decide to manage security and risk to protect all stakeholders
  • Design to be a responsible custodian of customer data
  • Drive to identify and build a societal value proposition

Here at the Edge we're curious: How important is the moral component in your cybersecurity work? Is it the driving factor in what you do, or is morality a word best left out of the conversation among cybersecurity pros? Let us know what you think in the Comments section, below.

Related Content:

This free, all-day online conference offers a look at the latest tools, strategies, and best practices for protecting your organization’s most sensitive data. Click for more information and, to register, here.

Curtis Franklin Jr. is Senior Editor at Dark Reading. In this role he focuses on product and technology coverage for the publication. In addition he works on audio and video programming for Dark Reading and contributes to activities at Interop ITX, Black Hat, INsecurity, and ... View Full Bio

Comment  | 
Print  | 
More Insights
Comments
Threaded  |  Newest First  |  Oldest First
ChrisK92103
50%
50%
ChrisK92103,
User Rank: Author
10/30/2019 | 5:50:05 PM
Efficacy is a moral responsibility-
We now live in a such a competitve era that I worry sometimes we rush securty capability and product out the door without adequete QC.  It takes proper moral fiber to invest richly and deeply in quality assurance, customer success, and verifying that security technolgies are fit for purpose.  The company bottom line needs to be as important as these things, and this value makes the company's product more valuable.
J@wn007
100%
0%
[email protected],
User Rank: Strategist
10/31/2019 | 2:39:42 PM
CISSP Ethics
My old ISC2 card used to state I promised to uphold the "highest ethical standards", not simply the most financially conveinent ones.
The Edge Cartoon Contest: You Better Watch Out ...
Flash Poll