When the audience files into a keynote session at a computer industry conference, they can be primed to hear many different words. "Moral imperative" are rarely among them. But those are exactly the words that were part of the opening at last week's Gartner SYMposium.
Mbula Schoen, senior principal analyst for Gartner, was charged with talking about business' role in a digital society, which she defined as "the sum of all our interactions between human and technology." As part of the responsible business role, she says that companies must invest in a safe digital society while protecting the enterprise.
And just to put a point on it, she told the audience that, "Security is a moral imperative in a digital society." That moral imperative covers the responsibility the company has to society at large, as well as to all of the organization's stakeholders — partners, employees, customers, as well as shareholders.
But what does that imperative look like when turned into action? Schoen had several examples of issues IT security teams should be looking for in their work. One of the first she talked about was inappropriate use of technology.
Big, splashy examples of inappropriate technology use aren't hard to find. Schoen pointed to the drones that were sighted near England's Gatwick airport, closing it for 33 hours in December 2018. More insidious cases, she pointed out, could be in bias introduced in AI systems.
Researchers have known that those AI biases are a potential issue for years. But the impact of bias took on heightened urgency when it was recently shown that some AI models favored white patients over black patients for healthcare treatment. When Gartner data shows that 30% of organizations will use AI to make decisions by 2022, the potential for those critical biases to increase reaches a critical level.
In another example, Schoen pointed to the increasing collection of personal data for use by businesses. The data is being collected, processed, and stored, often without the understanding of the customer. And each of those steps requires security.
"Finding data to collect isn't hard, but society is skeptical about how it's being used," she explained. As a result, "There is more regulation of privacy than ever before, and less privacy."
ISC(2), the organization of CISSP and other cybersecurity certifications, also sees moral and ethical components to cybersecurity.
"I think [morality is] very relevant today. It's about doing the right thing for society," said COO Wesley Simpson in an interview at the ISC(2) Security Congress, in Orlando this week. "For every one of our 145,000 members, it's not just about passing an exam or getting endorsed. The third component is that you have to accept, abide by, and live up to our ethical canons. That gets to the moral obligation of our members."
Simpson pointed out that ISC(2) has, and will continue to, revoke the certification of members found to have violated moral or ethical standards within cybersecurity.
Back at Gartner SYMposium, Schoen said finding data to collect is easy, but society has become skeptical about how that data is being used and secured. To build great trust, she said, companies must institute solid information governance and provide greater transparency regarding security and privacy controls.
Finally, Schoen said every organization should institute "three Ds" regarding using and protecting user data:
- Decide to manage security and risk to protect all stakeholders
- Design to be a responsible custodian of customer data
- Drive to identify and build a societal value proposition
Here at the Edge we're curious: How important is the moral component in your cybersecurity work? Is it the driving factor in what you do, or is morality a word best left out of the conversation among cybersecurity pros? Let us know what you think in the Comments section, below.