In recent years, concerns about the erosion of trust and invasion of privacy have extended into nearly every interaction between customers, organizations, and devices. Lawmakers have continued to respond with new privacy and data protection laws that put pressure on the related industries.
Simultaneously, the positive effects of proper privacy protection continue to become more apparent. A proactive approach toward transparency and privacy creates an opportunity for a competitive difference among enterprises by fostering increased productivity and sales successes, improving public image, and enhancing customer trust.
Gartner compiled strategic predictions for the future of privacy, looking at the impact privacy will have on organizations this year and beyond. Security and risk management leaders can use these predictions to avoid pushback, find opportunity, and create value for their organizations.
1. By 2023, organizations that do not excessively monitor remote working employees will experience up to 15% higher productivity than those that do.
Amid the shift to remote work during COVID-19, many employers have increased tracking the activities of employees who work remotely. While they may have legitimate reasons to conduct employee monitoring, such as scanning for security threats, leaders must be mindful of respecting employees' privacy.
Excessive monitoring can erode trust and harm the employer-employee relationship as well as the overall corporate image. Security leaders must ensure monitoring measures strike the appropriate balance between the organization's needs and employees' right to privacy. This will help employers build trust with employees and ensure higher productivity in the long run.
2. By 2023, organizations embedding privacy user experience into customer experience (CX) will enjoy greater trustworthiness and up to 20% more digital revenue than those that don't.
Consumers want to know how their personal data is being used, and they are more trusting of companies that are transparent about data usage. Once customers trust an organization, they are more likely to be loyal, to recommend that company, and to buy more products and services.
Organizations can turn privacy compliance into a revenue generation opportunity by making privacy central to the CX. It is imperative to consistently incorporate transparency and choice into all CX and personalization endeavors. The privacy user experience (UX) consists of clear yet simple language and full disclosure about the purpose of every interaction, data processed for it, enablement of choice through consent and preference management, and easy access to exercising privacy rights. Ideally, this is centralized in a consumer-facing self-service portal. By doing this, organizations can increase their trustworthiness and improve customer satisfaction and loyalty — thereby increasing revenue opportunities.
3. By 2023, over 20% of organizations will use a data risk assessment (DRA) to identify and manage appropriate privacy controls, despite a lack of guidance from regulators on how to implement it.
Organizations face a changing world filled with an ever-increasing amount of data, which can lead to huge business opportunities when that data is properly used to develop or enhance products and services. However, organizations are simultaneously challenged with navigating an evolving international portfolio of privacy and data protection laws, creating significant business risks if data is used improperly. Guidance from privacy regulators on how to mitigate such risks is often inconsistent or lacks focus.
Companies can use a DRA to identify and analyze potential privacy and data protection risks. The results of the DRA will help determine the success of existing data security controls and identify any gaps or inconsistencies that need further engineering. The DRA can also help address the compliance requirements of global data protection and privacy laws, reducing the risk of accidental disclosures, inappropriate data processing, or other data breaches.
4. By year-end 2025, multiple Internet of Behaviors (IoB) systems will elevate the risk of unintended consequences, potentially affecting over half of the world's population.
The pervasiveness of monitoring sensors, Internet of Things devices, and wide availability of massive datasets enables an unprecedented evaluation of individual "behaviors" on- and offline. An IoB system aims to capture, analyze, understand, and respond to these behaviors, with the goal to influence that behavior in return. An IoB system combines multiple sources of intelligence, such as commercial customer data, publicly available citizen data, social media, facial recognition, and location tracking, to do so.
These systems could lead to positive outcomes, such as improved public health. For example, during COVID-19, an IoB could aim to systematically monitor and analyze hand hygiene behavior, use face recognition-based analysis to determine mask usage, use device- and video-based algorithmic confirmation to monitor social distancing behavior, etc. Through information feedback loops, including inclusion or exclusion decisions, these systems could then help drive behavior adjustment.
However, when left uncontrolled, there could also be negative outcomes, such as censorship or truth fabrication. Therefore, there is naturally an ongoing debate around the position and reliability of algorithms, the ethics behind decision-making, individual rights and freedoms, and protection of autonomy regarding IoB systems.
These debates must shape acceptance parameters for IoB deployments. As an IoB grows at scale, security leaders must ensure stability and consistency. Establish a framework for privacy, security, ethics, and interconnectivity that all connected entities must subscribe to, further reducing the risk of unintended consequences.