Dark Reading is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Risk

1/28/2020
06:40 PM
Connect Directly
Twitter
LinkedIn
Google+
RSS
E-Mail
50%
50%

'Understand What You Believe': Fmr. FBI Agent Unpacks Information Threats

In the past few years, social media has transformed from a communications gold mine to a minefield of disinformation campaigns.

CPX 360 – New Orleans, La. – Social media, initially built to improve global communications, has become a weapon in the hands of cybercriminals launching disinformation campaigns.

CEO Gil Shwed used the world's growing interconnectedness as the topic to begin his morning keynote at the Check Point CPX 360 conference, and he used the Olympics to illustrate just how connected we have become. The 1996 Atlanta Olympics generated 171 hours of coverage, he noted. One decade later, the 2016 summer games in Rio de Janeiro generated 6,755 hours.

"Everything has to do with technology," Shwed said. Ticket reservations, hotel bookings, and viewings of Olympic events are all done online. "Every transaction around an event like the Olympics has to be built using the Internet and using the connected world." Still, the benefit of using the Internet to make the Olympics a more global event is countered by potential threats.

"Protecting all of our systems is a problem that's becoming bigger and bigger," he added.

This sentiment extended into a keynote talk by Clint Watts, former special agent for the FBI and distinguished research fellow at the Foreign Policy Research Institute. He spoke to the rapid growth of social media and its role as a boon to communications and evolving attack vector.

"The information landscape has changed," said Watts. People championed social media in 2011, when it was used to anonymously communicate during the Arab Spring. By 2016, only a few years later, those same platforms had entered a new era: "the rise of the trolls," as he put it.

Today's social networks, used among billions of people for legitimate communications, double as a space for the Internet's hecklers to sow mistrust and advance false narratives. "They have a very specific mission," Watts said. Most trolls get tired over time; they will eventually leave you alone. Now, we see a rise in armies of trolls discrediting people and hacking to power influence.

"In just 10 years, we went from social media and the Internet bringing us all together, to tearing us all apart," he said. Trolls pose a new threat to consumers and businesses alike. Fake news stories about corporations can cause stock prices to rise or fall, depending on the angle.

There are two components to building what Watts calls a "preference bubble," or the way in which users see social media content. One is an algorithm; the other is the reader. Each individual's social feed is uniquely tailored to them, he explained, and nobody else sees the world in that way. This makes it easier to spread false narratives, as other people don't know if they should believe content because it wasn't specifically made for them.

Why do people believe everything they see? Watts shared a few tactics that attackers commonly use to manipulate victims — for example, provoking emotions such as anger and fear.

"If I can scare you as an audience, you're more likely to believe what comes next," he said. "I can draw in followers by scaring people with news."

The Current and Future State of Disinformation

Disinformation campaigns rely on three biases, Watts continued. The first is confirmation bias: The more you like and follow topics online, the more of them you see. Implicit bias, the second, describes when we have unconscious attitudes toward people. As Watts described it, most humans prefer information from people who look and talk like them. This bias motivates attackers to create fake profiles that look more like the people they are working to discredit.

The third bias is status quo bias, an emotional preference to keep things the way they are. This means you may start shaping what you say, and what you share, because you want to stay within your tribe. You only post things your followers agree with, maintaining a status quo.

Watts then described three dynamic changes shaping the information landscape, starting with "clickbait populism," or the promotion of popular content and opinions. The more a person plays to the crowd's preferences, the more they will be promoted. Social media nationalism is another; this is the collective adherence to a digital identity drawn by hashtags and avatars. Then there is the death of expertise, or the belief that anyone on the Internet is just as smart as anyone else, regardless of their experience, training, education, or specialty, he explained.

Over time, the objectives, methods, and actors driving social media manipulation will change. Activists and extremist groups will have fewer resources; lobbyists and the very wealthy will have more. Attackers will need more sophistication to incite fear and provoke conflict, to create alternative information outlets, or to distort reality via pseudoscience and revised histories.

"What if there are whole communities of bots that talk to each other, and you are looking at one person, but it's really nine people talking?" he added. Threats on the horizon include trolling-as-a-service, or disinformation for hire, as well as social media influencers-as-a-weapon, pseudoscience firms, alternative universities, and cross-platform computational propaganda.

How can businesses respond to all this? Watts advised implementing both a social media usage policy and security training for employees and an insider program that goes beyond data loss. Organizations should develop and rehearse playbooks to protect their brand and reputation, especially when a disinformation attack is targeting them in real time.

"Speed is essential," he emphasized. "When I see hacking to power influence, the immediate response is to have a meeting. But the longer it goes on, the worse it's going to get."

For citizens, Watts encouraged them to understand why they believe the things they trust online. "Listen more than you speak, read more than you write, watch more than you film," he said. We're most likely to believe the things we see first, those we see most, those that come from sources we trust, and those that don't come accompanied by a rebuttal.

Check out The Edge, Dark Reading's new section for features, threat data, and in-depth perspectives. Today's top story: "7 Steps to IoT Security in 2020."

Kelly Sheridan is the Staff Editor at Dark Reading, where she focuses on cybersecurity news and analysis. She is a business technology journalist who previously reported for InformationWeek, where she covered Microsoft, and Insurance & Technology, where she covered financial ... View Full Bio
 

Recommended Reading:

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
COVID-19: Latest Security News & Commentary
Dark Reading Staff 6/4/2020
Abandoned Apps May Pose Security Risk to Mobile Devices
Robert Lemos, Contributing Writer,  5/29/2020
How AI and Automation Can Help Bridge the Cybersecurity Talent Gap
Peter Barker, Chief Product Officer at ForgeRock,  6/1/2020
Register for Dark Reading Newsletters
White Papers
Video
Cartoon Contest
Write a Caption, Win a Starbucks Card! Click Here
Latest Comment: What? IT said I needed virus protection!
Current Issue
How Cybersecurity Incident Response Programs Work (and Why Some Don't)
This Tech Digest takes a look at the vital role cybersecurity incident response (IR) plays in managing cyber-risk within organizations. Download the Tech Digest today to find out how well-planned IR programs can detect intrusions, contain breaches, and help an organization restore normal operations.
Flash Poll
Twitter Feed
Dark Reading - Bug Report
Bug Report
Enterprise Vulnerabilities
From DHS/US-CERT's National Vulnerability Database
CVE-2020-13842
PUBLISHED: 2020-06-05
An issue was discovered on LG mobile devices with Android OS 7.2, 8.0, 8.1, 9, and 10 (MTK chipsets). A dangerous AT command was made available even though it is unused. The LG ID is LVE-SMP-200010 (June 2020).
CVE-2020-13843
PUBLISHED: 2020-06-05
An issue was discovered on LG mobile devices with Android OS software before 2020-06-01. Local users can cause a denial of service because checking of the userdata partition is mishandled. The LG ID is LVE-SMP-200014 (June 2020).
CVE-2020-13839
PUBLISHED: 2020-06-05
An issue was discovered on LG mobile devices with Android OS 7.2, 8.0, 8.1, 9, and 10 (MTK chipsets). Code execution can occur via a custom AT command handler buffer overflow. The LG ID is LVE-SMP-200007 (June 2020).
CVE-2020-13840
PUBLISHED: 2020-06-05
An issue was discovered on LG mobile devices with Android OS 7.2, 8.0, 8.1, 9, and 10 (MTK chipsets). Code execution can occur via an MTK AT command handler buffer overflow. The LG ID is LVE-SMP-200008 (June 2020).
CVE-2020-13841
PUBLISHED: 2020-06-05
An issue was discovered on LG mobile devices with Android OS 9 and 10 (MTK chipsets). An AT command handler allows attackers to bypass intended access restrictions. The LG ID is LVE-SMP-200009 (June 2020).