New research looks at how information on Internet can be manipulated

11/4/2014 9:19:00 AM ITI Staff

With a tight gubernatorial race in Illinois, voters are being bombarded today with TV ads, yard signs and robo calls. But what if candidates could use their campaign dollars to subtlety influence your vote on social media? It might as simple as paying to promote the status of a friend who just voted or downplaying negative status posts on Facebook.

Written by ITI Staff

With a tight gubernatorial race in Illinois, voters are being bombarded today with TV ads, yard signs and robo calls. But what if candidates could use their campaign dollars to subtlety influence your vote on social media? It might as simple as paying to promote the status of a friend who just voted or downplaying negative status posts on Facebook.

Those actions might seem innocuous, but it could selectively nudge the turnout.

You can envision the possibility, said Michael Bailey, a new Illinois ECE and CSL faculty member. Where turnouts are only a couple hundred votes, could one use the sciences of influence and persuasion and a knowledge of Facebook and Google customization and personalization to actually influence the outcome?

 

Michael Bailey
Michael Bailey
Michael Bailey

Bailey, who focuses his work in security and also has an appointment with the Information Trust Institute, was recently awarded a $225,000 NSF grant titled EPICA: Empowering People to Overcome Information Controls and Attacks, that will look at situations like this, where personalized information services on the Internet may be a new feeding ground for attackers to compromise the integrity of input data and affect outputs.

 

We are interested in exploring the information ecosystem as a critical resource that needs to be protected in the same way that utilities or the transportation sector needs to be protected, Bailey said. You can't have democracy without free and open access to information.

This interdisciplinary work pulls together political scientists and computer scientists, as well as input from psychologists, from multiple universities. Bailey will be teaming with Georgia Institute of Technology computer scientists Wenke Lee, Nicholas Feamster and Hongyuan Zha, Georgia Institute of Technology political scientist Hans Klein and University of Maryland computer scientist Marshini Chetty. Bailey, as a computer scientist focused on security, will specifically be looking at the creation of systems from an adversarial point of view. The group will also be looking at social behavior and economical behaviors about how we think about and view information.

One of the challenges Bailey has seen emerge as the majority of people are receiving their news online now, rather than via print or television media, is that each aspect of the information ecosystem — how we create, locate, aggregate and consume news and information — is changing.

One of the things I worry about most is the idea of systemic influence or persuasion, Bailey said. What would happen if someone like Google or Microsoft decided to promote a political agenda? How would we know that our search results are unbiased?

Bailey added that with so much customization and personalization of individual results, it's had to determine if your results have been manipulated. It's not enough to simply curtail these features. As a society, we want Google and other search engines to know some things about us because we get value from personalization and customization.

However, it's sort of a magic black box. We don't know why it works the way it does, so it's hard to figure out if they've started doing something we don't want them to do, Bailey said. This information manipulation is what we're going to be looking at.

By bringing together a variety of security and machine learning techniques, they're going to be looking at the idea of influence and persuasion in a new way. The group will be creating algorithms, techniques and theories that influence the science of networking and security. Specifically, the research will study the security of personalized services such as Google Search and News and online-targeted advertising to identify vulnerabilities, as well as develop countermeasures to prevent various attacks, alert users and incentivize the industry to provide more transparency and protection. They will also be developing an evaluation framework to help facilitate development and adoption of new technologies.

We want to create tools that help users better understand the impact of customization and personalization and all the elements of the information ecosystem, Bailey said. We also hope to influence the areas where technology interacts with public policy, law and societal ideas.


Share this story

This story was published November 4, 2014.