ACTIVITY: How does your information travel in the surveillance society?

Note: this page is a work in progress.

Time: 45-60 mins Group size: 2-10

Overview

This activity is designed to help participants get a "big picture" view of the surveillance society by "connecting the dots" between key players. To do that, we walk participants through some oral histories of the surveillance society (i.e. data stories).

Goals

Background knowledge

We assume that facilitator and participants are familiar with the following:

Supplies

Prepare before this session

Takeaways

Key

The word Do marks something that the facilitator should do.

The words Say and Facilitator marks something that the facilitator should say to the participants.

The word Note marks a note (from us, the writers) to the facilitator.

The word Discussion marks an opening for group discussion.

The word TODO marks a part that we (the writers) haven't finished yet.

Content

Opening (5 mins)

Do: Show this slide:

Facilitator:

You'll never believe all the unexpected places that your personal information goes and the ways it gets used!

Welcome to this live episode of the "Wait Wait, Don't Track Me" podcast. Today, we're going to tell a series of stories of what happens to your personal information and who's using it. These are all true stories taken from the news (or put together from real news stories). Together, we'll piece these stories together into a map of the surveillance society and the ways that information and money flow within it.

The surveillance gap

Say: Often, the people who have their information harvested and used against them are those who were already vulnerable in other ways: for example, members of ethnic, religious, and sexual minorities, those with sensitive medical conditions, women, sex workers, and immigrants.

First, let's start with some common stories about you.

Discussion: How did you get to this room today? What devices did you use and what information did you give up?

One answer: For example, you might have let your Android phone track your location in real-time, which lets you use services like Uber and Google Maps. Or maybe you let a TSA agent scan your luggage and pat you down at the airport.

Facilitator: These are examples of ways that you, your devices, and your actions in physical space get translated to a representation of you as information: your "data body."

Zooming out a lot, we'll go ahead and propose an overall structure of the surveillance society: the corporate, state, hybrid, and individual actors and institutions that want to know about you.

Mapping stories (15-30 mins)

Do: Hand out participant supplies (maps, handouts, and colored pencils).

Note: After each story, discuss what the story is meant to illustrate, what entities to add to the map, and how these entities relate to what's already on the map. Also, the number of stories told can be adapted to the amount of time allotted (about 3 minutes per story).

Facilitator: Now let's tell some other stories about surveillance: some you might not have experienced, some about surveillance under duress, using information you didn't consent to giving up, using processes opaque to the people caught up in them. They are all real stories or composites of real stories.

The important things in the stories, which also appear on the map, are highlighted in green on your handout. As we read the stories, circle the entities in green on the map, and draw lines that connect the information that moves between them. Here's an example of how we can do that.

  1. You log on to Facebook and take a personality quiz. It tells you you're an INTJ. Later, you see some pretty good ads for a local congressional candidate, and decide to learn more about them.

    By allowing the quiz site to access your Facebook profile, it now has access to your likes, photos, and friend list. They can profit by selling that data to political data analysis groups, like Cambridge Analytica (Trump) or The Groundwork (Hillary) can make very accurate predictions about your political preferences based on this data. They give suggestions to a political campaign about what users to advertise to, using Facebook's Custom Audiences feature.

    And here's one way that you might draw how information flows from social media, to Facebook's various platforms, to public political campaigns and private data analytics firms in the United States.

    Do: Show each slide that shows the corresponding flow.

    Let's try it out with the rest of the stories below:

  2. You search for unscented lotion, mineral supplements, and cotton balls on Google, then visit Target's site. They send you mail with a coupon that reads "Congratulations on the baby!" You're still living with your parents, who are Puritans who you were trying to keep a secret from. They kick you out of the house.

    Target uses predictive algorithms to link seemingly unconnected things to make guesses as to what your next purchases will be. If you've made a purchase at Target, on their website, and/or downloaded their app, they have an entry for you in their customer database. These are both standard marketing procedures, but Target is known for being especially good at it.

  3. You have an Android phone and you go about your daily life. The Feds issue a geofence warrant to Google to get your location data. They suspect you of murder and arrest you based on the data.

    Google holds individuals location data in a secure database that employees call "Sensorvault." Law enforcement agencies can subpoena this data with a geofence or "dragnet" warrant that specifies a location and time span, and Google provides anonymized device "traces" that generally matches those specifications. Law enforcement select one or several traces that match their investigative profile, and compel Google (who always complies) to reveal the personal information attached to those devices.

  4. You're out with your family and you take a bunch of nice photos of your kids, which you upload to Flickr. An IBM researcher scrapes your photo from Flickr and uses it to train a facial recognition model, technology that it sells to the NYPD so it can "search CCTV feeds for people with particular skin tones or hair color."

    IBM used Flickr photos that are in the Creative Commons to create the labelled Diversity in Faces dataset, with the intent of making facial recognition fairer. It makes this dataset available to academic and corporate research institutions. These corporate institutions make and sell algorithms that can allow cameras (CCTV, security, body, etc.) to recognize specific traits, such as race or gender.

  5. You live in San Francisco, and you spend a nice day out at the Brainwash cafe. Unbeknownst to you, the camera in the cafe recorded your face. Stanford researchers tap into the camera installed in the cafe and grab all the face images to create a dataset, which is then used in China by psuedo-private research firms to develop surveillance technology for monitoring and persecuting ethnic minorities.

  6. You share a candlelit dinner with your partner at home. At one point, you mention you would like to buy a new cast-iron pan. Your Alexa device, listening for information about your purchases records the audio of your conversation and sends it offshore for workers to transcribe. The transcription is used to improve Alexa's voice recognition technology.

  7. You're out with your friends downtown. A Bluetooth beacon tracks your location and knows that you went to Dick's Sporting Goods zat 2 pm, to the mall at 3 pm, to the bathroom at 4 pm, and that you hesitated in front of a Doc Martens store but didn't go in. The next day, while reading the news, you see a bunch of ads for Doc Martens and decide to go back to the store.

  8. You're out on the roof with your partner at night, enjoying a warm and intimate evening. Nobody can see you here, right? Unbeknownst to you, a man in an NYPD helicopter is using thermal imaging to spy on you.

    Note: we didn't make an answer key for this one. Provide your own!

  9. You are a US citizen on US soil. While writing a paper, you visit the website of an academic collaborator in Britain, who (unbeknownst to you) is alleged to be a terrorist suspect. Your request for the webpage passes through your internet service provider's infrastructure, then through an AT&T peering center, where it is intercepted by an NSA wiretap, copied to Maryland, and caught in the XKEYSCORE software by an NSA analyst. This piece of information just about tips the scales of a predictive algorithm to land you on a no-fly list. You only find out when you try to fly to a conference to give an invited talk and you're stopped at the airport. There is no appeal or recourse for your status on the no-fly list.

  10. Imagine you open Facebook and go to your local for-sale page to list an old guitar. After a week, someone responds with interest and you schedule a time to meet that day. When you drive up, ICE agents are there with printouts of your driver's license. Agents responded to your post after getting your name from a Department of License's Driver and Plate Search.

    Note: we didn't make an answer key for this one. Provide your own!

Putting it all together (5 mins)

Note: Here's one way that the group could have mapped all those flows:

Facilitator: Even if it doesn't seem like it, everything is connected. All kinds of organizations want to know things about you like where you've been, who your friends are, and what your personality is like. Once they have this information, organizations often exchange it and put it together, either for profiting off of consumers or imposing state control on individuals and communities. Apart from organizations, an important part of the surveillance society is individuals (who may be in a position of power) who use tools to surveil others for personal reasons, such as passively watching or actively stalking someone.

The surveillance gap

Say (again): Often, the people who have their information harvested and used against them are those who were already vulnerable in other ways: for example, members of ethnic, religious, and sexual minorities, those with sensitive medical conditions, women, sex workers, and immigrants.

Discussion questions:

One answer: One common thread between these stories is how some little innocent pieces of your information could fly out and, like a boomerang, come back to change your life in ways that you might not even notice.

Taking action (5 mins)

Facilitator: From the exercise we just did, it sounds like our devices are leaking information like a sieve! But it doesn't have to be that way. Here is an example of how a piece of well-designed legislation can stop private companies from owning, profiting off of, and sharing public data.

The story of the Cambridge Surveillance Ordinance demonstrates how surveillance oversight laws can stop secret police/Amazon agreements in their tracks.

Do: Show this slide and talk through the text message thread.

You can join in by working to create preemptive legislation to limit information sharing in your city, or by joining the public review process on the Seattle Surveillance Ordinance.

Takeaways (5 mins)

Do: Ask participants if they would be willing to donate their maps, flows, and stories to a communal knowledge bank.

Materials

Facilitator materials

Slides

The slides for the above material can be found here.

Story background

Here is the background and context for each story:

  1. The Facebook story is an entry point to this whole story that people should have vaguely heard of: "The Data That Turned the World Upside Down: How Cambridge Analytica used your Facebook data to help the Donald Trump campaign in the 2016 election." The point is to introduce a relationship between private entities and public entities, to introduce the idea of public-private collusion, and to introduce the idea of many small pieces of information being used for a much huger emergent purpose (i.e. that of undermining democracy).

  2. This is the famous "Target knows you're pregnant before your parents do" story. It serves as an entry point to surveillance capitalism and predictive behavior influencing.

  3. The Google story is another pathway to public-private collusion, as well as discussing data storage/longevity issues and issues of scope creep. NYTimes article

  4. The IBM story is an entry point to the role of facial recognition and AI in surveillance in the US, how surveillance is targeted at marginalized groups of people, and how personal data is collected coercively to fuel private gain. NBC article

  5. The Brainwash story reinforces the above story about facial recognition and targeting minorities, adds the nuance that data can move outside state boundaries, and is an entry point to the advanced Chinese surveillance state.

  6. The Amazon story introduces another corporate actor, Amazon, as well as personal hardware, and the coercive nature of modern AI. Accidental recording, audio review team

  7. This mall story is an entry point to ongoing corporate surveillance in physical space and how it ties into advertising and behavioral microtargeting. It also reinforces how much information a smartphone can leak: Bluetooth beacons and MAC address trackers can ping your phone, identifying your location and identity. This story can lead into an exercise about wifi network inspection. Bluetooth tracking, This trash can is stalking you, EFF's Why Metadata Matters

  8. The helicopter story is taken straight from the New York Times. It introduces the human element of surveillance: misuse of the state surveillance apparatus—ostensibly for your protection—for a human observer's voyeuristic glee, even in a place that should be secret.

  9. The NSA story is an entry point to state surveillance and starts to introduce some of the technology that we'll visit in the walking tour. It also reinforces the relationship between public and private actors, and introduces two main problems with pervasive surveillance, namely scope creep and data retention.

    This story is by necessity a collage of several stories, since we know so little about the inner workings of the NSA and the no-fly list—it's a "matter of national security." Nevertheless, based on the sources below, it seems plausible. This story is an entry point to several broader themes: the policy of the Foreign Intelligence Surveillance Act that the NSA should limit wiretapping to non-US citizens, and its ability to circumvent this policy to spy on US citizens if a foreign national is involved; data fusion; the deep opacity and arbitrary nature of state surveillance and automated decision systems; public-private information sharing; the dragnet nature of surveillance; and the very frustrating and hurtful effects the combination of these factors can have on an individual. It's also a counterpoint to the "nothing to hide, nothing to fear" argument—you have nothing to hide, you would have no problem revealing your list of academic collaborators, and you've done nothing wrong; it's only the "guilt by association" argument that brings the hammer crashing down. We will also see an AT&T peering site on the walking tour. No-fly list predictive assessments, NSA spying on US citizens

  10. This story is taken almost directly from the recent NYT expose "How ICE Picks Its Targets In The Surveillance Age". It shows how government fusion centers operate, how lots of little pieces of information can be used to pervasively track people, and how such surveillance capabilities are used against vulnerable populations (immigrants in America). The article describes the US domestic surveillance system as follows:

    public records make clear that ICE, like other federal agencies, sucks up terabytes of information from hundreds of disparate computer systems, from state and local governments, from private data brokers and from social networks. It piggybacks on software and sharing agreements originally meant for criminal and counterterrorism investigators, fusing little bits of stray information together into dossiers. The work is regulated by only a set of outdated privacy laws and the limits of the technology.

Poster: blank map (24x36)

Poster: map with flows (24x36)

Participant materials

Handout: mapping oral histories of surveillance

  1. You log on to Facebook and take a personality quiz. It tells you you're an INTJ. Later, you see some pretty good ads for a local congressional candidate, and decide to learn more about them.

    By allowing the quiz site to access your Facebook profile, it now has access to your likes, photos, and friend list. They can profit by selling that data to political data analysis groups, like Cambridge Analytica (Trump) or The Groundwork (Hillary) can make very accurate predictions about your political preferences based on this data. They give suggestions to a political campaign about what users to advertise to, using Facebook's Custom Audiences feature.

  2. You search for unscented lotion, mineral supplements, and cotton balls on Google, then visit Target's site. They send you mail with a coupon that reads "Congratulations on the baby!" You're still living with your parents, who are Puritans who you were trying to keep a secret from. They kick you out of the house.

    Target uses predictive algorithms to link seemingly unconnected things to make guesses as to what your next purchases will be. If you've made a purchase at Target, on their website, and/or downloaded their app, they have an entry for you in their customer database. These are both standard marketing procedures, but Target is known for being especially good at it.

  3. You have an Android phone and you go about your daily life. The Feds issue a geofence warrant to Google to get your location data. They suspect you of murder and arrest you based on the data.

    Google holds individuals location data in a secure database that employees call "Sensorvault." Law enforcement agencies can subpoena this data with a geofence or "dragnet" warrant that specifies a location and time span, and Google provides anonymized device "traces" that generally matches those specifications. Law enforcement select one or several traces that match their investigative profile, and compel Google (who always complies) to reveal the personal information attached to those devices.

  4. You're out with your family and you take a bunch of nice photos of your kids, which you upload to Flickr. An IBM researcher scrapes your photo from Flickr and uses it to train a facial recognition model, technology that it sells to the NYPD so it can "search CCTV feeds for people with particular skin tones or hair color."

    IBM used Flickr photos that are in the Creative Commons to create the labelled Diversity in Faces dataset, with the intent of making facial recognition fairer. It makes this dataset available to academic and corporate research institutions. These corporate institutions make and sell algorithms that can allow cameras (CCTV, security, body, etc.) to recognize specific traits, such as race or gender.

  5. You live in San Francisco, and you spend a nice day out at the Brainwash cafe. Unbeknownst to you, the camera in the cafe recorded your face. Stanford researchers tap into the camera installed in the cafe and grab all the face images to create a dataset, which is then used in China by psuedo-private research firms to develop surveillance technology for monitoring and persecuting ethnic minorities.

  6. You share a candlelit dinner with your partner at home. At one point, you mention you would like to buy a new cast-iron pan. Your Alexa device, listening for information about your purchases records the audio of your conversation and sends it offshore for workers to transcribe. The transcription is used to improve Alexa's voice recognition technology.

  7. You're out with your friends downtown. A Bluetooth beacon tracks your location and knows that you went to Dick's Sporting Goods zat 2 pm, to the mall at 3 pm, to the bathroom at 4 pm, and that you hesitated in front of a Doc Martens store but didn't go in. The next day, while reading the news, you see a bunch of ads for Doc Martens and decide to go back to the store.

  8. You're out on the roof with your partner at night, enjoying a warm and intimate evening. Nobody can see you here, right? Unbeknownst to you, a man in an NYPD helicopter is using thermal imaging to spy on you.

  9. You are a US citizen on US soil. While writing a paper, you visit the website of an academic collaborator in Britain, who (unbeknownst to you) is alleged to be a terrorist suspect. Your request for the webpage passes through your internet service provider's infrastructure, then through an AT&T peering center, where it is intercepted by an NSA wiretap, copied to Maryland, and caught in the XKEYSCORE software by an NSA analyst. This piece of information just about tips the scales of a predictive algorithm to land you on a no-fly list. You only find out when you try to fly to a conference to give an invited talk and you're stopped at the airport. There is no appeal or recourse for your status on the no-fly list.

  10. Imagine you open Facebook and go to your local for-sale page to list an old guitar. After a week, someone responds with interest and you schedule a time to meet that day. When you drive up, ICE agents are there with printouts of your driver's license. Agents responded to your post after getting your name from a Department of License's Driver and Plate Search.

Handout: blank map (11x17)

Handout: map with flows (11x17)

Credits

Map design by Micah Epstein and the coveillance.org team. This work is licensed under CC BY-SA 2.0. If you use our work, we'd love to hear about it!

Our template, and some of our material, is based on the Our Data Bodies toolkit (PDF).

Parts of the styling were borrowed from the Software Carpentry toolkit.



back to coveillance.org