Privacy is not keeping things secret; it is deciding who to share what information with, at what time, and in what context. 

Jennifer Golbeck is a Professor in the College of Information Studies and Director of the Social Intelligence Lab at the University of Maryland, College Park. She received an AB in Economics and an SB and SM in Computer Science at the University of Chicago, as well as a PhD in Computer Science from the University of Maryland, College Park. 

Professor Golbeck began studying social media from the moment it emerged on the web, and she is one of the world’s foremost experts in the field. Her research has influenced industry, government, and the military. She is a pioneer in the field of social data analytics and discovering people’s hidden attributes from their online behavior, and she is a leader in creating human-friendly security and privacy systems. 

Here are my favourite takeaways from viewing Dr. Jennifer Golbeck’s Great Courses Class:

  • Research has found that viral videos, fads, rumors, fake news, and pages to like on Facebook all spread in online social networks the same way that diseases spread in offline social networks

Homophily

  • Sociologists have known for a long time that we are friends with people who are similar to ourselves. They call this homophily, and it’s basically the concept that birds of a feather flock together. If you’re rich, your friends tend to be rich. If you graduated from college, your friends also probably graduated from college. It’s not that all of your friends are the same as you, but rather that your personal attributes are shared by your friends more often than they are shared by the general population.

Most of what you like on social media is social—it’s stories, pages, pictures, and videos that appear in front of you because someone shared it. Rarely is it something you go search for yourself. So what you see depends on your friends.

  • One project followed pregnant women on Twitter and developed a model that could predict whether they would develop postpartum depression. On the day a woman gives birth, the algorithm already knows the outcome—and it is right 80 to 85 percent of the time. It does this using data about the style of people’s writing, their interactions, and the kinds of posts they make.
  • In the US, you do not own your personal data, and companies do not have to tell you what they know about you or how they are using your data. These algorithms are used behind the scenes all the time, often without your knowledge.

Data Harvesting

  • Facebook wants to know details of how people use their platform. Their main goal is to keep people engaged with it. They want people to use Facebook as much as possible and to know what kinds of activities keep people engaged. For example, if you are commenting on your friends’ posts, that is good from Facebook’s perspective because it keeps you on Facebook and encourages your friends to engage. But what if you start to post something and then reconsider, never actually posting it?

Shadow Profiling 

  • Even if you have not created a Facebook profile, Facebook still knows a lot about you. For many people who are not on the site, Facebook creates something called a shadow profile. Basically, it’s very easy to know when a person is missing from a social network. There is a hole where a person should be, and Facebook can easily figure out who that person is that the hole represents.
  • The most obvious information to use is other people’s contact lists. If you have a friend who has a social media account and that friend uses the app, your friend likely has given access to his or her contact list. Many platforms ask for this because it allows them to pair you up with other people who are using the app. They do this by downloading a list of your contacts along with all of your contacts’ data—their phone numbers, email addresses, street addresses, and maybe photos—and if a particular platform has another user in the system with that same email address or phone number, it can suggest that you become friends with that person. Essentially, a phone number or an email address are unique identifiers. 

So even if you are not on Facebook, most of your friends still are, and you are likely in many of their contact lists. And when your friends give permission for Facebook to access their contacts, Facebook retrieves your data from a lot of different people

Types of Data to Control 

There are three types of data that you want to be able to control:

  •  data that you are explicitly choosing to share,
  • data that is collected about you in the background that you may not know about, and
  •  data that is inferred or calculated about you using artificial intelligence.

Facial Recognition

  • Facial recognition algorithms can identify an individual by analyzing the pattern of the person’s facial features. It’s a technology that many large corporations are working on. Facebook has a good facial recognition algorithm. When you upload a picture, it automatically identifies the people who are in that photograph. However, not everyone has access to such a huge database of people’s photos, so there are only a handful of companies with large and accurate facial recognition systems.

The inaccuracy and the potential for them to create a variety of social problems have led to bans on the use of facial recognition technology by government departments, including police agencies, in some cities. 

Tattoo Recognition.

  • Beyond facial recognition, technology exists to individually monitor people and their associations in other ways. Consider tattoo recognition. Facial recognition looks at the biometrics of your face to uniquely identify you. Tattoo recognition does a similar thing, scanning an image of the tattoo to distinguish it from any other. 

Advertising Kiosks

  • The Wall Street Journal reported that some shopping malls in South Korea had installed kiosks that have maps of the mall with lists of the stores, and each kiosk had a set of cameras and a motion detector. When someone came up to look at the map or browse the stores on the screen, those cameras and detectors used a facial recognition–type system to analyze the face of the person using the map.

The malls were not trying to uniquely identify that person but rather to estimate the individual’s gender and age. From there, the kiosk could drive the person to different stores or show him or her ads for other products. A young woman may see ads for something different than an older man would. 

Transparency and Control 

Transparency means being clear about what is being done with users’ data and how their experience is being controlled.

  • Transparency generally involves clearly telling people what is being done with their data, who it is being shared with, and how it is impacting their experience. Ideally, privacy policies would cover this, but they tend to be written either with a lot of legal language that is hard to understand or with a lot of vague language— and sometimes both.

Control

Control means giving people the right to opt in or out or change their preferences. 

  • In this case, people have a choice about whether their data is used in certain ways. Social media platforms tend to give users some control over who can see the information they post. These options may be simple or complex. For example, Twitter and Instagram allow you to essentially have a public or private account; Facebook has much more sophisticated controls, which let you choose individual people who are allowed or prohibited from seeing your posts.

Bias in Algorithms 

  • Facial recognition algorithms can use a picture of your face— whether it’s posted online or captured in a surveillance system—to identify who you are by comparing it to a database of known faces. But such algorithms can be wrong; in fact, they’re wrong in biased ways. 

If you are a white male, these algorithms work very well for you. If you are a member of any other group—such as white women or men of any other race—they are less accurate 

Action Steps to take control of your Personal Data

  • Delete old posts on social media. When less information is available about you, algorithms can discover less about you as well. 
  • Check your privacy preferences. On your phone, turn off apps’ permission to access your location, contacts, and other information, unless it is really critical. To stop background data collection, select the options that prevent apps from running in the background, and if they do not need to contact the internet— as is the case with many games—turn off their rights to use cellular data. 
  • By blocking apps from using data, you are preventing them from being able to send any information about you out to the world. This action may also stop them from downloading and showing ads to you. This might disable some features of the app, but you can decide which option feels best for you for each individual app. 
  • Limit the data you store online, delete frequently, and use good privacy and cybersecurity practices.
  • The only real steps you can take to protect yourself are using good security practices. Using things like two-factor authentication will alert you if anyone tries to get into your accounts, and it makes it harder for people to access them. You can set up a credit freeze or monitoring with a credit bureau, but be sure to read the fine print. 

Recommended Books

Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy by Cathy O’Neil 

All the best in your quest to get better. Don’t Settle: Live with Passion.

Write A Comment