Facebook knows us. Exceptionally well.
Facebook tracks who we talk to, what we talk about, what we like, what we’re interested in. It tracks where we are and what transactions we conduct. Facebook can pick your face out of other people’s pictures and automatically tag you in media. It can even find you in the background of crowd shots (“isn’t it cool that I’ve been tagged in so many pictures?”).
After gathering all this personal data, who does Facebook sell it to? Any buyer who can afford it. Even foreign actors, as we saw in the 2016 election. If there’s a small smidgen of our intimate life that Facebook can sell, it will do so.
Yet here’s the irony: Facebook recently posted continued growth and profit numbers for the last year. It reported that over two billion people use its online properties every day. So it appears that users don’t care that their personal data is sold freely – or more accurately, don’t fully understand the ramifications.
Think about it: Facebook is enabling the subversion of our highly personal social networks for profit and undue political influence. Which raises the questions: Is consumer capitalism – with any sense of safeguards – working anymore? Are the likes of Facebook, Google, and other online giants simply too big to suffer economic penalties for violating public trust? If so, we are on a slippery slope indeed.
Even if we accept that by sharing personal data we’ll receive more targeted advertising, we face a huge challenge. Namely, Facebook’s reach and potential misuse of personal data is now creating a real threat to our fundamental ideas of individual freedom and liberty.
Harvesting All of Us
It’s not that we haven’t been warned about the dangers of sharing personal data online. And many of us do take precautions with some of our sensitive data. Yet as a group we think all those online complimentary services are worth the loss of privacy, bit by bit.
So Facebook (and other Web giants) accumulate all our personal data points over time. The more data there is in one place, the more value it has for data mining. Over time, and in context of other individual data points, it becomes Big Data. Using data integration, it’s then mixed on the back-end with other data sources that, as end-users, we’ll never be aware.
Increasingly, identifiable data collection is happening in more dimensions than are ever understood by most users. Some apps now offer “general” surveys or take note about group preferences, but are really harvesting detailed notes that track us individually.
These apps, we know, use data analytics to analyze “friends of friends” comments to compile data about us. They even determine our current emotional state from textual analysis or online behavior. It’s now possible to correlate how sad or depressed someone might be by analyzing the volume and variety of their online interactions.
Are we comfortable with all of this?
A Cautionary Culture
Let’s look at China today. The government is building a huge system to track every individual’s social reputation. Why shouldn’t good people be recognized and rewarded? Yet it’s not just a reward. Authorities can use that reputation as a means of direct influence and control – who gets jobs, travel and educational opportunities.
The Chinese government can aggregate and mine phone and app online activity, reported recorded personal interactions, and all financial transactions. In China, every individual will be monitored at a micro-level. Everything people do will be auditable forever.
Now, back to Facebook: Recently there was an online “fun” app in which users were encouraged to submit two pictures of themselves, 10 years apart. Privacy experts suspect that this was a thinly disguised excuse to collect a massive amount of training data, to train algorithms at a huge scale. Of course, all of this makes that vast Facebook photo library even more commercially valuable. If you submitted your precious selfies, you helped a machine learn how to erode one more layer of your privacy.
When we compare China with our freedom-oriented Western culture, are we really aiming to get somewhere much different? I fear that platforms like Facebook have taken us many steps down that darker road.
The Machine Is Learning
Much of the data mining we’re talking about is about training recognition algorithms. I’m a big fan of the mathematics of machine learning, but I’m not so sure it can be ethically deployed at scale “for good.” Much has been written about the way machine learning algorithms at scale can be taught prejudices and learn bad behaviors, or used as a pretense and shield for ultimately unethical practices.
Beyond that, we should be aware that machine learning is also forming the basis of much of today’s drive towards process automation. Increasingly, intelligent machine-based automation – powered by deep learning and artificial intelligence – will replace many of the jobs of many low-skilled people.
I don’t believe in protecting jobs that could otherwise be intelligently automated. But those users who aren’t careful about “donating” their data might find it used to automate them out of relevancy. There could come a time when companies that own the resulting “intelligence” will own everything there is of value.
Fundamental Trust Issues
There is an implied social contract between people that assumes a basic level of goodness in all people. But too many forget that Facebook is a for-profit company, not a trusted confidante or even a neutral platform. Even if we believe that online privacy is already a lost cause, we’d be wise to remember one thing: not everything we do needs to be exposed and handed outright to commercial entities.
Trust should be a hard thing to earn, and for trust in third parties, constantly re-validated. We need to keep in mind that passive data sharing is a deliberate trust decision. I’m not suggesting we turn off the Internet, or give up on tech-based networking with our friends and family. But as we said back in my Air Force days – “The price of freedom is eternal vigilance.”
Facebook may be where your friends are, but it isn’t your friend.