Facebook shrouded its core problem through focusing on details and side effects. Mark Zuckerberg's Facebook hearing aimed to provide clarity. In reality, it resulted in even more confusion. We need to put the core problems on the main stage of debate. It lies on the crossroads of two different fields: technology and sociology. These two perspectives are difficult to reconcile, but if we do not learn to, future incidents will be more stunning and momentous. A slew of new startups based around the same principles as Cambridge Analytica were recently funded by the political startup accelerator, Higher Ground Labs in San Francisco.


In the hearing, Republicans and Democrats grilled Zuckerberg. The hearing did not exclusively focus on Cambridge Analytica, but spanned from data privacy to censorship. Cambridge Analytica is a side effect of the real problem. It’s personifying an enemy and steering focus away from the root cause, the sheer effectiveness of applying data.

Many Republicans claim that Facebook censors conservative thought. Ted Cruz focused his line of questioning on multiple Facebook groups and pages that were banned by Facebook. He claims that they were banned only for their Conservative beliefs. He pointed at Facebook's roots in the left-leaning Silicon Valley as further evidence.

Some political ideologies radicalized under these circumstances.

Many Democrats claim that Facebook resulted in Trump's presidency. Cambridge Analytica was hired by the Trump organization to use data science to target the highest priority voters. Information has also been weaponized in a surgically executed campaign.

Both Democrats and Republics claim that Facebook created an echo chamber. Facebook users are fed news that validates their biases. Some political ideologies radicalized under these circumstances. By isolating each political group from the ideas of the other, they drifted further and further apart. This polarization is evident in our current political discourse.

How Cambridge Analytica Obtained Data

The methods that Cambridge Analytica obtained their data remains shrouded in misconceptions. Mark Zuckerberg reiterated in the hearing that Facebook has never sold user data. He stated that it was collected through their developer platform. For years, Facebook has allowed external developers to build and publish apps. Facebook games like Farmville and Mafia Wars are built on the developer platform.

Aleksandr Kogan, a researcher at the University of Cambridge, developed a personality quiz using the developer platform. Users opted to take the quiz and gave permissions for the application to access the information on their Facebook profile. A number of users also gave permissions for the quiz to access their Facebook messages. Users also gave permissions to their friends list, revealing their friends' basic profile information like name and location.

Facebook knew about this. They were told that the data would be used for non-commercial reasons such as academic research, and allowed it. Mr. Kogan claims that, at a later date, Cambridge Analytica approached him for this data. He stated that he provided it without any knowledge on how it would be subsequently used. However, Cambridge Analytica denies using any Facebook data.

Facebook's Business Model

Facebook made over $40 billion dollars in 2017. Over 98% of that came from businesses paying for ad space. Facebook competes heavily with Google for this money. Historically, Google has been one of the most effective advertising platforms. Google made over $111 billion dollars in 2017, with 86% of that directly from ads sales. Even though Facebook may seem to own the social media market, they own a minority of the ads market.

Businesses must decide which platform to advertise on. They want each dollar they put in to be as effective as possible. Both companies provide tools to help businesses target specific users. Companies only pay for ads when they are clicked. Google and Facebook do everything in their power to try to get you to click on as many ads as possible. They try to predict your tastes to further this goal.

Their algorithms formed echo chambers as side effects.

They've developed intricate algorithms to predict your preferences. Their algorithms are the basis of this whole problem. They ingest huge amounts of user behaviour data, which incentivizes these companies to collect everything. Their algorithms form echo chambers as side effects. Cambridge Analytica developed an algorithm like this but, without the huge quantity of data that Facebook has, they are not as effective. Therefore, they would augment it by scraping the internet for any public user data and buy up private data they could get their hands on.

Data is the New Oil

All the algorithms that power these systems are fueled by data. The fastest and easiest way to improve the systems is to feed them more data. Companies like Facebook collect data wherever they can. They collect every click committed by every user. If you log out, they will keep track of your clicks. When you log in they will retroactively take those clicks into account.

Data is being generated and collected at an astonishing rate. The rise of cloud computing and the drop in storage prices have made it economically feasible to store dizzying amounts of data. People generate 2.5 Exabytes of data. That’s enough data to fill the hard drives of about 5 million computers a day.

The sheer magnitude of data makes it nearly impossible to do anything by hand. It is infeasible for Facebook to hire enough people to tailor each user’s newsfeed. Recommendation algorithms power everything. Therefore, data powers everything. Data is the oil of the tech companies.

Data creates a moat for the business. Even if you designed a search algorithm better than Google’s, you would never achieve similar quality without their data. These companies have effectively protected themselves from direct competitors solely through having more users. Data disparity will continue to widen the gap, until it becomes impossible for anyone to directly compete.

The Echo Chamber

As previously mentioned, the content of your Facebook newsfeed are assembled by recommendation algorithms. These algorithms generalize each user by their behaviour and calculates the similarity between users. They recommend to you items that similar users liked.

For example, if you are a liberal millennial in San Francisco, it will begin to recommend things that other liberal millennials in San Francisco like. As time passes, the recommendations start to become more and more similar for each user. Recommender systems do not take risks. They want to show you content that it knows you will enjoy.

Most users do not realize that the newsfeed is tailored to them. Many assume that everyone sees the same news that they do. This is not true. It validates people’s biases. It tricks you into thinking that everyone agrees with you. It hides opposing opinions. It shows you everything that it knows you already agree with.

The isolation allows groups to radicalize. It made people conform to similar users. It turned the United States into increasingly homogeneous groups with less and less in common. It allowed fake news to wreak havoc by keeping it quarantined amongst the most susceptible users.

Recommender Systems do not take risks... It validates people's biases. It tricks you into thinkin that everyone agrees with you.

This is our current political ecosystem. Ravaged and isolated by flawed recommendations. The Facebook recommender system was designed to change to match user behaviour. In reality, it changed society’s behaviour to match its design.

The Solution

Tech companies have incentives to collect as much data as possible. It powers their recommendation systems, which indirectly makes them more money. They cannot self-regulate. They will always do what’s in their best long term interest. We need our policies to catch up and set the limits of what you can collect to improve recommendations. The EU has drawn a line in the sand by passing the General Data Protection Regulation (GDPR). There are clear improvements to this that can be applied now.

Companies are collecting everything. Even information that they do not currently apply. Data is cheap to collect and as precious as oil when applied. Companies should not be able to collect what they do not need. There should be a clear application of the data that does not contradict any other regulation, like the GDPR.

Not all data should be collected. Messages and calls contain the most sensitive content. We don’t like systems knowing what we clicked on, but we hate systems analyzing our messages. GDPR does not require messages to be encrypted. It should. With encryption, messages can be transmitted by the system, but the system cannot read the inner contents of the message. With encryption, your private messages could not be used for ads.

You should own your own data. Facebook does not have strong competition in the social media space. Google does not have strong competition in the search engine space. No one can come close to matching their quality without similar magnitudes of data. They assert their dominance via data, creating a new kind of monopoly. You should be able to transfer your data directly to competitors. The GDPR mentions this; however, it leaves the implementation details to the companies. Facebook will not make it easy for a smaller social media to absorb the customer’s data. We should pick the use cases, like social media and search, that are the highest priority and define exactly how the data should be transferred. The smaller companies should be able to build systems to absorb and use the data.

You should own your own data.

Companies like Facebook do not have a right to your data. It’s a privilege. It can be used to give you a great user experience, such as with companies like Spotify. Data should be used to enhance your experience, not to sell you more things. Companies have to realize this, and start focusing on value-based uses of data. Transferring data will help, but to make this a reality, you need to have the ability to clear your data.

Triton leads the charge in transforming personalization. We’ve reimagined personalization to give each user value and control at every turn. This creates loyal users who want to use your site. Companies should not be extorting users, they should be delighting them. We are helping tech companies improve themselves from the lessons learned from Facebook.