The Day Big Data & Facebook Betrayed the Internet
For a decade now, Facebook (and social media, in general) has been a vital, albeit controversial, invention in the twenty-first century. It has changed our world drastically, and its positives outweigh the negatives. And that should be acknowledged. But due to the latest revelations about it, it may be time we, as informed netizens, view it with suspicion and outrage.
On 17 March, the investigative teams of The New York Times, The Observer and The Guardian and a whistleblower by the name of Christopher Wylie, revealed to the world how political parties and companies that deal with data analytics get their hands on personal information of 50 million Facebook users and use them for nefarious and ulterior means. Cambridge Analytica, a political consulting firm, allegedly did so for two of the major world events; Brexit and getting Donald J. Trump elected as the current US President. The latter event was the catalyst that got Facebook under heavy scrutiny as it has been revealed that Trump’s campaign used the platform to unfairly sway Americans voters to elect him, as well as that Russia had aided and abetted them in the process.
But the latest reports revealed Facebook’s naiveté and lack of concern for its users as these two events came about. And although the actual reports on this did not quite confirm the extent that this data was used for those two events alone, nothing in these events was coincidental by any means. This is what transpired.
Around 2015, Aleksandr Kogan, an academic from Cambridge University, got permission from Facebook to use the data of its users for an app he was working on. This app was a type of personality quiz that Facebook users can take as long as they can consent to give the app access to their and their friends’ Facebook profiles. This is something we all have done at some point and is standard for all other apps and websites. It is also how Facebook and advertisers generate money as it uses the data to make ads that are specifically targeted to the users.
Although 270,000 used the app and took the quiz, Kogan ended up with personal information of around 50 million users as it had information on the “friends” the participants allowed the app to have access to. Cambridge Analytica, which is partly owned by Steve Bannon, an associated of Donald Trump who was on his campaign team (and also knew Kogan) got their hands on this data. He and Cambridge’s CEO Alexander Nix, according Wylie, who was a former employee at Cambridge, used this data to create psychographs (categorizing people on their personalities, likes and dislikes). These psychographs proved vital as the firm then used them to create political ads and propaganda on Facebook, altering just enough so they can sufficiently influence the users they were targeting. Reporters went undercover as clients to Nix, where they recorded the exchanges that took place between them (these can be viewed on Channel 4 on YouTube). It must be clarified that using psychographics and demographics for political campaigning is not a crime in itself. How that’s being done however, can be questioned.
When this was revealed, Facebook was in a pretty tight spot, as it confirmed the suspicions of those who felt that the platform did little to handle this situation. The CEO, Mark Zuckerberg, did not answer to these allegations at once. It did try to reveal a few days before the actual reports came out, that they had suspended Kogan for violating the clause of giving the data to third parties and all others. (Kogan, however, assured them that he did not have any of the data with him, but Wylie has said that that data was still there at Cambridge). The reports, however, revealed something more telling; that Facebook’s algorithm and business model was designed in such a way that can be taken advantage of by firms like Cambridge. And that no one can use the internet for almost any purpose other than surfing unless they have a Facebook account and users should be willing to let it use the data (display pictures, what they have liked, shared and commented as well as their friends).
Eventually, Zuckerberg did speak out on the issue and has stated that his team will work on fixing the issues that protect its users’ data better, and wrote a lengthy post on this on his Facebook account, but this time many can tell that his speaking out was nothing more than a PR ploy. For one, Zuckerberg did not explicitly apologize; two, his strategy only included fixing the bugs but not in changing the design of Facebook itself that made it susceptible in the first place (namely strengthening the privacy of its users), and three, the company made the grave error of threatening to sue the news companies that had publicized these revelations.
This naturally led to the formation of #DeleteFacebook, with bigwigs like Tesla founder Elon Musk and Whatsapp CEO Jan Koum urging people to remove Facebook from their lives. It has gotten the US government calling Zuckerberg to US Supreme Court for questioning on his actions. Facebook lost a lot financially as its stocks plummeted drastically. It also tarnished Zuckerberg’s reputation as a reliable and responsible CEO and may be further broken down in the coming weeks.
But it has done something much worse, it has cast doubt and mistrust among us for the website. It has given netizens regret for uploading our personal information and pictures in the first place and for giving us access to others. It has made us realize that corporations do not hold anything sacred for the sake of profit, nor have any regard for us consumers. It showed us how vulnerable we all are and how big data use can be used against us and manipulate us to make horrible decisions. And finally it has made us realize that escaping the realm of Facebook is difficult (deactivating our accounts doesn’t help as it is not synonymous with deleting and the site still retains our information). The necessity of the social networking for our education, jobs, news and other vital things is so huge that even if we leave it forever, we users will still lose in the long run.