02/07/2021
Big data is the new oil of the digital economy. Like oil, data is lucrative and fast-growing. It allows companies to capture, interpret, and analyze information in order to engage customers and drive business forward. But, there’s one crucial difference—data isn’t a single-use commodity. It can be reused, shared, and exploited, either deliberately or recklessly. This begs the question: how do we approach the ethics of big data?
by students Nalisha Men and Amanda Marques – Nalisha graduated from the Master in Business Analytics & Data Science at IE University. You can get in touch here. Amanda, born and raised in Sao Paulo, Brazil, is also an alum from the Master in Business Analytics & Data Science, and you can get in touch here.
IE University recently held an on-campus talk to shed some expert light on the subject.
Talented, informed, and industry-leading panelists shared their personal views and professional experiences regarding ethics and its precarious role in the world of big data.
In case you missed out, here’s the low-down.
What even are ethics?
First, let’s get back to basics. What exactly are ethics? Defining big data is relatively straightforward in comparison. It refers to information so large and complex that it’s practically impossible to process using traditional methods. However, the topic of ethics is a lot less black and white.
Essentially, it’s all about measuring what counts as right and wrong. Ethical theories are applied to topics as varied as animal rights, human cloning, and the environment. But what makes the world of ethics such a gray area is that it’s inherently subjective. In a world as diverse as ours, making policies and setting precedents based on ethics is a slippery slope.
Big data and ethics can get along
With this in mind, it’s imperative that big multinational companies are sensitive to ethical discrepancies around the world. Chinese customs might not bode well in the US, and what may be considered the norm in the US could be deemed unthinkable in India. This means businesses are constantly having to adapt their marketing strategy to specific countries to avoid causing offense. This is where data steps into the equation.
The majority of the world’s most renowned organizations such as Amazon, Netflix, and Google have already mastered data analysis to enhance regional personalization. For example, Amazon blends real-time user data with product information to formulate individualized recommendations for its customers. Unsurprisingly, these recommendations differ radically from country to country.
Data can make it a lot easier and faster to produce tailor-made services across the globe.
But surely there’s a catch?
Who does this data really belong to? And when does it stop being our own?
IE’s event touched upon this, exploring the extent to which data can be utilized before ethical boundaries are crossed. At a base level, we all stand to benefit from personalized content. Who doesn’t love Netflix’s recommendation engine? It saves us all a whole lot of time trawling through the app searching for our next binge-worthy series. And of course, it provides a huge boost to the company’s revenue—over 75% of the billion-dollar organization’s user activity is driven by its recommendation system.
However, for these extra-special, quite literally one-of-a-kind recommendations to be formulated, our personal data has to pass through many hands, until it’s halfway across the world—and you can forget about ever getting it back.
The right to be forgotten
You’ve probably heard of Cambridge Analytica. You might even know a little about the scandal they were involved in, and how it shook the world of data. In a nutshell, only last year, it came to light that Cambridge Analytica, a political data firm, had harvested the personal data of millions of Facebook users in order to influence their decision on the American presidential election. It remains one of the most, if not the most, unethical uses of big data ever.
Unsurprisingly, this resulted in a renewed focus on the Right to be Forgotten, which is now codified and can be found in the General Data Protection Regulation (GDPR). According to this regulation, when data is no longer required for its original purpose, or the data subject has withdrawn their consent to share this data, it must be completely erased immediately. As “simple” as that.
Since its implementation in 2014, the law has opened up a Pandora’s box of controversy. We’re talking heated debates over freedom of expression, the right to privacy, and access to information. As Julia Powles, law and technology researcher at the University of Cambridge, puts it:
“There is a public sphere of memory and truth, and there is a private one… Without the freedom to be private, we have precious little freedom at all.”
One small step for data analysts, one giant leap for ethics
So is it possible, in this cutting-edge, data-driven, and fast-paced world we live in, to have an individual relationship with data? Probably not. That ship sailed quite a while ago. But it’s definitely within our rights, and within reach, to have a bit more privacy.
First of all, companies need only collect data if it’s consensual, sustainable, and adds value to their business. For example, Joshua Kanter, Senior Vice President of Marketing at Caesers Entertainment, says:
Before conducting any new analysis, we ask ourselves whether it will bring benefit to customers in addition to the company. If it doesn’t, we won’t do it.
There’s also a lot of room for more transparency when handling data. Customer trust is one of the most valuable commodities in today’s global economy, and letting your customer know exactly what your data processes are is the first step in building a long-lasting and rewarding relationship with them. This in turn will increase your brand loyalty and revenues.
The bottom line
So, what’s in store for ethics in big data? Moving forward, companies need to enforce stricter ethical rules and regulations surrounding the way they collect, use, store, and share customer data. Small changes, such as limiting the intake of user information, only analyzing it when completely necessary, and making intentions known from the get-go, will make a world of difference. After all, there’s a fine line between fearing data, embracing it, and taking advantage of it.