The Dark Side Of Big Data
Latanya Sweeney, a Harvard professor Googled her own name to find out an ad next to her name for a background check hinting that she was arrested. She dug deeper and concluded that so-called black-identifying names were significantly more likely to be the targets for such ads. She documented this in her paper, Discrimination in Online Ad Delivery. It is up to an advertiser how they pick keywords and other criteria to show their ads. Google, like most other companies for which advertising is their primary source of revenue, would never disclose details of algorithms behind their ad offerings. Google denied AdWords being discriminatory in anyway.
Facebook just announced they are planning to give more options to their users to provide feedback regarding which ads are relevant to them and which ads are not. While on surface this might sound like a good idea to get rid of ads that are not relevant and keep marketers as well as users happy, this approach has far more severe consequences than what you might think. In case of the Google AdWords discrimination scenario the algorithm is supposedly blind and has no knowledge of who is searching for what (assuming you're not logged in and there is no cookie effect), but in case of Facebook, the ads are targeted based on you as an individual and what Facebook might know about you. Algorithms are written by human beings and knowingly or unknowingly they could certainly introduce subtle or blatant discrimination. As marketers and companies that serve ads on behalf of marketers know more about you as as an individual, and your social and professional network, they are a step closer to discriminate their users, knowingly or unknowingly. There's a fine line between stereotyping and what marketers call "segmentation."
AirBnB crunched their data and concluded that older hosts tend to be more hospitable and younger guests tend to be more generous with their reviews. If this is just for informational purposes it's interesting. However what if AirBnB uses this information to knowingly or unknowingly discriminate young hosts and old guests?
A combination of massively parallel computing and sophisticated algorithms to leverage this parallelism as well as ability of algorithms to learn and adapt to be more relevant, almost in real-time, are going to cause a lot more of such issues to surface. As a customer you simply don't know whether the products or services that you are offered or not at a certain price is based on any discriminatory practices. To complicate this further, in many cases, even companies don't know whether insights they derive from a vast amount of internal as well as external data are discriminatory or not. This is the dark side of Big Data.
The challenge with Big Data is not Big Data itself but what companies could do with your data combined with any other data without your explicit understanding of how algorithms work. To prevent discriminatory practices, we see employment practices being audited to ensure equal opportunity and admissions to colleges audited to ensure fair admission process, but I don't see how anyone is going to audit these algorithms and data practices.
I have no intention to paint a gloomy picture and blame technology. Disruptive technology always surfaces socioeconomic issues that either didn't exist before or were not obvious and imminent. Some people get worked up because they don't quite understand how technology works. I still remember politicians trying to blame GMail for "reading" emails to show ads. I believe that Big Data is yet another such disruption that is going to cause similar issues. We should not shy away from these issues but should collaboratively work hard to highlight and amplify what these issues might be and address them as opposed to blame technology to be evil.
Photo Courtesy: Jonathan Kos-Read
Commentaires
Enregistrer un commentaire