by Bobbin Wages
Harold Weston produces minimal digital exhaust. A simple internet search reveals that he is a clinical associate professor of risk management and insurance at the J. Mack Robinson College of Business, but on a personal level, he is difficult to track. On social media sites like Facebook and Instagram, he doesn’t exist. His cellphone contains no extraneous apps, and its tracking services are disabled. According to Weston, web users’ internet activity is stored in profiles containing every online move they’ve ever made. The data brokers who compile those profiles merge them with public records across the country and sell them to marketing companies and other potential end users – sometimes for sales purposes, sometimes for less benign purposes. To Weston, “big data” can be a euphemism for “creepy Orwellian nightmare.”
To be fair, Weston notes that data mining has its place, particularly when benefiting society. Last spring, the Qatar Computing Research Institute analyzed approximately 3.1 million Arabic tweets in order to determine who was at greatest risk of joining ISIS. The Pittsburgh Health Data Alliance garners information from patients’ medical and health insurance records, DNA data and wearable sensors not only to create personalized healthcare packages but also to detect disease epidemics, prevent outbreaks and improve diagnoses. On February 22, 2015, Terra Seismic predicted that a major earthquake would hit Indonesian island Sumatra based on satellite data. Nine days later, a 6.4 magnitude earthquake struck. Big data analytics can save lives.
Weston says that big data should be harnessed to advance efficiency, knowledge, science and the public good. “Having less information to make less accurate decisions is bad because that would keep us intentionally dumb and oblivious. We want better tools that can use more information,” he explains. “But we also want to apply practical reason and judgment when it affects peoples’ integrity and behaviors. And some ethics,” he continues. “Is it right and good for individuals and society that we compile dossiers on everyone?”
Data mining powers online advertising as well. If a woman shops for a wedding dress, Macy’s might beckon her to create a gift registry from her browser’s sidebar within hours. A new cycling enthusiast who Googles road bike models not surprisingly could receive subscription offers from Bicycle Times Magazine in his Facebook feed. In 2012, a Target data analyst predicted a teenager’s pregnancy before she notified her parents; the sudden influx of baby merchandise coupons in the mail ratted her out. Google and public health officials can use search inquiries regarding medical symptoms to track diseases and their vectors. However, pharmaceutical companies exploit the same search inquiries to promote drugs that sync with people’s medical histories.
Although the accuracy of digital ads is a little eerie, Weston points out that traditional businesses have drawn the same conclusions about customers for centuries before the internet existed. For example, the owner of a mom-and-pop bookshop might notify a regular customer of a new title based on the reader’s purchase history and perceived literature preferences. “Small merchants always have used inductive reasoning to build customer profiles and make product recommendations,” Weston says. “It serves the customer better and makes choices easier.”
On the flip side, such extremely tailored advertising possibly prevents customers from discovering goods that don’t neatly align with their online personae but in actuality complement their lifestyles. “There’s a fine line between facilitating and constraining choices,” Weston says. Additionally, media sites often watch readers’ news consumption and reinforce those interests and biases with similar stories.
Data mining gets dicey when it fuels false assumptions or even discrimination. A 30-something could lose a job offer if the employer conducts an online background check and exhumes an incriminating photograph of her at a party from more than a decade prior. Travel website Orbitz has admitted to displaying higher hotel prices to Mac users; according to the company’s research, Apple owners spend up to 30 percent more per night on lodging than their PC-carrying peers. More disturbingly, Weston remarks that some companies feed on members of low-income populations or other vulnerable parties by charging them more, not only because of low credit scores but also because of an inability to negotiate. Further, scammers often target the elderly. “Miss Mitchell is a nice old lady and won’t push back,” Weston explains. “The seller knows more about her state of mind than she does herself.”
Currently, no U.S. laws regulate companies’ data mining practices. Although the Federal Trade Commission has beseeched Congress to consider legislation that would require data brokers to act with greater transparency, little headway has been made. “People are haunted by a history that no longer is relevant because of links to things no one ever would have found otherwise,” Weston says.
Yet in April 2016, the European Union (EU) adopted the General Data Protection Regulation (GDPR), which grants individuals some control over their personal data. As part of the GDPR, EU residents are entitled to a “right to erasure,” enabling them to request removal of their information from, for example, Google search results. Further, the Charter of Fundamental Rights of the European Union includes a section on the protection of personal data in “light of changes in society, social progress, and scientific and technological developments.”
In response to cognizance that they essentially are being watched, some consumers participate in what Weston calls “blowback of data collection” — and alter their behavior accordingly. In order to appear as a health nut, a grocery shopper might purchase kale, tofu and cold-pressed juice with her credit card and frozen pizza, a pack of hot dogs and a bag of Doritos with cash, leaving the former in her refrigerator to rot while gorging on the latter. That falsified data trail could trigger her insurance provider to view her as a lower risk and reduce her premium. Morality comes into play as well. Weston offers the example of a cheating spouse who quits using a dating site in an effort to rededicate himself to the marriage. But when the spouse stops logging in to the account, the company displays relentless advertising on the family computer so as not to lose business. In both cases, data collection impairs personal liberty. “It affects our autonomy when we have to stop doing things, or start doing things, to defeat the spying,” Weston says. “It is sort of like a police state, except the police are a private force.”
“Thirty years ago, if someone predicted that a handheld device would be invented and we would be required to keep it on or near us at all times, let it track our locations, record our social habits and personality traits and interests and curiosities and obsessions, create a log of all friends and communications, and then sell all that to companies, well, we might have predicted a revolt,” Weston shrugs.
And yet, most people spend significant free time glued to said device, leaving breadcrumbs for data brokers to follow and predict the next weakness to manipulate.