Mental Health Matters: Why you should be leery of ‘Big Data’
Mental Health Matters
I’m starting to worry. The more I think about this, the more concerned I become. And I’m not talking about a Trump Presidency or the “Anyone but Hillary” Movement.
No, I’m thinking about the rise of “Big Data” and how it’s changing what we think and even how we think.
Remember the dystopian novel, “1984,” published nearly 70 years ago, about a future in which everything you thought you knew was no more than propagandized programming by an all powerful government? If you haven’t read this book by George Orwell, now is the time.
But I digress. “The Rise of Big Data,” a piece in Foreign Affairs, May/June 2013, explains that computer technology allows for the accumulation of massive databases on everyone and everything.
Turn on your computer, look for products, open up “news” feeds, listen to music, and sophisticated algorithms are learning about your interests, beliefs, habits, politics and personal preferences.
Next thing you know, you’re being invited, based on the data generated by those algorithms, to buy products or view stories that reflect your mindset.
You ask, “What’s wrong with that?” After all, now the stories, the products, the news — it’s all easier to come by.
True, but there is a dumbing downside — the algorithms used to discern your proclivities, your preferences, even your thoughts and behavior, tweaked a bit, now present you with news, stories and products that commercial interests use to subtly influence, direct or reinforce your views and your emotions.
How does that happen? Will Oremis, writing for Slate on 1/3/16, explains that Facebook news feed, “can be tweaked to make us happy or sad; it can expose us to new and challenging ideas or insulate us in ideological bubbles.”
Happy or sad? Ideological bubbles? What?
A paper in the Proceedings of the National Academy of Sciences explains that by manipulating the news feed to increase or reduce positive or negative messages, the reader is prone to post messages of similar emotional content. Says Slate, 6/28/14, “Social networks can propagate positive and negative feelings.”
So, web companies use sophisticated algorithms to tailor news feeds and search results to your very own personal tastes. You then get exposed to more information that simply confirms what you already think you know.
It’s like Fox News or CNBC on steroids. How neat is that? Everything you think you know is now newly confirmed and reinforced by the news feed you can get on your computer several times a day!
Understand, I’m no computer wizard. It took me a while to figure out that the invitations my computer was sending me were no more than algorithmic elaborations of the data I had previously generated. The computer, having digested my searches, fed me stories it predicted I would “like.”
Clever, right? After all, confirming what you think you already know makes you feel good, makes you feel smart, makes you know it’s the other guy who’s stupid, and makes you want more similar stories.
Perhaps that’s one reason some parents have become increasingly leery of the “free” software provided by schools because it allows access to a child’s generated data, which can be analyzed, stored and sold.
In the article, “Data Mining your Children,” for politico.com, Barmack Nassirian asks, “Do you want private companies to get into your kids head and mine the learning process for profit?” He muses, “A model legislative bill might set up a central state database for student records and allow colleges and businesses to browse them in search of potential recruits.”
And it can get worse, much worse. We already live in a world of increasingly polarized beliefs, compromising our ability to empathize and understand those we see as different from ourselves.
Increasingly, we tend to cluster in communities of like-minded people. The resulting echo chamber effect means your preconceived notions, right or wrong, are reiterated and reinforced by people who think like you do.
This is unhealthy. It fosters hostility and self-righteous indignation toward people who are different. “I’m right, you’re wrong” offers no middle ground, no way to derive a new, richer, nuanced and empathic understanding of opposing viewpoints.
A “closed” mind means you can’t think critically, you can’t acknowledge that a different viewpoint may have merit. You haven’t learned how to think, only what to think.
To widen your understanding, to learn to empathize with those whose beliefs are not yours, to realize that what you think you know is incomplete and imperfect, when next opening the news on your computer, allow yourself to search perspectives that are different from what your computer predicts you’ll like.
Incline Village resident Andrew Whyman, MD, is a clinical and forensic psychiatrist. He can be reached for comment at firstname.lastname@example.org.