Are Your Beliefs Your Own? Algorithms And The Theft Of Thought
The data you generate and hand over today is being processed by vested interests to not only anticipate and influence but also manipulate your future beliefs.
If you are not a marketer, advertiser, or an AI ethicist, I am willing to bet the boat that you’ve never heard of these terms: “behavioral twin”, “algorithmic nudge” (or the “Nudge Theory”), and “dark patterns”. As a consumer, though, you need to be acutely aware of what exactly each of those terms means.
At the outset, let me say this: I am not an alarmist or a rabble rouser. But what you are about to read are facts from the world we live in, with nil embellishments. Facts that act as a roadmap, indicating where we are headed.
Prepare to be surprised.
By now, it's common knowledge that in our digital age, exchanging data for convenience is near-normal. Research indicates younger demographics, particularly those below 30, are increasingly comfortable sharing personal information for tailored experiences and perceived value.
So it works like this - I share publicly the choice of my fav sneaker brand, the colors I primarily like in such shoes, and the occasions when I love wearing sneakers; maybe not putting out all of this information simultaneously but on different occasions, from different platforms, including social media. Soon after, the tech company and the advertiser/seller collaboratively stich together all my information around sneakers, and then “pop up offers” or “whatever else it may be” to get my attention to goad me into buying a pair, even though I probably don’t need new sneakers at this point in my life.
Almost all of us have now got used to such online “behavior”, right? But just about a decade ago, predictive analytics-based marketing was like “OMG”.
The Illusion of Choice: You Are No Longer In Control
What if the data “they” have isn't just about you today, but the “you” of tomorrow? And what if it's not just about what you might do, but what they will make you do?
A future where your actions are subtly guided, rather than consciously chosen, may be closer than you think. This is the insidious reality of "unforeseen commodification," where companies are now selling access to our future selves, effectively manipulating our choices…. before we even make them! Unless safeguards are put in place, we may soon lose much of the power to direct our own actions.
I am not an analyst, a psychologist, or some such, but a digital marketer, an AI communicator, and a futurist. Given my extensive experience over the last four decades in the field of content, marketing, and tech, coupled with my interactions with my peers, other experts, consumers and my own experience as a consumer, if I were to hazard an “educated guess”, I would estimate that consumer manipulation today is at level 6 out of 10.
The Illusion of Choice: When Algorithms Know You Better Than You Do
Enter artificial intelligence (AI). Humans take pride in being autonomous and in their ability to make independent decisions. Yet, increasingly, algorithms are predicting our desires and inclinations. Predicting could have been fine, but “they” are going beyond that and actually manipulating your choice and mine.
Fueled by AI, the nature of the beast is changing very fast. It's not limited to just showing us relevant ads anymore; it's about actively manipulating our emotions and desires to achieve the organization’s/seller's desired outcome.
Imagine this: Without your conscious involvement, you've been subtly nudged toward a particular lifestyle through curated social media feeds and personalized content. Your browsing history, location data, and even your physiological responses, captured through wearable devices, are analyzed to build a predictive model of your future self. This model isn't just a statistical average; it's a dynamic, evolving representation of your potential choices, desires, and vulnerabilities.
Tech and other companies can then "sell" access to this model to interested retailers, allowing them not only to anticipate your needs but also to help manipulate your future behavior. It’s no longer limited to a few big-time retailers wanting to just drive their bottom line, but is spreading to financial institutions and even political parties to subconsciously mold your opinions and point of view.
The ability to reinvent ourselves, to learn from past mistakes, and to evolve is a fundamental aspect of human experience. However, when our future selves are locked into a fixed, data-driven identity, this ability is compromised. The past becomes a permanent record, not a learning experience, and the future becomes a predetermined script, not an open canvas. And now, even that script is being subtly re-written not by you, the scriptwriter, but by hidden (black-boxed) algorithms.
The Illusion of Choice: When Algorithms Start to Nudge You in a Certain Direction
This “commodification” of our future selves creates a disturbing temporal distortion. The present moment becomes less about lived experience and more about a data point feeding into future projections.
Our future is no longer about the unknown. Our choices are no longer driven or limited to our immediate desires or conscious deliberations but by algorithms that cunningly manipulate us into thinking what we might want in the future.
….It Goes Beyond Retail
The commodification of our future selves extends beyond marketing and advertising. It's about the financialization of potential, where companies can trade "futures" on our behavior.
Picture this: insurance companies are already using algorithms to judge your risk, not on your present, but your predicted future. Now, imagine them forcing you to change your behavior, violating your existing policy, and voiding your coverage.
In a more extreme scenario, companies could trade predictions about individual behavior, creating a market for human potential itself. This would transform us from autonomous individuals into commodities, our futures bought and sold like stocks on a market.
In the age of AI, our digital footprints have become more than just a trail of breadcrumbs. They are a treasure trove of information that companies are increasingly using to manipulate our choices.
Behavioral Twins
Take, for example, this relatively new phenomenon called “behavioral (digital) twins”*. These AI avatars are akin to the pre-AI era static, “buyer persona” (digital marketers and advertisers will know what I am talking about), only more sophisticated, dynamic and real time.
(*The term is used loosely here more for the reader’s understanding than anything else.)
I’ve written about digital twins before. Though not really the same, a behavioral twin can be called a “subset” of a digital twin. It is a digital representation of you, created by analyzing your online behavior.
A behavioral twin focuses on modeling and predicting human or system behavior rather than just replicating a physical entity. It aims to understand patterns, decision-making, and interactions.
The purpose is to simulate and predict user or system behavior based on data patterns.
It is a complex algorithm that predicts your future actions, preferences, and even emotional states. Companies use this information to target you with personalized ads and offers, often without your knowledge or consent.
How are Behavioral (Digital) Twins Created?
Your virtual twin is created by collecting data from various sources, including your browsing history, social media activity, and even your physical location. This data is then analyzed to identify patterns and trends in your behavior. Once your behavioral twin is created, it can be used to predict your future actions with surprising accuracy.
The Dangers
While digital cognitive twins can be used to provide personalized experiences, they also pose several dangers. For example, they can be used to manipulate your choices by showing you ads for products and services that you are likely to purchase. This can lead to impulse buying and overspending.
Behavioral twins can also be used to create filter bubbles, which are echo chambers of information that confirm your existing biases. This can make it difficult to see other perspectives and can lead to political polarization.
Beyond Behavioral Twins: The Importance of Context and Agency
While virtual twins play a role in this manipulation process, they are but. a part of the overall problem. The focus must always be on the broader context of data commodification. Remember, it's not just about predicting behavior; it's also about manipulating it. It's about selling access to our potential, effectively turning us into commodities in a market for human futures.
Get Your Membership. Now.
Humans As Commodities: The Erosion of Intrinsic Value
The commodification of humans extends beyond simply selling data; it's about reducing individuals to quantifiable units of potential behavior. In this framework:
Intrinsic Value Diminishes: Human worth is no longer defined by inherent qualities like creativity, empathy, or moral character. Instead, value is determined by predictive models, risk assessments, and potential consumer behavior.
Data as Currency: Our digital footprints become our primary assets, traded and exploited for profit. We are reduced to data points, our lives segmented and analyzed for marketability.
Loss of Individuality: The algorithms create a standardized, predictable version of us, stripping away the nuances and complexities that make us unique. We become interchangeable units in a vast data-driven system.
The rise of social scoring: If a company can correctly predict your future actions, that data can be used to score you, and then the score can be sold to other companies. This is where the true danger lies.
Consequences of Deliberate Manipulation: A Case Study
Imagine a cunning marketing agency tasked with selling a new, overpriced, and arguably unnecessary "wellness" product. They possess access to sophisticated predictive models and behavioral manipulation techniques. Here's how they might operate:
Targeted Vulnerability Exploitation: The agency identifies individuals with predicted anxieties or insecurities related to health and well-being.
Algorithmic Nudging and Social Proof: They create highly personalized content, subtly nudging targeted individuals towards the product.
Emotional Manipulation: They leverage psychological triggers, such as fear of missing out (FOMO) or the desire for social acceptance, to create a sense of need.
Pre-emptive Purchase Prompts: The agency utilizes predictive models to anticipate moments of vulnerability, such as periods of stress or loneliness, and delivers targeted ads or offers at those precise times.
The result: The targeted individuals, believing they are making autonomous choices, purchase the product, often at a premium price.
The longer term result: The consumer begins to not trust their own feelings, and begins to rely on the algorithms that they know are trying to manipulate them. This could cause high levels of anxiety, and depression.
Let’s see how Sdxah, a fictitious character, is “manipulated” into believing there’s something wrong with her mental health by one such unscrupulous firm:
Case Study To Illustrate Subtle But Insidious Manipulation
Background: Meet Sdxah, a 32-year-old marketing professional from London or New York or New Delhi…take your choice. She faces all the ups and downs of metro life, like most of us. A mother and a career person, she is also known to be occasionally “overwhelmed” by life.
This Thursday, Sdxah was having a slightly "off" week. Known to be an otherwise “edgy” individual, she wasn't experiencing overwhelming stress but rather a general feeling of mild unease and a slight dip in her usual energy levels that week. She was spending a bit more time than usual on social media, seeking a little pick-me-up. Unbeknownst to her, her digital footprint was being meticulously recorded and analyzed.
Data Collection:
Browsing History: For a few months prior, Sdxah had frequently searched for terms like "stress relief," "meditation," "self-care," and "mindfulness." But in her “feeling low” phase this week, she searched for "self-care" in a more casual way, like "relaxing evening routine" or "easy ways to unwind."
Social Media Activity: More passive scrolling, engaging with lighthearted content, but perhaps a few more posts related to "feeling down" or "needing a boost."
Device Data: Her smart watch showed a slight decrease in activity levels and a few nights of slightly less restful sleep, but nothing alarming.
App Usage: Sdxah used a digital journaling app (the content of which was already sold to a 3rd party). In her “current mood”, she continued using the journaling app, but the entries showed less stress, and more of a feeling of being unmotivated.
Algorithmic Analysis, Prediction, and Manipulation:
An AI-powered marketing platform analyzed Sdxah's data and constructed a highly detailed profile earlier, even before the start of “this stressful week”. Sdxah’s behavioral digital twin was:
Experiencing significant stress and was actively seeking solutions.
Susceptible to marketing messages that emphasized self-care and emotional well-being.
Likely to respond positively to products that offered convenience and a sense of community.
Yet, during the week in question, the AI-powered marketing platform continued to analyze Sdxah’s incoming data, but the “prediction” was now subtle and even manipulative to “force” Sdxah into believing some things which she was not:
It detected a mild emotional vulnerability, a temporary dip in her usual mood.
It identified her as someone receptive to messages that offered a sense of comfort and ease.
It recognized her interest in self-care, even in a casual context.
The Manipulation (Subtler and More Insidious):
Given her background and psychological profile, the AI caught on to the fact that this week, Sdxah was kinda feeling low. In an otherwise analogous world, a movie, a bowl of soup, or even a good cry would have released the pressure. Yet, the AI starts sending alerts to the pharma (fictitious) company “Mindful Moments” from where Sdxah has previously bought medications, et al., to swing into action.
Soon, Sdxah gets….
"Relatable" Advertising: The ads for "Mindful Moments" this time around were less focused on extreme stress relief and more on "easy ways to brighten your day" or "simple pleasures." They featured relatable scenarios, like enjoying a cup of tea or taking a relaxing bath.
"Just a Little Treat" Messaging: The language emphasized self-indulgence and gentle pampering, framing the subscription box as a small, affordable luxury.
Social Media "Relatability": Influencers portrayed the subscription box as a way to add a touch of joy to everyday life, rather than a solution to serious stress.
"You Deserve It" Psychology: The marketing subtly reinforced the idea that Sdxah deserved a little treat, capitalizing on her slightly lowered mood.
The use of her journaling app data: Instead of focusing on stress, the company used the data to highlight the fact that Sdxah was feeling under-motivated, and that the box could help her to feel more inspired.
The Purchase:
Sdxah, feeling a little low and drawn to the idea of a gentle pick-me-up, ticks on the box provided by “Mindful Moments” and makes the purchase. She sees it as a small, harmless indulgence.
The Aftermath:
The purchase provided a brief moment of enjoyment, but the effect was fleeting.
Sdxah didn’t feel manipulated because the marketing was subtle.
She began to rely on small purchases to boost her mood.
She now trusts the ads, because they seemed so relatable.
Analysis:
This case study illustrates how not only a combination of data collection, algorithmic analysis, and psychological manipulation can be used to influence consumer behavior but that even mild emotional states can be exploited for commercial gain. The manipulation is more stealthy because it's less overt, making it harder for consumers to recognize it.
Mass Manipulation: The Potential for Societal Control
The techniques described above can be scaled to manipulate entire populations. This has profound implications for:
Political Manipulation: Political campaigns can use predictive models to target specific demographics with tailored messaging, exploiting their predicted biases and vulnerabilities. This can undermine democratic processes and erode public trust.
Consumerism and Debt: Companies can create a culture of artificial need, encouraging impulsive purchases and fueling consumer debt. This can lead to widespread financial instability and social inequality.
Social Polarization: Algorithms can amplify existing social divisions by creating filter bubbles and echo chambers, reinforcing biases and hindering constructive dialogue.
Public Health: Corporations can manipulate the masses into unhealthy habits, through the use of targeted advertising. For example, a junk food company could target people who are predicted to have a higher chance of obesity.
The erosion of critical thinking: If people become used to having their decisions made for them, they will lose the ability to think critically. This would be a disaster for society.
Let me add some context to help readers understand the scale of the issue:
The rise of "surveillance capitalism":
The concept of "surveillance capitalism" describes how tech companies collect and analyze vast amounts of personal data to predict and influence consumer behavior. This practice is widespread, and it significantly impacts how consumers make decisions.
This means that many of the online interactions consumers have today are being recorded and then used to influence future interactions.
The prevalence of "dark patterns":
"Dark patterns" are deceptive design elements used on websites and apps to trick users into doing things they didn't intend. These tactics are increasingly common, and they contribute to consumer manipulation.
Dark patterns like countdown timers, hidden information, and subscription traps, to name a few, are now being addressed by some regulatory bodies, showing that they are a clear and present issue.
On July 10, 2024, the International Consumer Protection and Enforcement Network (ICPEN), led by the US Federal Trade Commission, revealed shocking findings: a review of websites and apps showed over 76% employed dark patterns, with nearly 67% using multiple deceptive tactics. These practices, classified according to OECD guidelines, highlight the widespread manipulation of consumers."
The power of targeted advertising:
Advanced algorithms allow companies to deliver highly personalized ads that exploit individual vulnerabilities and preferences. This level of targeting can be very effective in influencing consumer choices.
The use of AI in targeted advertising is becoming more and more advanced.
The influence of social media:
Social media platforms play a significant role in shaping consumer opinions and behaviors. Algorithms curate content to keep users engaged, which can lead to the spread of misinformation and the reinforcement of existing biases.
It's important to note that this article is a subjective assessment, and the actual level of manipulation may vary depending on individual circumstances. However, there is little doubt that consumers are facing increasing pressures from sophisticated manipulation tactics.
The sheer volume of data collected and the sophistication of AI-driven manipulation techniques are unprecedented.
Many consumers are unaware of the extent to which their behavior is being influenced.
Regulatory efforts to address these issues are still in their early stages.
The awareness of this unseen commodification can erode social trust, eventually. When we realize that our choices are being manipulated, and our futures are being sold, we may feel a profound sense of alienation and distrust.
This can lead to a society where individuals are constantly monitored and controlled, not by overt force, but by subtle algorithmic nudges. The line between helpful personalization and manipulative control becomes increasingly blurred, and we may find ourselves living in a world where our choices are no longer our own.
Reclaiming Our Future: The Need for Algorithmic Friction and Temporal Privacy Rights
Can this trend be countered, or is it too late? Yes, it can be. Consumers need to occasionally and deliberately introduce noise and uncertainty into data collection and prediction. Or the designers of such AI systems need to deliberately introduce such “algorithmic friction”; all of which can make it harder to construct accurate predictive models.
We also need to establish "temporal privacy rights," which would protect individuals' ability to control the use of their data over time. This includes the right to "reset" their data-driven identities and reclaim their future selves.
Furthermore, fostering digital literacy and critical thinking skills is essential. Consumers need to understand how their data is being used and how algorithms are influencing their choices.
We must resist the temptation to surrender our autonomy to algorithms and instead strive to shape our own destinies.
The Need for Ethical Boundaries and Regulatory Oversight
The potential for mass manipulation highlights the urgent need for ethical boundaries and regulatory oversight. This includes:
Transparency and Accountability: Companies must be transparent about how they use data and algorithms to influence consumer behavior.
Data Privacy Regulations: Stronger data privacy regulations are needed to protect individuals from the exploitation of their personal information. There must be international laws that ban the selling of social scores.
Algorithmic Auditing: Independent audits of algorithms are necessary to identify and mitigate potential biases and manipulative practices.
Digital Literacy Education: Public education programs are essential to empower individuals to understand and navigate the complexities of the digital age.
Reference:
https://determ.com/blog/how-ai-is-changing-the-way-we-predict-consumer-behavior/
https://www.oecd.org/en/blogs/2024/09/six-dark-patterns-used-to-manipulate-you-when-shopping-online.html
https://en.wikipedia.org/wiki/Nudge_theory#:~:text=The%20author%20stresses%20%22Companies%20are,by%20nudging%20them%20into%20desirable
https://6sense.com/platform/predictive-analytics/what-is-predictive-marketing/
https://useinsider.com/predictive-marketing
https://www.ftc.gov/news-events/news/press-releases/2024/07/ftc-icpen-gpen-announce-results-review-use-dark-patterns-affecting-subscription-services-privacy




