The debate surrounding the use of artificial intelligence in marketing is usually a controversial one. In the eyes of pop culture, advanced technology, like artificial intelligence, evokes views of imminent dystopias.
One of the most memorable books I read recently was Dave Eggers’ chilling dystopia The Circle, which was released as a movie last month. The story revolves around ambitious young go-getter Mae Holland who joins a growing tech company that aims to complete the “circle” of information sharing (which would eliminate any notions of privacy). This is a world in which advanced technology is a ubiquitous and seamless part of everyday life, so much so that we don’t stop to question its existence.
(While the movie has received mixed reviews, I believe the book is worth reading.)
The perils of technological “progress” make up many of the sci-fi story lines we see in books and movies today. Popular approaches draw out drastic evolutions of technology like artificial intelligence that have serious implications, ranging from the extinction of privacy to all-out human slavery by machines.
But one topic that may represent a much more immediate reality remains largely untouched by the world of pop culture: cultural bias.
There’s plenty to fear about artificial intelligence in a general sense, but what about when you assess the possibility of racial, sexual, and cultural variations in the level of threat? Perhaps one of the most disturbing prospects for the future of artificial intelligence in marketing is, ironically, how well it may reflect and reinforce human flaws and biases.
In a paper published in April of this year, researchers at Princeton discovered that when an AI algorithm is trained with ordinary human language found online, it could acquire cultural biases that are embedded in particular patterns of wording and language. Such biases range from harmless preferences for flowers over insects to serious prejudice when it comes to race and gender.
For example, using an analysis of word proximity in 10-word strings to assess the strength of a connection between two words, the researchers found that a set of traditionally African American names had more unpleasantness associations than a set of traditionally European American names. The algorithms were learning this association all without any additional qualification to say that this association, though it may be observable in the data, is morally wrong.
Arvind Narayanan, assistant professor in computer science at Princeton said, “We have a situation where these artificial intelligence systems may be perpetuating historical patterns of bias that we might find socially unacceptable and which we might be trying to move away from.”
While many people would assume that artificial intelligence algorithms are objective tools making objective calculations, the fact is these tools are created from and trained on large sets of data (images, text, video, etc.) that currently exist online. This is data that has been created by humans, and thus is data that’s not free from bias.
When AI algorithms and content intersect, we need to be careful about the results. The danger with overuse of artificial intelligence in marketing is that our dominant, biased discourses will remain dominant and biased, especially if we assume an AI tool is taking an objective tack. We also risk relinquishing some of our critical judgment to the presumed intelligence of AI tools.
Image attribution: Tirza van Dijk
Let me start by saying I don’t believe the use of artificial intelligence in marketing is an inherently bad thing. AI algorithms can help make relevant suggestions to customers, recommend content that’s valuable, and help better target messaging to an appropriate audience. Artificial intelligence is far from replacing content creators, but it can free up time spent on mundane tasks that can then be put to better creative use.
But we need to be aware of the limitations in current AI tools. Here are some key principles to follow when crafting your content strategy.
We haven’t reached the point where AI is capable of creating the kind of content that humans do. Where algorithms can be useful is in identifying related topics and offering suggestions for further reading. AI tools can also help you better target your messaging to the right audiences and personalize that messaging effectively. It can also automate personalized responses triggered by certain actions. But leave the carefully crafted wording to the humans.
A human should always be the last point of contact with your content. It’s tempting to assume the complicated mathematical formulas that run the AI portions of your content strategy live in a mysterious black box that’s above your level of understanding and therefore you shouldn’t question it. This is never the case.
Make sure you own your content from start to finish and understand the general mechanisms by which your AI tools operate. Conduct a regular review of the tools you use and assess whether they’re still serving your company’s goals well. Have someone review all content before it gets published and distributed.
Sometimes as content creators and marketers, we need to move away from the status quo and try to push the needle in a different direction when it comes to the conversations we’re having. We all need to be aware that cultural biases do exist and actively work to counteract them. Make sure your content properly reflects the diversity of voices you claim to represent and speak to.
Brand voice isn’t a static thing. Let it grow and evolve with the direction of your company, and identify opportunities for positive change in your messaging. Listen to a broader range of input on your content strategy ideas to make better-informed decisions about the language and narrative techniques you’re implementing.
While AI tools can be a useful support to your content marketing strategy, understanding their limitations is paramount to using them effectively. For more insights about how technology is changing content marketing, join us at Forward 2017, the premier brand storytelling conference.
Featured image attribution: Alex Knight