Accountability: The One Thing You Can't Outsource To AI
At the end of the classic 1939 film The Wizard of Oz, Dorothy discovers, to her dismay, that the all-powerful wizard she journeyed to Emerald City to see is in fact just a man operating a rather rudimentary machine. “Pay no attention to the man behind the curtain!” was his desperate line to prevent her from making this discovery. This scene is an important reminder that no matter how powerful or autonomous technology becomes, it is still man-made and, therefore, man’s responsibility. It is imperative that we don’t forget about the man behind the curtain.
It is through this lens that financial marketers should view their relationships with artificial intelligence today. Marketing teams inside fintech companies and banks are forging boldly ahead with plans to use artificial intelligence (AI) to transform and perfect the way they reach, engage, sell to and retain their customers. But as they become more reliant on machine-based intelligence, marketers must also acknowledge their duty to ensure these insights are used appropriately.
With public debate about consumer data access and privacy at an all-time high, and regulators and government bodies such as the Money Authority of Singapore and the United Kingdom’s House of Lords pushing for the development of ethical standards for AI, financial marketers must acknowledge the role they can play in shaping best practices for data analytics that build trust among their customers and respect for their organizations.
Here are three considerations for marketers looking to shape the responsible, inclusive, transparent use of AI in financial services:
Marketers need to be educators, too.
Recent headlines about the misuse of customer data by tech companies and the increased frequency of large-scale data breaches have banking customers worried about the security and misappropriation of their information. The U.K.’s move to an open banking regime has some consumers worried about their data, too. A survey of U.K. bank account holders by Ipsos revealed that two-thirds were concerned about how their personal financial data might be used.
Amid these feelings of insecurity and skepticism, financial marketers have a captive opportunity to reassure customers about the safety of their data and proactively educate the market about how to protect their data and seek recourse should their information be misused or compromised. The steady stream of step-by-step guides on how to freeze your credit score following the 2017 Equifax breach is an example of how financial marketers can earn the trust of their institution’s customers.
AI is imperfect like us.
As AI algorithms are built by humans, they too are exposed to our imperfect judgment, errors and implicit biases. While some AI fails are humorous -- such as Facebook AI chatbots Alice and Bob, who developed their own secret language, other examples, like the varying accuracy of facial recognition technology based on race, can be far more damaging. To ensure that we are building AI algorithms, data-driven decision making tools and marketing campaigns that don't inadvertently stereotype or decide unfairly, marketers need to acknowledge the limitations of their AI tools and implement the necessary checks and balances to maintain human accountability for machine-driven decisions.
Personalization needs to come without the creep factor.
Financial marketers today have incredibly powerful technology and data-driven insights at their fingertips that, if used right, can turn hyper-personalized marketing into an exact science that seems like an art form. Marketers have the ability to understand when, where, what and how to best appeal to their customers’ needs, but with this, they have a duty to toe the line between personalization and intrusion. Beyond abiding by opt-out rules and regulations, marketers can use customer focus groups, surveys and even AI-driven tools to understand customers’ behaviors, preferences and limitations.
As marketers increasingly look to data to explain reality and rely on the intelligence of machines to think on their behalf, it is imperative to remember that we built the machines, and that accountability is the one thing we can’t outsource to AI.
This post previously appeared on Forbes.