Clara Durodié explores the need for ethical AI for business growth and profitability.
AI will impact the very fabric of financial services. This is because every single business process is structurally changed when AI is deployed. For those who understand this aspect, they are already making provisions to become data-driven companies.
Essentially, these businesses change how they do things - their operational model, and their business processes design. In fact, many AI projects don’t scale and die in pilot stage. Many times it is because machine automation is forced on a process designed for humans to deliver.
Therefore, business processes need to be redesigned to enable the human-machine team to work together. This is why ethical AI is all about design — a robust approach to delivering value in a different way to what the industry is used to.
Business leaders should approach AI as a strategic business tool for growth, profitability and improved customer service. AI is an exceptionally powerful tool in any leader’s toolkit.
Even better, leaders should always aim to use only ethical AI, which can be summarised as modern corporate governance, exponential automation and quality business thinking delivered as one package.
I think this question should be “why our industry needs theology degrees?”. I know, this surprises many.
But first, let me clarify that ethical AI is about its design. Ethical AI is not compliance —many incorrectly regard ethical AI as “yet more regulations”. In contrast, unethical AI automates wrong decisions very quickly and puts businesses at risk. Unethical AI is a like a virus. It spreads very quickly and it’s hard to contain.
I believe that institutions need to be better at predicting what is the right thing to do especially in the absence of specific regulations.
So, ethical AI is grounded in sound business models deployed with technology designed to follow the principles of fairness, privacy, transparency, explainability and accountability.
Recently, there has been a lot of talk about ethical AI. Ethical AI is not new to me; I’ve been actively using and recommending the principles of ethical AI for many years. Having a strong moral compass is the safest guide there is. This is why I believe that financial services leaders should rather have degrees in theology rather than economics or finance. Why?
Theology teaches lessons in the history of human values and moral code. You can easily teach a theologian how banking works, but it’s a lot harder to teach a banker how ethics works.
As our industry is becoming more technology driven, it’s so important to have leaders with a degree in theology, a strong grasp of ethics and a solid moral compass.
And yet, many organisations don’t get it. As I see it, they continue to blindly recruit people with economics, finance or accounting degrees.
Our industry, can always benefit from ethics embedded in how we operate, think and design our work.
We need to volunteer to always do what’s right and refrain from doing what’s comfortable or what doesn’t upset a deal, a founder or a status-quo.
Ethics doesn’t hinder; rather, ethics enables businesses to flourish. As ESG is becoming more embedded in how we design our business strategy, our moral compass is activated — decision makers are expected to embed ethics in their everyday decisions.
Lacking ethics poses a business existential risk. An unethical decision can wipe-out a company. Those companies which get this, recruit for the future and are likely to be industry leaders.
There are a few types of algorithmic bias. In my book “Decoding AI in Financial Services — Business Implications for Boards and Professionals”, I wrote about different types and how to address them.
Firstly, having a joined up and well designed data strategy, enables business to address a range of challenges using AI — from design, production through to deployment and maintenance.
Secondly, it’s also important that business teams and technology teams work together when designing and deploying AI.
These teams can be territorial, perhaps a function of the historical divide between the front office (business) and the back office including IT. This divide doesn’t bring value in a data-driven business.
Finally, it’s crucial that these teams are diverse. They are formed by people with different backgrounds. Diversity of thought removes group thinking, a core cause of bias.
This bias, then gets into data, then moves into code, and then becomes algorithmic bias in unethical AI — this is a business risk.
For the first time in the history of our sector, we have the opportunity to devour personalisation at scale. This can eradicate financial advice gap, for instance, and help people manage and growth their wealth.
When properly designed, deployed and maintained, AI technologies open new possibilities for business growth and profitability. One of these possibilities is personalisation at scale.