At times we have a strained relationship with the law. Sometimes we call for more regulation as if it’s the only answer, that it might protect us once and for all. At other times we eschew the ‘nanny state’ insisting the law should stay out of our lives
But what happens when they cross over, when we want legal protections for our personal lives? This is where we find ourselves with technology and more specifically artificial intelligence (AI).
Our relationship with technology and the big tech companies behind it, is now deeply personal. There’s a lot at stake. Almost every facet of our lives can be pieced together from our digital footprints, however lightly we intend to tread.
And so, we find ourselves simultaneously asking for protection and privacy, for regulation but also for freedom. We’re enticed to keep giving, interacting with apps and services that seem worth it for the sharing of a little (or a lot) of information about us along the way.
But what system of regulation can we trust? What combination of responses can allay our fears and reign in the predatory players in the market?
The practice of ethics
There are calls for more and better regulations to protect us, as citizens or as we’re most commonly referred to in tech investment pitches, ‘users’. We’ve also seen a proliferation of ethics committees designing codes and ethical frameworks for AI in an attempt to guide businesses to better practices that consider more than just profits.
Rather than despair at the multitude of initiatives and the general nature of principle-led frameworks we should feel relief that in these initiatives we are seeing the practice of ethics in motion.
Through this process we are getting closer to a better system, and ultimately a safer society. Much like the technology itself, we should expect this system to constantly evolve to meet our needs.
The lag of the law
It takes time to enact laws and pass new regulations. The process can be long and of the final words embodied in legislation, few are usually pleased. There’s often complaints of ‘measly’ words and ‘whittled down’ provisions.
Businesses tend to point to the restrictive nature of regulation and the ‘burden’, sometimes losing sight of why the protections were considered necessary in the first place.
As technology and automation transforms products and markets, regulations will appear out of touch if we look to them for prescription, if we ask them to tell us exactly what it is we should do in every situation. It can’t do that and shouldn’t. Effective regulatory systems still require us to think.
Community expectations and business practices will inevitably jump ahead of our legal system, moving beyond that which is enshrined by a plain reading of the law. And so, we must turn to principles and frameworks, be they voluntary or compulsory, to complement the legal system and take us to higher ground.
Take for example, the government’s response to the ACCC Digital Platforms Inquiry issued last week.
The response details a series of reforms to address market power, transparency and fair competition.
The response includes the development of a number of ‘voluntary codes of conduct’ to be overseen by ACMA where only if progress is unsatisfactory will mandatory intervention be introduced
The professions, of accounting, law and medicine, have long held their social contract through a combination of regulation and self-regulatory mechanisms. The combination helps us to hold them to account.
It’s now time for the tech sector to prove that they can be trusted by engaging with higher order ethical principles as well, rather than just the letter of the law.
From a cost to a way of doing business
Complying with laws and voluntary regulations is commonly viewed as a cost to business, taking from the bottom line.
However, there stands an opportunity for a new way of thinking. Adopting higher standards, through ethical codes and voluntary certifications, can help businesses prove they are trustworthy.
Often there’s a race to the bottom, to minimal compliance. However, those who choose to connect with the higher order principles of ethical frameworks can position themselves to ensure a positive contribution to society and in turn, win the trust and confidence of customers and the community.
There is an opportunity to stand out amongst competitors who seek to fight, defeat and reluctantly meet minimum legislative requirements and nothing more.
Collaborating for the greater good
Engaging with the development and trialling of ethical frameworks is an important step in ensuring these frameworks can be applied and become practical, that they will impact what a business actually does and how it impacts us.
The United Nation’s Sustainable Development Goal 17 calls for ‘Partnerships for the Goals’, acknowledging that we must work together to address the biggest issues of our time. Our world is connected like never before and neither business nor government, can work in isolation. Instead we must work together, we must cooperate and partner to find solutions that will work and to ultimately create a system that we can trust.
Some have claimed that ethical frameworks are merely aspirational, but do we want the opposite? We should aspire to have a responsible business community that considers the ethics of their impact. It’s not the aspiration with which we should focus our criticism but rather those who refuse to participate and fight what hopefully is now inevitable – a more comprehensive system with which we can hold businesses, and those who manage them, to account.
Clare Payne is a former employment lawyer and founder of the Banking and Finance Oath (The BFO). She holds the role of EY Fellow for Trust and Ethics and is an Honorary Fellow of The University of Melbourne
Christina Larkin is a Director at EY and leads the Digital Trust Assurance practice. She is an expert panel member for the IEEE’s Ethical Certification Program for AI and Autonomous Systems.
Do you know more? Contact James Riley via Email.