After years of lobbying for better data management in large corporations, consumer rights advocacy CHOICE made a vow: to uphold the same standards it was calling for in others, within its own organisation.
This ‘practice what you lobby for’ mentality was ethically-motivated, explained Chief Data & Technology Officer, Ashwin Sridhar – much like CHOICE’s recent campaign for companies to be more transparent when collecting biometric information, which led to three major retailers abandoning facial recognition technology.
“We wanted to, even in the smallest way possible, be a role model for how organisations should behave when collecting and using consumer data,” said Mr Sridhar ahead of the Australian Privacy Forum, hosted by Informa Connect.
“Our work in identifying data misuse in others led us to self-reflect on our own practice. While we have always been legally compliant, the law isn’t sophisticated enough to tell organisations how to behave with every possible application of data use. So, it is up to organisations to interrogate their own conduct and ensure they are responsible and fair.”
Mr Sridhar also highlights the misconception that unethical data use is always the result of bad intent. Sometimes, it is the result of companies not stopping to consider the broader impact of their data-driven decisions, he says.
“The three major retailers uncovered by our campaign work did not have any malicious intent when collecting consumer biometrics. They were addressing a genuine business issue – i.e. regular shoplifting.
“However, biometric information is incredibly sensitive to collect and retain and, in our view, the retailers had not been forthright enough about the fact they were using it – so there were issues of consent. In fact, one retailer only made the disclosure on their website, so it is questionable whether someone entering the store would have seen that.”
While CHOICE admits that it is only starting out its own journey towards exemplar data usage, Mr Sridhar encourages other organisations to follow in its footsteps. While irresponsible data use might slip through legal loopholes, it could cause untold reputational damage, if exposed by consumers or advocacy groups, he explained.
“Consumer trust is at an all-time low and is an increasingly important purchasing criterion. It is vital organisations uphold consumer trust – and ethical data management is a key component of that,” he said.
Statistics agree, with a recent survey finding that 88 percent of consumers value trust in businesses, with 71 percent having already switched brands to find a company they prefer.
Being a trusted company is not about prohibiting the use of data, Mr Sridhar stressed, but ensuring data is used appropriately.
“I am not saying all data collection is evil and must be discontinued. But businesses owe it to their customers to be more transparent and refrain from using data in ways that have the potential to cause harm or drive unintended consequences” he said.
Whole of business effort
The first step in its own quest towards more ethical data management, began with a recognition that data is a whole of business responsibility.
“You need absolutely everyone to be on board with your ethical mandate,” Mr Sridhar said.
“If not, there might be one person that makes one decision about data usage – and that decision could jeopardise everything.
“Say your organisation is monitoring customer behaviour online, for marketing and analytics purposes. Then later, someone else in the organisation decides to use a cut of that data to selectively promote a sale – only making the offer available to women or elderly people, for example.
That raises ethical questions as well as being discriminatory.”
Getting clear on personalisation goals
As a subscription-driven, not-for-profit organisation, CHOICE derives information from members who pay an annual membership fee. It uses this data to model out their propensity to renew membership, and allows them to personalise marketing communications accordingly.
However, the organisation draws a line on using these data points for ways in which it should not.
“As a digital native, I love personalisation – and frankly, don’t want to deal with a service that isn’t personalised. However, it is important that businesses offer personalisation on a fair, non-discriminatory basis,” Mr Sridhar said.
To ensure personalisation is fair, Mr Sridhar recommends companies include it in their code of conduct and itemise behaviours they will and won’t tolerate.
“At CHOICE, we have discussed personalisation at an executive level to ensure we aren’t making decisions along the way that have unintended, harmful consequences. We established a code of conduct specifically for our use of machine learning and AI and created some ‘principles of ethical use’.
“This was ratified by the executive team to make sure we are completely devoted to it as an organisation. We then presented it to all staff and gave them a checklist to ensure they are adequately guided when making future decisions.”
Dealing with complexities
Mr Sridhar acknowledges that ethical boundaries are often blurred, particularly when there is a trade-off between consumer outcomes.
“The lines are not always clear. That is why it is so important to have high level conversations right from the start and ensure everyone is in agreement about what the business as a whole find acceptable not just from the business’ standpoint, but with the customer in mind.
“I recommend every organisation that collects data for any reason firms up their code of conduct now – because it is much better to do so when the waters are calm, rather than in the aftermath of a crisis,” he concluded.
Hear more expert tips from Ashwin Sridhar at the Australian Privacy Forum hosted by Informa Connect on February 16, 2023.
Joining Mr Sridhar on the stage are the NSW Privacy Commissioner, Samantha Gavel, Executive Manager, Legal MarComms and Research at eSafety Commissioner Executive Manager, Morag Bond.
The event will be held at the Radisson Blu Plaza Sydney.
Learn more and register.