Research commissioned by Kordia suggests over half of employed Gen Z Kiwis are using generative AI, despite limited understanding of its risks.
An independent survey by Perceptive and commissioned by the communications infrastructure company found 53 per cent of Gen Z respondents said they had used generative AI tools like ChatGPT or DalleE, with more than a quarter saying they used it for work.
Only one in five were aware of cyber security or privacy risks associated with AI use.
Alastair Miller, principal consultant at Kordia-owned Aura Information Security said despite AI’s potential, there were numerous issues to consider about generative AI tools.
“Generative AI can boost productivity, but it can also introduce risks around privacy and information security – especially if your employees are entering sensitive or private information into a public AI, like ChatGPT,” he said.
One of the most important things for leaders to understand was the difference between public and private generative AI tools.
High profile cases demonstrated this. At Samsung, for instance, employees inadvertently leaked source code and internal meeting minutes when using the public ChatGPT tool to summarise information.
“Once data is entered into a public AI tool, it becomes part of a pool of training data – and there’s no telling who might be able to access it," Miller said.
"That’s why businesses need to ensure generative AI is used appropriately, and any sensitive data or personal information is kept away from these tools.”
Gen Z were already widely using the technology so older generations of business leaders needed to upskill and become AI literate to ensure younger members of the workforce understood AI risks and acceptable use, he said.
Survey results indicated New Zealanders of all ages were naive to some of the issues associated with the use of generative AI.
When asked what they believed to be the main risks of generative AI, only one in five were concerned that generative AI could produce biased outputs, and only one in three were concerned that it could produce inaccurate information or be used for misinformation campaigns.
Miller said one of the drawbacks of generative AI was its propensity for hallucinations.
“Like a clever imaginative child, there are instances where a generative AI model will create content that either contradicts the source or creates factually incorrect outputs under the appearance of fact.”
One US law firm, for instance, is facing sanctions after using ChatGPT to search for legal precedents supporting a client’s case, which resulted in fabricated material being submitted to the courts.
Miller said there were data sets that should never be entrusted to public generative AI tools.
“Financial information or commercially sensitive material should never be exposed to public AI tools like ChatGPT," he said. "If it were leaked, you could run the risk of losing your competitive edge, or even breaching market regulations.”
The same went for any data belonging to customers, or personal information like health records, credentials or contact details.
Private AI tools might be fine to use with sensitive data, but Miller still urged caution.
“Even a private AI tool should go through a security assessment before you entrust your company data to it," he said. "This will ensure that there are protective controls in place, so you can reap any benefits without the repercussions.”
However, despite the risks, businesses shouldn’t shy away from AI.
“It’s a great instrument for innovation and productivity, and many of the major software companies are weaving AI into our everyday business technology, for example Microsoft 365 Copilot or Bing search," Miller said.
Rather than blindly adopting AI, it was worth defining the value or outcome you want to achieve to help implement it effectively and safely.
“Every business should look to create an AI policy that outlines how the company should strategically use AI and provide guidelines for employee usage," Miller said.
"Our research suggests only 12 per cent of businesses have such policies in place, so there’s room for improvement there.”