When it comes to generative artificial intelligence (genAI), businesses across a variety of vertical industries have jumped into deployments over the past year or so for fear of being left behind.
More than two-thirds of business leaders have now rolled out genAI tools for their workforce. But employee knowledge or training on large language models and genAI tools remains a top barrier to proper implementation, according to an October Harris Poll.
The findings indicate while genAI is being rolled out broadly, it’s not being effectively used, and getting a return on investment (ROI) is even more elusive.
“I was really surprised to see 72% conducted employee training on [AI] concepts and employee knowledge was still a barrier,” said Carm Taglienti, chief data officer at technology consultancy Insight Enterprises. "So that means a lot of people were trained, but they really didn’t learn anything."
The Harris Poll, conducted on behalf of Insight Enterprises, found that a majority of business leaders have been tasked with helping their company define the ROI from genAI. However, only 15% consider the costs of implementation, including technical debt due to outdated infrastructure, initial financial investments, and ongoing maintenance costs.
Rick Villars, group vice president for research firm IDC, said his analysts have seen that enterprises are “again and again” increasing budgets for genAI and other forms of AI, including building out infrastructure, services, and software platforms.
“The one thing they’re not increasing at the same level is the investment in training and upskilling their own teams,” Villars said. “That’s about the IT teams and the subject-matter experts. But it’s also just training their employees on better AI behavior and practices so they can protect their information.”
Training workers to utilise AI and big data (huge amounts of structure and unstructured information) ranks third among company skills-training priorities over the next five years, and it will be prioritised by 42% of companies, according to a survey by the World Economic Forum.
The questions companies need to ask include what kind of employees should be part of an AI implementation team and how much (and what kind of) training is needed to realise all of AI’s potential benefits.
Who should be on your AI team?
Building an AI team is an evolving process, just as generative AI itself is steadily evolving — even week to week.
“First, it’s crucial to understand what the organisation wants to do with AI,” said Corey Hynes, executive chair and founder at IT training company Skillable. "Second, there must be an appetite for innovation and dedication to it, and a strategy — don’t embark on AI efforts without due investment and thought. Once you understand the purpose and goal, then you look for the right team.”
Some top roles include:
- A data scientist who can simplify and navigate complex datasets and whose insights provide useful information as models are built.
- An AI Software Engineer: a person who often owns design, integration, and execution of machine-learning models into systems.
- An AI Officer or Leader to provide leadership and guidance in implementing and leading AI initiatives to ensure alignment and execution.
- An AI security officer to handle the unique challenges, such as ensuring adherence to regulations, data transparency, and internal vulnerabilities that come with AI models and algorithms, including adversarial attacks, model bias, and data poisoning.
- Prompt engineers who can craft and improve text queries or instructions (prompts) in large language models (LLMs) to get the best possible answers from genAI tools.
- Legal consultants to advise IT teams to help ensure organisations abide by regulations and laws.
LLMs are the deep-learning algorithms — neural networks — most often characterised by their massive storehouses of information. LLMs can have millions, billions and even trillions of parameters or variables and are essentially next-word generators. Training them to choose the most appropriate response for a given query is the job of a prompt engineer. (Prompt engineering is also one of the fastest-growing career skills for technical and non-technical professionals alike.)
These [LLMs] are like spastic parrots. They will react to what they’re fed. If you feed them a bunch of garbage, you will get a bunch of garbage back. – Arcadia CTO Nick Stepro
“Think of it as the process of interacting with a machine to get it to produce the results you’d like,” said Sameer Maskey, a Columbia University AI professor and CEO of Fusemachines, an AI consultancy.
Avivah Litan, a distinguished vice president analyst at research firm Gartner, said prompt engineering will eventually be folded into application engineering and software developer career streams. “It will be a required skill for the future, but it will not be a separate career stream,” she said.
Cloud providers are also expected to launch prompt-engineering services, according to Forrester Research, meaning that in some cases the task can be outsourced.
“In 2024, all of the hyperscalers will announce prompt engineering," Forrester said in a recent report. "However, enterprise adoption will be limited. Due to incomplete contextual data and limited experience in natural language and prompt engineering among data scientists, the cloud provider’s first-gen prompt engineering services will not suffice to address the tailored fine-tuning needs."
Some of the top learning platforms for IT skills include online training providers such as Udemy, Coursera, and Code Academy; they are often the best places to turn for upskilling or reskilling employees, according to Erick Brethenoux, an adjunct professor at the Illinois Institute of Technology.
In 2023, Coursera offered access to more than 35 courses or projects specific to genAI, resulting in 570,500 enrollments. And just last week, it announced the launch of its Generative AI Academy, which is designed to equip executives and their employees with the skills needed to create in a genAI-driven workplace.
Coursera last year also launched courses specific to prompt engineering and has enrolled more than 170,000 students in them, according to a spokesperson. Coursera’s programs include Prompt Engineering for ChatGPT from Vanderbilt University; ChatGPT Prompt Engineering for Developers from DeepLearning.AI; Prompt Engineering for Web Developers from Scrimba; and AI Foundations: Prompt Engineering with ChatGPT from Arizona State University.
While LLMs such as OpenAI's GPT-4, Google’s LaMDA, or Hugging Face’s Bart are pre-filled with massive amounts of information, prompt engineering allows genAI tools to be tailored for specific industry or even organisational use.
Over time, massive, amorphous LLMs such as GPT-4 are expected to give way to smaller models that are less compute intensive and more domain specific, allowing more compact LLMs to gain traction in any number of vertical industries.
Security remains a top concern for business leaders when it comes to genAI, with 38% citing it as a top barrier in that Harris Poll. That's more than double the percentage who consider the costs of implementation as a hurdle for their organisation.
Mike Peterson, vice president of Infrastructure and Cloud Services at Blue Shield of California, said before ever experimenting with genAI, his organisation first developed an AI governance team. That team, comprising members of the IT, legal, and HR group, is now developing frameworks and safeguards to help ensure AI does no harm when it goes live.
Blue Shield is also upskilling its employees, mostly through self-paced learning, but not in the most traditional of ways.
“Generally, I try to put IT teams on big projects,” Peterson said. “I find 70% to 80% of knowledge retention in learning happens by doing. We also hire to fill our AI skills needs.... In the AI space, though, talent is harder because finding someone with years of experience is impossible.”
Sarah Danzl, chief marketing officer at Skillable, agreed, and said allowing workers to simply use genAI with little to no hands-on training is like letting someone who’d only read an instruction manual fly a plane.
“Hands-on training drives skill mastery,” she said. “Learning new, competitive skills is the most important way IT professionals can future-proof themselves. It’s also the path forward for companies seeking to become future-ready.”
How to build a genAI team
Building an genAI team requires a holistic approach, according to Jayaprakash Nair head of Machine Learning, AI and Visualisation at Altimetrik, a digital engineering services provider. To reduce the risk of failure, organisations should begin by setting the foundation for quality data, establish “a single source of truth strategy,” and define business objectives.
Building a team that includes diverse roles such as data scientists, machine learning engineers, data engineers, domain experts, project managers, and ethicists/legal advisors is also critical, he said.
“Each role will contribute unique expertise and perspectives, which is essential for effective and responsible implementation,” Nair said. "Management must work to foster collaboration among these roles, help align each function with business goals, and also incorporate ethical and legal guidance to ensure that projects adhere to industry guidelines and regulations."
For example, a data scientist develops AI models that will extract insights and predict trends. A machine learning engineer will then take the AI models developed by data scientists and scale them for production use, focusing on algorithm optimisation and deployment. Data engineers next build and maintain the infrastructure and pipelines that allow for efficient and secure data collection, storage, and access — all of them necessary for AI operations. And domain experts provide industry-specific knowledge required to guide the development and solutions.
Project managers oversee and coordinate between different roles, managing resources and timelines, and ensuring that the project meets its objectives and business goals. Meanwhile, legal advisors ensure that projects comply with relevant laws and regulations.
Generally, I try to put IT teams on big projects. I find 70% to 80% of knowledge retention in learning happens by doing. -- Mike Peterson, VP of Infrastructure and Cloud Services at Blue Shield of California.
Also important: emerging technology IT workers, who are seen as essential in support of CIO and CTO objectives, according to a recent survey by Skillable. AI for IT Operations (AIOps) ranked in the top five of those technologies.
The top three emerging technologies or knowledge areas crucial for AI team members are:
- AI TRiSM (AI trust, risk and security management);
- Composite AI (the combination of different AI techniques to achieve the best result);
- And generative AI.
According to Gartner, AI TRiSM has emerged as the backbone for organisations to navigate the challenges presented by genAI. Failure to effectively upskill current workforces on best practices means falling behind on genAI implementation.
Not surprisingly, companies are now looking to hire people with AI skills — and job seekers who have those skills are highlighting them.
A recent study by the Oxford University’s Internet Institute found that employees with skills in AI can earn salaries as much as 40% higher than peers who don't have them. The study also found combining AI skills with a wide range of other skills was the most valuable to organisations.
Job listing site Upwork last fall released its study on the skillsets of job seekers accepting jobs, as well as the search and hiring behaviors of companies pursuing genAI projects. It found that about half of hiring managers plan to hire more independent talent and 49% plan to hire more full-time employees — both because of genAI deployment plans.
It's also important to look for people who like learning new technology, have a good business sense, and understand how the technology can benefit the company.
At his own organisation, Insight Enterprises, Taglienti said 10% to 20% of his staff must learn prompt engineering. Others need to know how to use AI content generation skills and coding assistants such as Microsoft Copilot or Amazon’s CodeWisperer.
“You’re interacting with AI, but that doesn’t require you to have a ton knowledge to use it,” he said. “The democratisation of this technology enabled us to move away from having to each someone Python, infrastructure, cloud skills. I have the ability to make this relatively simple in terms of onboarding some of the basics for experimenting or implementing ideas that could benefit the business.
“So we look for people who are not only subject matter experts but also people who understand business transformation," Taglienti continued, "and they understand adoption and adaptation from a cultural standpoint of a business because that’s what the power of the technology allows you to do."
Getting business users onboard
On the business side of the house, it’s usually not difficult to inspire employees to learn any new technology when that technology has clear business benefits. Nick Stepro, CTO at Arcadia, which sells a health data platform, said it's business users more than technical users who'll get "supercharged" about genAI.
"Four or five years from now, I don’t know what a mid-tier level software engineer looks like," Stepro said. "If you’re a business user, though, you now have access to an immense amount of power in a natural conversational way that you didn’t have before."
The message is clear: genAI and the power of big data will transform healthcare and other vertical industries. And it's the business users who'll understand what data will be needed to feed LLMs to ensure the outputs they produce are valuable.
"If you don’t have a data strategy under the hood that can scale, and that can leverage a lot of structured data collated with unstructured data with the appropriate metadata tagging, then you’re in a tough spot," Stepro said. "These things [LLMs} are like spastic parrots. They will react to what they’re fed. If you feed them a bunch of garbage, you will get a bunch of garbage back."
There are myriad uses for genAI in business environments that can enhance efficiency, decision-making, and overall customer experiences through the use of chatbots, data analysis tools, document intelligence, and content creation. For example, marketers can use it to create advertising material. Physicians can used genAI to peruse patient records and get a synopsis of their history and even treatment recommendations.
“As compared to AI or classical machine learning, the ability to use generative AI is easy. So, the barrier is fairly low,” Taglienti said. “You can start using AI immediately without a lot of training. The challenge in our world, and this is true for our customers, is operationalisation. So, you can provide people with a chatbot, but in order to get production-level consistency you need to control the processes. You have to monitor the behavior of the usage.”
If genAI is being used by the business side to create marketing campaigns or derive sales data, how can those users be sure the the tools are giving them kinds of responses they should expect, Taglienti said.
“How am I measuring consistency and the performance of the model?" he said. "We work with customers to help them understand usage and operational performance. That’s something organisations need to look at, just like any other IT deployment."