News Warner Logo

News Warner

The biggest barrier to AI adoption in the business world isn’t tech – it’s user confidence

The biggest barrier to AI adoption in the business world isn’t tech – it’s user confidence

  • The biggest barrier to AI adoption in the business world isn’t technology itself, but rather user confidence and a lack of belief in one’s ability to use it effectively.
  • Technological self-efficacy, or the person’s belief in their ability to use technology effectively, plays a significant role in determining behavior. This can lead to hesitation and avoidance of new technologies like AI.
  • Organizations should invest in job-specific and user-centered training that addresses employees’ specific needs and roles, rather than providing one-size-fits-all training programs.
  • The generational divide also affects AI adoption, with younger workers (Gen Z and millennials) being more confident using technology than older generations (Gen X and boomers). Training should take into account the backgrounds and comfort levels of employees to build trust in AI.
  • Effective AI training involves a combination of mastery experiences, vicarious experiences, verbal persuasion, and physiological and emotional states. Organizations can use cohort-based trainings with feedback loops, customized content, and engaging formats like prompting parties to help employees build confidence and skill in using AI.

Believe in your own decision-making. Feodora Chiosea/Getty Images Plus

The Little Engine That Could wasn’t the most powerful train, but she believed in herself. The story goes that, as she set off to climb a steep mountain, she repeated: “I think I can, I think I can.”

That simple phrase from a children’s story still holds a lesson for today’s business world – especially when it comes to artificial intelligence.

AI is no longer a distant promise out of science fiction. It’s here and already beginning to transform industries. But despite the hundreds of billions of dollars spent on developing AI models and platforms, adoption remains slow for many employees, with a recent Pew Research Center survey finding that 63% of U.S. workers use AI minimally or not at all in their jobs.

The reason? It can often come down to what researchers call technological self-efficacy, or, put simply, a person’s belief in their ability to use technology effectively.

In my research on this topic, I found that many people who avoid using new technology aren’t truly against it – instead, they just don’t feel equipped to use it in their specific jobs. So rather than risk getting it wrong, they choose to keep their distance.

And that’s where many organizations derail. They focus on building the engine, but don’t fully fuel the confidence that workers need to get it moving.

What self-efficacy has to do with AI

Albert Bandura, the psychologist who developed the theory of self-efficacy, noted that skill alone doesn’t determine people’s behavior. What matters more is a person’s belief in their ability to use that skill effectively.

In my study of teachers in 1:1 technology environments – classrooms where each student is equipped with a digital device like a laptop or tablet – this was clear. I found that even teachers with access to powerful digital tools don’t always feel confident using them. And when they lack confidence, they may avoid the technology or use it in limited, superficial ways.

The same holds true in today’s AI-equipped workplace. Leaders may be quick to roll out new tools and want fast results. But employees may hesitate, wondering how it applies to their roles, whether they’ll use it correctly, or if they’ll appear less competent – or even unethical – for relying on it.

Beneath that hesitation may also be the all-too-familiar fear of one day being replaced by technology.

Going back to train analogies, think of John Henry, the 19th-century folk hero. As the story goes, Henry was a railroad worker who was famous for his strength. When a steam-powered machine threatened to replace him, he raced it – and won. But the victory came at a cost: He collapsed and died shortly afterward.

Henry’s story is a lesson in how resisting new technology through sheer willpower can be self-defeating. Rather than leaving some employees feeling like they have to outmuscle or outperform AI, organizations should invest in helping them understand how to work with it – so they don’t feel like they need to work against it.

Relevant and role-specific training

Many organizations do offer training related to using AI. But these programs are often too broad, covering topics like how to log into different programs, what the interfaces look like, or what AI “generally” can do.

In 2025, with the number of AI tools at our disposal, ranging from conversational chatbots and content creation platforms to advanced data analytics and workflow automation programs, that’s not enough.

In my study, participants consistently said they benefited most from training that was “district-specific,” meaning tailored to the devices, software and situations they faced daily with their specific subject areas and grade levels.

Translation for the corporate world? Training needs to be job-specific and user-centered – not one-size-fits-all.

The generational divide

It’s not exactly shocking: Younger workers tend to feel more confident using technology than older ones. Gen Z and millennials are digital natives – they’ve grown up with digital technologies as part of their daily lives.

Gen X and boomers, on the other hand, often had to adapt to using digital technologies mid-career. As a result, they may feel less capable and be more likely to dismiss AI and its possibilities. And if their few forays into AI are frustrating or lead to mistakes, that first impression is likely to stick.

When generative AI tools were first launched commercially, they were more likely to hallucinate and confidently spit out incorrect information. Remember when Google demoed its Bard AI tool in 2023 and its factual error led to its parent company losing US$100 billion in market value? Or when an attorney made headlines for citing fabricated cases courtesy of ChatGPT?

Moments like those likely reinforced skepticism – especially among workers already unsure about AI’s reliability. But the technology has already come a long way in a relatively short period of time.

The solution to getting those who may be slower to embrace AI isn’t to push them harder, but to coach them and consider their backgrounds.

What effective AI training looks like

Bandura identified four key sources that shape a person’s belief in their ability to succeed:

  1. Mastery experiences, or personal success

  2. Vicarious experiences, or seeing others in similar positions succeed

  3. Verbal persuasion, or positive feedback

  4. Physiological and emotional states, or someone’s mood, energy, anxiety and so forth.

In my research on educators, I saw how these concepts made a difference, and the same approach can apply to AI in the corporate world – or in virtually any environment in which a person needs to build self-efficacy.

In the workplace, this could be accomplished with cohort-based trainings that include feedback loops – regular communication between leaders and employees about growth, improvement and more – along with content that can be customized to employees’ needs and roles. Organizations can also experiment with engaging formats like PricewaterhouseCoopers’ prompting parties, which provide low-stakes opportunities for employees to build confidence and try new AI programs.

In “Pokemon Go!,” it’s possible to level up by stacking lots of small, low-stakes wins and gaining experience points along the way. Workplaces could approach AI training the same way, giving employees frequent, simple opportunities tied to their actual work to steadily build confidence and skill.

The curriculum doesn’t have to be revolutionary. It just needs to follow these principles and not fall victim to death by PowerPoint, or end up being generic training that isn’t applicable to specific roles in the workplace.

As organizations continue to invest heavily in developing and accessing AI technologies, it’s also essential that they invest in the people who will use them. AI might change what the workforce looks like, but there’s still going to be a workforce. And when people are well trained, AI can make both them and the outfits they work for significantly more effective.

The Conversation

Greg Edwards does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

link

Q. What is the biggest barrier to AI adoption in the business world?
A. The biggest barrier to AI adoption in the business world isn’t tech – it’s user confidence.

Q. Why do many employees hesitate to use new technology, including AI?
A. Many employees hesitate to use new technology because they lack confidence in their ability to use it effectively, fearing they will get it wrong or appear less competent.

Q. What is technological self-efficacy, and how does it relate to AI adoption?
A. Technological self-efficacy refers to a person’s belief in their ability to use technology effectively. This concept was developed by psychologist Albert Bandura, who noted that skill alone doesn’t determine people’s behavior.

Q. Why do older workers tend to feel less confident using technology than younger workers?
A. Older workers often had to adapt to using digital technologies mid-career, which can lead to feelings of inadequacy and skepticism about AI’s reliability.

Q. What is the key to effective AI training, according to Bandura’s theory of self-efficacy?
A. According to Bandura, four key sources shape a person’s belief in their ability to succeed: mastery experiences, vicarious experiences, verbal persuasion, and physiological and emotional states.

Q. How can organizations help employees build confidence in using AI?
A. Organizations can provide job-specific, user-centered training that includes feedback loops, customization, and low-stakes opportunities for employees to build confidence and try new AI programs.

Q. What is the importance of tailoring AI training to specific roles and needs?
A. Training needs to be job-specific and user-centered – not one-size-fits-all – to help employees build confidence in using AI effectively.

Q. How can organizations approach AI training in a way that builds confidence and skill over time?
A. Organizations can approach AI training by providing frequent, simple opportunities for employees to steadily build confidence and skill, similar to how players level up in games like Pokémon Go!

Q. What is the relationship between AI adoption and workforce effectiveness?
A. When people are well-trained on AI, it can make both them and the outfits they work for significantly more effective.