New Gartner Study Places AI’s biggest problem on production, not talent acquisition.

The latest survey shows that only half of AI models make it into production.

Add bookmark

Here at the AI, Data and Analytics Network, we have in recent months been finding that, whenever we surveyed enterprises, talent acquisition was one of the main concerns listed both this year and the coming one.

However, new research by Gartner shows that the problems facing enterprises in terms of AI may be even bleaker than we thought, and the survey reported that, on average, only 54% of AI models move from pilot to production.

Incidentally, 72% of those surveyed stated that they had the talent they needed, which beggars the question: why can’t enterprises get their AI projects over the line?

Speaking to VentureBeat, Frances Karamouzis, distinguished VP analyst at Gartner, said: “The biggest surprise was the sheer number of organizations that reported having thousands of AI models deployed coupled with the fact that only 54% make it into production, and many [indicating] they have not aligned to business value”. In Karamouzis’ view, lack of discipline in aligning values opens up organizations to risks in terms of AI trust, security, and proper implementation.

The report also shed light on enterprise security concerns, with Erick Brthenoux, VP analyst at Gartner, writing in the release that: “Organizations’ AI security concerns are often misplaced, given that most AI breaches are caused by insiders. While attack detection and prevention are important, AI security efforts should equally focus on minimizing human risk” (half of organizations (50%) were worried about competitors or even partners as risks).

The Gartner study also found that 40% of organizations have thousands of AI models deployed and that volume leads to complexity for governance, as well as tracking the value and return on investment from AI.

Help us help you benchmark your enterprise by filling in our short survey about the state of AI and enterprise now.


RECOMMENDED