AI Success in Procurement Is Mostly Invisible: The “Iceberg” You Can’t Skip by Alex Karichensky

0

Alex Karichensky is the CEO at Digicode Europe, a consulting and custom software development firm operating across international markets. With deep expertise in procurement and supply chain digital transformation, he partners with enterprise organizations to modernize sourcing, contract lifecycle management, and operational processes through scalable, execution-focused technology programs.

Most procurement AI conversations start with the visible part: a model that extracts invoice fields, an assistant that drafts supplier emails, a tool that predicts risk. Those are real capabilities, but they’re not the whole story. When teams struggle to scale AI beyond a pilot, it’s rarely because the algorithm was “bad.” Likely because the foundation underneath wasn’t ready.

What people see above the waterline (algorithms) is the smallest part of success. The larger share sits below: technology and integration, and the biggest block of all: people, process, and governance.

The 10%: algorithms are the easy part

Algorithms are visible, demo-friendly, and measurable. You can show invoice extraction accuracy or classification precision. You can compare vendors. You can run tests.

But in procurement, algorithms are only valuable when they can act on real operational data and produce outputs the organization is willing to trust. A model that performs well in a lab can still fail in production if it can’t access the right data, or if nobody believes the result enough to change a decision.

The hidden trap: “pilot success” without operational impact

Many AI pilots succeed because the team selects clean data, chooses narrow scenarios, and adds extra human review. Then, when you deploy across categories, suppliers, and regions, the workload increases, and performance becomes inconsistent. The lesson isn’t that AI can’t work; it’s that AI needs an operating model.

The 20%: technology and integration make AI usable

In procurement, useful context is distributed. It lives in ERP, supplier portals, contract repositories, ticketing tools, email, and spend analytics. An AI tool that only sees one system will behave like a clever assistant with blind spots. Integration is what turns AI into a decision partner.

What “good integration” looks like in practice

Good integration is not just API connectivity. It is:

  • consistent identifiers across systems;
  • clean master data (no duplicates and outdated records);
  • clear data ownership (who maintains which fields);
  • near-real-time updates for key workflows;
  • secure access control so sensitive data is protected by default.

If these fundamentals are shaky, teams spend more time reconciling data than acting on insights. That’s when AI becomes a distraction instead of leverage.

The 70%: people, process, and governance decide whether AI is trusted

Procurement is a control function. It touches financial integrity, supplier risk, compliance, and reputation. Trust is the real currency.

People: roles, incentives, and adoption

AI changes how work is done, not just how fast. If buyers fear that automation may remove judgment from their role, they will avoid it. If legal teams don’t trust AI summaries, they will demand manual review. If finance sees exceptions handled inconsistently, they will tighten controls.

Adoption improves when roles are explicit:

  • who reviews agent recommendations;
  • who approves high-impact actions;
  • who owns exception resolution;
  • who is accountable when the agent flags a risk.

People also need “permission” to use it. Teams adopt faster when leadership frames AI as support for better decisions, not as surveillance.

Process: define the decision points, not just the steps

Procurement processes are often documented as steps. AI needs decision points:

  • Is this spend allowed without sourcing?
  • Does this contract permit a price change?
  • Is this supplier compliant for this region?
  • Should we escalate a delivery delay today or monitor?

When decision points are defined, you can decide where an agent helps, where it advises, and where it must hand off. Without decision clarity, AI either becomes overconfident or overly cautious — both reduce trust.

Governance: guardrails that prevent chaos

Governance isn’t about slowing teams down. It’s about creating a safe way to move faster. Practical governance includes:

  • Policy alignment (recommendations map to your real rules);
  • Permissioning (draft vs. execute by role and category);
  • Audit trails (what data was used, who approved, why);
  • Escalation rules (bank detail changes, high spend, sanctions flags);
  • Monitoring (drift, bias, and performance changes over time).

Procurement teams don’t need AI to be perfect. They need it to be accountable.

A realistic readiness checklist for procurement leaders

Data and systems

  • Do we have a reliable supplier master?
  • Are contracts searchable and linked to suppliers and categories?
  • Can we connect ERP, portal, and contract data without manual export?
  • Are we comfortable with the security and access model?

Operating model

  • Who owns exceptions, and how are they resolved today?
  • Which tasks are high-volume and repetitive vs. judgment-based?
  • What decisions can be standardized with clear thresholds?

Governance

  • What actions require approval, and at what thresholds?
  • How will we audit and explain decisions to stakeholders?
  • Who monitors performance and addresses failures?

In case you can’t answer these questions, that’s not a blocker, it’s a roadmap.

The temptation is to buy the algorithm and hope the rest follows. In procurement, it rarely works that way. Algorithms are the tip. Integration makes AI usable. People, process, and governance make it trusted. Invest below the waterline, and AI becomes a durable capability the business relies on.

Share.

Comments are closed.