Back to Blog
Case StudyMarch 2026 · 11 min read

SAP Clean Core Migration: Ontology Extraction in Practice

By Sid, Founder at Vyuh

SAP Clean Core migration is one of the hardest problems in enterprise IT. Thousands of custom ABAP objects accumulated over decades. Complex dependencies nobody fully understands. Manual assessments that take months and still miss things.

We connected Vyuh to a live SAP system and let it do what it does: extract the ontology, build the capability graph, and deploy intelligence on top. Here's what happened.


The Problem: Decades of Custom Code

Every SAP system accumulates custom code. Z-programs, custom transactions, modified standard objects, bespoke Fiori apps, custom OData services. Over time, this creates an opaque landscape that nobody fully understands.

Clean Core migration — moving to standard SAP functionality and pushing customizations to the BTP extension layer — requires understanding every custom object: what it does, what depends on it, whether standard alternatives exist, and what breaks if you touch it.

Traditional approaches fail in predictable ways:

  • Manual assessment takes 3-6 months and produces stale spreadsheets
  • Static analysis tools see syntax but miss semantics
  • Consultants bring frameworks but lack system-specific context
  • Nobody can answer "what happens if I remove this object?" with confidence

What Vyuh Extracts

When Vyuh connects to an SAP system, it doesn't just list objects. It extracts the ontology — the structure of what exists, how things relate, and what they mean.

The Extraction Layer

From a single SAP connection, Vyuh automatically catalogs:

  • Custom objects — Z-programs, function modules, classes, includes, with full dependency graphs
  • Fiori applications — UI5 apps, their OData bindings, navigation targets, and tile configurations
  • OData services — entity sets, associations, function imports, and which Fiori apps consume them
  • Data flows — how data moves between entities, which custom code touches which tables, upstream and downstream dependencies
  • Standard coverage — where SAP standard functionality already exists as a replacement for custom code

This isn't a flat inventory. It's a semantic graph— each object is typed, connected to its dependencies, and annotated with its role in the system.

From Ontology to Capabilities

The extracted ontology becomes a capability layer. Each capability is something an AI agent can discover, understand, and use:

Capability: assess_custom_object
Inputs:
  object_name: string (e.g., "Z1_INVOICE_UI_V3")
  assessment_type: enum [clean_core, complexity, dependency]
Returns:
  classification: retire | refactor | replatform | retain
  confidence: float
  dependencies: object[]
  standard_alternative: string | null
  migration_effort: estimate

The agent doesn't need to know about ABAP, SAP tables, or system internals. It sees typed capabilities with validated inputs and structured outputs.


The Migration Advisor

On top of the extracted ontology, we deployed an AI migration advisor. This isn't a chatbot with a prompt — it's an agent operating within the bounded world of capabilities Vyuh defined.

The advisor can:

  • Assess any custom object against Clean Core principles
  • Trace dependency chains to identify migration blast radius
  • Propose migration sequences that respect dependencies
  • Generate executive summaries with effort estimates
  • Answer specific questions grounded in actual system data

Critically, every answer is groundedin the extracted ontology. When the advisor says "Z1_INVOICE_UI_V3 has 12 downstream dependencies," that's not a hallucination — it's a fact from the capability graph.


What We Learned

Ontology extraction is the unlock

The advisor is only as good as the structure it operates on. Without the ontology — the typed, connected, semantic understanding of the system — it would just be another chatbot guessing about SAP. The extraction layer is where the value is created. The intelligence layer is where it's delivered.

Closed world matters for enterprise

Enterprise customers don't want AI that might be right. They want AI that operates within known boundaries. The capability layer gives the advisor a bounded action space — it can only assess objects that exist, trace dependencies that are real, and propose migrations based on actual system state.

Grounding eliminates the hallucination problem

The number one objection to AI in enterprise: "How do I know it's not making things up?" When every response is traceable to extracted system data, that objection dissolves. The advisor doesn't generate knowledge — it reasons over structure that Vyuh extracted.

The system grows without agent rewrites

As we extracted more of the SAP landscape — adding Fiori apps, then OData services, then data flows — the advisor automatically became more capable. It could answer questions it couldn't before, not because we changed its code, but because the capability graph grew.


The Bigger Pattern

SAP Clean Core is one instance of a general pattern:

  1. Connect to a complex enterprise system
  2. Extract the ontology — what exists, how it connects, what it means
  3. Build a governed capability layer from the extracted structure
  4. Deploy AI that operates within that bounded world

The system doesn't matter — SAP, Salesforce, internal APIs, legacy databases. The pattern is the same: extract structure, govern capabilities, deploy intelligence.

That's what Vyuh does. The SAP migration advisor is just the first proof point.


Try It

The SAP Explorer is live. You can see the extracted ontology, ask the migration advisor questions, and explore the capability graph yourself:

Open the SAP Explorer →