The SAP ERP integration with Akeneo PIM package is designed to automate the bulk (Batch) export of product data from SAP S/4HANA to Akeneo PIM
This package is optimized to handle large volumes of data, whether for initial catalog setup or periodic mass synchronizations
Additionally to the core package, an Akeneo PIM Data Model Generator package is also available so you can easily retrieve your PIM structure and ease the message mapping part.
Quick Overview of the iFlow
Below is a high-level summary of the iFlows implemented for Material Master data synchronization. Each iFlow corresponds to a logical data area or process step.
| iFlow Name | Purpose |
|---|---|
| Import in Batch Data from SAP ERP to Akeneo PIM | The entry point iFlow, here, data collection logic must be implemented and pushed to the Orchestration iFlow |
| Mapping SAP ERP fields to Akeneo PIM attributes | Map your source fields to your target fields. Step where the source payload is transformed to the target structure. |
| Orchestrate Data in Batch | Call this IFlow with the IFlow that will collect data from the customer's source. This iFlow will manage the remaining steps ( PIM Token, Mapping, Search, PIM API) |
Overall process
- The entry point is the data collect iFlow Import in Batch Data from SAP ERP to Akeneo PIM where the integrator will adapt the IFlow to meet the customer logic intended and get the product master data to synchronize to Akeneo PIM.
- Then, the data mapping and transformation need to be conducted. As the data mapping is very specific to each customer, it is expected that customers do this configuration in the Message mapping S/4HANA to Akeneo PIM.
- Once data is collected and mapped, it needs to be communicated to the orchestration iFlow Orchestrate Data in Batch; here, the iFlow manages the logic of Product(s) synchronization as a process. For example, the PIM’s token management part checks if it is a creation or update and finally calls PIM’s API to push product master data. This iFlow calls the Search and mapping steps.
Prerequisites
Before configuring the flows, ensure the following conditions are met
-
PIM Configuration: A connection must be created in Akeneo to obtain the Client ID and Secret credentials
-
Network Security: SAP Integration Suite IPs must be authorized on the Akeneo side, and the Cloud Connector must be configured for the SAP OData API
-
Keystore: Import the Akeneo root certificate if necessary
Security material and main configuration set up
To connect both systems, make sure to generate connection settings in the monitor folder of the Integration Suite. We recommend creating:
| Credential needed | Content |
|---|---|
| S4HANA_CONNECTIVITY | Connect to SAP OData service | User & Password |
| AKENEO_BASIC_AUTH | User & Password |
| AKENEO_CLIENT_SECRET | Client ID & Client Secret |
Once created, those security materials will need to be called in the package. To do so, change the SAP security materials in the various iflows of the package.
Informations
The iFlow parameters are related; once an item is registered, it is replicated on the other. However, we advise of checking all of the parameters to make sure those are up to date.
Configurable Parameters
Import in Batch Data from SAP ERP to Akeneo PIM iFlow
Before deploying the iFlow, ensure the following parameters are correctly filled in your SAP CPI configuration.
Receiver
For the R_S4HANA Receiver
| Parameters | Description |
|---|---|
| SAP_Address | URL of the SAP S/4HANA instance (OData API) |
| Credential Name (SAP_CredentialName) | Security alias for SAP access (Basic Auth) |

In this example, the security material was called “SAPTechnicalUser”, S4_Address is the website of S/4HANA
More
In the All Parameters tab
| Parameters | Description |
|---|---|
| Batch_MaxElements | Maximum number of products retrieved per batch in SAP (e.g., 100 products) |
| Batch_MaxParallelProcess | Number of parallel threads for splitting |

Orchestrate Data in Batch iFlow
Before deploying the iFlow, ensure the following parameters are correctly filled in your SAP CPI configuration.
Sender
For the SND Sender and the JMS adapter
| Parameters | Description |
|---|---|
| Akeneo retry interval definition | Retry can be adjusted to fit your business need |

Receiver
For the R_PIM Receiver and the first HTTP adapter
| Parameters | Description |
|---|---|
| PIM_HTTP_Adress | Base URL of the Akeneo PIM instance |

For the R_PIM Receiver and the second HTTP adapter
| Parameters | Description |
|---|---|
| PIM_HTTP_Adress | Base URL of the Akeneo PIM instance |
| Credential Name (PIM_Credential_Name_OAuth2) | Security alias for the Akeneo OAuth Client ID / Secret |

For the R_CPI Receiver and the first JMS adapter
| Parameters | Description |
|---|---|
| AkeneoRetry | Retry retention and expiration can be adjusted to fit your business need |

More
In the All Parameters tab
| Parameters | Description |
|---|---|
| PIM_BatchMaxProducts | Maximum number of products sent per batch to Akeneo (e.g., 10 products) |
| PIM_CREDENTIAL_NAME | Security alias for the Akeneo OAuth Client ID / Secret |

In the Content Modify tab
| Parameters | Description |
|---|---|
| Name [ID] CM_SetPIM_Credential_Name_Basic[…] |
In Source Value, security alias for the Akeneo OAuth Client ID / Secret |

Technical Workflow
The global synchronization cycle follows these technical steps
-
Data Collection: The flow extracts data via an OData GET call on API_PRODUCT_SRV/A_Product
. It is recommended to use a filter on the modification date to process only deltas -
Technical Segmentation: A Groovy script fragments the global XML message into fixed-size batches (defined by Batch_MaxElements).
-
Business Mapping: Each batch is transformed to translate technical SAP fields into PIM attribute codes
-
Orchestration & Sending: The flow manages the OAuth2 token lifecycle (caching and renewal) before executing PATCH calls to the Akeneo /api/rest/v1/products endpoint
.
SAP Data Filtering
Data retrieval on the SAP side takes place exclusively in the first iFlow: Import in Batch Data from SAP ERP to Akeneo PIM. To optimize performance and target specific products, you can apply filters directly at the source.
Where to apply the filter?
- Open the iFlow in edit mode.
- Locate the Request Reply step (usually at the beginning of the flow).
- Click on the OData Adapter (the connection arrow) reaching out to the SAP S/4HANA system
- Go to the Processing tab
Configuring Query Options ($filter)
In the Query Options field, you can define your filtering criteria to restrict the data set:
- By Date: To retrieve only products created or modified since a specific date: $filter=LastChangeDateTime ge datetimeoffset'2024-01-01T00:00:00Z'
- By Product Type: To filter for a specific category (e.g., Finished Products): $filter=ProductType eq ‘FERT’
Best Practice
Always use a $select statement alongside your filter to retrieve only the specific fields required by Akeneo. This reduces XML payload size and significantly improves iFlow execution speed.
Mapping between SAP ERP and Akeneo PIM structure
To conduct the mapping, the SAP S/4HANA structure and Akeneo PIM structure need to be present in the message mapping step of the iFlow Mapping SAP ERP fields to Akeneo PIM attributes. To generate the Akeneo PIM structure file, generate it by using the Akeneo PIM Data Model Generator package and then download it in the message mapping of the package.
Generate the OpenAPI structure files
The steps are
- Generate Akeneo Structure: In the Akeneo PIM Data Model Generator package, configure the R_PIM HTTP receiver to use the appropriate PIM_HTTP_Query (e.g., family=t-shirts). Run the iFlow.
- Download and Save: The output will be a JSON or XML file representing your Akeneo structure. Download this file and save it locally.
- Load into Mapping iFlow: Go to the Mapping SAP ERP fields to Akeneo PIM attributes iFlow
- Open Message Mapping: Click on the MM_PRODUCT_TO_PIM block
- Import Target Structure: On the right-hand side, import the Akeneo structure file you just saved
The integrator will also need to manually obtain or generate the SAP S/4HANA structure to use as the source file in the mapping step.

Conduct the mapping
The flow is all set! You can manually do the mapping and transformation between the two systems.

Management and update of existing products
The package automatically handles the creation of new products and the update of products already present in Akeneo PIM based on the PATCH product endpoint. This mechanism requires no manual configuration from the user.
The "Upsert" Principle (Update or Insert)
The integration utilizes the PATCH method on the Akeneo API. This operating mode, known as an "Upsert," allows the flow to be intelligent:
- If the product does not exist, Akeneo automatically creates it using the data transmitted from SAP.
- If the product already exists, Akeneo updates only the attributes sent by SAP. All other data already present in the PIM (manual enrichments, marketing descriptions, media) remains intact and is never overwritten.
Identification via SKU
The distinction between a creation and an update is based exclusively on the SKU (Product Identifier):
- The field mapped from SAP to the Akeneo identifier (SKU) serves as the unique key
- During the data transfer, Akeneo checks if this SKU already exists in its database:
- Match found: The existing product is updated
- No match found: A new product is created
Resilience and Performance
The flow is designed to process products in batches (Bulk mode) to optimize performance. In the event of high load on the Akeneo API (429 error), the package includes an automatic Retry logic to ensure that no updates are lost, even during high-volume data transfers.