LogoLogo
CommunitySupport PortalYouTubeStart a free trial
OE
  • Welcome
  • Commerce Engine
  • Orchestration Engine
  • API Documentation
  • Release Notes
  • Changelog
OE
  • Orchestration Engine
  • Getting Started
    • What is OE?
      • What are the benefits?
      • Use Cases
    • Provisioning
    • Understanding Make
    • OE Learning Trails
      • Setting up a First Digital Process
      • Adding a Conditional Step in a Digital Process
  • Management Dashboard
    • Accessing OE
    • Orchestration Center
    • Forms
    • KPIs and Analytics
    • Rulestore
    • Events
      • Event Registry
      • Event Log
      • Cloud Events in OE
      • Events Authentication and Configuration
      • Working with Event Data
    • Admin
      • Users and Roles
      • Integration with Celonis
      • Make Team
      • Task Inbox and Tasks
  • Digital Processes
    • Digital Processes Dashboard
    • Digital Processes Example
    • Working with Digital Processes
      • Digital Process Components
      • Creating a Digital Process
      • Editing a Digital Process
      • Testing a Digital Process
      • Conditional Process Runs
      • OE Make Modules
      • Configuring a Connection between OE and Make Modules
      • Setting a Trigger to Wake Up a Paused Digital Process
      • Cloning of Make Scenarios in OE Digital Processes
    • Data Flow between OE and Celonis
      • Retrieving Data from Celonis
      • Sending Data from OE to Celonis
      • Sending Form Submission Data from OE to Celonis
      • Sending Forms Magic Links Data from OE to Celonis
    • Process Context
    • Datastore
    • Versioning
    • Data Retention Policy
  • Troubleshooting
    • Firewall Allowlisting
    • Digital Processes Debugger
    • Generating a HAR File with Logs
    • FAQ
      • Receiving Alerts for Scenario Errors
      • Make Scenario Not Valid in OE
      • Digital Processes Blocked by Scenario Errors
      • Invalid Digital Processes
Powered by GitBook
LogoLogo

Resources

  • Emporix.com
  • Developer Policy
  • Terms of Use

Find us

  • LinkedIn

© 2025 Emporix. All Rights Reserved.

On this page
  • Prerequisite
  • Extracting data from OE digital processes
  • Creating a data connection
  • Customizing data jobs
  • Getting data from the finished digital processes only
  • Getting data from the digital process finished in a specified time frame
  • Limiting data load to a period from a specified date
  • Setting up process context scope to live or draft digital process instances
  • Enabling activity log
  • Executing a data load

Was this helpful?

Export as PDF
  1. Digital Processes
  2. Data Flow between OE and Celonis

Sending Data from OE to Celonis

See how you can transfer data from OE to Celonis using dedicated extractors.

PreviousRetrieving Data from CelonisNextSending Form Submission Data from OE to Celonis

Last updated 22 days ago

Was this helpful?

Digital Processes capture information in events throughout their lifecycle. The information captured in the events can be imported into Celonis using a Data Extraction job. To send the data, you need an extractor that brings data from Orchestration Engine to Celonis. For OE, you can customize the extractor so that it shares the data that you want to send to the Celonis system.

Prerequisite

To extract the data from OE to Celonis, you have to create a Data Pool in your Celonis account. Celonis uses Data Pools to collect and cluster data information to set up an integration flow. To learn about the details, see the documentation.

Extracting data from OE digital processes

Creating a data connection

You can create your own data extractor based on an Emporix template. The steps below show how to work with the extractor template and how to use it for your custom configuration.

Start with creating a connection to your Celonis Data Pool that links your OE as the Data Source:

  1. In your Celonis account, go to Data Integration -> Your Data Pool -> Data Connections and choose Add Data Connection.

  2. Choose Connect to Data Source.

  1. Go to the Custom section and choose Build custom connection -> Import from file.

  2. Enter the Name for your custom extractor. Optionally, you can also add a Description for it.

  3. Download the Digital Process Event Log Extractor and then upload it as a JSON file in your Celonis extractor builder:

  1. Choose Save and Continue.

  2. Check the parameters that should be used for the connection and then continue to the next step. The parameters were configured automatically with the uploaded extractor JSON file.

  1. Check the authentication method, for Emporix it's the OAuth2 (Client Credentials) method with the following endpoint and type details:

    • Get Token Endpoint: {Connection.AUTH_URL}/oauth/token

    • Grant Type: client_credentials Don't modify this configuration. Continue to the next step.

  1. Click on the process context endpoint to check and customize its configuration.

The response that is visible in the endpoint configuration, is the part that you have to customize. In the JSON file, enter all the tenant and process context data information that is needed for your custom connection and for getting the proper response.

Make sure the context$processid is checked as a Primary Key. Without the context$processid it is not possible to link the child tables back to the parent.

  1. Choose Finish to save your custom connection configuration.

  2. Go to the Data Connections list again and choose Add Data Connection -> Connect to Data Source -> Custom. Your newly created connection is visible there, under Custom connections.

  3. Choose your connection and check its configuration details to make sure all the authorization details like Client ID or Client Secret are added. Save your changes.

Customizing data jobs

Having the connection, you can also adjust the data that will be imported to your Celonis account.

  1. To start with the data job configuration, go to Data Pools -> Your Data Pool -> Data Jobs and choose Add Data Job. Add a name for the Data Job and choose your Data Connection.

  2. In the Custom Data Job, choose +Add Extraction and select all the available tables: account, digital_process, event_type, process_context.

Now you can add additional filters to set some limitations on the load context and it's content, see the examples below that you can use for reference:

Getting data from the finished digital processes only

To get the data only from the finished digital process runs (and to exclude gathering the data of the digital processes that are running), you can add a filter that fetches data only from the digital processes with the finished status.

  1. Go to Data Pools -> Your Data Pool -> Data Jobs -> Extractions -> Your Extraction -> process_context.

  2. In the Filter Statement section add status = 'finished'. This is an optional field, but it makes sure you only get data from the completed processes.

Getting data from the digital process finished in a specified time frame

To get the data only from the finished digital process runs and to make sure only the changed data is loaded, you can use a filter for the finished status and modified time.

  1. Go to Data Pools -> Your Data Pool -> Data Jobs -> process_context.

  2. Create a new parameter for the modified time - <%=max_modified_time%>.

    Go to Your Extraction -> Parameters tab and choose New Parameter and provide the following values in the fields:

    • type: dynamic

    • table: process_context

    • column: metadata$updatedAt

    • operation type: FIND_MAX

    • data type: date

    • default date: any date from the past

  1. Go to process_context and in the Delta Filter Statement section add status = 'finished' AND updatedSince = <%=max_modified_time%>.

Limiting data load to a period from a specified date

If you want to limit the date from when you load the post action feedback data, use the createdSince filter.

  1. In Celonis, go to Data Pools in Celonis -> Data Processing section and choose Data Jobs.

  1. Go to Data Pools -> Your Data Pool -> Data Jobs -> Your Data Job -> Extractions and choose Your Extraction task.

    • Open the process_context configuration.

    • Go to Time Filter and configure the filter to customize the creation period.

Setting up process context scope to live or draft digital process instances

You can customize the process context extractor to include live, draft or live and draft digital process instance.

  1. Go to Data Pools -> Your Data Pools -> Data Jobs -> Choose Your Data Job -> Extractions and choose the extraction task.

  2. Go to Parameters and choose New parameter.

  3. Set up the new parameter to include draft, live or draft and live scopes. You can check the example:

    • Placeholder: includeScopes

    • Name: Include Scopes

    • Description: you can add your text about the parameter here for future reference

    • Type: private

    • Data type: text

    • Values: you can decide if you want to limit the data to live or draft only

      • draft includes the instances of draft digital process versions

      • live includes the instances that are published and active

      • live,draft includes both live and draft instances

After creating the parameter you have to create a filter for it in the process context.

  1. In the Data Jobs -> Your Data Job -> Extractions -> Table Configuration choose process_context.

  2. Go to Additional Filter section and add filters for the Included Scopes:

    • Filter Statement - includeScopes = '<%=includeScopes%>'

    • Delta Filter Statement - updatedSince = <%=max_modified_time%> and includeScopes = '<%=includeScopes%>' - this example includes setups for both time and scope modifications.

Enabling activity log

To make it possible to generate an activity log in Celonis

  1. Go to Data Pools -> Your Extractor -> Data Jobs -> Transformations -> Extract Event Log and use the script below.

The script that is prepared, creates an activity log table based on the name of the digital process - it supports delta loads. The script creates a single activity for each event that is triggered in a digital process.

CREATE TABLE IF NOT EXISTS activity_log(
  digital_process_name varchar(256), 
  event_name varchar(256), 
  event_key  varchar(256) PRIMARY KEY, 
  event_id varchar(26),
  case_id varchar(256), 
  occurred_at TIMESTAMP, 
  read_at TIMESTAMP, 
  sorting BIGINT);

DROP TABLE IF EXISTS activity_log_new;
CREATE TABLE activity_log_new(
  digital_process_name varchar(256), 
  event_name varchar(256), 
  event_key  varchar(256),
  event_id varchar(26) PRIMARY KEY, 
  case_id varchar(256), 
  occurred_at TIMESTAMP, 
  read_at TIMESTAMP, 
  sorting BIGINT);

INSERT INTO activity_log_new
SELECT "digital_process"."name" as "digital_process_name",  
       "event_type"."name" as "event_name", 
       "event_type"."key" as "event_key", 
      "process_context_event"."id" as "event_id",
       "process_context"."context$processid" as "case_id", 
       "process_context_event"."occurredAt" as "occurred_at",
       "process_context_event"."readAt" as "read_at",
       TIMESTAMPDIFF(MILLISECOND, '1970-01-01 00:00:00', "process_context_event"."readAt")  as "sorting"
FROM  "digital_process", "process_context", "process_context_event", "event_type"
where "process_context"."context$executionTemplateID" = "digital_process"."id" 
and "process_context_event"."process_context$context$processid" = "process_context"."context$processid"
and "event_type"."key" = "process_context_event"."eventType" ;

MERGE INTO activity_log 
USING activity_log_new 
ON activity_log_new.event_id = activity_log.event_id
WHEN NOT MATCHED THEN 
  INSERT (digital_process_name, event_name, event_key, event_id, case_id, occurred_at, read_at, sorting) 
  VALUES (activity_log_new.digital_process_name, 
          activity_log_new.event_name, 
          activity_log_new.event_key, 
          activity_log_new.event_id,
          activity_log_new.case_id, 
          activity_log_new.occurred_at, 
          activity_log_new.read_at,
          activity_log_new.sorting);

DROP TABLE IF EXISTS activity_log_new;
  1. Set up the Data Model to establish relations between all of the extractor's components.

  • Go to Your Extractor -> Data Model and choose Add Data Model.

    • Select all the items to be added in the Activity Table and choose Next.

    • In the Activity Table setup, you can configure all the activity columns. Choose the activity_log , select the following in the columns and then confirm with Finish.

    • Case ID - Digital Process ID

      • Activity name - it's an event_name

      • Time stamp - for occurred at

      • Sorting - for a better data display

      • Read_at - means a point when an event is saved, later used for sorting

  1. Set up all the Foreign Keys for your activity table. For example, set the ID key for the Account.

    • Choose New foreign key in the account settings.

    • Connect the ID field from a Dimension table to a digital_process id in a Fact table.

Mandatory relations:

  • Link Process Context (Dimension) with Activity Log (Fact) by linking activity_log.case_id and process_context.context$processid.

  • Link Digital Process (Dimension) with Process Context (Fact) by linking digital_process.id and process_context.context$executionTemplateID.

Recommended relations:

  • Use Event Log as activity table

  • Link Process Context (Dimension) with the additional tables (Fact) using process_context.context$processid to load any additional data as a part of your post action feedback event

Optional relations:

  • Link Digital Process (Dimension) with Digital Process Triggers (Fact) by linking digital_process.id and digital_process$trigger.digital_process_id.

  • If you want OE multi tenant separation of data, link Account (Dimension) with Digital Process (Fact) by linking account.id and digital_process.tenant.

To see an example of a configured data model, check the below sample Data Model:

  • Process Context: it's the central component, it belongs to one digital process

  • Digital Process: it can have many process contexts and many digital process's trigger events, but it can have only one account

  • Event log: it's a 1:1 relationship with the process context

  • Account: means the OE tenant, it can have many digital processes

Executing a data load

Execute a data load based on your configuration and created connection.

  1. Go to Data Pools -> Data Jobs and choose Execute Data Job. See the example:

  1. Choose Execute Selection.

The job starts and you can already see the process logs.

When the process is finished, you can check the logs details.

You can find the Client ID and Client Secret in Emporix Developer Portal - .

For more details and instructions on how the Celonis data model and transformation look, see documentation.

To learn more about the configuration of Data Models in Celonis, see the documentation.

Celonis Data Jobs
Data Model Definition
Celonis Data Pools
15KB
Digital-Process-Event-Log-Extractor.json
Manage API Keys