Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Get familiarized with OE step by step by going through the training trails.
Emporix Orchestration Engine provides Hello World exercises designed to acquaint you with the realm of value streams and the OE environment. As a brief overview of OE , we can construct a value stream from the ground up, allowing you to grasp the concepts and familiarize yourself with the available tools.
Before starting with the trails, take a look at some basic information about how the processes in OE are built - see Value Stream Components guide.
Then, go through the trails in the following order:
OE uses Make's functionality for configuration of value streams. Learn more about the tool.
As OE is closely integrated with Make (formerly Integromat), it's important to get to know the application. OE's value streams use Make scenarios to perform actions in other systems. It's essential to understand the way the scenarios behave and how they can be combined within a process.
To learn more about Make, you can check the following documentation:
Learn how the process of provisioning OE looks like.
After receiving a provisioning request, Emporix prepares a OE tenant for you. When the tenant is ready, you'll receive an email notification to activate your account. Then, you can access the Emporix Management Dashboard and Developer Portal.
Management Dashboard is where you access OE.
Developer Portal is where you manage your account, tenant or API Keys.
If you want to place a provisioning request, contact our .
Learn how you can configure OE tenant.
The configuration of your OE tenant is related to setting up relevant access controls and connections. This can be done in the Admin section of OE.
To access the Admin section, log in to Management Dashboard, go to OE -> Admin.
In the Admin section, you can find your Make settings, Tenant ID and configuration tabs related to Datasources and KPIs & Analytics.
For details how to manage and use the configurations of the Admin section, take a look at:
Test out the value stream before publishing.
When you create or edit a value stream, you have two possibilities as the next actions: test or publish.
Test - allows you to test the draft version of the process, which means all the triggers and steps that you constructed in the value stream can be checked before you publish them
Publish - creates a production version of the value stream
To test the value stream before publishing, choose the Test button.
You can test the value stream in two ways:
Reuse a Make scenario in another value stream.
You can clone a scenario prepared in Make to use it in your OE value stream. Go to the Make application and choose the Clone action for the selected scenario. You can copy all types of scenarios, either the ones that are used as triggers, or the ones that are prepared for the other process steps.
When you clone a scenario in Make, a Clone pop-up window is displayed where you should specify the name for the scenario's copy and select, or create, a new webhook to trigger the copy of the scenario.
Each Make scenario that’s invoked by the OE process engine, starts with a cloud event that is set as a trigger. This requires a unique webhook so the value stream engine routes the request to the proper scenario. When you add the scenario to the value stream, it sends requests to this webhook and not to the one of the original cloned scenario. This way the new cloned scenario gets invoked.
Additionaly, in the Administration module of Management Dashboard, you can set up your settings for the tenant users and their permissions. For details, see Users and roles documentation.

Copy the curl request sample - you can use it for sending a request and checking if it starts the process (for instance in Postman)
Test directly in the UI - when the process is not published yet, you can test it directly from the UI, using the exiting payload example or adding your own. If it's successful, you get the notification in the same window.

Orchestration Engine is available as a module in the Management Dashboard.
OE is available from the Emporix Management Dashboard. The dashboard is where you perform your administration and configuration tasks. It’s also where you check your data and statistics.
As a customer, you receive your tenant details with credentials to log in to Emporix Management Dashboard when the provisioning process is complete.
To access OE:
Use the link provided to you in the account activation email and log in to the dashboard with your email address and password.
After you log in to Emporix Management Dashboard, you can see the welcome page. Go to Orchestration using the navigation menu on the left side of the application. If you are an Emporix Commerce Engine (CE) customer as well, you can see all the CE modules in the navigation menu as well. If you use OE alone, the CE modules are not visible and you have access only to the Orchestration and Administration modules, depending on your assigned role.
If you wonder why you need OE, take a look at the example use cases where OE provides tailored solutions.
Let’s take a look at some examples for wholesale, distribution and discrete manufacturers. Remember, Emporix OE is focused on enhancing both business outcomes and customer experience. The main use cases that you might use OE for are:
Payment assistance: Customer payment options are auto-restricted for buyers with a poor payment track record, encouraging the customer to pay on time while protecting the seller.
Stock assistance: Overstocked products are automatically promoted to ensure working capital efficiency.
Order assistance: Customers are proactively provided with recommended substitutions for products which cannot be fulfilled on time.
Sales assistance: Sales reps are supported with intelligent recommendations for alternative products or suppliers to meet the promise of on-time delivery.
There are many more use cases for particular situations:
A value stream can react on an action (event) happening in internal or external systems.
External events work as triggers for value stream to run or wake up sleeping processes. These events, such as notifications regarding stock levels, quote requests, and many others, are transmitted as authenticated webhooks to a pre-configured Orchestration Engine event-receiver endpoint. When receiving these notifications, the system interprets them and initiates, or wakes, relevant value stream accordingly. This approach enables real-time adaptation and decision-making, allowing businesses to efficiently manage resources and respond to changing conditions in their operational environment, creating business agility.
OE is schema-less, it utilizes the Cloud Event specification for both receiving and sending events. This framework enables OE to seamlessly gather event information from designated endpoints responsible for transmitting such data. By adhering to this specification, OE ensures efficient and standardized communication, facilitating seamless integration with various systems and endpoints.
For details about events in OE, check the following documentation:
Emporix related:
Orchestration Engine optimizes digitalization processes in your company. Get familiar with the Emporix solution.
Emporix Orchestration Engine is the digitalization layer to the Emporix Commerce Orchestration Platform. It empowers businesses to monitor real-time events across systems, make data-driven decisions, and automate business processes dynamically — without heavy development. By letting you define workflows triggered by events - like inventory shortfalls, customer behavior, or supplier signals — you can orchestrate actions (such as email, pricing changes, stock adjustments, etc.) across your stack immediately. OE gives you real agility as it bridges the gap between what’s happening in operations and how the front office (sales, marketing, service) responds. Orchestration Engine enables rapid response to real business conditions, reduced manual labor, more efficient operations, and a more adaptive way to run commerce end-to-end.
Create integration between OE and your Celonis instance.
Our OE can gather data retrieved from Celonis and use value streams to improve business processes, which are reflected in Key Performance Indicators. Data retrieved from Celonis is visible in the KPIs and Analytics dashboard.
To configure the connection with Celonis, you need the environment URL and Celonis token.
You can get the URL from Celonis EMS application.
Copy the URL from your Celonis account.
In OE, go to Management Dashboard -> OE -> Admin.
In the Settings tab -> Celonis section paste the URL.
Save your changes.
If you don't have the environment URL, contact the Celonis support team.
In your Celonis account, go to Admin & Settings -> Applications and choose Add New Application -> Application Key.
Enter the key name, to be able to identify the key in future.
Copy the token.
Be sure to save the token safely, as it will not be displayed again.
In OE, go to Management Dashboard -> OE -> Admin.
In the Settings tab -> Celonis section paste the token.
Save your changes.
Having the URL and token settings done, you can configure your KPIs and Analytics dashboard. See the documentation.
Prepare the Make instance.
By default there is always one team created for every Make Organization. All users initially belong to this default team. However, you can create additional teams to logically separate Make Scenarios and users according to the needs of the business. You can configure OE to point to any Make Team you want.
To set the Make Team for a tenant:
Go to OE → Admin → Settings.
Add the name of your Make team.
To join Make and be a member of the Make Team, the users have to be assigned to the user group that gives permissions to access Make.
Learn how to import and export value streams between your tenants.
The Import and Export feature is useful when you want to move your processes from one tenant to another for example, from your test environment to production.
To export a value stream:
Go to the Management Dashboard, and choose the three dots icon next to the value stream you want to export.
Choose Export Process. The value stream gets automatically exported and downloaded as a .json file.
To import a value stream:
Go to the Management Dashboard and choose the Import button at the top right, above the list of processes.
Upload the .json file containing your value stream. You can either drag and drop the file or select it using the standard file browser. If you selected the wrong file, choose Cancel to choose another one. To abandon the entire operation, choose Discard.
To start the import, choose Import Process. The value stream gets automatically added to your dashboard list, with the Last published date showing the import date.
The value streams are imported along with:
Events - the events that you use in your value stream are imported and added to events registry in the tenant
Make scenarios - all the scenarios that are used in your process are imported in the same configurations
Subflows - when you have subflows configured in your process, the linked value stream is imported as well
Conditions - the process is imported together with conditions applied at the steps levels
Current things that can't be exported/imported with a value stream:
Cloud function connectors - this is going through an improvement process and will be updated.
If any of your Make scenarios include connections that don’t exist in the target tenant, you need to recreate those connections. For example, if you have a Gmail connection in your test tenant, you need to set it up again in the production tenant.
Learn how to work with events in the Orchestration Engine.
If you use dots in event names, for example tenant.created, you need to correctly reference such event types to ensure seamless integration between Emporix Orchestration Engine and Make or other third-party platforms.
You may need to reference such event types it in the following situations:
When using the OE API process run search endpoint.
When using the search process run module in Make.
In other places in Make, when referencing the event type as an attribute of a payload.
To reference an event type with a dot in its name, enclose the event type name in backticks (backquotes) ``:
When using the OE API process run search endpoint, ensure that you enclose the event type name in backticks.
For example:
When working with Make, enclose the event type name in backticks when using the search process run module or referencing the event type as an attribute of a payload.
For example:
In the search process run module:
When referencing the event type as an attribute of a payload:
Using backticks to enclose event type names with dots, ensures compatibility with Make and other third-party platforms. By following this convention when referencing these event types, you can maintain seamless integration and prevent potential issues during the processing of your workflows.
Orchestration Engine is a powerful tool that helps you optimize and digitalize the company processes.
Emporix Orchestration Engine (OE) enables companies to dynamically optimize their business processes based on real-time, end-to-end process intelligence. It is revolutionary as it orchestrates people and systems across the business and even outside of it, based on the process intelligence, enabling the first process-context-aware solution in the commerce space. This means that whenever you receive signals from your business, or when you receive insights about customers', or suppliers' behavior, you can react immediately to improve your business operations.
Emporix OE helps to close a gap between the operations of your business and the sales, marketing and communication channels that extend them out to the customer in the front office. It covers the need to obtain information, work with it and use it for business improvements. It makes it possible to create data-driven operations, with real-time business agility.
Emporix OE orchestrates actions for systems to optimize key outcomes for the business, based on events occurring across your organization. OE is a framework that enables orchestration, planning and running of actions combined in value stream. They typically are a combination of:
a trigger event that makes the value stream start
a series of steps that are performed after the trigger
Emporix OE also provides a User Interface (Management Dashboard UI) that allows you to design and manage the value stream easily and quickly. It enables business analysts working in-house, at consultancies or at solution providers to build them without the need of development expertise or resources.
Signals from your business to Emporix OE trigger automations and optimization for various business scenarios, such as:
anticipating stockouts
reducing return rates
controlling advertising spend
It triggers an almost instantaneous response to events defined by business rules, proposing optimizations to the line of business, and automatically orchestrating changes with the underlying tools (such as email, marketing, commerce and orders), significantly reducing time to action and manual labor.
Moreover, OE is integrated with CE to support a wide range of commerce-driven business processes.
See how versioning is handled in value streams.
The versioning of value stream is based on the roll-forward approach. It means that with every change a new version of the process is created. If you want to get back to a previous version, you can do so, but it actually creates the next version of the process.
When you trigger a value stream and it starts to run, you have an instance of that process. If the process is at version 1, that instance is an instance of version 1. Supposing you have 200 running instances at version 1 and you modify your process and save it as version 2, the next instance of that process that you run is a version 2 instance. This does not change the 200 running instances at version 1.
It is possible to have many instances of many different versions all running at the same time. When a new version is created, it doesn't stop the previous ones from running.
Additionally, using the Show Versions function, you can check what is the current value stream version, when it was modified and by whom. You can also select the previous version of the value stream and get back to it.
Choosing to revert to an older version of the value stream results in changing the process to the older configuration, but with a new version number.
View all the events happening in the system in relation to digital processes.
Event Registry is a list of all the events that were configured in your tenant for your OE value stream. The events are the triggers that initiate the processes and make them run.
Events are utilized by the ce-type attribute, which indicates that the event has occurred. By aligning the Event Name (eg. order.created) configured in OE with the ce-type name, you ensure that a value stream accurately receives and processes the event information.
Event Registry lists all the events related to Commerce Engine. The events reflect the actions and processes happening in the Commerce Engine system, which you can use as triggers for the value stream without the need to configure them manually. The pre-configured commerce events facilitate digitalizing the commerce-related workflow processes.
To create a new event:
In OE, go to Events -> Event Registry.
Choose Create New Event.
Specify the Display and Event names.
Display name - it's the name that is visible in the UI when you create a value stream, for example: Order Created.
Event name - it's the name of the registered event, for example: emporix/ce/order.created.
See how OE's value streams integrate with Celonis through Make scenarios, enabling signal-based triggers and action feedback for seamless data exchange.
OE's value stream together with Make scenarios allow for seamless interaction with Celonis' knowledge models and their Intelligence API. You can use the data and signals received from Celonis as triggers for your business value streams. Then, when the processes finish, you can send back a response to Celonis using action feedback.
To configure the data flow between OE and Celonis see the topics:
Value streams are referenced to as digital process in the data flow examples.
See how to proceed with an invalid Make scenario.
If a scenario that you prepared in Make is not valid for your value stream in OE, check if you have the following settings done:
Start trigger must end with a Start New Orchestration module. Check Events as Triggers.
Mid trigger must have a notification event - Completion Event module. Check Setting a Trigger to Wake Up a Paused Value Stream.
Scenario in a process step must start with Trigger Event and have the Completion Event module. Check .
Currently OE doesn't support scenarios without the Completion Event module. If you don't have the module for the scenarios used as mid triggers or process steps, OE filters them out and they are not available for you to use.
See how to fix an invalid value stream.
If you created a new value stream but the validation shows that the process is invalid, check the following possible reasons:
Proper configuration and reference to Make scenarios
JSON document syntax errors
It's possible that you have a wrong reference to a Make scenario that was either deleted or modified.
Check if your value stream does not reference a scenario that had been deleted. If it does, you should see a red error with a Scenario deleted notification message at the location (value stream step) where the scenario was used.
To resolve this error, remove the reference to the deleted scenario from your value stream step.
If a Make scenario was modified, it may no longer meet the requirements of its intended context. In such a situation, the OE UI doesn't show it in the list scenarios available for your value stream. You're not able to use the scenario in your process' steps.
This occurs when:
The scenario no longer sends output event information through the Start New Orchestration module
The scenario no longer sends output event information through the Completion Event module
The scenario no longer receives incoming calls through the Trigger Event Make module
To resolve the errors, make sure the modified Make scenario meets the requirements of the intented context and check if all the input and output modules are configured properly, with correct events setup.
As the value streams are designed using OE UI, it doesn't usally happen that any syntactic errors occur. However, if it turns out that your value stream doesn't work, and you checked all the OE and Make settings but the errors still occur, you may need to review the JSON document to ensure it adheres to proper structure and formatting.
Configure an action event that resumes a paused value stream.
The value stream that have already started, may need to wait for an intermediate event before continuing with next steps. While waiting, a value stream goes into a dormant or sleeping state.
To wake up a sleeping process instance, you need to post an event with a ce-instanceid (Instance ID) header and the ce-type header that matches the event type that had been configured for the process as a trigger.
ce-instanceid routes to the correct instance
ce-type routes to the trigger
The Instance ID is the ID of a running instance of a value stream. Every time a trigger event is received for an OE value stream, OE generates a new instance with an Instance ID. If the value stream is triggered 100 times, OE creates 100 unique Instances, each with their own ID.
To find the Instance ID header in Make, go to your scenario and check the completion summary of the process that was already run. In the Emporix Orchestration window you can check the Instance ID and copy the value.
Enter the copied value for the ce-instanceid header in your request.
If you send the request with the ce-instanceid and ce-type headers, OE saves the payload and then passes it through to the next Make scenarios that are configured as next steps in your OE value stream. The event type is used as a key. The payload grows with every event that is generated.
It is also possible to wake up a value stream with a Make scenario that ends with a "completion event" module that is configured to send notifications about completed scenario to OE. If such a Make scenario is a part of your value stream steps, it causes the process to wake up and move to the next process steps.
Learn how to get notified about errors.
You can configure your Make organization to receive different kinds of notifications related to your scenarios: warnings, errors or deactivations. You can either receive the notifications by email or check them inside the Make application under Notifications.
To set up your Make email preferences:
Log in to your organization in Make.
Go to My Team -> Team -> Notification Options.
Choose the notification that you want to receive by email.
There's no distinction between a failed process and incomplete process when it comes to notifications sent from Make. This is because an incomplete process run is a failed process scenario with errors or warnings. What is different between the two is that a scenario is not shut down immediately. The state is saved when the process fails.
We recommend to enable all the notifications in Make, to be sure not to miss any issues.
If everything is correct but you still don't receive the emails, check your spam folder or contact your administrator to check if the messages are not blocked on your side.
Currently, OE itself does not send any error notifications. However, you can check the for existing issues.
Get insights into the processes running in the system in the Value Streams Dashboard.
Emporix Orchestration Engine orchestrates business behaviour for systems to optimize key outcomes based on events. They consist of a trigger that makes the process start and of steps that need to be performed after the trigger takes place.
All value streams are executed top-down and run independently from each other.
To access the page, in the navigation menu go to OE → Value Stream.
What you can see in the page is a list of all the value streams that are created in your tenant. You can manage the value streams by creating new ones or deleting those which are no longer needed, activating or inactivating the ones which already exist.
Learn how to establish a connection between OE and Make.
Make modules are components that are used for creating scenarios in Make.
The connection between OE (Orchestration) and the modules in Make is normally created automatically. However, if it happens that you add the module in Make when creating a scenario, and the connection is not established, you can configure it manually:
Add an Orchestration module in Make, for example a Trigger Event. An Emporix Orchestration
Learn how to modify a value stream in OE.
After the value stream is published, what you see after clicking on the value stream is the latest version that was released. To open the draft mode for a published value stream, you have two options:
Choose the Draft icon in the dashboard, this displays a window with basic information about the person who edited the draft and the date when the last changes were made. Go to draft opens the latest version of the draft.
Open the selected value stream and then choose the Edit button.
Risk Management
Credit Risk - cancel orders from known bad actors or put the orders on hold for review.
Fulfillment Risk - alert supplier management team of orders delayed by backorders or slow drop ship vendors.
Stockouts
Automatically hide out-of-stock products, then automatically reverse the process when the item is back in stock.
Intelligent Replenishment
Monitor product sales and automate inventory and on-time reordering of low-stock products by combining sales statistics with supplier on-time delivery (OTD) statistics to avoid stock out.
Next Best Option
Recommend products from suppliers/drop ship vendors with better on-time delivery to reduce backorders and accelerate revenue.
Sustainability
Reduce waste by avoiding excessive stock, promote stock that will expire, reduce return rates and recommend more sustainable shipping options. Report on your carbon footprint.
Return Optimization
Understand customers' propensity to return products. Engage them proactively to reduce return rates, for example by testing different strategies, such as punishment and reward.
Advertising Spend
Control advertising spend by automatically suspending/reactivating advertising campaigns based on inventory level and demand.
What is OE?
Get to know the concepts of the Emporix Orchestration Engine.
Management Dashboard
Find out how to access OE and configure your tenant in MD.
Value Streams Dashboard
Find out what the OE value streams are and how they work.
Troubleshooting
Get help and find answers.






Rulestore - the rules set up for your value stream are reflected in the new tenant
Forms - forms used by the value stream are copied to the new tenant as well




`tenant.created` {
"eventType": "`tenant.created`"
} {
"eventType": "`tenant.created`"
} {
"eventTypeAttribute": "tenant.created`.attribute"
}Active: order the list by active or inactive value streams.
Name: alphabetical order, you can also search for a specific scenario process by typing its name in the search field.
Last Published: you can sort the scenario processes by the date they were published or search for a scenario process that was published on a specific date (using the search field).
Running, Failed, Completed and Stopped instances: check how many times the process run, failed, got completed or stopped. For example, every single time a new order is placed and the “new order” scenario is triggered, it counts as one instance.
Statuses: represent the validations of the value stream blueprints, they are not related to the running instances.
Pending - process configuration is being validated.
Invalid - the process is not configured properly and cannot run.
Validated - the process was configured successfully, the validated status covers both active and inactive states.
Deleting a Value Stream: removes a value stream from the list of available processes, every time you delete a process you see a popup window with a request to confirm the action. Deleting a value stream is possible only when there are no running instances of the process.
To open the connection configuration, choose ADD in the connections section. The Create a connection window is displayed.
Enter the Event Source ID and the related Secret. You can copy and paste the ID and the secret from the Emporix Management Dashboard.To find the Event Source ID and the Secret, go to OE –> Admin –> Settings.
Choose the environment with which you want to establish the connection.
Enter the Client ID and the client Secret. You can copy and paste the ID and the secret from Emporix Developer Portal. To find the Client ID and the Secret, go to Developer Portal → OE API Keys.
Choose SAVE to create the connection.
There's a confirmation popup to decide whether you want to load the latest draft version, or overwrite it.
No, load last draft - loads the most recent draft saved for that value stream, which could be for example a draft you’ve been working on over the past few days.
Yes, overwrite it - takes the last published version, the one you are viewing, it makes it a new draft, overwriting and erasing the last draft saved. Starts from scratch.
It’s possible for a value stream to be opened in multiple tabs or by different users simultaneously. In such cases, if a user attempts to edit a draft that is already being modified by someone else or in another tab, a warning message appears. This ensures the user is aware they may overwrite changes made by others. The most recent draft is always the last one saved, regardless of how many tabs are open.
If you choose to overwrite the value stream, you become the owner of the draft, and the original editor can no longer make changes. If you cancel and opt not to overwrite, you remain in view mode, allowing the other user or the original editor to continue making changes to the draft.
Every step of the value stream can be edited separately.
For triggers you can:
Choose Edit to select a different scenario or event from the list.
Duplicate to copy the trigger, the duplication automatically creates the next step which is exactly the same.
Delete to remove a trigger. However, it is not possible to remove the first trigger because it’s a mandatory step to start a process.
For process steps you can:
Use the drag and drop (the hamburger icon) to reorder Make scenarios in your process steps.
Choose Edit Scenario to apply changes to the scenario in Make.
Choose Edit to select a different scenario from the list.
Choose Duplicate to copy the step, the duplication automatically creates the next step in the value stream which is exactly the same as the original one.
Choose Delete to remove a step.

The storage of incomplete processes is not enabled by default. To enable it go to Your scenario -> Scenario Settings.
If the storage of incomplete processes is not enabled, the scenario stops and all instances of process contexts are blocked. With this setup, the overall outcome is that you have a queue of process context instances, which are blocked at a specific scenario step, until the scenario is manually reactivated.
If the storage is enabled, you can check the records of the incomplete processes at any time in the Incomplete Processes folder. You can use the record to reprocess the data. The incomplete processes need to be resolved to unblock the specific process context instance.
The consecutive errors feature helps to separate sporadic issues from those that are fundamentally broken. Using advanced scenario settings, you can customize the number of failed processes before a scenario gets deactivated.



Get insights and metrics about value streams in KPI Dashboard.
The KPIs & Analytics dashboard gathers high level business information using KPIs from Celonis and business outcomes of your running value streams.
The dashboard is divided into two sections:
KPIs - data showing KPIs retrieved from Celonis EMS.
Value Stream Effectiveness - graphs showing the business outcomes of your running value stream.
To start with the configuration of the KPIs and Analytics dashboard, configure your connection with Celonis. See the documentation.
The KPIs & Analytics dashboard can be configured to your needs and the data that is most important for you to see and monitor. To start the configuration of the dashboard, go to OE -> Admin -> KPIs & Analytics.
To configure business KPIs:
In the Business KPIs section, you can configure up to 4 KPI tiles that display your selected data.
To set up a specific KPI, open its edit mode by choosing the edit icon.
Enter the Display name for the data that should be displayed in the tile.
To configure the process effectiveness visualizations:
In the Process Effectiveness section, you can configure up to 4 graphs that can be used in the KPIs & Analytics dashboard to view the results of the running executions and their effect on business outcomes.
Enter the Display name for the data graph.
Choose the related value stream.
Complete the following KPI data sources configuration:
Celonis Knowledge Model
Task inbox serves as a communication point for getting information from external parties.
Many value streams require an interaction with an internal user or an external party, such as Supplier or Customer. This is often referred to as Human-in-the-loop. For example, in one step of a value stream you send a form to your supplier asking for some information, and in another step you use and process the information provided by the supplier's form submission, or pass it to an internal user to review and approve.
In the first step, a workflow task is created for the supplier and assigned to a magic link. To consolidate all the tasks assigned to a single recipient and also aggregate email notifications related to different value stream, you can point the external party to their Task Inbox that lists all the tasks assigned to them. You can achieve that by generating an access link that you are able to share with a relevant person.
Internally, employees using the Management Dashboard can easily access their Task Inbox, which is integrated within the dashboard, promoting easier access and increased efficiency.
For internal users, the Task Inbox is conveniently integrated within the Management Dashboard, allowing employees to access their tasks more easily and work more efficiently. By eliminating the need to switch to external systems, users can manage all their tasks directly from one centralized location. The Task Inbox is accessible via the top panel, providing a seamless experience within the dashboard.
The Task Inbox is also available externally for users who do not have access to the Management Dashboard. The external users can access their Task Inbox with access links generated for them in Management Dashboard.
To generate an access link:
Go to the Admin module, and open the Access links tab.
Choose the Create access link.
Select the recipient from the list and pick the link expiry date.
Choose Save to create an individual link. The link ID number appears after successful creation.
The Task Inbox is available outside the Management Dashboard so you can share the link with external user. Each user gets a unique access link that allows them to view their Inbox without authenticating or logging in to the Management Dashboard first.
For each access link, you can assign an expiry date upon creation. But for security reasons, you might want to withdraw access earlier, for instance if a user leaves the company.
Go to the Admin module, and open the Access links tab.
Pick a user's email you want to revoke the Task Inbox access and choose the Trash icon.
From this point on, the person is not able to access their external Task Inbox and you can see the status revoked.
Learn how you can use Cloud Events in OE.
OE uses the Cloud Event specification when responding to, or sending out events. OE uses this to receive information about an event from the related endpoint that is responsible for passing the event information.
OE uses CloudEvents specifically with HTTP Protocol Binding and Binary Content Mode, where currently we support application/json only as a content-type for the data payload.
To set up OE to work with cloud events:
Make sure the authentication works properly and the relationship between the endpoint you use for receiving events and OE is correctly established.
To create the relationship you should provide source and secret values that are configured for your event-receiver endpoint and used as a signature in every request that is sent.
To run a value stream, start a trigger that is specifically configured to set off the process. The trigger for the first step must specify values of the following headers, which are cloud event specifications adopted by OE:
ce-source - source of the environment sending the cloud event, you can enter any source here but it should clearly identify the application that is sending the event
ce-type - type of the cloud event
The event type that is configured as a starting point cannot have the ce-instanceid (OE Instance ID) defined.
If you have a ce-instanceid header defined, the start trigger ignores it as the header is used only for restarting an existing value stream instance. Moreover, ce-instanceid is bound to one value stream, you cannot have the same id defined and used in triggers for another value stream.
Running the OE value stream allows orchestration of multiple Make scenarios. Every Make scenario is combined of various modules that gather data one after another. The first module is always an input for the second module that creates output for the next one and further. While a process instance is running, it can be configured to wait for another event before moving to the next step. In that case, the process goes to a sleeping state and waits for the next event type.
To wake up a paused process with an event, and make it move to the next step, you need to send an event that wakes up the trigger. You have to have a unique ID (ce-id) for the event and the ce-instanceid (OE instance ID) defined. OE instance ID routes the event to the proper sleeping process instance that is waiting.
You can send many events using this field and they are all be routed back to the proper value stream instance.
Track and monitor operations happening in OE.
Event logs are records that track important actions and incidents in a system or network, like errors or user activities. They help to monitor the system by providing details on when, where, and what happened.
In the OE Event Log main view, you can check records of all the events that were received for your value stream. You can search the results with the Request ID for which the event occurred. You can also see when the data was received and whether the call was successful.
Request ID - it's an identifier of the signal sent to OE to start a value stream, for example: req_NkMdGRM1paSsrootCUhf. You can find the Request ID in the debug mode in the Webhook logs of the trigger:
To see more detailed logs, select the event you're interested in.
Then you can see a popup window with details including:
ID - it's the ce-id that is sent in a request, it represents the unique identifier for the event instance.
Source - information about where the event occurred, for example: emporix/oe.
Type - event name, it can be checked in the , for example: order.stock.checked.
Logs details also allow to get all of the event information in a CURL format, use the Copy as CURL for that.
Get familiar with Value Streams and how to configure them.
The Emporix Orchestration Engine orchestrates business behaviour for systems to optimize key outcomes based on events. OE provides a framework that enables the orchestration and execution of actions combined in Value Streams.
The value streams consist of an event that makes the process start, and of the next steps that need to be performed after the event takes place. Emporix OE provides a user interface, the Management Dashboard, that allows you to design and manage value streams with ease.
To start working with the value streams, make sure you have all the prerequisites and configurations completed as described in the following sections of our documentation:
To open the value stream view in Management Dashboard, simply click on the selected value stream that you wish to open.
The view shows every step of the value stream, starting with a trigger and then proceeding with the subsequent process steps or mid-triggers. This is where you can begin working on your new value stream or edit an existing one. In each value stream you view, you can also perform the following actions:
Edit - opens the draft mode of a value stream
Show versions - you can view all the versions of your value stream and switch to the previous one. To see how versioning of the value stream works in detail, see .
Go to Debugger - this allows you to move directly to the debug mode to check all the logs of your value stream instances. For debugger details, see .
Data transferred through the system is kept for a specific period of time. See what's the binding policy.
Learn how you can generate logs.
In case of UI issues, you can contact Emporix Orchestration Engine support team at [email protected]. The team may ask you to provide a HAR (HTTP Archive format) file with logs generated from your browser. Generating a HAR file helps us to identify the issues. The HAR file is stored in JSON format and contains information about the web browser's interactions with a web server.
You can generate a HAR file from any browser, here are some examples.
To maintain the privacy and ensure no sensitive data is stored, it's best if you create your HAR file in a private mode of your browser.
To create a HAR file in Google Chrome:
Open the Google Chrome browser and go to the page which you want to record.
To open the debug pane, press the F12 button on your keyboard, or chose the three-dots icon on the right of the toolbar and go to More tools -> Developer tools.
Choose the Network tab.
Ensure that Chrome is recording, you should see a confirmation message in the tab and a red button indicating that a recording is already in progress. If not, choose the
To create a HAR file in Safari:
Open the Safari browser and go to the page which you want to record.
In the Develop menu, choose Show Web Inspector. If you don't see the the Develop menu, you can enable it in your browser's advanced settings.
Go to the Network tab.
Reproduce the issue on the page where it occurred.
To create a HAR file in Microsoft Edge:
Open the Microsoft Edge browser and go to the page which you want to record.
To open the debug pane, press the F12 button on your keyboard, or go to Tools -> Developer Options -> Network.
Clear the session history by pressing the button indicated by three lines and a red "X".
Ensure that Edge is recording, you should see a confirmation message in the tab and a red button indicating that a recording is already in progress. If not, choose the
To create a HAR file in Firefox:
Open the Firefox browser and go to the page which you want to record.
To open the debug pane, press the F12 button on your keyboard, or go to Tools -> Developer.
Go to the Network tab and choose Persist Logs. You can find the Persist Logs option under the gear icon - Network Settings.
See the details of OE firewall configuration.
If you are experiencing any access issues when using Orchestration Engine, make sure your network is not blocking http traffic to the following Emporix, Celonis and Make domains and subdomains:
Emporix: emporix.io
Emporix API:
Emporix Developer Portal:
Emporix Management Dashboard :
Emporix Forms UI :
Emporix SSO:
Emporix Documentation:
Emporix Inbound Webhooks:
Emporix Outbound Events: - required for DCP customers only
Emporix Corporate Website:
To start working with , make sure to also allowlist its IP: 34.128.182.253.
Celonis:
If you work with Celonis, make sure the Emporix IPs are allowlisted with the as documented by the company.
Emporix Make Environment:
Make:
Integromat: - additional access is needed for any Make Apps using OIDC Authorization Protocol
To allowlist the connection to and from Make from your internal IT system firewalls, use also these IP addresses:
Outgoing = 63.176.242.240, 35.159.75.87, 3.72.162.92
Incoming = 63.176.80.5, 3.73.150.249
Create and store business rules that govern the value streams.
The Rulestore is a repository for creating and managing static business rules to be used in value stream. They are not active until referenced by a value stream. The business rules can be reused multiple times, in different value stream. Business rules can be changed without changing the value stream that references it, they are completely independent. Business rules can be created by users with the OE Rule Editor and OE Admin roles.
You can filter the rules by the ID and name. Additionally, you ca also see which value stream are actively referencing each rule.
If you choose the Go to Value Stream button, it opens the value stream in a separate tab. Business rules cannot be deleted if there's any value stream using the rule.
See what you can get with the Orchestration Engine.
Emporix OE leverages EMS insights around the Order Management Process and orchestrates and triggers actions other systems across your business. You can immediately begin to improve your experience with your customers and directly impact your revenue. OE delivers real-time responses to signals from the back office, resulting in faster time to action for the business and the ability to positively affect the experience for the customer and the business at just the right moment.
In commerce, optimizing outcomes for these two priorities is challenging given the commerce process’ highly dynamic and decentralized nature. Front-office problem solving for customer satisfaction and customer experience doesn't typically take into account back office processes and problems; the two are typically separated, creating unnecessary friction.
Secondly, it is impossible for organizations to react to all developments with the intelligence, speed and scale required in order to maximize both revenue and the quality of customer experience.
A value stream is built of triggers and steps. Familiarize yourself how they can work together.
The Orchestration Engine value streams are made up of a combination of triggers and process steps. It is required to have at least a trigger in a value stream.
Triggers are used to start new instances of a process, or resume an existing one that was paused. There are two types of triggers: events and scenarios.
Set conditions when to run a value stream and its steps.
For every trigger and step of your value stream, you can add a condition that would be used by the process context to decide whether a particular action should be processed or not. You can set up a filter or a rule that serves as conditional logic for the step. All of the conditions are evaluated by the Decision Engine to affect the process.
Whenever you have conditional logic applied to a trigger or a step, you can check the condition in the and evaluate the outcome of its execution or non-execution.
To be able to set a filter or a decision, you have to have a trigger or a process step created first. Then, the +Add logic button appears.
Process context plays a significant role in managing value streams. Learn the details.
The process context is an important part of the Orchestration Engine system as it holds all the external information regarding the processing of a specific value stream instance.
A process context is a JSON object that consists of three primary sections:
Context: contains all the event-types received during the value stream run, their associated payloads (if any), and the initial payload when the value stream is started.
Metadata: provides additional information about the process context, such as the tenant identifier.
Build and use a datastore to keep your data for easy reuse.
You can use datastore for use cases like listing active users, managing workspace settings, lookup tables and others. Data from a datastore can be fetched using its name space by a specialist Make module or through API. The key specifications for datastore interaction include:
authentication through Magic Links
support for CSV data files
supported text search
The steps below demonstrate an example how to use the datastore functionality and test it with the end-to-end flow:
You’ll typically see pain points such as:
a mess of digital commerce/marketing/back-end tools
fragmented and disconnected end-to-end digital sales processes
slow-moving stock, tying up working capital
fast-moving stock, creating frequent stock out
low on-time delivery rates from suppliers/drop-ship vendors
high manual labor to introduce, improve or run end-to-end digital commerce business processes
high order fulfillment costs, long order-to-cash cycles, high return rates
Emporix OE is a modern solution orchestrating business front-office processes based on Celonis process intelligence and the insights that it provides. This approach to business solutions is unique and efficient. It makes it possible for directly tie business KPIs and outcomes to the customer interactions, and to improve those interactions with the data flowing through the business.
Including OE in the management of your organisation process adds business value at many levels. Our value streams can improve different aspects of your business at the same time, for example:
reducing customer acquisition cost
improving operational efficiency
growing revenue and working capital
lowering labour costs
Here are some specific examples of business aspects and the customer value that you can address with OE:
Promote excess stock
Reduced working capital - excess stock is identified and sold to reduce the bound capital.
Avoid the promotion/selling of products with low stock or long lead times.
Reduced CAC - customers who click on ads of products no longer available incur cost without buying.
Shorter order to cash cycle - orders with no delays do not cause delays in payments.
Prevention of lost revenue - avoids customers leaving to procure product from an alternative vendor.
Minimize cancellation rates by providing intelligent delivery times driven by supplier performance.
Increase customer satisfaction - customers are closely engaged increasing their LTV and orders are delivered on-time, leading to higher overall customer satisfaction.
Reduced CAC - expensive win-back campaigns are avoided.
Increased revenue - more accurate delivery times allow sellers to manage customer expectations and reduce order cancellations, increasing the revenue retained.
Automated Re-Ordering - remind customers to buy on time, based on their consumption pattern and current order lead times.
Increase customer satisfaction - customers are closely engaged increasing their LTV and orders are delivered on-time, leading to higher overall customer satisfaction.
Increased recurring revenue - create a recurring revenue stream without subscriptions.
Self Service Return - automate and improve the cost effectiveness of your return process.
Reduced cost of handling for returns - make automated decisions on how to handle and how much to charge for a return.
Reduced return cycle times - improve customer satisfaction whilst reducing the cost and shortening the time of handling a return.
Supplier Promotions - promote products with kick backs/marketing funds from suppliers or manufacturers.
Increased or diversified revenue - establish or improve an additional revenue stream.
Reduced procurement costs - push products to achieve a different discount tier when buying from a supplier.
Emporix OE is designed as a low-code solution to make it possible to deliver automated solutions fast, without needing developers. The platform can be still used by developers and the business or marketing teams.
Requiring less technical skills, OE improves time to market and business delivery. To use the solution, you don’t need to have multiple engineering teams focusing on integration and automation; let OE do the work.
Outbound Emails
No limit
Make scenarios do not impose restrictions on the quantity of outbound emails sent. However, keep in mind that some limitations may be set by your setup or infrastructure, such as infrastructure / throughput availability, 3rd party API limits, or a limit related to the account used for a specific application.
Webhook events
7 days
The duration within which Emporix stores events data sent to the Event-Receiver Endpoint. All events are stored including the unprocessed ones or those failing authentication.
Process context
No TTL for lifetime of Tenant
The duration within which you can access data stored inside a process context. Data is deleted when the corresponding value stream is deleted.
Submission data
No TTL for lifetime of Tenant
The duration within which you can access submitted form data. Data is deleted when the corresponding magic link is deleted.
User audit logs
No TTL
The duration in which Emporix maintains audit logging information.
Make scenarios history log storage
60 days
The duration within which you can access detailed logs for each execution in the Make scenario history.
Make scenarios audit logs storage
12 months
The time during which the platform retains audit logs for Make scenarios.
Maximum Make scenario execution time
60 min
The maximum time for which a single Make scenario can run. If the elapsed time surpasses this limit, the Make scenario will result in a failure with a timeout error.
Webhook queue size
10k executions
Incoming webhooks are accumulated in a Webhook queue. The queue may experience accumulation when the corresponding Make scenario, responsible for handling the webhooks, is either disabled (e.g., due to an error) or processes the webhooks at a rate slower than their incoming pace.
ce-specversion - version of the cloud event spec (for example v1.0)
ce-id - unique ID of the cloud event, it should be unique with the source when combined
x-emporix-hmac - the hmac signature which is the payload of the cloud event body signed with the OE webhook secret
Link the value stream that is configured to work with the KPIs data.
Complete the following KPI data sources configuration:
Celonis Knowledge Model
Celonis Record
Celonis Record Field
Preview format
To save your configuration, choose Apply.
Celonis Record
Celonis Record Field
Choose Apply to save your configuration.










Choose Preserve Log.
Clear any existing logs by choosing Clear network log.
Reproduce the issue on the page where it occurred.
Stop the recording by choosing the Stop recording (red square) button.
Right-click anywhere in the list of network requests and select Save all as HAR with content.
Save the file to your computer.
Attach the HAR file to the OE support case.
In the Name column, right-click the file where the issue occurred and then choose Export HAR.
Save the file to your computer.
Attach the HAR file to the OE support case.
Reproduce the issue on the page where it occurred.
Stop the recording by choosing the Stop recording (red square) button.
Right-click anywhere in the list of network requests and select Save all as HAR with content.
Save the file to your computer.
Attach the HAR file to the OE support case.
In the Network tab, right-click the file where the issue occurred and then choose Save as HAR.
Save the file to your computer.
Attach the HAR file to the OE support case.
















Time - timestamp showing when the event occurred, for example: 2024-04-29T13:48:13.992+00:00
Instance ID - ID of the value stream instance where the event occurred, for example: 01HQRB60P23FQ5DV8X8SPH4NED-01H7YP07GX9TFBK2S0RRRT11WY-01HWN1SXGG90FP1M6HWWZ9Y5JR.
Scenario - the name of the Make scenario for which the event is configured and where it occurred, for example: Send Quote (1343). The copy button copies the scenario link.
When you click the scenario link in the logs, it opens the scenario in Make.
Scenario Run ID - the ID of a specific scenario execution where logs are stored, for example: abb589c4c9cd4cdbbe428f949d3d980a.
When you click on the run ID link, it opens the logs of a specific scenario execution in Make.
HMAC - details of the keyed-Hash Message Authentication Code secret.
ce-specversion - the version of the CloudEvents specification with which the event is compliant, for example: 1:0.
User-agent - Emporix Orchestration/production
Payload - event payload details, for example:



Status: represents the current status of the process context.
Here's an example of a process context in JSON format.
The context section holds all the event types received during the value stream run and the payload associated with each of them, if any. It also contains the initial payload when the value stream is started. The following keys are present in the context section:
executionTemplateID: The unique identifier for the value stream.
executionTemplateVersionID: The unique identifier for the specific version of the value stream.
instanceid: The unique identifier for the instance of the value stream run.
processid: A unique identifier for the workflow. It combines the tenant ID, value stream name, value stream version ID, and an additional unique identifier.
Event types are keys within the context object that represent specific events that occurred during the value stream run. Each event type may have an associated payload.
In the example provided, product_assigned_discount_category is an event type with the payload:
This means that when the value stream run received a product_assigned_discount_category event, the event carried the payload with the discount_category set to vip.
The metadata section provides additional information about the process context. The example contains the tenantkey, which represents the unique identifier for the tenant associated with the process context. You can also see the createdAt and updatedAt fields with timestamps - you can check when the process context for a given value stream instance was created and updated.
The status section indicates the current status of the process context. The statuses that are valid for the process context, are either Started or Finished. In the provided example, the status is finished, meaning the value stream run has been completed. If the value stream run is completed and the process context status if finished, we can no longer write to this process context.
The process context is a critical component in the system that enables tracking and managing the processing of the value stream versions. Understanding its structure and the information it holds, developers and users can effectively interact with it and monitor it as well.
"root":{
"globalAvailabilty":[
0:{
"id":"15536937"
"globalStock":40
"orderedAmount":1
}
1:{
"id":"20635992"
"globalStock":12
"orderedAmount":1
}
]
} {
"metadata": {
"tenant": "01H7YP07GX9TFRK4S0RRGZ11WT",
"createdAt": "2023-10-16T10:13:42.322Z",
"updatedAt": "2023-10-16T10:14:09.354Z"
},
"context": {
"executionTemplateID": "01HCD6ZYTDSBZ6KS1GNQYX529N",
"executionTemplateVersionID": "01HCJ2VRN61N7J5WQSYQ9Z1JEZ",
"instanceid": "691a4c2c-595b-4b3c-9cf3-0471f6e074c8",
"metadata": {
"externalID": "",
"scenarioid": ""
},
"order.found": {
"orderId": "EON1075"
},
"order.spotbuy.pending": {},
"order.spotbuy.received": {
"contact": {
"companyId": "",
"companyName": "",
"id": "",
"language": "en",
"name": "",
"value": ""
},
"contacts": [],
"forms": [
{
"metadata": {
"formId": "01HAPCT5KS0T7METVDM4GBM0MC",
"formName": "PO Form"
},
"payload": {
"businessNeed": "Spot Buy",
"expectedDeliveryDate": "2023-10-18T09:00:00.000Z",
"item": "30610403",
"itemCost": 112.79,
"orderDate": "2023-10-16T10:13:42.183Z",
"purchaseOrderNumber": "EON1075-30610403",
"purchaseSum": 360.928,
"quantity": 4,
"requesterName": "Emporix",
"requestorEmail": "[email protected]",
"select": "pending"
}
}
],
"magicLinkId": "01HCVZM8QK3RNNJ14FG1WBBT92",
"submission": 1
},
"processid": "01HCJ2VRN61N7J5WQSYQ9Z1JEZ-01H7YP07GX9TFBK2S0RRGZ11WY-01HCVZM98AP21QG53MEBRE9XVM"
},
"status": "finished"
}{
"discount_category": "vip"
}There are two options when creating a business rule, simple and advanced mode:
Simple - this is a wizard interface that lets you create rules without needing any technical understanding of the underlying expression language
Advanced - lets you create a rule using a comprehensive expression language called JsonLogic
It's possible to add and edit the same rule in both modes as long as the expression can be represented in the simple mode. If you create a more complex, advanced expression, it cannot be viewed in the simple mode.
In the Management Dashboard, go to OE -> Rulestore.
Choose the Create Rule button. Simple mode opens by default.
First, the Name block gets activated in the wizard. Enter the name for the rule in the Rule Name field. You can see it displayed in the block. This name is important because it is used as the variable name, when referenced by a value stream.
Add a Description for the rule. Use this to help you select the correct rule to reference in your value stream.
Choose a Unit for the rule. Units can be of different formats - boolean, string, number, date or list.
Enter the rule Expression. You can choose from a list of the predefined expressions depending on the selected unit. For instance, to decide whether the value of a number should be less, greater or equal to the specified value.
In the Management Dashboard, go to OE -> Rulestore.
Choose the Create Rule button.
Go to Advanced mode and enter a name for the new rule.
Add a Description for the rule. Use this to help you select the correct rule to make sure it reflects the rule behavior, not to mistake it with a different rule when you reference it in your value stream.
Enter a rule expression. The rule should be created in a JsonLogic format.
A JsonLogic expression for the above order rule:
If you add a JsonLogic expression to create a rule, a syntax validation takes place. If the expression is not a valid JsonLogic format, you get an error message and the rule cannot be created.
You can define a variety of different rules for your business purposes, they can include dealing with orders, work processes, effectiveness and many others.
As an example, you can define a rule that works as a condition that filters return reasons. In this rule, the only reasons to approve a return are a damaged product, wrong price, or a wrong product.
You can set up a rule that will work for products delivered only after a specified date and time.

The value streams can be started by external events. These events can be sent to a configured OE event-receiver endpoint, for instance about stock levels.
The list of configured events is available in the Event Registry section of the Events module. You can check the list of all the events that were created for your tenant and filter them by a display name, or by an event name.
You can use Make scenarios as triggers for your value stream.
Choose a scenario that is already created and visible in the list, use Create New Scenario to create a new trigger scenario, or Go to Scenarios to check for scenarios that are prepared to be used publicly or available for your team only. The Create New Scenarios and Go to Value Streams actions open a new tab with the Make application. For example, you might use a Make scenario to perform pre-processing on the data before initiating the value stream and passing the data through.
Timer event serves as a flexible mechanism to schedule and control execution of a value stream. It supports a variety of trigger configurations to manage diverse timing and scheduling requirements. Timer events can work both as starting triggers, when they initialize a value stream, and as the wake-up triggers placed between the value stream steps.
Process steps are made of Make scenarios, Cloud Functions, or other value streams that can be combined together to create one automated end to end process.
You can use one or more Make scenarios in a process step. If you use multiple scenarios, the process runs the scenarios sequentially according to the process plan that is prepared in a value stream, from top to bottom.
You can choose a Make scenario that is already created in Make and visible in the list in the process step. If you don't want to select an existing scenario, choose Create New Scenario to create a new process step scenario, or Go to Scenarios to check for scenarios that are prepared to be used publicly or available for your team only. The Create New Scenarios and Go to Value Streams actions open a new tab with the Make application.
If you need to create a more complex value stream, you can embed one value stream into another by adding it as one of the process steps. All the value streams are separate processes, but they can also work together in a parent-child dependency. This is useful if you want to address a larger business use case. In such a situation, you can embed processes within one another without the need to run them separately.
A Cloud Function is a small, event-driven piece of code that runs in the cloud without requiring server management. It's designed to execute specific tasks in response to triggers like HTTP requests, file uploads, or database changes. They can be used for various purposes, such as processing real-time data, automating workflows, handling API requests, or responding to IoT events.
When used as a process step in a value stream, they allow to pass data from the process context to the selected Cloud Function so that it processes it, and when the step finishes the data comes back to the process context. You may need a proper authentication between OE and the Cloud Function Service. The execution of the value stream calls the relevant Cloud Functions defined in the process steps. The supported Cloud Functions include Azure Functions, AWS Lambdas and Google Cloud Functions.
Filters allow a step to only act on a selected event type. It is possible to add more than one filter for the same step. If you set filters for specific event types within a step, it means that the value stream step will run only if the conditions of the filter are met. If they are not met, the process step is skipped. Filters use the event types that are configured for your OE tenant.
To open and configure the filter settings window:
Choose Add Logic in a chosen process step
If you choose the option, it opens a window with a visible Filter tab.
Choose an Event Type from the drop-down list
The list shows event types that are configured for the tenant and available for the scenario context. It's possible to select multiple event types at one time.
Choose the IN or AND event matching
IN - means the condition passes if the event type matches at least one of the selected events from the list. The bahavior is based on matching any of these events.
AND - means the condition passes only if the event type matches all the events selected in the list, the events accumulate across steps. The behavior is based on matching every event in this list.
For example:
If you select IN with welcome_pack_completion_event and registration_event selected, the filter passes if either one happens. If you select AND with the same list, the filter only passes once both events happen.
Decisions work on the level of evaluating whether an action should be taken or not. It means, they allow setting up a condition for the step that is evaluated on a true / false logic.
To open the decision window choose Add Logic in a chosen process step and go to the Decision tab.
For the decisions, we use the JSON logic expression language and extend it with custom operators. The JSON logic expression can be either a direct expression placed in the field, or it can refer to a rule created in the Rulestore.
Custom operators:
businessRule - References a rule defined in the rulestore.
The first expected value is the ID for the rule that you want to use, and the others are the variables that should be available for the referenced business rule. These variables are indicated as value pairs.
Rulestore
Logic
Data
Result
context - This is used to access a dynamic value inside the process context in a way similar to the var operator. If access to a dotted key is needed, the dotted key should be quoted with backquotes (`).
Logic
Process Context
This operator is valid within the value stream logic. Since the process context does not exist within the Rulestore, the necessary variables must be included as parameters.
Extended behavior:
var - Inside a value stream the data means the context with the parent key.
For example if the process context is:
Then, the data for the JSON logic expression would be:
Dotted keys must be backquoted to work.
Logic
Process Context
Result
Start with populating a datastore through the Set a Datastore from a CSV Upload module in Make. The supported file format for the data is CSV. To pull the CSV file, you can create a scenario in Make with relevant modules.
For example, for fetching the file you can use the HTTP module and then link it to the Set a Datastore from a CSV Upload module.
Add a URL to your file in the HTTP module.
Configure the Set a Datastore from a CSV Upload:
Add a name for the Namespace - it's used for organizing and separating data, it binds the data extracted from the CSV file, for example: active_users.
Enter the name of the Key Column, it's the key for each imported entity, for example: id.
Select the HTTP - Get a file option.
A common use case is to use datastores to populate select boxes in forms - create a form to access the values from the datastore.
Go to Management Dashboard -> OE -> Forms and choose Create form.
Choose the Dropdown list component and go to the Data tab.
Datastores are authenticated with Magic Links. You can add the Create Form Magic Link ID module in a Make scenario and in the Included Forms select the form that you'd prepared for the datastore search. The module creates a link to the form that is used in the List Values in a Datastore module. You can add it in the same scenario or create a new one only for searching the data.
Add the List Values in a Datastore module.
Enter the Namespace that was set up in the Set a Datastore from a CSV Upload, in our example it was active_users.
Choose the Magic Link ID from the Create Form Magic Link ID module.
Run the scenario. If the run is successful, you can see the search results in bundles, in the List Values in a Datastore module output. See the returned demo data:
The search behaviour is a text match, if the first name is the same as the last name, or a part of it, the search returns results for both.
Whenever you want to update the stored data, simply upload a new file with the changes. After a new upload, the old data is overridden with new one. To remove all the data, upload an empty file.
The whole example can be viewed in a following flow, but it can be also separated to upload data for the datastore only (with the Get a File and Set a Datastore from a CSV Upload modules), and later to search for data within the configured datastore (with the Create Form Magic Link and List Values in a Datastore modules).
Get inspired by this example of a value stream to see how digital solutions can improve your company's workflows and eliminate bottlenecks.
The Shift High Stock Product value stream was created for demonstration purposes. In this example, Orchestration Engine uses the Emporix Commerce Engine API to perform some of the actions, but it can work with any other systems. There's no dependency between Emporix OE and Emporix CE platforms.
The value stream is an example designed to meet the demand for maintaining optimal stock levels. It is prepared to identify and prioritize products with a high stock level, ensuring their successful promotion and preventing any issues with overstock.
The value stream consists of a trigger and subsequent process steps that end in promoting a product and sending a post action feedback to Celonis.
Solve errors with Value Streams Debugger - it helps out to solve problems with value streams on your own.
{">": [{"var": "order"},1000]}// ID: "01ARZ3NDEKTSV4RRFFQ69G5FAV"
{"==": [{"var": "a"}, {"var": "b"}]}{"businessRule": ["01ARZ3NDEKTSV4RRFFQ69G5FAV", ["a", 1], ["b", {"var": "n"}]]}{"event": {"key": "value"}}{"context": {"event": {"key": "value"}}}{"var": "context.`key.with.dots`.boolean"}












Result

https://api.emporix.io/.../data-store/active_users/values.Add the Request Headers:
First - key: X-Magic-Link, value: _MAGIC_LINK_ID_.
Second - key: X-Tenant, value: _TENANT_ID_.
Add the Search Query Name - it's used for refining the results, for example: search.
Leave the Search Threshold with the 0,3 value.
Choose the search type, for example, you want to search for users called Nathan.
Add a limit for your search, for example, you want to limit the number of active users called Nathan to 250 results.



High Stock Trigger is a mandatory part of the value stream as it sets off the rest of the steps. The trigger is based on an event that checks availability of a product and a condition related to its stock number. In this case, if the stock number exceeds 100, a new orchestration starts.
The value stream has seven process steps that work one after another to increase selling of a product that is overstocked. Each process step is a separate Make scenario that has its own trigger, actions and a completion event.
This step is designed to boost the product in the Search Engine's search ranking so that it appears at the top of the list. It's a Make scenario that begins with a trigger — an event received from the value stream's trigger, indicating that the orchestration should start. The modules initially gather information about a product, such as its product ID, name, and description — all the details entered into the system for that specific product. Following this, a variable is set to boost the product's popularity. An API call is then made to the Search Engine, and a completion event is sent, indicating that the operation has finished successfully, and the scenario is completed. This ends with a product_boosted event type.
This step assigns the product to a Discount category in the Commerce or Pricing Engine. It starts with a trigger after the previous step is completed, searches for a category with a Discount name, gets the details of the product and assigns it to the Discount category. This ends with a product_assigned_discount_category event, that shows the scenario has finished successfully.
This step shows the possibilities of complex B2B pricing updates, in this case adding a discount price of the product to a price list. It uses the Pricing Engine, includes a discount variable, which is used for the price mapping. When the scenario is triggered, it searches for a product and its price. It then adds the discount price to the price list and either ends directly with the discount_pricelist_created event, or makes an update of the price in the price list and then ends with the discount_pricelist_created event.
This step creates a coupon for selected customers. The scenario looks for a Discount category and an order with the specific product ID. The order details include information about the product and the customer who created the order. When the data about the product, customer and order is gathered, the coupon is created. The scenario can end with two possibilities, depending if the coupon had been created in the past or not, it's designed to run multiple times. This scenario example ends either with the customer_coupon_created event when a new coupon is created, or customer_coupon_exists event when it was checked that the coupon already exists and is valid for the customer.
This scenario creates a quote for customers, it first gets the order details and searches for existing quotes based on productId, customerId and siteCode. If there's no existing quote, the scenario creates the quote checking the eligibility of the product. It then changes the quote's status to open, which means the quote is ready for the user to approve. The scenario aggregates the data to JSON and the scenario finishes with the customers_quoted event.
This scenario uses the CE Assets API to create a new image of the product with a promotion label. It makes it the primary image of the product.
This step is a scenario created for publishing a promotional banner related to the product. When it's triggered, it uses an asset from Contentful that had been designed for promotions for Emporix. If the asset has been already published, then the publishing is ignored. But if the asset has not been published yet, the scenario generates the banner and ends with the promotional_banners_generated event.
Update Celonis case is the last step of the value stream. It sends process context data to Celonis for further processing. In this example it gets the productId, startStockLevel and quotes.

Check the reason why it stopped.
See at which exact step the value stream instance failed.
If you are using a complex value stream that involves subflows and embeds other value stream, you have to debug all the child value stream separately from the parent one.
To access the Value Streams Debugger:
Go to the OE module in Management Dashboard and open the Value Streams listing.
Choose the value stream that you want to check and click the Magnifier icon next to it.
You can also access the debug mode directly from the value stream by choosing the Go to Debugger option.
This shows you a list of all the process instances for the selected value stream. The debugger works for two separate views of your value stream instances:
Test - whenever you run a test for your value stream it creates a new instance that can be later checked in the debug under Test toggle
Published - this view shows all the instances of production value stream
Instance ID - you can use the Instance ID or a request ID to filter the results and to find the specific instance that you want to check
Request ID is a webhook ID
Instance ID is an ID of the value stream instance
To find the request ID and Instance ID, go to your scenario in Make, open the history tab and choose the instance you want to check. The IDs are visible when you choose to view the scenario's module details.
Start time and end time of the value stream instance run.
Status of the instance - you can check if a process instance was finished, is running, failed or was terminated.
Finished - an instance of a value stream that ended, it means all the steps of the instance were successfully completed.
Running - an instance that is in progress, it can be processing some action at the moment, but it can also be in a paused state, waiting for a trigger event.
Failed - an instance of the value stream that was not completed and could not complete all the steps.
Terminated - an instance that was running but was stopped by a user.
A value stream's Version of the instance - to check how versioning works in OE, see .
To open the debug mode, click on the value stream instance.
The debug mode is a read only page that shows each step of the value stream instance with detailed process logs. The logs allow you to see data flowing through the system up to the point where it stopped because of an error. The errors that you can see, are also the ones passed back by other systems, which may have contributed to the failure.
For the triggers, you can check the logs of the event that took place.
For process steps with Make scenarios, you can check the input and output logs, as well as the information about completion events.
You can use the debugger to move directly to a scenario that is linked to one process step. You don't have to search for the scenario in Make, it opens automatically from the value stream instance after you click on the Go to Scenario Logs button.
If there's an instance of an value stream that has stopped at one step, but the step shows a running status, you can navigate directly to this instance's Make scenario to address the problem immediately.
For every part of an instance the following statuses are possible:
Triggers - waiting/received
Process steps - running/finished
Filtered - for the steps where filters are applied
Pending - only at the beginning, during the initialization
If value stream are stopped, they can be stuck either at a value stream level or at a running instance level.
A stuck value stream means that a cause is at Make's scenario step and all of the value stream' instances are queued at the same place - at the same scenario step.
A stuck value stream' instance is failing at a value stream' step showing the running status. The cause can be an error of a Make scenario which is a part of the value stream step, or an incomplete process.
To see why an value stream instance is failing you can go through the following steps:
Check the state of an process step in the debug mode and then move directly to Make scenario logs to see the details.
Check the scenario history in Make.
Go to Incomplete Processes (executions) to see the list of failed processes.
Open the details view, to see the scenario record and the reason for interruption.
The debugger mode allows replaying a single step of a value stream instance. You can use the feature whenever you want to test a Make scenario added in a step, directly in the OE view. This helps you in testing separate scenarios without the need to run the entire process. The replay feature can be run for all the steps that are built with Make scenarios and are in a running or finished status. However, it cannot be used for the triggers.
Whenever you replay a step, its logs are updated with the latest results, and the run is visible in Make's scenario history.
Additionally, in the debugger logs for the particular step, you can see a replayid in the Input data. The ID helps in identifying the replay run when needed. The outcome of the replayed step is a part of the whole process context.
Replaying a single step of a value stream does not create a new running value stream instance.
If you need to stop your running value stream instance, you can do that with the terminate button. As a safety precaution, you have to copy the instance's name and enter it in the confirmation before the system terminates the instance.
A value stream may require collecting information from users. Use Forms.io to collect data and build customized forms.
Using Form.io, OE allows you to build highly customizable forms. The platform offers extensive customization options, makes it possible to control various aspects of the form's appearance and behavior. You can streamline the data collection processes by creating forms that perfectly align with your specific needs and guidelines.
The forms page is available in the Management Dashboard, under OE. On this page you can:
see all the forms that were created in your tenant
create a new form
create a form link
duplicate a form
remove a form
import and export multiple forms
Moreover, you can filter the forms by name, ID, or the language in which they were prepared.
To start working with the forms builder, you should get familiar with the basic features and the components that are used there. To learn about the solution, see the documentation.
To create a new form:
Go to Management Dashboard -> OE -> Forms.
In the forms page, choose the Create Form button.
Add the name for your new form.
Choose the components that you want to have in the form. You can select among the , , and
Add the components by using the drag and drop feature, which allows for putting the components at any place, before or after other components.
For every component that you add, there's a detailed view with the configuration settings for this specific field in the form. You can customize your component with different values, data or conditions.
For more details, see the documentation.
To preview the form after you've added all the components, choose the Preview Form button.
When your form is ready and you want to save it, choose the Save Form button. The form will appear in the forms page list.
Forms can be customized to incorporate dynamic data, allowing for the creation of personalized forms tailored to specific recipients. See the Coffee Preferences example below, to learn how to set up such a form.
Go to Forms and choose to create a new form.
Add a dropdown menu component and call it, for example, Coffee Preferences.
In the Coffee Preferences component, go to the Data tab select Custom as a Data Source Type.
Add Custom Values, they are needed to return the value options. For example, values = data._magicLinkCustomAttributes.coffeeType[0].options
Add Text Field components that will also work as additional custom attributes to choose in a form. For example, milk_type and sugar_amount.
In the milk_type component go to Data tab and add a JavaScript code under Custom Default Value. For example, value = data._magicLinkCustomAttributes.defaults[0].milkType.
Save the form.
After creating the components, you can check the preview of the created form:
Forms are referenced by the module that allows you to configure custom form attributes. Using forms builder and the module together you can add dynamic, personalized behavior for each recipient. For more information, see the and documentation.
To edit an existing form, click on it in the forms list. The edit mode is opened straight away and you can apply your changes.
If you want to reorder or edit the selected components, you use the inline components settings and adjust the component to your needs.
For more details, see the documentation.
To duplicate an existing form, choose the Duplicate icon.
After you duplicate the form, it's added to the forms list and displayed with the suffix "copy" in the name. You can then rename the form.
To delete a form, choose the Delete icon for the selected form. You have to confirm the deletion of the form by entering its name in the confirmation window.
The import/export feature makes it easy to move multiple forms from one tenant to another.
To export multiple forms from your tenant:
Go to OE -> Forms and choose Import/Export button.
Select the forms that you want to export and choose Download. This downloads separate files in a JSON format with all the selected forms.
Go to the tenant where you wish to import the forms. In OE -> Forms choose Import/Export button.
Choose import and add the previously downloaded JSON file.
Choose Upload. This imports all the forms that were previously exported. When the process completes, you get a notification message about successful upload.
Forms localization makes it possible to translate your forms from the base language to other languages. You can create forms with multiple languages, with the same structure as in the original language but with corresponding fields translated.
The translations should always be prepared before the link with the form is sent to the contact.
Go to OE -> Forms.
Open the form that you want to translate.
To start the translation, choose the current language that you use and then click on the Download button. This downloads myData.json file with the names of the fields. you can also use Download Template, which downloads a .json file without the selected language. For example:
Open the downloaded .json file and enter the translated text.
Example of an English to German translation:
Save the file.
Choose Upload Translations to upload the translated .json file back to the form in Management Dashboard.
Choose the translation language and add the .json file.
Choose Save.
Result: Your form is now available in two languages: English (in this case it's the default) and German. Both of the versions are visible in the form preview.
When the form is translated, there's a relevant lang parameter visible in the URL of the form. The parameter can also be used in configuration of a mail module in a Make scenario responsible for sending the forms.
Establish a connection between OE and Cloud Events.
Orchestration Engine uses CloudEvents specifically with HTTP Protocol Binding and Binary Content Mode, where currently we support application/json only as a content-type for the data payload.
The CloudEvents should be used if you want third party systems to send trigger events to OE. If you want to use triggers directly from Celonis, see the documentation.
CloudEvents is a specification for describing event data in a common way. It provides a structured format for event data and includes essential metadata such as the event type, source, and timestamp. HTTP Protocol Binding in CloudEvents defines how to use HTTP for transporting event data, and Binary Content Mode specifies how to encode the event in a binary format within the HTTP message.
Manage OE users and permissions.
To prepare the users to work with OE and Make, you need to configure their accounts and set the correct authorizations. The users have to be added to the tenant and assigned to proper user groups with sets of access rights for the roles they will perform.
To achieve this, open the Users and Groups dashboard, which allows you to manage the users’ data in general, both for Commerce Engine (CE) and for Orchestration Engine (OE). Using the dashboard, you can add, edit and delete users' data. You can also manage users by filtering or sorting by users’ first name, last name, e-mail address, department, or status.
The status types are:
Green - the user is active
Grey - the user account is locked
Learn how to initiate and manage value streams in Emporix's Orchestration Engine (OE) using Celonis signals as triggers.
The Celonis signals and events can be used as triggers for value streams. To retrieve data from Celonis in your value stream, you can use Make scenarios as triggers, with included Start New Orchestration module or the Celonis Intelligence API modules.
The modules below should be used only if you want to have Celonis signals as triggers for your value streams. If you want third party systems to send trigger events to OE, use the CloudEvents described in documentation.
The Start New Orchestration module is available both as part of but also can be used from your Celonis EMS Action Flows. In the later case, you can trigger value streams as part of your EMS process mining setup. For example, you can use this module to start a value stream based on certain triggered signals.
{"n": 1}true{"context": "`key.with.dots`.boolean"}{"key.with.dots": {"boolean": true}}{"key.with.dots": {"boolean": true}}truetrue













































{
"Text Field": "",
"Number": "",
"Password": "",
"Radio": ""
}{
"Text Field": "Textfeld",
"Number": "Nummer",
"Password": "Passwort",
"Radio": "Radio"
}In OE, an event-receiver serves as an endpoint specifically designed to receive external events, which are then used for initializing or resuming value stream. These events, for example a placement of a new order, are configured and transmitted to the receiver, subsequently triggering the corresponding processes within OE.
To ensure that OE can receive events, the initial step is to establish the connection and verify that authentication functions properly. This connection requires specific OE tenant data, including the endpoint's source and secret values. These values are then utilized, for instance, in the Postman setup for development to ensure that requests are directed to the correct OE tenant.
Once the connection is established and authenticated, you can start with creating a relevant event within OE. This sequential process ensures smooth communication and accurate event handling between systems.
To establish the connection between the event-receiver endpoint and OE:
Go to OE → Admin → Settings.
Use the Source and Secret values for postman environment configuration.
OE:
Postman:
Source - the attribute that defines the endpoint that is be used for sending an event to OE, it’s unique for every tenant.
If the endpoint is: https://signals.emporix.io/e/{{src}}, your src value is the value as in the source field in OE.
Secret - an HMAC (keyed-Hash Message Authentication Code) secret; an HMAC is a type of authentication technique that uses a hash function and a secret key. You can use it to sign a request with a shared secret. It verifies if data is correct and authentic. In this configuration the hmac secret is a value for the x-emporix-hmac header.
The header attributes are defined by the CloudEvents specification and are mandatory for sending the requests:
ce-source - Represents the source of the event, providing information about where the event originated. The ce-source attribute typically contains a URI identifying the event producer or the endpoint that generated the event.
ce-type - Represents the type of the event, providing information about the kind of event that occurred. The ce-type attribute typically contains a string value that identifies the specific type or category of the event, such as order.created or payment.processed. The ce-type events are the ones that refer the OE tenant .
ce-specversion - Represents the version of the CloudEvents specification with which the event is compliant. The ce-specversion attribute typically contains a string value indicating the version of the CloudEvents specification being used, such as "1.0" or "1.1".
ce-id - Represents the unique identifier for the event instance. The ce-id attribute typically contains a string value that serves as a globally unique identifier (GUID) or a Universally Unique Identifier (UUID) for the event.
The signature has to be added in the request that is sent. You can use the following sample to see how to generate the signature
The request.data is the payload.
OE is schemaless, the payload can be empty or it can be in any valid Json format.
The secret is the one that you need to copy from the Management Dashboard admin settings.
Here's an example of a request to generate a signature:
Once the connection setup is complete, you can proceed to create a new event. Ensure that the name of the event you create corresponds to the event used in Postman's POST request.
Utilize the ce-type attribute to recognize events and provide information indicating that the event has occurred. By aligning the event configured in OE with the ce-type name, you ensure that information about events is accurately received and processed. This alignment facilitates seamless communication and data handling between OE and Postman.
To create a new event:
Go to OE -> Admin -> Events.
Choose Create New Event.
Specify the Display and Event names.
Display name is the name that is visible in the UI when you create an execution template, for example: Order Created.
Event name is the name of the registered event, for example: emporix/dcp/order.created.
Save your changes.
User - an employee using the Emporix Management Dashboard.
User Group - a group of users that share some common characteristics, like performing similar job. User group defines access controls for the users.
Role - a combination of predefined permissions that allow users to perform some actions on resources within the system. You can apply a role to a user group.
Permission - a mechanism for limiting what actions a user belonging to a role can perform on specific resources.
Access controls - a combination of roles and resources. For example, a user with a manage access control on product resources can view, create, delete, and edit product entities.
Resource or Entity - the object type within the Emporix Management Dashboard.
Action - the ability to perform an action on entities of specific type.
This diagram shows a high-level view of the relationships between users, groups, and roles:
To create a user of Management Dashboard:
In Management Dashboard, go to the Administration module -> Users and Roles.
Click Create New User and fill in all the fields. They're all mandatory.
Choose Save to add your new user to the users list. The user automatically receives an email invitation to join the tenant.
If you decide to stop adding the new user, you can use the Discard option. It clears all the fields and removes the data you’d entered.
It's also possible to add multiple users to your tenant at one time. You can do that through the Developer Portal using the CSV import users feature. For more information, see the documentation.
If the user already had an active account, or is an existing user of a different tenant, they are visible as an active user right away, without the provisioning status.
To allow your user to access the Management Dashboard, you need to set up the correct access controls. To do this, assign the users to the right user groups. Every user group can be assigned roles with associated permissions. When you assign a user to a group, you give them the permissions that the roles have.
The default groups for OE are:
OE Viewer
The users in the viewers group have read access only, they cannot modify anything.
OE read access
Member
OE Editor
The users in the editors group can edit OE value stream and Make scenarios.
OE read and edit access
Make application developer integromat.app_developer
Make scenario editor integromat.scenario_edit
OE Manager
The users in the group can conduct the development tasks, such as creating applications in Make.
We recommend to use only the OE user groups that are provided by default. Still, it's possible to create custom user groups.
The groups are created in the Groups tab. You need to provide a group name with a description, plus set the relevant access controls.
In the Administration module, go to Users and Groups.
Go to the Groups tab and click the Create New Group.
In General section, provide the group's name and description.
For roles specific to OE, choose the Standard role and select Orchestration Engine from the drop-down menu. You can then select one of the predefined access rights - Viewer, Editor, Manager or Administrator.
Set up the group access rights in the Management Dashboard Settings section. For sole OE groups, choose OE in Access to define permissions for OE-specific resources:
Read access selected: a user is able to see entities of a specific type
Edit access selected: a user is able to see and edit entities of a specific type
Manage access selected: a user is able to see, edit, create, and delete entities of a specific type
Administrate access selected: a user is able to do all available actions on the entities of a specific type
none selected: a user is not able to see entities of a specific type
Use the checkboxes to define the access rights only for the particular types the group is supposed to have access to. If a group is not supposed to manage or even see a particular entity in the Management Dashboard, don't select any permission.
You can see that the access controls are correlated and selecting one of the available access controls for a specific entity automatically selects the same access control for another one. This behavior ensures that users get the same access to the related resources. To disable existing access rights for an entity, uncheck the checkbox.
Confirm with Save.
When creating a new group, or editing an existing one, you can add the group’s users right away in the Members tab.
To see the permissions assigned to a specific group, open a particular group in Administration -> Users and Groups and check the Management Dashboard Settings.
Check the diagram below to see the details of the relationship between user groups, roles, access controls and permissions in OE:
Prerequisites:
The user must be a part of one or more OE user groups.
The user must have at least one of the following roles assigned within the OE user group: integromat.scenario_edit or integromat.app_developer.
To start with the synchronization:
Choose the users that you want to have access to Make and assign them to the right groups as defined in the prerequisites section. The system automatically scans all OE users' permissions and identifies those users who have the integromat.scenario_edit or integromat.app_developer roles.
Add the selected users to your Make organization (tenant). Once the eligible users are identified, the system automatically adds them to the list of authorized users of their corresponding Make organization as member users.
When you add a user to your Make organization, they receive an email notification with instructions how to access and set up their accounts.
When any changes are made to the groups a OE user belongs to, the synchronization process is triggered to ensure that the user's access to the Make organization is up-to-date. Users receive a new email invitation whenever their group membership is altered.
If a user's group membership is modified and the user no longer has one of the two specified roles assigned (Integromat Scenario Editor or Integromat App Developer), the user still retains access to the Make organization. In this case, an administrator should manually remove the user's entry from the Make organization to maintain a consistent and secure environment.
The Resume Event module is used as a wake up trigger so an action flows can resume a value stream instance. In order to resume a value stream from outside of OE, you must provide the value stream instance id. In Make, you can obtain the instance id through the extractor and/or store it as an augmented attribute on a record.
The Celonis Intelligence API is a software interface that offers regulated access to Celonis Intelligence, enabling its integration with third-party tools and platforms.
The start using Celonis Intelligence API, it has to be enabled on the Celonis side. Contact the Celonis Team for the enablement support.
If you work with Celonis, make sure the Emporix IPs are allowlisted with the IP-based Access Restrictions as documented by the company. You can find the information about Emporix IPs under Firewall Allowlisting.
The module makes it possible to link Make scenarios with triggers from Celonis Knowledge Models. To configure the module, you need to have a trigger set up in your Celonis account for the specific knowledge model you want to use. Only then, you can configure the Record Trigger Event module in Make.
The steps below are used for demonstration purposes to show how a Celonis Knowledge Model trigger can be set up and then used in OE, but for exact guidance and details on the solution, see the Celonis Triggers documentation.
To set up the trigger in your Celonis account:
Go to Studio -> your Space.
Choose your configured Knowledge Model and go to Triggers.
Add a name for the trigger, select the Record that should be monitored by the trigger and use apply Filters to define the record items.
Save the trigger configuration.
To configure the Record Trigger Event module in Make:
In your Make scenario, select the Record Trigger Event as the Celonis Intelligence API module.
Choose to add a new webhook in the module configuration.
Add the name for the subscription. We recommend to add a unique name for the subscription as it later helps in identifying the correct data on Celonis side, in the Knowledge Model. See the images below to see how the Subscription Name in Make module is reflected in Celonis Knowledge Model trigger subscribers.
Choose the Knowledge Model from which you want to gather data, and for which you had previously configured the trigger.
After choosing the Knowledge Model, you can select the trigger from the list of available triggers, for example: new overstock items. The name in the module's drop-down list reflects the name that is visible in the Triggers list in the Celonis Knowledge Model.
Save the module configuration and then the scenario.
When the connection is configured, you can see it as a new subscription in your trigger details, in the Celonis account. When you open the subscribers info, you can see your trigger subscription logs.
To see an example of how the trigger works after receiving a new data load:
In your Celonis account go to Data Pools -> your Data Source -> Data Connections and choose Add Data Connection.
Choose Upload Files and with select the file you want to upload.
The file is visible under Files, choose Import to upload.
Go to Data Pools again -> your Data Source -> Data Jobs and choose Execute Data Job.
Select Full Load and choose Execute Selection.
When the data load job is executed, the Make scenario with the configured Record Trigger Event is started. You can later check the module logs in the scenario runs history.
Whenever you want to remove the subscription to the trigger, you can do so by deleting the webhook in your Make account.
The module enables you to retrieve data related to specific KPIs from a Celonis Knowledge Model. To set it up, first establish a connection to your Celonis account, then select a Knowledge Model from the available options linked to your account. Once a model is chosen, you can specify the KPI from which you want to extract the data.
The module is configured to make an API call to the Celonis Intelligence API to fetch data from Celonis Knowledge Model.
The module allows you to choose the Celonis Knowledge Model from which you want to fetch data. The results are returned in bundles. You can configure the data list using the following fields:
Fields items - use them to select which specifics you want to gather from the data model, the items might include attributes like for example company codes, shipping methods, or vendors.
Filters - you can choose different types of filters: select the filter from a drop-down list of items, or add a custom filter expression, for example PLANT eq "P05".
Sort - define the way the data is displayed on the list. If you want to use multiple fields for sorting, you can put them in an expression with a comma separated list of fields, for example sort=+field1,-field2,field3.
Limit - use it to limit the data amount fetched from the knowledge model, you can set up the max number of returned bundles. The default page size is set to 50.
Depending on your module configuration the data fetched from the knowledge model differs in the output. See the data that is returned with the exemplary mapping setup:
For the Knowledge Model Record Data you can use Breaks as a flow control. Each break lets you define automatic executions and a number of attempts.
{
"id": "0c29dar5-13er-4766-82f8-91e87a45ghy6",
"name": "devenv",
"values": [
{
"key": "src",
"value": "src_w34a44rft567",
"type": "default",
"enabled": true
},
{
"key": "secret",
"value": "<replace with secret from OE->Admin->Events in Management Dashboard>",
"type": "secret",
"enabled": true
},
{
"key": "hmac",
"value": "",
"type": "any",
"enabled": true
}
],
"_postman_variable_scope": "environment",
"_postman_exported_at": "2023-04-19T07:26:28.294Z",
"_postman_exported_using": "Postman/10.10.4"
}
{
"info": {
"_postman_id": "d5ce7aff-7f71-4261-b369-7cd67813e878",
"name": "OE Hello World",
"schema": "https://schema.getpostman.com/json/collection/v2.1.0/collection.json",
"_exporter_id": "12223042"
},
"item": [
{
"name": "say-hello",
"event": [
{
"listen": "prerequest",
"script": {
"exec": [
"var sha256digest = CryptoJS.HmacSHA256(request.data, postman.getEnvironmentVariable(\"secret\"));",
"var base64sha256 = CryptoJS.enc.Base64.stringify( sha256digest);",
"postman.setEnvironmentVariable(\"hmac\", base64sha256);"
],
"type": "text/javascript"
}
}
],
"request": {
"method": "POST",
"header": [
{
"key": "ce-source",
"value": "postman",
"type": "text"
},
{
"key": "ce-type",
"value": "hello_world",
"type": "text"
},
{
"key": "ce-specversion",
"value": "1.0",
"type": "text"
},
{
"key": "ce-id",
"value": "{{$guid}}",
"type": "text"
},
{
"key": "x-emporix-hmac",
"value": "{{hmac}}",
"type": "text"
}
],
"body": {
"mode": "raw",
"raw": "{\n \"hello\" : \"world\"\n}",
"options": {
"raw": {
"language": "json"
}
}
},
"url": {
"raw": "https://signals.emporix.io/e/{{src}}",
"protocol": "https",
"host": [
"signals",
"emporix",
"io"
],
"path": [
"e",
"{{src}}"
]
}
},
"response": []
}
]
}var sha256digest = CryptoJS.HmacSHA256(request.data, "the-secret-from-management-dashboard");
var base64sha256 = CryptoJS.enc.Base64.stringify( sha256digest);
postman.setEnvironmentVariable("hmac", base64sha256);


















OE read, edit and manage access
Make application developer integromat.app_developer
Make scenario editor integromat.scenario_edit
OE Admin
The users in this group can conduct administration tasks such as adding users to the tenant or creating applications in Make.
OE read, edit, manage and admin access
Make owner
Make scenario editor integromat.scenario_edit





Learn how to create a value stream in OE.
Creating a value stream happens in the edit mode. When working in edit mode, you have a possibility to work on one draft version of the value stream, it's not possible to save multiple draft versions.
Whenever you work on the draft and do some changes, they are autosaved overwriting the previous draft.
The draft of each value stream is visible together with the active value stream in the dashboard. Whenever you click on a value stream that was not published yet, it opens in the edit mode for the process. In this case, when a process was not published yet, the Last Published date refers to the time when it was created and autosaved for the first time.
To start with a value stream:
In the Management Dashboard, go to OE -> Value Streams and choose Create Value Stream.
Enter a name for the new value stream and choose Create. Without entering the name, you cannot move to the next steps. There are no restrictions on using special characters in names, and you can create separate value streams with the same name.
Select the trigger that you want to use in the process. You can choose between an event and a Make scenario:
Choose an from the list of events that are created in your tenant. If you have a large number of events, you can narrow the list down by the event’s name using the search field.
To set up a Make scenario as a trigger, you can use the following options:
Choose an existing scenario directly from the list of available scenarios. To edit the scenario, you can use the Edit Scenario action and move directly to the edit mode of the specific scenario in Make.
Choose Go to Scenarios to create a new triggering scenario in Make.
To learn about Make's scenarios, check out the documentation.
Choose Go to Templates to use a Make template that is already available publicly or for the team.
To learn about Make's scenarios templates, check out the
The scenario / process must have at least one occurrence of the Start New Orchestration module, otherwise it is filtered out from a list of available scenarios / processes, or disabled if it’s not configured properly. The module is required because every trigger starts a new OE instance. If the module has not been added, OE doesn’t receive any notifications about the new triggers.
You can use a timer event for setting up a trigger that starts a value stream on a specified schedule, and for setting up how long the process should be paused before it wakes up again.
You can set up the timer trigger in three ways, as a context variable, single event, or a repeating event.
Use Context Variable: you can use it for the mid triggers, the timer uses a variable that is stored in a process context of a value stream, the trigger works when a Make scenario, or a Cloud Function, provide a point in time when a value stream should wake up.
The variable should be configured in a .<event_name>.<content_variable_name> format, for example: .sleeping_until.until, where the wake up trigger is the until property in the sleeping_until event. It has to be a cron expression or an ISO 8601 compliant date.
When writing the expression pay special attention to the format - remember to include the dot (.) before the event name and the content variable name.
The context variable can reference to a time during which a process context should be paused, this can be configured together with a Make scenario. For example:
The context variable is referenced as .sleeping_until.sleep_until using the <event_name>.<content_variable_name> expression format, where:
sleeping_until is the event name, selected as Event Type in the Completion Event Make module
and sleep_until is the content variable name, referenced to by a JSON String in the Completion Event Make module
To set the variable in Make, you can use the Set Variable module.
Single Event: choose a specific date and time when an event should trigger a value stream
Repeating Event: define recurring schedules using precise timing rules, such as every Tuesday, every 30 minutes or every month at specified time.
When you select a trigger, you can see that the user interface provides confirmation that the selection was saved. You can collapse the section using the caret icon in the step's heading line, or simply go to the next step by choosing the plus icon.
After choosing the trigger, select the process step that you want to perform. Each process step can use a single or multiple Make scenarios, call a child value stream, or call a Cloud Function.
To add a process step with Make scenarios, go to the Scenarios tab.
To add a process step with another value stream included, go to the Subflows tab.
To add a Cloud Function as a process step go to the Cloud Functions tab.
You can choose a Make scenario that is already created in Make and visible in the list in the process step. If you don't want to select an existing scenario, choose Create New Scenario to create a new process step scenario, or Go to Scenarios to check for scenarios that are prepared to be used publicly or available for your team only. The Create New Scenarios and Go to Value Streams actions open a new tab with the Make application.
After going to Make, ensure you are working in the right organization. The name should reflect your tenant name in OE. To check if the organization is the correct one, it's enough to hover over the Make's sidebar, the name is visible at the top. Then, you can either click directly on the template to start working with it, or choose the Start guided setup option to continue. Every Make scenario used in a process step must start with a Trigger Event and have the Completion Event module.
If you click on any of the Make's sidebar nodes, it switches off the template and redirects you the selected section in Make. It's not possible to get back to the template.
Additionally, ensure that the scenario is active. The scenario toggle should be switched from OFF to ON. To learn about Make's scenarios, check out the documentation.
Each process step can be made of multiple Make scenarios that can be rearranged at any time using the drag and drop functionality (hamburger icon).
In the Subflows tab, you can choose a value stream that you'd like to include in your parent value stream. By default, value streams are all separate processes, but they can also work together in a parent-child relationship. This is useful if you want to create a more complex process to address a larger business use case. In such a situation, you can embed processes within one another, without the need to run them separately.
Value streams that have an invalid or pending status are not visible in the subflows list.
In the Cloud Functions tab, you can choose one of the already enabled Cloud Functions or create a connection to a new one.
To start creating a connection to a Cloud Function choose New Cloud Function, this opens a window where you can create Cloud Functions groups and structure them in parent-child relationship. To create a single Cloud Function connection, first create a group. When you have the group created you can then move to the Definition tab, where you can specify the Cloud Function configuration.
You can create multiple Cloud Function connections within a group. Groups allow aggregation of multiple Cloud Function connections under a single set of authentication credentials.
In the Definitions tab, provide the following setup for a Cloud Function connection:
Display Name - Cloud Function name
HTTP Method - GET, HEAD, POST, PUT, DELETE, CONNECT, OPTIONS
Invocation URL - it's a URL that is used to call a Cloud Function which is created in Azure, AWS or GCP.
Cloud Provider - choose between predefined providers - Azure, AWS or GCP. The providers offer the infrastructure and services that enable the function to execute in response to specific events or requests. It's necessary to select one of them, to have the relevant authentication options.
Cloud Functions allow for dynamic, context-aware processing where the body of the function call is determined by applying a JQ filter to the context. The result of this filtering extracts the key body, which becomes the function's payload. Other keys in the result are mapped to replacement variables and can replace placeholders in the CF definition.
Replacement variables are defined in the CF's URL using the syntax {{.varName}}. When the JQ filter is applied, any value under the key varName is substituted into the corresponding placeholder in the URL. This system enables flexible customization of both the function payload and its endpoint through context-specific values.
After you create the connection to the Cloud Function and add it to the process step you can check the edit mode of the step with three input texts:
Run Context - loads the last run context
JQ Expression
Curl to the Cloud Function
For a value stream that had already run, you can see the context data included, for example:
The three text fields are visible only in the edit mode, when you check a value stream in view mode, you can see the Input field only.
When you select a process step, the user interface confirms that your selection has been successfully saved, just as it does for triggers. You can then collapse the section, proceed to the next step, or complete the scenario creation using these two steps.
The value streams can be created using many combinations, for example:
A trigger with one process step
A trigger with many process steps
A trigger followed by a process step, another trigger and more steps
If you use a Make scenario as a mid trigger, make sure it ends with a "notify event" module that is configured to send notifications about the completed scenario to OE. You need to use the OE Instance ID, oe-instanceid, to wake up a value stream. For details, see .
When your value stream is ready, choose Publish to save your configuration.
Your value stream is created but is not active yet.
To activate the value stream, use the toggle that changes the inactive state to active. You can do that straight away in the edit mode, or from the processes list.
When your value stream is ready, it runs every time the specified trigger event is received. Each run of a value stream creates a new instance.
You can check your value stream instances in the . The debugger helps in troubleshooting and reviewing the whole process flow, as well as single process steps.
See how you can transfer forms magic links from OE to Celonis using dedicated extractors.
You can manage retrieving data from Emporix OE to track which forms were sent to suppliers or contacts using the Magic Links extractor. The extractor allows for precise data retrieval or comprehensive tracking of all sent forms. This ensures accurate visibility into communication history with your suppliers and contacts.
To extract the data from OE to Celonis, you have to create a Data Pool in your Celonis account. Celonis uses Data Pools to collect and cluster data information to set up an integration flow. To learn about the details, see the documentation.
Authentication mode - you can create Cloud Functions with or without authentication method. If you select Azure as the cloud provider, you can choose between the following authentications:
None
Basic - a username and password are used when making a request to a server, it's a part of the HTTP protocol.
Oauth2 - allows third-party applications to have a limited access to the data, through an authorization server.
Function Key - a specific authentication mode used only in Microsoft Azure Functions, access to an Azure Function is secured using a shared key, known as a function key. If you make a request to the function, you must include this key as part of the request to authenticate and gain access. The Authentication Mode options in OE correspond to the Cloud Providers configurations.


















You can create your own data extractor based on an Emporix template. The steps below show how to work with the extractor template and how to use it for your custom configuration.
Start with creating a connection to your Celonis Data Pool that links your OE as the Data Source:
In your Celonis account, go to Data Integration -> Your Data Pool -> Data Connections and choose Add Data Connection.
Choose Connect to Data Source.
Go to the Custom section and choose Build custom connection -> Import from file.
Enter the Name for your custom extractor. Optionally, you can also add a Description for it.
Download the Magic Link Extractor and then upload it as a JSON file in your Celonis extractor builder:
Choose Save and Continue.
Check the parameters that should be used for the connection and then continue to the next step. The parameters were configured automatically with the uploaded extractor JSON file.
Check the authentication method; for Emporix it's the OAuth2 (Client Credentials) method with the following endpoint and type details:
Get Token Endpoint: {Connection.AUTH_URL}/oauth/token
Grant Type: client_credentials Don't modify this configuration. Continue to the next step to finish the process.
At the end you can see endpoint defined for this extractor - Get Magic Links.
Having configured the data connection, we now need to establish the connection with Emporix system. Go to the Data Connections list again and choose Add Data Connection -> Connect to Data Source -> Custom. Choose the data connection that you created with the magic links extractor. It's visible there, under Custom connections.
Choose your connection and check its configuration to make sure all the authorization details like Client ID or Client Secret are added.
When the connection to Emporix is established, you can now configure the responses that you get. In the endpoints configuration, choose the Get Magic Links endpoint.
Use the Configure response using samples from your source system option to generate the example from the source system.
Expand the field -> Choose Use existing Data Source -> Choose your data source -> Choose Build Response. This autogenerates the response configuration, which you can then view in the JSON below.
You can check that the JSON was updated with magic links configuration, for example:
Make sure the id is checked as a Primary Key.
Choose Finish to save your connection configuration.
To start with the data job configuration, in your Data Pool go to Data Jobs and choose Add Data Job. Add a name for the Data Job and choose the Data Connection you created for the magic links.
For example, a data job named Load Magic Links and a related Magic Link Extractor to it.
After creating the data job, you can add the forms extractions.
Choose Add in the Extractions section, add a Name for the extraction and save it.
Choose Add Extraction in the Table Configuration.
Select magic_link from the list of available tables. Save the selection.
As a result, you should see the form table added to the extraction.
Go back to the Data Jobs view. Choose Execute Data Job. This starts a new task and populates a schema that is later visible in the Transformations. To check logs after a job execution, go to Logs tab and select the execution task that you want to view. You can see all the details related to the executed job.
Go back to the Data Jobs view and choose and Add two transformations in the Transformations section.
view the magic links transformation - with select * from "magic_link" as the transformation statement
Set up the Data Model to establish relations between all of the extractor's components.
Go to Your Extractor -> Data Model and choose Add Data Model.
Select magic_link related items for the table and choose Next.
You can skip the activity table configuration.
Set up all the Foreign Keys for your activity table. For example, set the ID key for the magic_link.
Choose New foreign key in the form settings.
Set up the relations:
Link magic_link (Dimension) with magic_link_submissions (Fact) by linking id and magic_link_id.
Link magic_link_submissions (Dimension) with form_submission (Fact) by linking magic_link_id and magicLinkId.
Link form_submission (Dimension) with form_submission$data$visiting (Fact) by linking id and form_submission_id.
To see an example of a configured data model, check the below sample Data Model:
After setting up the relations, go to Data loads tab and choose Load Data Model.
When the load is finished you can check the details of the currently loaded data model.
Execute a data load based on your form submission configuration and created connection.
Go to Data Pools -> Data Jobs and choose Execute Data Job. You can select to execute a Delta or a Full Load. Delta loads only the part of data that changed since the last upload.
Choose Execute Selection. The job starts and you can already see the process logs.
When the process is finished, you can check the logs details.
Use the Enable creation date filter to set up a date from which the data should be loaded. This is used for setting the timestamp for the initial load and late in the delta loads as well.
To enable set up the initial load, use the createdAtGte filtering.
Go to Data Jobs -> Your extraction task -> Table Configuration.
Enable the Creation date filter and select the createdAtGte filter.
Set up the custom date and save your configuration.
Use the Enable change date filter to configure your delta loads. You can set up a date from when the updated data would be fetched with the updatedAtGte filtering.
You can also configure the executions to make the updated at column visible and to upload data only from the specified time.
Go to Data Jobs -> Your extraction task -> Table Configuration.
Enable the Change date filter.
Create a new parameter for the modified time - <%=max_modified_time%>.
Go to Your Extraction -> Parameters tab and choose New Parameter and provide the following values in the fields:
type: dynamic
table: magic_link
column: updatedAt
operation type: FIND_MAX
data type: date
default date: any date from the past
Use the Delta filter statement to define the parameter settings - updatedAtGte = <%=max_modified_time%>.
OE Make modules bring specific functionality and are responsible for specific actions. Get familiar with the OE specific modules available in Make.
See how you can transfer data from OE to Celonis using dedicated extractors.
Value streams capture information in events throughout their lifecycle. The information captured in the events can be imported into Celonis using a Data Extraction job. To send the data, you need an extractor that brings data from Orchestration Engine to Celonis. For OE, you can customize the extractor so that it shares the data that you want to send to the Celonis system.
To extract the data from OE to Celonis, you have to create a Data Pool in your Celonis account. Celonis uses Data Pools to collect and cluster data information to set up an integration flow. To learn about the details, see the documentation.
{"urlParam": "urlVal", "qParam": "qVal", "body": {"key":"value"}}curl --location '{{Cloud Function URL}}'\
--header 'Content-Type: application/json' \
--data '{}'
create submission table transformation - with the following statement:

















The processes modules are responsible for interactions between OE and Make. The modules make the two applications talk to each other. They are used for as triggers, mid triggers, updates of process context, or for notifications when a scenario is complete.
The starter modules are placed at the beginning of a process step Make scenario. The modules receive all the data from the process context that was gathered from the running value stream instance up to the particular step. This data can be used by the scenario as input to adapt its behaviour. Starter modules also send the instance ID to the value stream that is running the scenario step.
Trigger Event
The module starts a scenario of a process step, which is a part of a value stream. The module works as a trigger for the step when it receives information that the previous step was completed. It passes the process context from the previous steps to the scenario where it's placed.
New Form Trigger
The module works as a scenario trigger when a new Form is issued to the customer.
All the starter modules can be parts of Make scenarios that serve as value stream triggers. To learn more about triggers check the following guides:
The completion modules are responsible for sending notification to OE when a Make scenario is finished. They also send event data that is saved in the process context of the running value stream instance. For this, they need the instance ID from the starter module, and the Event that is an outcome of running the scenario step.
Completion Event
The module is used for sending notification about a completed Make scenario.
Start New Orchestration
The module can be used as a trigger for a new value stream instance. Using the module, you can connect with any other application. When such a connection is configured, any signals or actions from the connected app work as triggers for the orchestration.
List Event Types
The module provides a list of Event Types that can be sent to the Orchestration Engine.
Create Event Type
Creates an Event Type in the Orchestration Engine.
The OE is integrated with Form.io that provides a solution for building a variety of forms out of different components. Using the forms Make modules prepared by Emporix, you can include forms in your Make Scenarios.
Create Form Magic Link
The module creates a link with a designed form, or multiple forms combined into one. Within the module, you can select the form that you had prepared in your OE tenant, in Management Dashboard. The created magic link can be used to access the form through the web.
The magic link recipients are assigned to a task related to the value stream step. You can use the Task Inbox to aggregate all the tasks related to an individual magic link recipient and to reduce number of notifications related to the incomplete tasks. To create an individual access link to the Task Inbox and share it, see Task Inbox and Tasks documentation.
The Create Form Magic Link module allows you to configure custom form attributes, which you can use in the Form Builder to add dynamic, personalized behavior for each recipient of the form. For example, you can present a personalized list of coffee options.
You can create custom attributes either in the Forms view or using the magic link API. Then, you can reference them in the Form Builder to adjust the bahavior of the form.
In the Create Form Magic Link Make module, specify values for the form attributes:
Choose the form in which you have created the custom attributes - in this example it's Coffee Preferences.
2. In **Custom JSON Attributes**, Item 1, add `coffeeType` as Attribute Name and specify the value that should be visible in the form, the value should be given in a JSON format. For example:
Save the module configuration and the scenario.
After the scenario runs and the magic link to a form is created, you can see the form attributes and values provided in Make module as the possible answers that can be submitted.
To see the example of how to configure the form in OE, see Example - Coffee Preferences form.
Create a Form Magic Link from a Previous Submission
The module creates a link to a form that had been submitted earlier. The link is pre-populated with data received from one user and can be sent to a different user with another unique magic link. The module includes the source submission field (Form Submission IDs), which contains the previous submission ID.
Get a Form Magic Link
The module is used to access the data from a form that was submitted, or saved as a draft, along with the specific schema of the submitted form. The module receives data about forms, and it returns the following information:
the forms definition (schema) that is linked to the magic link
data that has been saved up to the specific point in time
metadata about the submission
Update Magic Link Custom Attributes
The module is used for any custom attributes used in a form magic link.
List Form Magic Links
The modules provides a list of all the form Magic Links.
Delete a Form Magic Link
The module removes a Magic Link along with any associated form submissions.
Get a Form
The module loads a form definition by using its ID.
Submit Form
The module is used together with the Create Form Magic Link to generate a pre-populated form, which can be sent to a user. Using the module settings, you can map the fields between a form created in the OE Management Dashboard and the module in your Make scenario. If you choose to map the fields, all the values are automatically added.
Submit Raw Form Data
The module is used with Create Form Magic Link to generate a form. However, unlike the Submit Form module, it uses a raw data payload in a JSON format. You can use the mapping option, but it only maps the form's ID from the Create Form Magic Link module used in the scenario. You have to add the JSON payload manually.
Using this module to create a form, you don't have an option to have it as a draft version.
Forms modules for draft submissions
Submit Form Draft - the module makes it possible to have a draft form version, before the submission. With this module, the form is autosaved as a draft, but it's not submitted yet - no submission event is sent. The recipient can update the form before they decide to submit. However, the module uses a dynamic form definition instead of raw JSON data.
Submit Raw Form Data Draft - similarly to Submit Form Draft, the module makes it possible to have a draft form version, before the submission. With this module, the form is autosaved as a draft, but it's not submitted yet - no submission event is sent. The recipient can update the form before they decide to submit. However, the module uses a raw JSON data instead of a dynamic form definition.
Extract Form Submission from Process Context
The module is used for extracting the submitted form data from the whole process context created for a value stream instance.
List Form Submissions
The module is used to provides the submitted data from the specified form.
Task list is designed for external users to make it easier for them to monitor and complete all their assigned tasks. The list gathers all the separate forms links for specific tasks and can be accessed by the users in a Task Inbox, outside of Management Dashboard. The Make modules related to the feature are:
Create an Access Link
Creates an Access Link (link to a Task Inbox) for a specific user email address.
Get an Access Link
Retrieves the Task Inbox link using the given link ID.
List Access Links
Provides a list of all the access links within the Task Inbox. You can filter the list by a specified query, limit the max number of links or sort by different attributes.
Revoke an Access Link
Removes the Task Inbox access link, this can be used for various security reasons.
Process Context is a JSON object that contains information about an overall value stream. To check the information how process context is structured in general, see the Process Context document.
In Make scenarios the data for process context is sent from the Resume Event module when a scenario is completed. The data is then passed through the subsequent scenarios/process steps. However, OE can also use process context modules to search for process context and also, to update the context in the middle of a scenario run.
Search Process Contexts
The module is used to find process contexts related to a specific tenant. It uses criteria such as value streams, event fields and processing state.
List Process Contexts
The module lists process contexts that can be filtered by value stream and its versions, as well as the status and create and update dates.
Get Process Context
The module securely extracts process context of a given value stream using its instance ID. It first retrieves the current state of the process context and then outputs the relevant information.
When used by Celonis action flows, it enables secure access and extraction of information from a value stream, eliminating the need for insecure methods like chaining webhooks. This ensures that data retrieval happens within a protected framework, enhancing the security of interactions between the value stream and external systems.
Delete a Value Stream Instance
The module removes the specified value stream instance along with any associated resources, including Event Logs and Magic Links.
Terminate a Value Stream Instance
The module stops a value stream instance.
List a Process Context Event Log
The module chronologically lists all the events generated by a value stream instance. Process contexts logs gathered with the module can be filtered, or sorted using various settings:
Update Since - gathers events data from the specified date and time.
Query Filter - uses specified queries to collect partial data, for example: eventType:(order.spotbuy.sent,order.stock.checked).
Sort - orders the results by a given criteria, the default sorting is done according to updatedSince.asc.
Limit - setting for the maximum number of returned results.
The output data in the module includes IDs related to the tenant and the related process context, as well as the Occurred At and Received At information.
Occurred At - it's the value of the ce-time CloudEvent header, or if the value is not present then it's the same value as Received At. The ce-time header can be used by the merchant to indicate when the event occurred on their end, which may not always align with the time they sent the event to OE. For example, if a return was initiated on Tuesday in an ERP but is only sent to OE on Wednesday, ce-time can inform us of Tuesday's timestamp, rather than presuming the return was initiated only when receiving the event.
Received At - it's the time when OE received and registered the event. This may differ from the time when the event was sent because OE queues events.
Update Process Context
The module is used if you want to save the process context information through the process of the specified Make scenario, before it is completed.
In comparison to having the context information available at the end of the Make scenario process, the Update Process Context module makes it possible to store context data before going to the next step in a scenario itself, not only between the value stream steps. This data is intended for use in scenarios allowing you to adapt a scenario's behaviour according to the context and outcomes of previously run scenarios.
If you want to save the process context information in a process of a running scenario, add the Update Process Context module to your scenario and in the module settings add the contextID, select the Event Type and use the mapping for JSON payload.
Set Variable from Process Context
The module is used for mapping process context data.
Using the context data gathered throughout a running scenario requires mapping this context data while setting up new scenarios in the same value stream. You can do the mapping using the Set Variable from Process Context module that works with events fields in a type safe way. The mapping is based on selecting only the whole context object and the fields that you want to assign.
In the module's configuration, you can select an event field that you want to use in the specific value stream and assign its output to a Make Variable. You can also specify, if the module should throw an error if the event type's value isn't present. Both primitive and complex types can be assigned.
The process context object mapping is supported for the Data object sent through the Trigger Event module, and the Context object from Search Process Contexts or Get Process Context modules.
To make the Set Variable from Process Context module work, you need to select the value stream and an example of the value stream instance that had run in the past. The value stream and instance ID allow you to select the Event Type field that you want to extract.
Set Multiple Variable from Process Context
It is recommended for type-safe assignment of process context variables. The module extracts event fields from the process context, either passed into a scenario or loaded from the API, and assigns their values to variables. It serves as a more user-friendly alternative to mapping context fields using notation.
Resume Event
The module is used as a wake up trigger for a paused value stream instance.
These modules are designed for uploading, storing, and browsing data. They work with forms for data storage and retrieval, making it easy to manage and access information. The datastore can be used for various purposes, such as listing active users, managing workspace settings, or creating lookup tables.
Set a Datastore from a CSV Upload
The module is configured for creating and populating a datastore with structured data. To upload a datastore, you first create a namespace to organize the data. Then, you define a unique key column for each row, ensuring data integrity. The module supports CSV file uploads using an HTTPS mapping, allowing flexibility in data sourcing.
List Values in a Datastore
This module enables listing values from the datastore. Access to the data is controlled through a unique Magic Link ID, which serves as the single entry point to retrieve stored values. Make sure the Magic Link you use is active and valid to use it in the module.
Get Account
The module is used to fetch information about your OE tenant data needed for the Make scenario.
Using OE and Make modules you can retrieve data from Celonis Knowledge models for your value streams. To see how the data flow between OE and Celonis works and how to use the Celonis Intelligence API modules, check the Data Flow between OE and Celonis documentation.
The CE-related modules are responsible for various aspects of commerce processes. They are organized into the following categories:
Artificial Intelligence
Catalog, Product and Price
Users and Permissions
Company and Customers
Rewards and Promotions
Quote Management
Checkout
Order and Order Fulfillment
Media Management
You can create your own data extractor based on an Emporix template. The steps below show how to work with the extractor template and how to use it for your custom configuration.
Start with creating a connection to your Celonis Data Pool that links your OE as the Data Source:
In your Celonis account, go to Data Integration -> Your Data Pool -> Data Connections and choose Add Data Connection.
Choose Connect to Data Source.
Go to the Custom section and choose Build custom connection -> Import from file.
Enter the Name for your custom extractor. Optionally, you can also add a Description for it.
Download the Value Stream Event Log Extractor and then upload it as a JSON file in your Celonis extractor builder:
Choose Save and Continue.
Check the parameters that should be used for the connection and then continue to the next step. The parameters were configured automatically with the uploaded extractor JSON file.
Check the authentication method, for Emporix it's the OAuth2 (Client Credentials) method with the following endpoint and type details:
Get Token Endpoint: {Connection.AUTH_URL}/oauth/token
Grant Type: client_credentials Don't modify this configuration. Continue to the next step.
Click on the process context endpoint to check and customize its configuration.
The response that is visible in the endpoint configuration, is the part that you have to customize. In the JSON file, enter all the tenant and process context data information that is needed for your custom connection and for getting the proper response.
Make sure the context$processid is checked as a Primary Key. Without the context$processid it is not possible to link the child tables back to the parent.
Choose Finish to save your custom connection configuration.
Go to the Data Connections list again and choose Add Data Connection -> Connect to Data Source -> Custom. Your newly created connection is visible there, under Custom connections.
Choose your connection and check its configuration details to make sure all the authorization details like Client ID or Client Secret are added. Save your changes.
You can find the Client ID and Client Secret in Emporix Developer Portal - .
Having the connection, you can also adjust the data that will be imported to your Celonis account.
To start with the data job configuration, go to Data Pools -> Your Data Pool -> Data Jobs and choose Add Data Job. Add a name for the Data Job and choose your Data Connection.
In the Custom Data Job, choose +Add Extraction and select all the available tables: account, digital_process, event_type, process_context.
Now you can add additional filters to set some limitations on the load context and it's content, see the examples below that you can use for reference:
To get the data only from the finished value stream runs (and to exclude gathering the data of the value streams that are running), you can add a filter that fetches data only from the value streams with the finished status.
Go to Data Pools -> Your Data Pool -> Data Jobs -> Extractions -> Your Extraction -> process_context.
In the Filter Statement section add status = 'finished'. This is an optional field, but it makes sure you only get data from the completed processes.
To get the data only from the finished value stream runs and to make sure only the changed data is loaded, you can use a filter for the finished status and modified time.
Go to Data Pools -> Your Data Pool -> Data Jobs -> process_context.
Create a new parameter for the modified time - <%=max_modified_time%>.
Go to Your Extraction -> Parameters tab and choose New Parameter and provide the following values in the fields:
type: dynamic
table: process_context
column: metadata$updatedAt
operation type: FIND_MAX
data type: date
default date: any date from the past
Go to process_context and in the Delta Filter Statement section add status = 'finished' AND updatedSince = <%=max_modified_time%>.
If you want to limit the date from when you load the post action feedback data, use the createdSince filter.
In Celonis, go to Data Pools in Celonis -> Data Processing section and choose Data Jobs.
Go to Data Pools -> Your Data Pool -> Data Jobs -> Your Data Job -> Extractions and choose Your Extraction task.
Open the process_context configuration.
Go to Time Filter and configure the filter to customize the creation period.
You can customize the process context extractor to include live, draft or live and draft value stream instance.
Go to Data Pools -> Your Data Pools -> Data Jobs -> Choose Your Data Job -> Extractions and choose the extraction task.
Go to Parameters and choose New parameter.
Set up the new parameter to include draft, live or draft and live scopes. You can check the example:
Placeholder: includeScopes
Name: Include Scopes
Description: you can add your text about the parameter here for future reference
Type: private
Data type: text
Values: you can decide if you want to limit the data to live or draft only
draft includes the instances of draft value stream versions
live includes the instances that are published and active
live,draft includes both live and draft instances
After creating the parameter you have to create a filter for it in the process context.
In the Data Jobs -> Your Data Job -> Extractions -> Table Configuration choose process_context.
Go to Additional Filter section and add filters for the Included Scopes:
Filter Statement - includeScopes = '<%=includeScopes%>'
Delta Filter Statement - updatedSince = <%=max_modified_time%> and includeScopes = '<%=includeScopes%>' - this example includes setups for both time and scope modifications.
To make it possible to generate an activity log in Celonis
Go to Data Pools -> Your Extractor -> Data Jobs -> Transformations -> Extract Event Log and use the script below.
The script that is prepared, creates an activity log table based on the name of the value stream - it supports delta loads. The script creates a single activity for each event that is triggered in a value stream.
Set up the Data Model to establish relations between all of the extractor's components.
Go to Your Extractor -> Data Model and choose Add Data Model.
Select all the items to be added in the Activity Table and choose Next.
In the Activity Table setup, you can configure all the activity columns. Choose the activity_log , select the following in the columns and then confirm with Finish.
Case ID - value stream ID
Activity name - it's an event_name
Time stamp - for occurred at
Sorting - for a better data display
Set up all the Foreign Keys for your activity table. For example, set the ID key for the Account.
Choose New foreign key in the account settings.
Connect the ID field from a Dimension table to a digital_process id in a Fact table.
Mandatory relations:
Link Process Context (Dimension) with Activity Log (Fact) by linking activity_log.case_id and process_context.context$processid.
Link Value Stream (Dimension) with Process Context (Fact) by linking digital_process.id and process_context.context$executionTemplateID.
Recommended relations:
Use Event Log as activity table
Link Process Context (Dimension) with the additional tables (Fact) using process_context.context$processid to load any additional data as a part of your post action feedback event
Optional relations:
Link Value Stream (Dimension) with Value Stream Triggers (Fact) by linking digital_process.id and digital_process$trigger.digital_process_id.
If you want OE multi tenant separation of data, link Account (Dimension) with Value Stream (Fact) by linking account.id and digital_process.tenant.
To see an example of a configured data model, check the below sample Data Model:
Process Context: it's the central component, it belongs to one value stream
Value Stream: it can have many process contexts and many value stream's trigger events, but it can have only one account
Event log: it's a 1:1 relationship with the process context
Account: means the OE tenant, it can have many value streams
Execute a data load based on your configuration and created connection.
Go to Data Pools -> Data Jobs and choose Execute Data Job. See the example:
Choose Execute Selection.
The job starts and you can already see the process logs.
When the process is finished, you can check the logs details.
{
"id" : "",
"name" : "",
"expirationDate" : "",
"stylesheet" : "",
"submissionEventType" : "",
"canBeSubmittedOnlyOnce" : false,
"instanceId" : "",
"scheduleId" : "",
"createdAt" : "",
"updatedAt" : "",
"recipient" : {
"company" : {
"id" : "",
"name" : ""
},
"contacts" : [
{
"id" : "",
"name" : "",
"value" : "",
"language" : ""
}
]
},
"mixins" : {
"topicsNames" : "",
"component" : {
"escalated" : "",
"name" : ""
},
"internalContact" : {
"email" : "",
"name" : "",
"preferredLanguage" : ""
},
"internalContact2" : {
"email" : "",
"name" : "",
"preferredLanguage" : ""
}
},
"forms" : [
{
"id" : ""
}
],
"submissions" : [
{
"submittedAt" : ""
}
],
"customAttributes" : [
{
"name" : "",
"values" : [
""
]
}
]
} create table if not exists magic_link_submissions (
magic_link_id varchar2(26),
num_submissions integer,
PRIMARY KEY(magic_link_id)
);
DROP TABLE IF EXISTS magic_link_submissions_new;
create table if not exists magic_link_submissions_new (
magic_link_id varchar2(26),
num_submissions integer,
PRIMARY KEY(magic_link_id)
);
INSERT into magic_link_submissions_new
select ml.id, count(submissions."submittedAt")
from magic_link as ml left outer join "magic_link$submissions" as submissions on submissions."magic_link_id" = ml.id
group by ml.id;
merge into magic_link_submissions
using magic_link_submissions_new
on magic_link_submissions.magic_link_id = magic_link_submissions_new.magic_link_id
WHEN NOT MATCHED THEN
insert (magic_link_id, num_submissions)
VALUES (magic_link_submissions_new.magic_link_id, magic_link_submissions_new.num_submissions); {
"options" : [
{ "id": "americano", "label" : "Americano" },
{ "id" : "flat white", "label" : "Flat White" },
{ "id" : "latte", "label" : "Latte" },
{ "id" : "black" , "label" : "Black" }
]
}CREATE TABLE IF NOT EXISTS activity_log(
digital_process_name varchar(256),
event_name varchar(256),
event_key varchar(256) PRIMARY KEY,
event_id varchar(26),
case_id varchar(256),
occurred_at TIMESTAMP,
read_at TIMESTAMP,
sorting BIGINT);
DROP TABLE IF EXISTS activity_log_new;
CREATE TABLE activity_log_new(
digital_process_name varchar(256),
event_name varchar(256),
event_key varchar(256),
event_id varchar(26) PRIMARY KEY,
case_id varchar(256),
occurred_at TIMESTAMP,
read_at TIMESTAMP,
sorting BIGINT);
INSERT INTO activity_log_new
SELECT "digital_process"."name" as "digital_process_name",
"event_type"."name" as "event_name",
"event_type"."key" as "event_key",
"process_context_event"."id" as "event_id",
"process_context"."context$processid" as "case_id",
"process_context_event"."occurredAt" as "occurred_at",
"process_context_event"."readAt" as "read_at",
TIMESTAMPDIFF(MILLISECOND, '1970-01-01 00:00:00', "process_context_event"."readAt") as "sorting"
FROM "digital_process", "process_context", "process_context_event", "event_type"
where "process_context"."context$executionTemplateID" = "digital_process"."id"
and "process_context_event"."process_context$context$processid" = "process_context"."context$processid"
and "event_type"."key" = "process_context_event"."eventType" ;
MERGE INTO activity_log
USING activity_log_new
ON activity_log_new.event_id = activity_log.event_id
WHEN NOT MATCHED THEN
INSERT (digital_process_name, event_name, event_key, event_id, case_id, occurred_at, read_at, sorting)
VALUES (activity_log_new.digital_process_name,
activity_log_new.event_name,
activity_log_new.event_key,
activity_log_new.event_id,
activity_log_new.case_id,
activity_log_new.occurred_at,
activity_log_new.read_at,
activity_log_new.sorting);
DROP TABLE IF EXISTS activity_log_new;
Read_at - means a point when an event is saved, later used for sorting












































See how you can transfer form submission data from OE to Celonis using dedicated extractors.
You can manage importing form submission data into Celonis through Form Extractor with different kinds of filtering. Filtering might include form IDs that enable precise data retrieval, or employing the extractor to handle submissions from all form types.
To extract the data from OE to Celonis, you have to create a Data Pool in your Celonis account. Celonis uses Data Pools to collect and cluster data information to set up an integration flow. To learn about the details, see the Celonis Data Pools documentation.
You can create your own data extractor based on an Emporix template. The steps below show how to work with the extractor template and how to use it for your custom configuration.
Start with creating a connection to your Celonis Data Pool that links your OE as the Data Source:
In your Celonis account, go to Data Integration -> Your Data Pool -> Data Connections and choose Add Data Connection.
Choose Connect to Data Source.
Go to the Custom section and choose Build custom connection -> Import from file.
Enter the Name for your custom extractor. Optionally, you can also add a Description for it.
Download the Forms Extractor and then upload it as a JSON file in your Celonis extractor builder:
Choose Save and Continue.
Check the parameters that should be used for the connection and then continue to the next step. The parameters were configured automatically with the uploaded extractor JSON file.
Check the authentication method; for Emporix it's the OAuth2 (Client Credentials) method with the following endpoint and type details:
Get Token Endpoint: {Connection.AUTH_URL}/oauth/token
Grant Type: client_credentials Don't modify this configuration. Continue to the next step to finish the process.
At the end you can see the Defined Endpoints.
Having configured the data connection, we now need to establish the connection with Emporix system. Go to the Data Connections list again and choose Add Data Connection -> Connect to Data Source -> Custom. The data connection you created with the form extractor is visible there, under Custom connections.
Choose your connection and check its configuration details to make sure all the authorization details like Client ID or Client Secret are added. Additionally, provide the form slug information to configure the right data filtering. Save your changes.
You can find the Client ID and Client Secret in Emporix Developer Portal - .
The slug details can be checked in two ways:
within the Management Dashboard in the Forms view
When the connection to Emporix is established, you can now configure the responses that you get. In the endpoints configuration, choose the Form endpoint to check and customize its configuration.
The Configure Response that is visible in the endpoint configuration can be done automatically.
Use the Configure response using samples from your source system option to generate the example from the source system. Expand the field -> Choose Use existing Data Source -> Choose your data source -> Choose Build Response. This autogenerates the response configuration, which you can then see in the JSON response.
Similarly as for the Forms endpoint, choose the Form Submission endpoint to check and customize its configuration.
The Configure Response that is visible in the endpoint configuration can be done automatically.
Make sure the id is checked as a Primary Key.
Use the Configure response using samples from your source system option to generate the example from the source system. Expand the field -> Choose Use existing Data Source -> Choose your data source -> Choose Build Response.
This autogenerates the response configuration. You can check that the JSON was updated with submission data configuration, for example:
Choose Finish to save your connection configuration.
With the forms extractor, you can configure the data jobs to either upload data related to form submissions only, or to form submissions with magic links.
To start with the data job configuration, in your Data Pool go to Data Jobs and choose Add Data Job. Add a name for the Data Job and choose the Data Connection you created for the forms.
For example, a data job named Forms extraction and a related orbit-control extraction to it.
After creating the data job, you can add the forms extractions. 2. Choose Add in the Extractions section, add a Name for the extraction and save it. 3. Choose Add Extraction in the Table Configuration. 4. Select form from the list of available tables. Save the selection.
As a result, you should see the form table added to the extraction.
Go to Extraction Settings, choose option B as the load configuration and save it.
Go back to the Data Jobs view. Choose Execute Data Job. This starts a new task and populates a schema that is later visible in the Transformations. To check logs after a job execution, go to Logs tab and select the execution task that you want to view. You can see all the details related to the executed job.
Go back to the Data Jobs view and choose Add in the Transformations section. Enter a name for the transformation and save it. The schema is automatically uploaded.
You can now add Transformation Statements and execute them. For example: select * from "form_submission$data$visiting" or select * from "form_submission".
Whenever a transformation is executed you can see the Output of the job.
The forms extractor includes magic links extractor, which enhances its functionality. Using the extractor, you can extract form submissions and also track who has been sent a form, regardless of whether they have completed or submitted it. By specifying the form type, the extractor loads relevant magic links, providing visibility into who has received the form, not just the responses they have submitted.
To have magic links data included in the extraction:
Choose Add in the Extractions section, add a Name for the extraction and save it.
Choose Add Extraction in the Table Configuration.
Select form from the list of available tables. Save the selection.
As a result, you should see the form table added to the extraction. 4. Go to Extraction Settings, choose option B as the load configuration and save it.
Go back to the Data Jobs view. Choose Execute Data Job. This starts a new task and populates a schema that is later visible in the Transformations. To check logs after a job execution, go to Logs tab and select the execution task that you want to view. You can see all the details related to the executed job.
Go back to the Data Jobs view and choose Add in the Transformations section. Enter a name for the transformation and save it. The schema is automatically uploaded.
You can now add Transformation Statement and execute it:
With the forms extractor, you can configure the data model to either upload data related to form submissions only, or to form submissions with magic links.
Set up the Data Model to establish relations between all of the extractor's components.
Go to Your Extractor -> Data Model and choose Add Data Model.
Select forms related items for the table and choose Next.
You can skip the activity table configuration. 3. Set up all the Foreign Keys for your activity table.
For example, set the ID key for the form:
Choose New foreign key in the form settings.
Connect the id field from a Dimension table to a form$id in a Fact table.
The next relations to set up are:
Link form (Dimension table) with form_submission (Fact table) by linking id and form$id.
Link form (Dimension table) with form$customAttributes (Fact table) by linking id and form_id.
To see an example of a configured data model, check the below sample Data Model:
After setting up the relations, go to Data loads tab and choose Load Data Model.
When the load is finished you can check the details of the currently loaded data model.
Set up the Data Model to establish relations between all of the extractor's components including magic links data.
Go to Your Extractor -> Data Model and choose Add Data Model.
Select the items related to forms and magic links for the table and choose Next.
You can skip the activity table configuration. 3. Set up all the Foreign Keys for your activity table.
For example, set the ID key for the form:
Choose New foreign key in the form settings.
Connect the id field from a Dimension table to a form$id in a Fact table.
The next relations to set up are:
Link form (Dimension table) with form_submission$data$visiting (Fact table) by linking id and form_submission_id.
Link form (Dimension table) with form_submission (Fact table) by linking id and form$id.
To see an example of a configured data model, check the below sample Data Model:
After setting up the relations, go to Data loads tab and choose Load Data Model.
When the load is finished you can check the details of the currently loaded data model.
Execute a data load based on your form submission configuration and created connection.
Go to Data Pools -> Data Jobs and choose Execute Data Job. You can select to execute a Delta or a Full Load. Delta loads only the part of data that changed since the last upload.
Choose Execute Selection. The job starts and you can already see the process logs.
When the process is finished, you can check the logs details.
To enable delta loads submissions, use the submittedAtGte filtering. You can configure the executions to make the submitted at column visible and to upload data only from the specified time.
Go to Data Jobs -> Your extraction task -> Table Configuration.
Enable the creation date filter.
Create a new parameter for the modified time - <%=max_modified_time%>.
Go to Your Extraction -> Parameters tab and choose New Parameter and provide the following values in the fields:
type: dynamic
table: form_submission
Use the Delta filter statement to define the parameter settings.
Similar to form submission, you can the Enable change date filter to configure your delta loads for magic links. You can set up a date from when the updated data would be fetched with the updatedAt filtering.
You can also configure the executions to make the updated at column visible and to upload data only from the specified time.
Go to Data Jobs -> Your extraction task -> Table Configuration.
Create a new parameter for the modified time - <%=magic_link_max_modified_time%>.
Go to Your Extraction -> Parameters tab and choose New Parameter and provide the following values in the fields:
By default, the forms extractor uploads data from the latest submitted version of a form associated with a given magic link. If users might submit the same form more than once, and you want to upload data covering all of its submitted versions, you can override the default behavior by changing the latest parameter settings.
To change the default setting:
Go to the Defined Endpoints and choose the Form Submission endpoint to open its configuration.
Go to the Configure Request section and expand Request parameters.
Choose to edit the latest parameter and change the value from true to false. By doing this, you make sure that not only the latest submission of a form is uploaded, but all the versions of it submitted by a specific user using a single magic link.
4. Go to **Data Jobs** and execute a task. Afterwards, you can view the execution logs with the updated setup and the number of loaded records.
with an API call, for example by calling the Get a Form Make module.
Link form$customAttributes (Dimension) with form$customAttributes$values (Fact) by linking form_id and form$customAttributes_form_id.
Link magic_link (Dimension table) with form_submission (Fact table) by linking id and magicLinkId.
Link magic_link (Dimension table) with magic_link_submissions (Fact table) by linking id and magic_link_id.
column: submittedAt
operation type: FIND_MAX
data type: date
default date: any date from the past
table: magic_link
column: updatedAt
operation type: FIND_MAX
data type: date
default date: any date from the past











































Build upon and improve the created value stream.
Having the first Hello World value stream set up, we can now add conditional logic to the way it works. The condition would include sending an additional email notification, but only if a sender provides us with a specific response to a form. To use conditional logic and add a condition step with a decision rule in the Hello World value stream, we will first prepare a new Make scenario and then create a business rule in OE Rulestore.
Similar to the , we will create a scenario named Send Outreach on Mixed Feedback that sends another email to forms recipients.
To build the scenario:
"data" : {
"visiting" : [
0
],
"select" : 0,
"starship" : 0,
"requestPermissionToLand" : false,
"shipCategory" : "",
"doomed" : false,
"_magicLinkCustomAttributes" : {
"planet" : [
{
"name" : "",
"residents" : [
{
"id" : "",
"name" : ""
}
]
}
]
}
}
create table if not exists magic_link_submissions (
magic_link_id varchar2(26),
num_submissions integer,
PRIMARY KEY(magic_link_id)
);
DROP TABLE IF EXISTS magic_link_submissions_new;
create table if not exists magic_link_submissions_new (
magic_link_id varchar2(26),
num_submissions integer,
PRIMARY KEY(magic_link_id)
);
INSERT into magic_link_submissions_new
select ml.id, count(submissions."submittedAt")
from magic_link as ml left outer join "magic_link$submissions" as submissions on submissions."magic_link_id" = ml.id
group by ml.id;
merge into magic_link_submissions
using magic_link_submissions_new
on magic_link_submissions.magic_link_id = magic_link_submissions_new.magic_link_id
WHEN NOT MATCHED THEN
insert (magic_link_id, num_submissions)
VALUES (magic_link_submissions_new.magic_link_id, magic_link_submissions_new.num_submissions);
Go to OE -> Events -> Event Registry and create a new hello_outreach_sent event named Hello World Outreach Sent.
Go to Value Streams and open the Hello World process that you created in the first trail.
Add a new process step and choose to Create New Scenario. This directs you to a Make scenario template.
As a first step of the Make scenario, establish the connection in the Trigger Event module.
Add the Set Variable from Process Context module and configure it with the following details:
Assign the Data from the Trigger Event.
Select the Hello World value stream.
Use the Instance ID from as the Example Instance. You can select the ID from the drop-down list. The ID at the top is the one from the latest instance.
Add the second Set Variable from Process Context module and configure it with the following details:
Assign the Data from the Trigger Event.
Select the Hello World value stream.
Use the latest Instance ID as the Example Instance. You can select the ID from the drop-down list. The ID at the top is the one from the latest instance.
Add the Gmail (Send an Email) module. We use Gmail as the example email provider as in the Setting up a First Value Stream trail, but it can be any other provider.
In the Completion Event, choose the Hello World Outreach Sent event as the Event Type.
Save the scenario and choose Run once. The scenario should look like this:
Go to OE and open the Hello World value stream. Add a new process step with the new scenario, you can see that the Send Outreach on Mixed Feedback scenario is now visible in the available scenarios list. Choose the scenario as the process step.
Choose Publish and then ensure that the value stream is active.
Ensure that the Send a Survey on Hello World and Send Outreach on Mixed Feedback scenarios in Make are active as well.
To test, re-run the new instance of the value stream by sending a new trigger event. You can send the same event using the Postman request, or Celonis action flows, that you used in the Setting up a First Value Stream trail to start the value stream.
After sending the event, first you get an email requesting you to submit a response to the created form. And then, after submitting the response, you get a second email with a Thank You message.
Up to this point the Send Outreach on Mixed Feedback scenario was invoked by any response submitted by the form recipient. Now, we will limit this process to invoke the scenario only on indifferent or sad responses. To do that, we first need to create a business rule and then apply it in the Hello World value stream as a decision on a process step.
Emporix Orchestration Engine provides a Rulestore that allows to manage business rules outside of the value streams and Make scenarios. The rules can be expressed with JsonLogic. Using JsonLogic, we will create a rule that identifies sentiment responses that need further outreach.
To create a new rule:
In Management Dashboard, go to OE -> Rulestore and choose Create rule.
Choose if you want to create the rule in Simple or Advanced mode.
If you choose the Simple mode, add the following values in the blocks:
Name: Sentiment Requiring Outreach
Description: for example - measure how positive recipients are
Unit: list
Expression: in
Value: indifferent, sad
If you choose the Advanced mode, add the following:
Name: Sentiment Requiring Outreach
Rule Expression: add a JsonLogic expression that evaluates to true if a variable called sentiment has indifferent or sad
Save the rule. It's now visible in the Rulestore list.
Copy the rule's ID, it will be used in the next steps.
When we have the business rule created in a Rulestore, we can now use it as a condition for a process step in the Hello World value stream.
To apply the business rule:
Go to the Hello World value stream and in the Send Outreach on Mixed Feedback process step choose Add logic.
Go to the Decision tab. Here, you can see the JSON Logic Expression field, where you can insert the business rule that you had prepared for the process step.
Enter the following expression, with your business rule ID:
The businessRule is a JsonLogic operator that refers back to the rule managed in the Rulestore.
The first part is the rule ID that you have to copy from the Rulestore and then paste in the expression.
The second part maps the rule variables to the actual relevant fields in your process context. Here, we are saying that the sentiment_requiring_outreach field can be found in the hello\_feedback\_provided.forms.0.payload.sentiment context event field.
Choose OK to save the expression as the process step decision.
Publish the new version of the value stream and ensure it's active. Now, only the responses with indifferent or sad values result in running the Send Outreach on Mixed Feedback.
However, the value stream completes regardless of the response that is submitted in the form.
Send the event request again by Postman or Celonis action flows. As a result, you receive a submission form request again.
In the first form submission provide the answer as happy. This DOES NOT execute the Send Outreach on Mixed Feedback process step.
If you provide the response as indifferent or sad the Send Outreach on Mixed Feedback IS invoked and you receive another email with the Thank You form and information about the next steps that you included in the text.
You can also use the debugger mode to verify if the value stream works properly. Go the Hello World value stream and choose the magnifier icon to open the debug mode.
If the response is happy, you can see that the step was filtered.
If the response was indifferent or sad, the step was invoked.
It is also possible to use events themselves as conditions on process steps. Multiple events can be assigned to a step or group of steps, and both IN (OR) and AND operators are supported. It's a preferred option when it is simply enough to test the presence of an event.
{
"businessRule": [
"01K18FK7DHTM53T9M5GJYM57GA",
[
"sentiment_requiring_outreach",
{
"context": "hello_feedback_provided.forms.0.payload.sentiment"
}
]
]
}Choose Hello World Feedback Sent as an Event Type.
Choose email from the Field drop-down list.
Type email in the Variable Name field.
Choose Hello World Feedback Provided as an Event Type.
Choose forms.0.payload.firstName in the Field drop-down list.
Type firstName in the Variable Name field.
Choose Yes for Raise Error when Null.















{
"in": [
{
"var": "sentiment_requiring_outreach"
},
[
"indifferent",
"sad"
]
]
}

Start with building a first digital process.
By going through this guide, you will create a value stream that responds to a Hello World event from a sender. The response is a feedback form that is first sent to the sender with a request for completion. In this case, the sender can be a customer whom we are asking for feedback. The value stream ends once the sender submits the feedback form. To successfully complete the trail, make sure to go through all the sections from top to bottom.
A value stream involves an initiating event followed by subsequent steps. External events work as triggers for value stream to run, they are transmitted to an Orchestration Engine event-receiver endpoint. The subsequent process steps are basically a combination of actions following one after another making a whole process complete.
To start building a value stream, follow these steps:
Create a new Hello World event, with hello_world as the event name, which works as its technical ID.
To create the event, go to OE -> Events -> Event Registry. For details, see the and guides.
Create a new value stream with the hello_world event as a trigger.
To create the value stream, go to OE -> Value Streams and choose Create value stream.
Add Hello World as a name for your process.
Choose the Hello World event as a trigger for the process.
Choose Publish
To move to the next steps you need the data to be processed first. To do so, you need to send some data to OE to complete the event connection. You can do this using Postman, or with Celonis Action Flows.
Get the Webhook Target URL and HMAC Secret and send the Hello World event following the instructions in . The authentication and configuration guide includes a Postman collection that also shows you how to authenticate the request by generating the HMAC signature.
You can check there all the details that must be included in the request.
Send the POST request, with {"hello" : "world"} in the body content. The endpoint responds with 200 OK status and a payload to confirm a successful receipt. Here's how the payload looks like:
If you receive 401, verify if your setup for the secret is correct.
If you receive any other status, verify if the webhook source URL is correct.
Note down your request_id, it may be useful later to search for the process in OE.
If you are a Celonis customer and want to trigger a value stream from EMS, it is possible to do this using the Action Flow module. There's a public that enables you to start value streams.
Add the OE module to your Action Flow.
Set up the connection to include the source, secret and OE API client and secret following the instructions in .
Select the “Hello World” event type.
To confirm a successful receipt, the endpoint responds with 200 OK status and a confirmation payload. Here's how the payload looks like:
If you receive 401, verify if your setup for the secret is correct.
If you receive any other status, verify if the source value is correct.
Note down your request_id, it may be useful later to search for the process in OE.
Check if your payload was retrieved and whether it activated a new value stream. You can verify this using the .
Go to Value Streams and open the debug mode of your Hello World value stream. You can do this from the main view by clicking the magnifier icon, or opening the process and choosing the Go to Debugger option. If the request worked, you should see the details of the instance run.
To confirm the data was received as expected, open the instance and check the logs.
After creating and validating the trigger event, we can now add a process step in the Hello World value stream. The process step includes a form prepared for the sender to add a feedback message.
To create the form, you can use the Form builder that is integrated with OE. For details, see documentation.
Create a new form with Feedback as a name.
Add a new dropdown component and label it as Sentiment in the Display tab.
Go to the Data tab and configure the component to have three Data Source Values:
happy
sad
indifferent
Go to the Validation tab and flag the field with the dropdown as Required.
Save the configuration.
Add two separate text fields:
first with a First Name label - set it as required in the validation tab
second with a Last Name label
Save the form. You can also check its preview to see if all the fields are added.
Update the event that you sent previously with a new data: replyTo, where you add the email address as a value.
Send the updated event to OE. You can use Postman or Celonis Action Flow as in the previous step.
Go to debugger to check if the data was properly received.
Copy the Instance ID, we will use it in the next steps.
Go to your Hello World value stream, and a new process step. For details, see .
In the new process step, choose the Create New Scenario option.
This opens a new tab and redirects you to a scenario step template wizard in Make, with a starting module and completion event.
After going to Make, ensure you are working in the right organization. The name should reflect your tenant name in OE. To check if the organization is the correct one, it's enough to hover over the Make's sidebar, the name is visible at the top. Then, you can either click directly on the template to start working with it, or choose the Start guided setup option to continue.
If you click on any of the Make's sidebar nodes, it switches off the template and redirects you the selected section in Make. It's not possible to get back to the template.
Add a name for the scenario, in this example it's "Send a Survey on Hello World". Save the scenario to make sure the name is kept.
Add a webhook name in the Trigger Event module.
There's no validation requirement for the name, but we recommend to use the same name for the webhook as for the scenario. Without the webhook, the value stream is marked as invalid in OE.
The webhook connection is established automatically, but it may happen that you'll need to set it up manually. In that case choose Add next to the connection field in the Webhook module and provide there the Connection Name, Tenant ID and Event-Receiver Secret.
The tenant ID is the first part of the value stream ID, you can find it in two places:
In the OE -> Admin -> Settings tab, the tenant ID is displayed next to the Make Team name.
In the URL of your value stream instance.
The template shows that the last Make scenario step should be a completion event. To send the completion event, we need to create a new event in OE Management Dashboard. Go to OE -> Events -> Event Registry and create a new event with the following data:
Display name: Hello World Feedback Sent
Event name: hello_feedback_sent
Go back to your scenario editor in Make -> Completion Event module and select the connection.
Optional: If it happens that you don't have the connection established yet, you need to configure it in the module.
Choose Add next to the Connection field.
Provide the necessary data for the Event-Receiver Endpoint and Environment.
The data for the endpoint is the one from the Admin settings in OE, the data for the environment is the one that you can find in the Developer Portal in your API keys - COP API.
Select the newly created event as an Event Type. If you don't see the event, try to refresh the list.
Choose Save. This creates the scenario, you can now add a relevant name for it, for example Send a Survey on Hello World.
Get the Make scenario to access the replyTo field. For this, you can use the Set Variable from Process Context module to extract the replyTo email address that we send in the payload, and assign it to a Variable called email.
Add the Set Variable from Process Context module and place it between the Trigger Event and Completion Event modules.
Assign the Data from the Trigger Event.
Select the Hello World value stream.
Use the Instance ID from the debugger (see step 4) as the Example Instance. You can select the ID from the drop-down list. The ID at the top is the one from the latest instance.
Modify the Completion Module payload to include the email. In the Payload field, select Map as JSON String and add the JSON string. The email should point to the email settings from the Set Variable from Process Context module. For example: { "email" : "{{4.email}}"} - if the reference is to email from module 4.
If it happens that you don't see the email variable in the list, try running the scenario. Even if it doesn't start, the run trigger itself updates the from the previous modules.
Save the scenario.
Go back to your Hello World value stream in OE and add the Send a Survey on Hello World scenario as the next process step.
Choose Publish and then activate the new version of the Hello World value stream.
Check if your value stream is activated and if it has a valid
Invalid status is usually caused by a missing webhook on the trigger module, or on improperly set up completion module.
OE revalidates each value stream with every new publishing.
To verify if you scenario works properly, do the following checks:
Go back to your scenario editor in Make. If you closed the tab with the scenario in Make, you can open it directly from the process step in the Hello World value stream.
To verify the step before activating the scenario, use the Run once option.
You should see a confirmation about a successful scenario load that is waiting for data.
Make sure there are no errors in your request and resend the previous request using Postman or Celonis Action Flow (as in Step 2 of Creating a skeleton scenario process step section) and check if the scenario is working as expected. You should see the operation details in the Set Variable from Process Context module. After you send the request, the scenario initializes and then you get a confirmation about scenario completion.
Optionally, go to debugger and verify if the new instance has logs including the new process step that you added. If there are any errors in your Make scenario, you can navigate directly to the specific scenario logs in Make from the process step visible in the debugger.
In this step we will use the Feedback form that we created earlier and email it to the replyTo email address. We will pre-populate one of the fields in this form. The recipient will be able to use a Magic Link to open the form. We will then change the value stream to wait till the Feedback form has been submitted (in this case saved as draft) before completing.
Go to OE -> Events -> Event Registry and create a new event with the following values:
Display name: Hello World Feedback Provided
Event name: hello_feedback_provided The event captures the feedback submitted by the recipient. All the submitted form data is enclosed inside this event.
Go to your Send a Survey on Hello World scenario in Make.
Add the Create Form Magic Link module and place it after the Set Variable from Process Context module.
Set up the module configuration:
Select the Feedback form
Choose no in the Assign Contact - we use the email address sent in the event
Select the Hello_feedback_provided event as the submission event type. This is what ensures that the expected event is raised when the form is submitted.
Leave Submitted Only Once
Add the Submit Form Draft module and place it after the Create Form Magic Link module.
Set up the module configuration:
Map the Magic Link ID from the previous step
Select the Feedback form to reveal the fields available for pre-population. The fields can be left empty.
The form has to be emailed to the recipient. To do this, send a message to the recipient containing the magic link. You can use any supported Mail Provider for this. The example below shows Gmail module with the linked account.
Use the OE form UI to review and submit the form. The link has the following format: Match the MAGIC_LINK_ID and the TENANT_ID. The TENANT_ID is specific to the tenant for which you're creating the scenario, it can be fetched from the Trigger Event.
The whole Make scenario should look like this:
Go back to your Hello World value stream in OE and add the Hello World Feedback Provided event as a trigger, after the Send a Survey on Hello World process step. This trigger is necessary to complete the value stream, it receives information that a form was submitted by the user.
Publish the value stream and ensure it's activated again. Send the previous event one more time from Postman or Celonis Action flow to trigger the process. The email with the survey link is sent to the specified email address.
Submit the reply to the form.
After submitting the reply, you can check if the last step of your Hello World value stream (Hello World Feedback Provided) was finished.
If it happens that the Hello World value stream trail did not work for you, see some fixing hints below:
"I got the email, submitted the form, but the process did not complete."
Check if you mapped the value stream ID in the Create Form Magic Link module
Make sure you selected the Hello_feedback_provided event as the submission event type in Create Magic Link module
"I created the process step with the
Send { “hello” : “world” } as the payload.
Trigger the action flow.
Enter email in the Variable Name field.
In the Value Stream Instance ID field, select the Value Stream Instance ID item from the list of items available for the Trigger Event module. This is to ensure the correct process is resumed when the form is submitted. Instance ID is the ID that you can find in the debugger, but here we need to map the relevant field from the trigger.
Name and Language fields can be left empty.
Choose Save to save your module configuration.
Check if there were any errors in the Make scenario sending the email
Check your junk folder for the email
Check you mail provider's logs
{
"status": "SUCCESS",
"message": "Request successfully handled. Request ID: req_M4XnFkQm2aPUTGqiteyN",
"request_id": "req_M4XnFkQm2aPUTGqiteyN"
}{
"status": "SUCCESS",
"message": "Request successfully handled. Request ID: req_M4XnFkQm2aPUTGqiteyN",
"request_id": "req_M4XnFkQm2aPUTGqiteyN"
}
{
"hello" : "world",
"replyTo" : "[email protected]"
}



























