Chat now with support
Chat with Support

erwin Data Transformation - User Guide

erwin DT - User Guide
Getting Started Adapters Configuration Configurations Administrator Tools Running DT on Event Running DT from Cloud Platform Log Viewer Utilities Users, Roles and Security Troubleshooting Information Examples of Workflow Configurations

Active Configurations

Active Configurations

If you want the scheduler to execute the jobs at the scheduled time, activate the configuration by checking the  check box. Once activated the check-box changes colour from yellow to green.

 Please note that a configuration does not necessarily need to be activated to schedule its jobs.

The user is allowed to schedule any of the jobs, but the jobs will not be executed if at the execution time the configuration is not checked as “active”.



The “All the configurations” configuration is always active:

Define Jobs Sequences

Define Jobs Sequences


The user may define job sequences, configuring adapters and workflow in a real logical flow; the user may choose between available adapters and related workflows which ones, and which order they belong in the Job Sequence:


Using right/left arrows to include/ exclude an item, and up/down arrow to change their order.

User may include more than one occurrence of adapter, with different workflows, backup and custom jobs.

You can schedule a job using the standard timing parameters: the job will result in the execution of all the adapters belonging to the job sequencer definition and all of the selected workflow, in the configured order.

Please take into consideration that if a workflow included in a sequence is set inactive in the workflow configuration, then it will still belong to the sequence, but it will appear “grayed out” in the sequence definition window, and its execution will be skipped when running the sequence itself.



Take into account the following internal behavior: when different workflows use source objects exported from the same model:

All the different requests will be aggregated, in terms of different object types, and in terms of filters (which attributes/associations are to be exported; the definition of filters on data).

So, if the objects to be exported, all at once, are typically more than a few thousand, or the filters are conflicting, it’s recommended to replicate the adapter execution, moving workflows in order to make the adapter request more efficient and avoid conflicts:

This is not mandatory and a single rule of workflow aggregation cannot be provided, as it hugely depends on the size/type of required export.

Sometimes, if DT Windows Services are forced to stop, or in case of severe internal error, sequences may enter an inconsistent state, that could stop other jobs from being executed. In that case the administrator may use the “Reset sequences” item in the “Tools” menu (See Utilities).

Multi Source Jobs

Multi Source Jobs

When the user in a workflow configuration defines more than one source (workflow with multi source) the system creates a “special” job named “MultiSource Job” that will be shown in the Job Sequencer and Job Scheduler sections of the specific workflow configuration:

You can schedule it using the standard timing parameters: the job will result in the execution of all the adapters belonging to the source definition of the workflow that will the trigger the workflow execution itself.

Workflows that have any of those adapters as single source or in a different multisource definition will not be triggered.

Custom jobs

Custom jobs

If needed, users can configure a custom job, aimed to run any external executable that can be launched through a command line, using the “Add” button at the bottom of the job list, available with the “All the configurations”.

To configure a Custom Job, beside the Job Name and Description, the user must provide the environment information, such as the IP of the server in which the execution has to be launched, the username and password of a Windows account (as specified in the Windows domain) which has the grant to login to the server and run the execution.

The user must provide the physical path, including the file name, of the given execution in the server, and the optional parameters required by the execution itself, enclosed by quotes if needed, as they would be written in a command line.

The option “Wait For Return” forces the process that launches the execution to be listening for the external process to end and wait for the exit code. In this case it’s mandatory to define the timeout in minutes, to avoid DT hanging, if the executable does not exit in the given elapsed.

DTWhen “Wait For Return” is not checked, user may define a number of minutes that DT will wait before scheduling the next job, if the custom job is included in a job sequence.

After saving the configuration, user can test the execution of the job:

In case of error, DT will provide the return code of the process execution.

You can schedule it using the standard timing parameters: the job will result in the launch of the external execution.

Related Documents

The document was helpful.

Select Rating

I easily found the information I needed.

Select Rating