Chat now with support
Chat with Support

erwin Data Transformation 9.2.3 - User Guide

erwin Data Transformation v9.2.3_User Guide
Getting Started Adapters Configuration Configurations Administrator Tools Running DT on Event Running DT from Cloud Platform Log Viewer Utilities Users, Roles and Security Troubleshooting Information Examples of Workflow Configurations

Running a DB Adapter on Event with Parameters

Running a DB Adapter on Event with Parameters


When configuring and using a trigger file for executing a DB adapter on event, it is possible to add parameters to drive the query resultset through them.

Let’s suppose that you want to configure a DB Query Adapter, which is named “MY ADAPTER” and extract detail of just the Applications that have been updated / inserted in a table named “MY_TABLE”. The Applications are selected, based on their ID.

The SQL query to do this would look like:

SELECT * from MY_TABLE where APPLICATION_ID in (131, 156, 653)

When triggering DT on event, you may want this adapter to just process applications that were updated or inserted in the DB table. The IDs of the updated / inserted Applications can be obtained and passed to the adapter using parameters in the SQL query. e.g.


<%APP_ID_LIST%> in the SQL query is the parameter name to obtain the name of the parameter in the XML trigger file.

Please note the required syntax for DT: parameter names must be provided enclosed by the prefix “<%” and suffix “%>”, like: <%PARAM_NAME%>.

 The values of the parameter must be written in the XML trigger file each time that the SQL Trigger executes on the Table where the Application detail will be updated / inserted.

The XML trigger file must contain the following:


      <PARAMETER name="APP_ID_LIST" value="345, 436"/>


The parameter name is hard coded into the SQL Trigger, and the values are the IDs of the changed Applications which is written directly into the xml trigger file by the SQL Trigger. These values can be different, each time that the database trigger is executed.

When the DT Database query is executed, the values will be put in place of the parameter name, as follows:

SELECT * from MY_TABLE where APPLICATION_ID in (345, 436)


Parameters included in the trigger file can be used even in workflow transformations, when they accept fixed values, like:


Special parameters include:

<%SYSDATE%>: a parameter can be used to populate a fixed value column, or a filter value, with the system date and time (UTC format); if the value <%SYSDATE%> is entered, DT will replace the parameter with the datetime value, at execution time.

<%BLANK%>: in Replace Text transformation, when you want it to be applied to an empty value.   


Running a Sequence on event

Running a Sequence on event


In this version, you can define job sequences, not only on schedule time basis, but in a real logical flow; that is, the user may choose between available adapters and related workflows; which ones, and in which order they belong in the Job Sequence:



The job sequences can be launched on demand, on schedule, or on event in a similar way to adapters.

A sample of the needed trigger file could be obtained pushing the “Trigger sample” button:

<?xml version="1.0" encoding="UTF-8"?>

<CC_TRIGGER sequence="MySequenceJob">


              <PARAMETER name="MY_CATEGORY_PARAM" value="sample_value" />



And it will contain all the parameters included in the included adapter/workflow definitions (where “sample_value” must be replaced by the actual "MY_CATEGORY_PARAM” value when producing the real trigger file).

The trigger files must be compliant to the naming convention:


Running DT from Cloud Platform

Running DT from Cloud Platform

Collector4Cloud provides EA Agile/EA Agile V3 users the capability to interact with onPrem DT:

View available DT flows / job scheduler

View the workflow definition, in terms of sources/target Adapters and configuration

View the Job Sequence definition, in terms of included workflows and custom jobs

Run a specific flow or job sequence on demand or by schedule, following the execution status

view the data managed by the flow to the target system / people

check if the flow exited with no data

check if there were errors, and view the logs in case

Have a complete updated view of the daily operations log


Both erwin EA Agile and erwin EA Agile V3 are supported, depending on the specific configuration (see Cloud Environment). This enables Listener Windows service to support on premise application to work with cloud:

Job (workflows and sequences) definitions are pushed to cloud

Job requests are pulled from cloud

oone time requests– drag and drop to “Requested” status Kanban         





oon schedule – setting the datetime field “Schedule Job for





Please note that repetitive schedules are only configurable by on premise application.


 User can monitor the execution of jobs by different views and fields and linked reports:

“Last execution Dataset” for workflows:


“Execution Progress” for Job Sequences (with links to workflows dataset), continuously updated:

Notification emails – can be requested by cloud setting the “Notify Result to” field on workflow/job sequences





Other summary views show more detailed information about last run, including last execution datetime and link to output file (if any) or error file (in case of errors):

A dedicated logs view enables cloud users to be aware of DT operations, from the request through intermediate steps to execution, with a XLSX report, updated in near real time:




User can also reset requests:

one time requests  – drag and drop on “unassigned” status kanban

on schedule – clearing the datetime field “Schedule Job for”, if not started; with drag and drop on “unassigned” status kanban, if running



all requests can be deleted through “RESET CLOUD REQUESTS” job from cloud, or “ToolsReset Cloud Requests” command from onprem UI.


Log Viewer

Log Viewer

The log window lists all the most important actions performed by DT.

For each row, you can see the name of the configuration (column name CONFIGURATION – available only if linked to the action), the date (column name DATE – timestamp of the action), the user (column name USER – the user who owns the action), the operation (column name OPERATION – what has been done) and a short description of the action (column name DESCRIPTION).

Filters (one for each column) can be used to see a subset of the whole Log.

By pressing the  button the selected filters are applied.

Clicking on the head of each column, the rows will be ordered alphabetically.

If “Note” refers to a file or folder, double click on the note text will open it.

Related Documents

The document was helpful.

Select Rating

I easily found the information I needed.

Select Rating