Chat now with support
Chat with Support

erwin Data Transformation 9.2.5 - User Guide

erwin Data Transformation v9.2.5_User Guide
Getting Started Adapters Configuration Configurations Administrator Tools Running DT on Event Running DT from Cloud Platform Log Viewer Utilities Users, Roles and Security Troubleshooting Information Examples of Workflow Configurations

Running an Adapter on Event

Running an Adapter on Event

In a real life environment, it could be important to acquire external data on a scheduled basis, or in an “on event” mode; in particular, when a DB Adapter is used for reading the information from a database. It’s quite easy to implement, for example, a stored procedure that drops a text file, containing the trigger information for DT to start a given adapter.

The trigger file can be detected by DT if dropped in a given folder:

any of the DT adapter folders (including the specific adapter, but not mandatory)

the folder “C:\ProgramData\erwin\Data Transformation\workflows

The described triggering feature applies to any other adapter type.

The trigger files must be compliant with the naming convention:

CCTRG_ADAPTER_YYYYMMDD_HHMMSS.xml

and the sample content describing the schema is detailed below (see next paragraph for <PARAMETERS> tag):

Please note that you could even choose whether to execute all the (active) workflows defined against that adapter as source, or a sub list of your choice:

 

 <?xml version="1.0" encoding="UTF-8"?>

<CC_TRIGGER>

    <PARAMETERS>

          <PARAMETER name="APP_ID_LIST" value="2,5,20"/>

   </PARAMETERS>

  <ADAPTER name="APP CATALOGUE APPS">

          <DELAY HH="0" MIN="0" SEC="30" />                   

          <WORKFLOWS all="false">

<WORKFLOW name="CCWORKFLOW_1">

<DELAY SEC="0" MIN="0" HH="0"/>

</WORKFLOW>

</WORKFLOWS>

  </ADAPTER>

</CC_TRIGGER>

Running a DB Adapter on Event with Parameters

Running a DB Adapter on Event with Parameters

 

When configuring and using a trigger file for executing a DB adapter on event, it is possible to add parameters to drive the query resultset through them.

Let’s suppose that you want to configure a DB Query Adapter, which is named “MY ADAPTER” and extract detail of just the Applications that have been updated / inserted in a table named “MY_TABLE”. The Applications are selected, based on their ID.

The SQL query to do this would look like:

SELECT * from MY_TABLE where APPLICATION_ID in (131, 156, 653)

When triggering DT on event, you may want this adapter to just process applications that were updated or inserted in the DB table. The IDs of the updated / inserted Applications can be obtained and passed to the adapter using parameters in the SQL query. e.g.

SELECT * from MY_TABLE where APPLICATION_ID in (<%APP_ID_LIST%>)

<%APP_ID_LIST%> in the SQL query is the parameter name to obtain the name of the parameter in the XML trigger file.

Please note the required syntax for DT: parameter names must be provided enclosed by the prefix “<%” and suffix “%>”, like: <%PARAM_NAME%>.

 The values of the parameter must be written in the XML trigger file each time that the SQL Trigger executes on the Table where the Application detail will be updated / inserted.

The XML trigger file must contain the following:

<PARAMETERS>

      <PARAMETER name="APP_ID_LIST" value="345, 436"/>

</PARAMETERS>

The parameter name is hard coded into the SQL Trigger, and the values are the IDs of the changed Applications which is written directly into the xml trigger file by the SQL Trigger. These values can be different, each time that the database trigger is executed.

When the DT Database query is executed, the values will be put in place of the parameter name, as follows:

SELECT * from MY_TABLE where APPLICATION_ID in (345, 436)

 

Parameters included in the trigger file can be used even in workflow transformations, when they accept fixed values, like:

 

Special parameters include:

<%SYSDATE%>: a parameter can be used to populate a fixed value column, or a filter value, with the system date and time (UTC format); if the value <%SYSDATE%> is entered, DT will replace the parameter with the datetime value, at execution time.

<%BLANK%>: in Replace Text transformation, when you want it to be applied to an empty value.   

     

Running a Sequence on event

Running a Sequence on event

 

In this version, you can define job sequences, not only on schedule time basis, but in a real logical flow; that is, the user may choose between available adapters and related workflows; which ones, and in which order they belong in the Job Sequence:

 

 

The job sequences can be launched on demand, on schedule, or on event in a similar way to adapters.

A sample of the needed trigger file could be obtained pushing the “Trigger sample” button:

<?xml version="1.0" encoding="UTF-8"?>

<CC_TRIGGER sequence="MySequenceJob">

           <PARAMETERS>

              <PARAMETER name="MY_CATEGORY_PARAM" value="sample_value" />

           </PARAMETERS>

</CC_TRIGGER>

And it will contain all the parameters included in the included adapter/workflow definitions (where “sample_value” must be replaced by the actual "MY_CATEGORY_PARAM” value when producing the real trigger file).

The trigger files must be compliant to the naming convention:

CCTRG_SEQUENCE_YYYYMMDD_HHMMSS.xml

Running DT from Cloud Platform

Running DT from Cloud Platform

Collector4Cloud provides EA Agile/EA Agile V3 users the capability to interact with onPrem DT:

View available DT flows / job scheduler

View the workflow definition, in terms of sources/target Adapters and configuration

View the Job Sequence definition, in terms of included workflows and custom jobs

Run a specific flow or job sequence on demand or by schedule, following the execution status

view the data managed by the flow to the target system / people

check if the flow exited with no data

check if there were errors, and view the logs in case

Have a complete updated view of the daily operations log

 

Both erwin EA Agile and erwin EA Agile V3 are supported, depending on the specific configuration (see Cloud Environment). This enables Listener Windows service to support on premise application to work with cloud:

Job (workflows and sequences) definitions are pushed to cloud

Job requests are pulled from cloud

oone time requests– drag and drop to “Requested” status Kanban         

 

 

 

 

oon schedule – setting the datetime field “Schedule Job for

 

 

 

 

Please note that repetitive schedules are only configurable by on premise application.

 

 User can monitor the execution of jobs by different views and fields and linked reports:

“Last execution Dataset” for workflows:

 

“Execution Progress” for Job Sequences (with links to workflows dataset), continuously updated:

Notification emails – can be requested by cloud setting the “Notify Result to” field on workflow/job sequences

 

         

 

 

Other summary views show more detailed information about last run, including last execution datetime and link to output file (if any) or error file (in case of errors):

A dedicated logs view enables cloud users to be aware of DT operations, from the request through intermediate steps to execution, with a XLSX report, updated in near real time:

 

 

 

User can also reset requests:

one time requests  – drag and drop on “unassigned” status kanban

on schedule – clearing the datetime field “Schedule Job for”, if not started; with drag and drop on “unassigned” status kanban, if running

 

 

all requests can be deleted through “RESET CLOUD REQUESTS” job from cloud, or “ToolsReset Cloud Requests” command from onprem UI.

         

Related Documents

The document was helpful.

Select Rating

I easily found the information I needed.

Select Rating