Multi Source Jobs
When the user in a workflow configuration defines more than one source (workflow with multi source) the system creates a “special” job named “MultiSource Job” that will be shown in the Job Sequencer and Job Scheduler sections of the specific workflow configuration:
You can schedule it using the standard timing parameters: the job will result in the execution of all the adapters belonging to the source definition of the workflow that will the trigger the workflow execution itself.
Workflows that have any of those adapters as single source or in a different multisource definition will not be triggered.
If needed, users can configure a custom job, aimed to run any external executable that can be launched through a command line, using the “Add” button at the bottom of the job list, available with the “All the configurations”.
To configure a Custom Job, beside the Job Name and Description, the user must provide the environment information, such as the IP of the server in which the execution has to be launched, the username and password of a Windows account (as specified in the Windows domain) which has the grant to login to the server and run the execution.
The user must provide the physical path, including the file name, of the given execution in the server, and the optional parameters required by the execution itself, enclosed by quotes if needed, as they would be written in a command line.
The option “Wait For Return” forces the process that launches the execution to be listening for the external process to end and wait for the exit code. In this case it’s mandatory to define the timeout in minutes, to avoid DT hanging, if the executable does not exit in the given elapsed.
DTWhen “Wait For Return” is not checked, user may define a number of minutes that DT will wait before scheduling the next job, if the custom job is included in a job sequence.
After saving the configuration, user can test the execution of the job:
In case of error, DT will provide the return code of the process execution.
You can schedule it using the standard timing parameters: the job will result in the launch of the external execution.
Running DT on Event
Running an Adapter on Event
In a real life environment, it could be important to acquire external data on a scheduled basis, or in an “on event” mode; in particular, when a DB Adapter is used for reading the information from a database. It’s quite easy to implement, for example, a stored procedure that drops a text file, containing the trigger information for DT to start a given adapter.
The trigger file can be detected by DT if dropped in a given folder:
•any of the DT adapter folders (including the specific adapter, but not mandatory)
•the folder “C:\ProgramData\erwin\Data Transformation\workflows”
The described triggering feature applies to any other adapter type.
The trigger files must be compliant with the naming convention:
and the sample content describing the schema is detailed below (see next paragraph for <PARAMETERS> tag):
Please note that you could even choose whether to execute all the (active) workflows defined against that adapter as source, or a sub list of your choice:
<?xml version="1.0" encoding="UTF-8"?>
<PARAMETER name="APP_ID_LIST" value="2,5,20"/>
<ADAPTER name="APP CATALOGUE APPS">
<DELAY HH="0" MIN="0" SEC="30" />
<DELAY SEC="0" MIN="0" HH="0"/>
© ALL RIGHTS RESERVED. Feedback 使用条款 隐私 Cookie Preference Center