Chat now with support
Chat with Support

Metalogix Replicator 7.4 - Product and Concepts Guide

Scenario 2

If you wish to install additional Replicator engines to your other WFEs, like WFE2 at http://wfe2, then this can increase your thread count, and the ability of Replicator to manage multiple actions simultaneously. For more information on threads and parallel processing, see the sections below titled Parallel Processing and Threads.
clip0035

Scenario 3

If you are not using a load balancer, or do not want to set up your replication system to work through a load balancer, then you can set up replication to work with a single specific web front-end at all times. In this case, our target Replicator is configured to communicate directly to the server where Replicator is installed, WFE1 in our example. WFE1 is the only front-end that will service Replicator requests, meaning we do not need or use the IIS virtual directories on WFE2 or WFE3. A load balancer can still be used to redirect normal web traffic coming to http://wfe.
 
clip0036

The disadvantage of such a configuration is that you limit your system to a single WFE for all Replicator requests. This can add additional strain onto that single WFE and could therefore affect the performance of replication, or clients accessing that server.

Multiple Replicator Engines

With Metalogix Replicator Enterprise Edition, you can install Replicator on multiple front-end servers in the same farm. The benefit of doing so is:

·Instead of one server managing the load of creating and applying events, the load can be distributed over multiple web front-end servers.

·In the event that the front-end server where the Replicator services are running becomes unavailable because of an outage, then Replicator will continue to create and apply packages using the Replicator services on the other web front-end servers.

info

NOTE: Replicator is designed to continue capturing events as they occur, even if the front- end server running the services is not available. These events will merely be captured and placed in a queue until the server is back up and able to replicate them.

 

Parallel Processing

Parallel processing involves the use of multiple threads in order to perform various computing actions simultaneously. The number of threads available determines the number of actions that can be run at the same time.

Parallel processing involves the handing of tasks simultaneously. However, it is important to note that some tasks must occur in sequence. For example, within a site we process all events in the order that they are captured, on the outbound, and the order that they are received, on the inbound. One thread is therefore used within a site to perform this processing. When a list is added, it must be processed prior to the addition of an item to that list, so that there is a list present on the target to add the item to. This means that a thread will be used to process the addition of the list, and once this processing is completed, the same thread will be able to process the addition of an item to that list. These two cannot be performed simultaneously, and therefore do not require the use of more than one thread, the same thread.

Related Documents