Chat now with support
Chat mit Support

Archive Shuttle 11.2 - Planning Guide

Planning for migrations to UNC

If you plan to migrate data to UNC, install the Storage Import Module before beginning.

You’ll use the Storage Import and Storage Provider target when configuring the workflow policy, mapping wizard, etc.

While setting up the migration, use the Storage Import Module tab of the System Configuration page to, for example, set item/archive parallelism, set up conversion of MSG to EML, fail items permanently on specified errors, etc.

Planning component installation

Where to set up Archive Shuttle components

The core Archive Shuttle Components include the following:

·Archive Shuttle Core Web Interface

·Archive Shuttle Core Web Services

·Archive Shuttle Core Service

It is recommended to install these components on the same server. For a production environment, this must be a dedicated server (physical or virtual). They can; however, be installed on separate servers.

See the chapter Hardware requirements for Archive Shuttle Core Server for more information about prerequisites.

Archive Shuttle also has the following Modules:

·Archive Shuttle Active Directory Collector Module

·Archive Shuttle Enterprise Vault Collector Module

·Archive Shuttle Enterprise Vault Export Module

·Archive Shuttle Enterprise Vault Import Module

·Archive Shuttle Enterprise Vault Provisioning Module

·Archive Shuttle Enterprise Vault Shortcut Processing Module

·Archive Shuttle Exchange Import Module

·Archive Shuttle Office 365 Module

·Archive Shuttle Native Format Import Module

·Archive Shuttle EAS Module

·Archive Shuttle Metalogix Module

·Archive Shuttle PST Export Module

·Archive Shuttle Dell Archive Manager Module

·Archive Shuttle SourceOne Module

Depending on the migration scenario, some or all of these modules have to be installed.

The following table provides a guideline on where to install these modules:

MODULE

NUMBER OF MODULES

NOTES

Active Directory Collector Module

One per Active Directory forest

Needs to be installed on a Server in the Active Directory Forest where data is to be collected from. The account that it is running under must have privileges to read Active Directory and its objects – a normal user account is sufficient.

Enterprise Vault Collector Module

Minimum: One per source Enterprise Vault environment

Recommended: One per Enterprise Vault Server hosting a Vault Store.

Note: If performing an EV to EV migration, this module is also needed in the target environment to gather site settings (HTTP/HTTPS) used when running ‘Fix Shortcuts’.

There needs to be at least one Enterprise Vault Collector Module installed in the source Enterprise Vault environment. It is recommended to install this component near the SQL Server that hosts the Vault Store databases. Care should be taken when additional Enterprise Vault Collector Modules are installed. Performance might not necessarily be increased, the limiting factor is usually the time it takes to retrieve data from the Vault Store databases.

Enterprise Vault Provisioning Module

One per source Enterprise Vault environment, and one per target Enterprise Vault environment.

There needs to be one Enterprise Vault Provisioning Module per Enterprise Vault environment, i.e. per Enterprise Vault Directory Database. This module uses EVPM. It is recommended to install this component on the least busy Enterprise Vault server.

Enterprise Vault Export Module

Minimum: One per source Enterprise Vault environment

Recommended: One per Enterprise Vault Server hosting a Vault Store.

There needs to be at least one Enterprise Vault Export Module installed in the source Enterprise Vault environment. It is recommended to install the Enterprise Vault Collector module on each Enterprise Vault server that hosts Vault Store partition data, in the source environment. This module requires the Enterprise Vault API. If this module is to be installed on a non-Enterprise Vault server, then the Enterprise Vault API Runtime needs to be installed.

Enterprise Vault Import Module

Note: This module is only required if the migration scenario includes migration to an Enterprise Vault environment.

Minimum: One per target Enterprise Vault environment.

Recommended: One per target Enterprise Vault Server hosting a Vault Store.

There needs to be at least one Enterprise Vault Import Module per target Enterprise Vault environment, i.e., Enterprise Vault Directory. It is recommended to install an Enterprise Vault Import Module on each Enterprise Vault Server where the Vault Store to import to is hosted. This module requires the Enterprise Vault API. If this module is to be installed on a non-Enterprise Vault server, then the Enterprise Vault API Runtime needs to be installed.

Shortcut Processing Module

Minimum: One per target environment.

There needs to be at least one Shortcut Processing Module per target environment.

This module must be installed on a Server with Microsoft Outlook 2007/2010/2013 (32 bit).

Note: An optional change in the System Configuration can change this module to use EWS rather than MAPI (however the installation of the module still requires Outlook)

Exchange Import Module

Note: This module is only required if the migration scenario includes migration to an Enterprise Vault environment.

Minimum: One per target Exchange environment.

There needs to be at least one Exchange Import Module per target Exchange environment. This module requires Microsoft Outlook to be installed. It is not supported to install the module on an Exchange Server. It is recommended to install this module on a dedicated physical or virtual machine.

To improve performance, multiple Exchange Import Modules can be installed. It is recommended to start with one module and add more modules (up to one per Exchange database) if required.

Office 365 Module

One per environment

There needs to be an Office 365 module in order to migrate containers to Office 365. This module collects data about Office 365 mailboxes, as well as migrating data into the target container and post-processing the ingested data to remove Enterprise Vault shortcuts and pending items.

The module connects to Office 365 using the credentials specified in the Credential Editor, which is installed by the module.

Native Format Import Module

One per environment

One Native Format Import Module is required in order to take extracted data and create PST files. This module will split PST files at predetermined sizes and store them with a name defined by policy into a UNC accessible staging area which is defined per link.

This module must be installed on a Server with Microsoft Outlook 2007/2010/2013 (32 bit).

Proofpoint Module

One per environment

One Proofpoint Module is required in order to take extracted data and prepare/construct the data required for Proofpoint ingestion.

This module must be installed on a Server with Microsoft Outlook 64-bit. This module also requires a 64-bit JRE. Additional information is available in the Installation Overview.

EAS Zantaz Module

One per environment

One EAS Module is required on a machine which can communicate with an EASIIS Server which has access to all the required Document Repositories.

Metalogix

One per environment

One Metalogix Module is required on a machine which can communicate with an Metalogix Server which has access to all the required Document Repositories.

PST Export Module

One per environment

One PST Export Module is required, and can process a number of UNC paths to extract data from PST files which are located there.

Dell Archive Manager Module

One per environment

One Dell Archive Manager module is required for the source environment.

SourceOne Module

One per environment

One SourceOne module is required for the source environment.

Planning for the Archive Shuttle databases

Archive Shuttle uses the following Databases:

·The Archive Shuttle Directory database. There is just one of these; it hosts all configuration and non-item based metadata.

·The Archive Shuttle Item database(s). There is one of these for each source Link (e.g. one per Vault Store). These databases do not have to be on the same SQL Server as the Archive Shuttle Directory database. Each Item Database can be on a separate SQL Server, if required.

Microsoft SQL Server must be installed and set up before you install Archive Shuttle. The collation requirement for the SQL Server installation must be case-insensitive, accent-sensitive (CI, AS); case-sensitive and accent-insensitive installations are not supported.

Microsoft SQL Server must be on a dedicated server, either physical or virtual. It is not supported to have it on the same server as the Archive Shuttle Core server. It is not supported to have the Microsoft SQL Server shared with any other software, for production use.

Before installing the Archive Shuttle Core Components, make sure the account that will be used has “dbcreator” rights in Microsoft SQL Server.

info

NOTE: SQL is only relevant within your environment when installing Core (see Quadrotech Archive Shuttle Compatibility Guide for more). If you are installing within the cloud, Quadrotech maintains SQL for you.

SQL versions

To see which versions of SQL Server are supported, go to the Quadrotech Archive Shuttle Compatibility Guide.

info

NOTE: Having the latest service pack installed is recommended.

SQL editions

Although Enterprise Edition of Microsoft SQL Server is recommended, Standard Edition may be used if the SQL instance uses the recommended (not minimum) resources associated with the size of migration you are performing. Planning for additional time will be required to accommodate regularly required offline maintenance.

Planning export/import storage

Sizing the staging location

The Archive Shuttle Export and Import Modules require a location to store the extracted data. Sizing of this location should be carefully considered.

Consider the following options when sizing the storage area:

·Local and Network Attached Storage is supported.

·Modifying the export/import schedules for the required modules can mean that the data to be migrated flows through the storage area with only a small amount being present at any time.

·Modifying the container mappings and ‘batching’ the migration will also lead to a smaller amount of data residing in the export/import location.

info

NOTE: Export will stop if the free space on the staging area drops to 20 Gb or lower.

It is recommended to allow for between 50 and 100 Gb per source link. For example, with Enterprise Vault that would be 50 to 100 Gb per source Vault Store. Other sources may vary.

info

NOTE: If the migration will send the data to PST files, additional space may be required in the staging area since, by default, temporary PSTs are created in the staging area before they’re moved to the PST Output Path. The temporary location can be changed, or the space needed per source link should be increased to at least double the recommendations above.

The System Health dashboard provides an administrator with an overview of storage and other health parameters which may affect the migration being undertaken. The following information is displayed:

Item

Description

Module Health

Displays information relating to all modules in the Archive Shuttle environment.

Unmapped Retentions

Any retention categories that are not mapped to a target environment are displayed here.

Link Health

Detailed information relating to each of the links will be displayed here. The links can be filtered so that only particular types of link are shown (e.g., Enterprise Vault, Exchange)

This part of the dashboard will also show if an Enterprise Vault link is backup mode. This will affect any ingests to that Link (they will be suspended while the link is in backup mode)

Free space on a staging area location is color coded as shown in the following table:

Highlighting

Reason

No highlighting

Free space is above 100 Gb

Yellow

Free space is below 100 Gb

Red

Free space is below 50 Gb

In addition, the System Health page gives an overview of any modules which are deactivated or otherwise not running. This is shown at the top left of the screen in the Admin Interface.

The Link information at the bottom of the page also highlights the ‘Used’ space on each link. Used space is defined as the amount of data that has been exported, but not yet imported. This too is color coded for ease of identifying a potential problem, as follows:

Highlighting

Reason

No highlighting

Used space is within normal parameters

Yellow

Used space is between 75% and 90% of the maximum allowed

Red

Used space is above 90% of the maximum allowed

If the Used Space reaches 100% on a link, exporting of data will be stopped until data has been imported, even if there is still free disk space available on disk. This is further illustrated as follows:

SizingtheStaging1

SizingtheStaging2

Verwandte Dokumente

The document was helpful.

Bewertung auswählen

I easily found the information I needed.

Bewertung auswählen