Chatta subito con l'assistenza
Chat con il supporto

Content Matrix 9.10 - Performance Optimization Guide

Introduction Special Considerations for Migrations to a SharePoint On Premises Target Special Considerations for Migrations to a SharePoint Online Target To achieve performance levels of 10GB/hour or higher Disabling Unnecessary Features Multithreading and Batch Jobs PowerShell Distributed Migration About Us

Configuring Content Matrix for Use with the Import Pipeline

 

Using Azure Private Containers

With Azure private (paid) containers you use your own Azure Storage location, which involves setting up an Azure Blob Storage account and configuring Content Matrix to work with it.  Microsoft recommends the use of private containers for migrations to achieve faster performance.

For instructions for creating a storage account to manage Azure private containers, refer to the article Create a storage account - Azure Storage | Microsoft Learn. After the storage account has been created, you will also need to create an Access Key and copy the Connection string to use when configuring Content Matrix migrations.

To specify a temporary storage location for processing of items and files:

1. On the machine(s) where you have installed Content Matrix SharePoint Edition, open the file EnvironmentSettings.xml, which can be found in one of the following locations:

·If you are required to be a local administrator on the machine where Content Matrix is  installed: C:\ProgramData\Metalogix

OR

·If you are not required to be a local administrator on the machine where Content Matrix is installed: C:\Users\<username>\AppData\Roaming\Metalogix\Common.

2.Find or add the parameter UploadManagerLocalTemporaryStorageLocation and set it to a location on your hard drive where processing of items and files can take place. It is important to have at least 40GB of free disk space for this processing. In the example below, I’m using C:\AzureAPI as the location for processing:

<XmlableEntry>

<Key>UploadManagerLocalTemporaryStorageLocation</Key>

<Value>C:\AzureAPI</Value>

</XmlableEntry>

To configure a migration to use Azure Private Containers:

On the List Content Options, tab select the Use Import Pipeline option, select Azure Private Containers, and enter your Azure Storage Connection String in the following format:

DefaultEndpointsProtocol=https;AccountName={your accountname};AccountKey={your account key}.

Azure Private Containers Connection String

You may optionally choose to select Encrypt Azure Jobs for extra security at the cost of performance:

Removing Private Containers and Queues Created by Content Matrix from Your Azure Storage Account

After a particularly large migration using the Import Pipeline, you may want to remove private containers and queues created by Content Matrix to free up space for more migration actions. Using Azure PowerShell, you can simultaneously remove all containers or queues that were created on the same date.  A prerequisite to removing multiple containers and queues is that Azure PowerShell must be installed and configured. Refer to the Microsoft Azure PowerShell documentation for installation instructions and general information about Azure PowerShell Cmdlets.

Naming Conventions

Beginning in version 9.3, Content Matrix uses naming conventions that allow you to easily identify containers and queues created by Content Matrix during migration using the Import Pipeline.  

File Type

Naming Convention

Blob Container
(per SharePoint list)

YYYYMMDD-<List Internal Name>-<GUID>-<blob>.

EXAMPLE:  

20200929-shreddocumentt-f0da9f001d3b450cb36ad9e9ecae219f-blob

Metadata Container
(per Import Pipeline job)

YYYYMMDD-<List Internal Name>-<GUID>-<Batch number in XXX format>

EXAMPLE:  

20200929-shreddocumentt-f0da9f001d3b450cb36ad9e9ecae219f-001

Reporting Queue
(per Import Pipeline job)

YYYYMMDD-<List Internal Name>-<GUID>-<Batch number in XXX format>

EXAMPLE:

20200929-shreddocumentt-f0da9f001d3b450cb36ad9e9ecae219f-001

To remove items from your Azure blob storage account using Azure PowerShell:

1.Ensure there are no active migration actions running that use the Azure storage where the items are stored.

2.Open Azure PowerShell.

3.Connect to Azure using the New-AzStorageContext cmdlet. Your account name and account key are required parameters. Note that these are different from your Client ID and Application Secret.        $context = New-AzStorageContext -StorageAccountName $accountName -StorageAccountKey $accountKey

4.Use the information in the following table to determine the appropriate command to use.

If you want to remove ...

Then use the command ...

all blob and metadata containers created on a specific date

Get-AzStorageContainer -Prefix "<MMDDYY>" -Context <context> | Remove-AzStorageContainer -Force

 

EXAMPLE:

Get-AzStorageContainer -Prefix "20200929" -Context $context02 | Remove-AzStorageContainer -Force

all reporting queues created on a specific date

Get-AzStorageQueue -Prefix "<MMDDYY>" -Context <context> | Remove-AzStorageQueue -Force

 

EXAMPLE:

Get-AzStorageQueue -Prefix "20200929" -Context $context02 | Remove-AzStorageQueue -Force

Using SPO Provided Azure Containers

The SPO provided (free) Containers are a set of Azure storage containers that are associated with a user’s tenant. This container is provided to a user at no cost from Microsoft and is associated with the infrastructure the user’s tenancy exists within. Content can only be uploaded to this location when it is also encrypted, making encryption a requirement.

This action requires no work from the user outside of the selection of the appropriate checkboxes/toggles within Content Matrix.

To use SPO provided Azure containers, make sure the Use Import Pipeline and SPO Provided Azure Containers are selected.

NOTE:  The Encrypt Azure Jobs option is disabled for this option because Content Matrix always encrypts jobs using SPO Provided Containers.

List Options

 

Related Documents

The document was helpful.

Seleziona valutazione

I easily found the information I needed.

Seleziona valutazione