Skip to content

Commit c3fe251

Browse files
committed
Updated docs for Cosmo Table Data Migration
1 parent 8030652 commit c3fe251

File tree

1 file changed

+52
-11
lines changed

1 file changed

+52
-11
lines changed

persistence/azure-table/migration-from-azure-storage-table-to-cosmos-table.md

Lines changed: 52 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -3,30 +3,49 @@ title: Migration from Azure Storage Table to Azure Cosmos DB Table API
33
component: ASP
44
related:
55
- persistence/azure-table
6-
reviewed: 2024-02-01
6+
reviewed: 2025-01-24
77
---
88

99
> [!WARNING]
1010
> The endpoint being migrated must be offline while migrating saga data. The saga data must be using secondary indexes (introduced in Azure Table Persistence 2.x) or be stored with Azure Table Persistence Version 3 or higher for this upgrade guide to succeed. The migration scenario described assumes only saga data of a one saga is stored per table.
1111
1212
## Import data
1313

14-
> [!NOTE]
15-
> At the time of writing this guidance the Data migration tool did incorrectly project columns and thus would crash with `NullReferenceException`. The [Pullrequest](https://github.com/Azure/azure-documentdb-datamigrationtool/pull/126) has been merged but it is not confirmed yet when the tool will be released. If required build the latest master branch of the tool.
14+
The saga data can be imported into Cosmos DB Table API using the [Azure Cosmos DB Desktop Data Migration Tool](https://github.com/azurecosmosdb/data-migration-desktop-tool) provided by Microsoft. This tool is built on .NET 6, is cross-platform, and replaces the legacy `dt.exe` tool.
1615

17-
The saga data can be imported into Cosmos DB Table API using the [Data migration tool](https://docs.microsoft.com/en-us/azure/cosmos-db/import-data#Install) provided by Microsoft. The import tool provides both [a UI and a command line](https://docs.microsoft.com/en-us/azure/cosmos-db/import-data#AzureTableSource) option. The general command looks like the following
16+
### Download the tool
1817

19-
```
20-
dt.exe /s:AzureTable /s.ConnectionString:"<AzureTableStorageConnectionString>" /s.Table:<SagaTableName> /s.InternalFields:All /s.Projection:"<SagaProperties>;Originator;OriginalMessageId;NServiceBus_2ndIndexKey;SagaId" /t:TableAPIBulk /t.ConnectionString:"<AzureCosmosTableApiConnectionString>" /t.TableName:<SagaTableName> /ErrorLog:errors.csv /ErrorDetails:All /OverwriteErrorLog:true
18+
Download the latest version of the Azure Cosmos DB Desktop Data Migration Tool from the [GitHub releases page](https://github.com/azurecosmosdb/data-migration-desktop-tool/releases).
19+
20+
### Configure the migration
21+
22+
Create a `migrationsettings.json` file in the tool's directory with the following structure:
23+
24+
```json
25+
{
26+
"Source": "AzureTableAPI",
27+
"Sink": "AzureTableAPI",
28+
"SourceSettings": {
29+
"ConnectionString": "<AzureTableStorageConnectionString>",
30+
"Table": "<SagaTableName>"
31+
},
32+
"SinkSettings": {
33+
"ConnectionString": "<AzureCosmosTableApiConnectionString>",
34+
"Table": "<SagaTableName>"
35+
},
36+
"Operations": []
37+
}
2138
```
2239

2340
### Parameters
2441

25-
`<AzureTableStorageConnectionString>`: The Azure Table Storage (source) connection string<br/>
26-
`<AzureCosmosTableApiConnectionString>`: The Azure Cosmos DB Table API (destination) connection string.<br/>
27-
`<SagaProperties>`: A semicolon seperated list of all saga properties that need to be projected (e.g `OrderId;OrderDescription;OrderState`). Make sure to leave `Originator;OriginalMessageId;NServiceBus_2ndIndexKey;SagaId` since those are standard columns that always need to be projected in case they are available.<br/>
42+
`<AzureTableStorageConnectionString>`: The Azure Table Storage (source) connection string. This can be found in the Azure Portal under your Storage Account's **Access keys** or **Connection string** section.<br/>
43+
`<AzureCosmosTableApiConnectionString>`: The Azure Cosmos DB Table API (destination) connection string. This can be found in the Azure Portal under your Cosmos DB account's **Connection String** or **Keys** section. Ensure the connection string includes the `TableEndpoint` parameter pointing to your Cosmos DB Table API account.<br/>
2844
`<SagaTableName>`: The name of the saga data table (e.g `OrderSagaData`).<br/>
2945

46+
> [!NOTE]
47+
> The migration tool automatically migrates all columns from the source table, including all saga properties and the standard NServiceBus columns (`Originator`, `OriginalMessageId`, `NServiceBus_2ndIndexKey`, `SagaId`). No explicit column projection is required.
48+
3049
### Example
3150

3251
For example, to import a single saga data table called `OrderSagaData` with the saga data type:
@@ -43,12 +62,34 @@ public class OrderSagaData : IContainSagaData
4362
}
4463
```
4564

46-
the following command can be used:
65+
the following `migrationsettings.json` can be used:
4766

67+
```json
68+
{
69+
"Source": "AzureTableAPI",
70+
"Sink": "AzureTableAPI",
71+
"SourceSettings": {
72+
"ConnectionString": "DefaultEndpointsProtocol=https;AccountName=MyStorageAccount;AccountKey=<key>;EndpointSuffix=core.windows.net",
73+
"Table": "OrderSagaData"
74+
},
75+
"SinkSettings": {
76+
"ConnectionString": "DefaultEndpointsProtocol=https;AccountName=MyCosmosAccount;AccountKey=<key>;TableEndpoint=https://MyCosmosAccount.table.cosmos.azure.com:443/",
77+
"Table": "OrderSagaData"
78+
},
79+
"Operations": []
80+
}
4881
```
49-
dt.exe /s:AzureTable /s.ConnectionString:"<_>AzureTableStorageConnectionString>" /s.Table:OrderSagaData /s.InternalFields:All /s.Projection:"OrderId;OrderDescription;OrderState;Originator;OriginalMessageId;NServiceBus_2ndIndexKey;SagaId" /t:TableAPIBulk /t.ConnectionString:"<AzureCosmosTableApiConnectionString>" /t.TableName:OrderSagaData /ErrorLog:errors.csv /ErrorDetails:All /OverwriteErrorLog:true
82+
83+
### Run the migration
84+
85+
Execute the migration tool from the command line:
86+
87+
```bash
88+
dmt.exe
5089
```
5190

91+
The tool will automatically read the `migrationsettings.json` file from the current directory and begin the migration process. The tool will migrate all data from the source table to the destination table, preserving all columns and their values.
92+
5293
## Data inspection
5394

5495
Due to the [limited types](https://docs.microsoft.com/en-us/rest/api/storageservices/understanding-the-table-service-data-model#property-types) supported by Azure Storage Tables, some types are stored in the table by the [Azure Table persister](/persistence/azure-table) as serialized JSON strings. The data can and should be inspected for quality both before and after the import. The migrated endpoint and all saga types migrated should be thoroughly tested before moving into production to ensure the migration is correct.

0 commit comments

Comments
 (0)