You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: persistence/azure-table/migration-from-azure-storage-table-to-cosmos-table.md
+52-11Lines changed: 52 additions & 11 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,30 +3,49 @@ title: Migration from Azure Storage Table to Azure Cosmos DB Table API
3
3
component: ASP
4
4
related:
5
5
- persistence/azure-table
6
-
reviewed: 2024-02-01
6
+
reviewed: 2025-01-24
7
7
---
8
8
9
9
> [!WARNING]
10
10
> The endpoint being migrated must be offline while migrating saga data. The saga data must be using secondary indexes (introduced in Azure Table Persistence 2.x) or be stored with Azure Table Persistence Version 3 or higher for this upgrade guide to succeed. The migration scenario described assumes only saga data of a one saga is stored per table.
11
11
12
12
## Import data
13
13
14
-
> [!NOTE]
15
-
> At the time of writing this guidance the Data migration tool did incorrectly project columns and thus would crash with `NullReferenceException`. The [Pullrequest](https://github.com/Azure/azure-documentdb-datamigrationtool/pull/126) has been merged but it is not confirmed yet when the tool will be released. If required build the latest master branch of the tool.
14
+
The saga data can be imported into Cosmos DB Table API using the [Azure Cosmos DB Desktop Data Migration Tool](https://github.com/azurecosmosdb/data-migration-desktop-tool) provided by Microsoft. This tool is built on .NET 6, is cross-platform, and replaces the legacy `dt.exe` tool.
16
15
17
-
The saga data can be imported into Cosmos DB Table API using the [Data migration tool](https://docs.microsoft.com/en-us/azure/cosmos-db/import-data#Install) provided by Microsoft. The import tool provides both [a UI and a command line](https://docs.microsoft.com/en-us/azure/cosmos-db/import-data#AzureTableSource) option. The general command looks like the following
Download the latest version of the Azure Cosmos DB Desktop Data Migration Tool from the [GitHub releases page](https://github.com/azurecosmosdb/data-migration-desktop-tool/releases).
19
+
20
+
### Configure the migration
21
+
22
+
Create a `migrationsettings.json` file in the tool's directory with the following structure:
`<AzureTableStorageConnectionString>`: The Azure Table Storage (source) connection string<br/>
26
-
`<AzureCosmosTableApiConnectionString>`: The Azure Cosmos DB Table API (destination) connection string.<br/>
27
-
`<SagaProperties>`: A semicolon seperated list of all saga properties that need to be projected (e.g `OrderId;OrderDescription;OrderState`). Make sure to leave `Originator;OriginalMessageId;NServiceBus_2ndIndexKey;SagaId` since those are standard columns that always need to be projected in case they are available.<br/>
42
+
`<AzureTableStorageConnectionString>`: The Azure Table Storage (source) connection string. This can be found in the Azure Portal under your Storage Account's **Access keys** or **Connection string** section.<br/>
43
+
`<AzureCosmosTableApiConnectionString>`: The Azure Cosmos DB Table API (destination) connection string. This can be found in the Azure Portal under your Cosmos DB account's **Connection String** or **Keys** section. Ensure the connection string includes the `TableEndpoint` parameter pointing to your Cosmos DB Table API account.<br/>
28
44
`<SagaTableName>`: The name of the saga data table (e.g `OrderSagaData`).<br/>
29
45
46
+
> [!NOTE]
47
+
> The migration tool automatically migrates all columns from the source table, including all saga properties and the standard NServiceBus columns (`Originator`, `OriginalMessageId`, `NServiceBus_2ndIndexKey`, `SagaId`). No explicit column projection is required.
48
+
30
49
### Example
31
50
32
51
For example, to import a single saga data table called `OrderSagaData` with the saga data type:
@@ -43,12 +62,34 @@ public class OrderSagaData : IContainSagaData
43
62
}
44
63
```
45
64
46
-
the following command can be used:
65
+
the following `migrationsettings.json` can be used:
The tool will automatically read the `migrationsettings.json` file from the current directory and begin the migration process. The tool will migrate all data from the source table to the destination table, preserving all columns and their values.
92
+
52
93
## Data inspection
53
94
54
95
Due to the [limited types](https://docs.microsoft.com/en-us/rest/api/storageservices/understanding-the-table-service-data-model#property-types) supported by Azure Storage Tables, some types are stored in the table by the [Azure Table persister](/persistence/azure-table) as serialized JSON strings. The data can and should be inspected for quality both before and after the import. The migrated endpoint and all saga types migrated should be thoroughly tested before moving into production to ensure the migration is correct.
0 commit comments