diff --git a/TOC-tidb-cloud-premium.md b/TOC-tidb-cloud-premium.md
index a1080a1bb844d..79fd1e831d488 100644
--- a/TOC-tidb-cloud-premium.md
+++ b/TOC-tidb-cloud-premium.md
@@ -133,6 +133,7 @@
- [Connect via Private Endpoint with AWS](/tidb-cloud/premium/connect-to-premium-via-aws-private-endpoint.md)
- [Connect via Private Endpoint with Alibaba Cloud](/tidb-cloud/premium/connect-to-premium-via-alibaba-cloud-private-endpoint.md)
- [Back Up and Restore TiDB Cloud Data](/tidb-cloud/premium/backup-and-restore-premium.md)
+ - [Export Data from {{{ .premium }}}](/tidb-cloud/premium/premium-export.md)
- Use an HTAP Cluster with TiFlash
- [TiFlash Overview](/tiflash/tiflash-overview.md)
- [Create TiFlash Replicas](/tiflash/create-tiflash-replicas.md)
diff --git a/tidb-cloud/configure-external-storage-access.md b/tidb-cloud/configure-external-storage-access.md
index 3796e472fbd11..8b3c5b4a4e87f 100644
--- a/tidb-cloud/configure-external-storage-access.md
+++ b/tidb-cloud/configure-external-storage-access.md
@@ -22,7 +22,7 @@ If you need to configure these external storages for a TiDB Cloud Dedicated clus
## Configure Amazon S3 access
-To allow a TiDB Cloud clusterinstance to access the source data in your Amazon S3 bucket, configure the bucket access for the clusterinstance using either of the following methods:
+To allow a TiDB Cloud clusterinstance to access your Amazon S3 bucket, configure the bucket access for the clusterinstance using either of the following methods:
- [Use a Role ARN](#configure-amazon-s3-access-using-a-role-arn): use a Role ARN to access your Amazon S3 bucket.
- [Use an AWS access key](#configure-amazon-s3-access-using-an-aws-access-key): use the access key of an IAM user to access your Amazon S3 bucket.
@@ -35,11 +35,11 @@ It is recommended that you use [AWS CloudFormation](https://docs.aws.amazon.com/
>
> Role ARN access to Amazon S3 is only supported for clustersinstances with AWS as the cloud provider. If you use a different cloud provider, use an AWS access key instead. For more information, see [Configure Amazon S3 access using an AWS access key](#configure-amazon-s3-access-using-an-aws-access-key).
-1. Open the **Import** page for your target clusterinstance.
+1. Open the **Import** or **Export** page for your target clusterinstance.
1. Log in to the [TiDB Cloud console](https://tidbcloud.com/) and navigate to the [**Clusters**](https://tidbcloud.com/project/clusters) page of your project.navigate to the [**TiDB Instances**](https://tidbcloud.com/tidbs) page.
- 2. Click the name of your target clusterinstance to go to its overview page, and then click **Data** > **Import** in the left navigation pane.
+ 2. Click the name of your target clusterinstance to go to its overview page, and then click **Data** > **Import** or **Data** > **Export** in the left navigation pane.
2. Open the **Add New ARN** dialog.
@@ -51,10 +51,23 @@ It is recommended that you use [AWS CloudFormation](https://docs.aws.amazon.com/
- If you want to export data to Amazon S3, open the **Add New ARN** dialog as follows:
- 1. Click **Export data to...** > **Amazon S3**. If your clusterinstance has neither imported nor exported any data before, click **Click here to export data to...** > **Amazon S3** at the bottom of the page.
+
+
+ 1. Click **Export data to...** > **Amazon S3**. If your cluster has neither imported nor exported any data before, click **Click here to export data to...** > **Amazon S3** at the bottom of the page.
2. Fill in the **Folder URI** field.
3. Choose **AWS Role ARN** and click **Click here to create new one with AWS CloudFormation**.
+
+
+
+
+ 1. Click **Export Data**.
+ 2. Choose **Amazon S3** in **Target Connection**.
+ 3. Fill in the **Folder URI** field.
+ 4. Choose **AWS Role ARN** and click **Click here to create new one with AWS CloudFormation**.
+
+
+
3. Create a role ARN with an AWS CloudFormation template.
1. In the **Add New ARN** dialog, click **AWS Console with CloudFormation Template**.
@@ -80,7 +93,7 @@ If you have any trouble creating a role ARN with AWS CloudFormation, you can tak
1. Sign in to the [AWS Management Console](https://console.aws.amazon.com/) and open the [Amazon S3 console](https://console.aws.amazon.com/s3/).
- 2. In the **Buckets** list, choose the name of your bucket with the source data, and then click **Copy ARN** to get your S3 bucket ARN (for example, `arn:aws:s3:::tidb-cloud-source-data`). Take a note of the bucket ARN for later use.
+ 2. In the **Buckets** list, choose the name of your target bucket, and then click **Copy ARN** to get your S3 bucket ARN (for example, `arn:aws:s3:::tidb-cloud-source-data`). Take a note of the bucket ARN for later use.

@@ -107,7 +120,7 @@ If you have any trouble creating a role ARN with AWS CloudFormation, you can tak
"s3:GetObjectVersion",
"s3:PutObject"
],
- "Resource": "//*"
+ "Resource": "//*"
},
{
"Sid": "VisualEditor1",
@@ -123,10 +136,10 @@ If you have any trouble creating a role ARN with AWS CloudFormation, you can tak
In the policy text field, replace the following configurations with your own values.
- - `"Resource": "//*"`. For example:
+ - `"Resource": "//*"`, where `` is the target directory for exported data or the source directory for imported data. For example:
- - If your source data is stored in the root directory of the `tidb-cloud-source-data` bucket, use `"Resource": "arn:aws:s3:::tidb-cloud-source-data/*"`.
- - If your source data is stored in the `mydata` directory of the bucket, use `"Resource": "arn:aws:s3:::tidb-cloud-source-data/mydata/*"`.
+ - If your data for import or export is in the root directory of the `tidb-cloud-source-data` bucket, use `"Resource": "arn:aws:s3:::tidb-cloud-source-data/*"`.
+ - If your data for import or export is in the `mydata` directory of the bucket, use `"Resource": "arn:aws:s3:::tidb-cloud-source-data/mydata/*"`.
Make sure that `/*` is added to the end of the directory so TiDB Cloud can access all files in this directory.
@@ -221,7 +234,7 @@ Take the following steps to configure a service account key:
-
+
## Configure Azure Blob Storage access
@@ -231,17 +244,36 @@ You can create a SAS token either using an [Azure ARM template](https://learn.mi
To create a SAS token using an Azure ARM template, take the following steps:
-1. Open the **Import** page for your target cluster.
+1. Open the **Import** or **Export** page for your target clusterinstance.
- 1. Log in to the [TiDB Cloud console](https://tidbcloud.com/) and navigate to the [**Clusters**](https://tidbcloud.com/project/clusters) page of your project.
+ 1. Log in to the [TiDB Cloud console](https://tidbcloud.com/) and navigate to the [**Clusters**](https://tidbcloud.com/project/clusters) page of your project.navigate to the [**TiDB Instances**](https://tidbcloud.com/tidbs) page.
- 2. Click the name of your target cluster to go to its overview page, and then click **Data** > **Import** in the left navigation pane.
+ 2. Click the name of your target clusterinstance to go to its overview page, and then click **Data** > **Import** or **Data** > **Export** in the left navigation pane.
2. Open the **Generate New SAS Token via ARM Template Deployment** dialog.
- 1. Click **Export data to...** > **Azure Blob Storage**. If your cluster has neither imported nor exported any data before, click **Click here to export data to...** > **Azure Blob Storage** at the bottom of the page.
+ - If you want to import data from Azure Blob Storage:
+
+ 1. Click **Import from Azure Blob Storage**.
+ 2. Fill in the **Folder URI** field.
+ 3. In the **SAS Token** field, click **Click here to create a new one with Azure ARM template**.
+
+ - If you want to export data to Azure Blob Storage:
+
+
+
+ 1. Click **Export data to...** > **Azure Blob Storage**. If your cluster has neither imported nor exported any data before, click **Click here to export data to...** > **Azure Blob Storage** at the bottom of the page.
+ 2. Scroll down to the **Azure Blob Storage Settings** area, and then click **Click here to create a new one with Azure ARM template** under the SAS Token field.
+
+
+
+
+
+ 1. Click **Export Data**.
+ 2. Choose **Azure Blob Storage** in **Target Connection**.
+ 3. Click **Click here to create a new one with Azure ARM template** under the SAS Token field.
- 2. Scroll down to the **Azure Blob Storage Settings** area, and then click **Click here to create a new one with Azure ARM template** under the SAS Token field.
+
3. Create a SAS token with the Azure ARM template.
diff --git a/tidb-cloud/premium/premium-export.md b/tidb-cloud/premium/premium-export.md
new file mode 100644
index 0000000000000..57748c3c3b669
--- /dev/null
+++ b/tidb-cloud/premium/premium-export.md
@@ -0,0 +1,184 @@
+---
+title: Export Data from {{{ .premium }}}
+summary: Learn how to export data from {{{ .premium }}} instances.
+---
+
+# Export Data from {{{ .premium }}}
+
+TiDB Cloud enables you to export data from a {{{ .premium }}} instance to an external storage service. You can use the exported data for backup, migration, data analysis, or other purposes.
+
+While you can also export data using tools such as [mysqldump](https://dev.mysql.com/doc/refman/8.0/en/mysqldump.html) and TiDB [Dumpling](https://docs.pingcap.com/tidb/dev/dumpling-overview), the export feature provided by TiDB Cloud offers a more convenient and efficient way to export data from a {{{ .premium }}} instance. It brings the following benefits:
+
+- Convenience: the export service provides a simple and easy-to-use way to export data from a {{{ .premium }}} instance, eliminating the need for additional tools or resources.
+- Isolation: the export service uses separate computing resources, ensuring isolation from the resources used by your online services.
+- Consistency: the export service ensures the consistency of the exported data without causing locks, which does not affect your online services.
+
+> **Note:**
+>
+> The maximum export size is 100 GiB. Exports larger than this limit might fail. To export more data or request a higher export speed, contact [TiDB Cloud Support](/tidb-cloud/tidb-cloud-support.md).
+
+## Export locations
+
+You can export data to the following external storage locations:
+
+- [Amazon S3](https://aws.amazon.com/s3/)
+- [Azure Blob Storage](https://azure.microsoft.com/en-us/services/storage/blobs/)
+- [Alibaba Cloud Object Storage Service (OSS)](https://www.alibabacloud.com/product/oss)
+
+### Amazon S3
+
+To export data to Amazon S3, you need to provide the following information:
+
+- URI: `s3:////`
+- One of the following access credentials:
+ - [An access key](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_access-keys.html): make sure the access key has the `s3:PutObject` permission.
+ - [A role ARN](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference-arns.html): make sure the role ARN (Amazon Resource Name) has the `s3:PutObject` permission. Note that only {{{ .premium }}} instances hosted on AWS support the role ARN.
+
+For more information, see [Configure External Storage Access](/tidb-cloud/configure-external-storage-access.md#configure-amazon-s3-access).
+
+### Azure Blob Storage
+
+To export data to Azure Blob Storage, you need to provide the following information:
+
+- URI: `azure://.blob.core.windows.net///` or `https://.blob.core.windows.net///`
+- Access credential: a [shared access signature (SAS) token](https://docs.microsoft.com/en-us/azure/storage/common/storage-sas-overview) for your Azure Blob Storage container. Make sure the SAS token has the `Read` and `Write` permissions on the `Container` and `Object` resources.
+
+For more information, see [Configure External Storage Access](/tidb-cloud/configure-external-storage-access.md#configure-azure-blob-storage-access).
+
+### Alibaba Cloud OSS
+
+To export data to Alibaba Cloud OSS, you need to provide the following information:
+
+- URI: `oss:////`
+- Access credential: An [AccessKey pair](https://www.alibabacloud.com/help/en/ram/user-guide/create-an-accesskey-pair) for your Alibaba Cloud account. Make sure the AccessKey pair has the `oss:PutObject` and `oss:GetBucketInfo` permissions.
+
+For more information, see [Configure External Storage Access](/tidb-cloud/configure-external-storage-access.md#configure-alibaba-cloud-object-storage-service-oss-access).
+
+## Export options
+
+### Data filtering
+
+TiDB Cloud console supports exporting data with the selected databases and tables.
+
+### Data formats
+
+You can export data in the following formats:
+
+- `SQL`: export data in SQL format.
+- `CSV`: export data in CSV format. You can specify the following options:
+ - `delimiter`: specify the delimiter used in the exported data. The default delimiter is `"`.
+ - `separator`: specify the character used to separate fields in the exported data. The default separator is `,`.
+ - `header`: specify whether to include a header row in the exported data. The default value is `true`.
+ - `null-value`: specify the string that represents a NULL value in the exported data. The default value is `\N`.
+
+The schema and data are exported according to the following naming conventions:
+
+| Item | Not compressed | Compressed |
+|-----------------|-------------------------------|--------------------------------------------------------------|
+| Database schema | {database}-schema-create.sql | {database}-schema-create.sql.{compression-type} |
+| Table schema | {database}.{table}-schema.sql | {database}.{table}-schema.sql.{compression-type} |
+| Data | {database}.{table}.{0001}.csv | {database}.{table}.{0001}.csv.{compression-type} |
+| Data | {database}.{table}.{0001}.sql | {database}.{table}.{0001}.sql.{compression-type} |
+
+### Data compression
+
+You can compress the exported CSV and SQL data using the following algorithms:
+
+- `gzip` (default): compress the exported data with `gzip`.
+- `snappy`: compress the exported data with `snappy`.
+- `zstd`: compress the exported data with `zstd`.
+- `none`: do not compress the exported data.
+
+## Examples
+
+### Export data to Amazon S3
+
+1. Log in to the [TiDB Cloud console](https://tidbcloud.com/) and navigate to the [**TiDB Instances**](https://tidbcloud.com/tidbs) page.
+
+ > **Tip:**
+ >
+ > You can use the combo box in the upper-left corner to switch between organizations, projects, and instances.
+
+2. Click the name of your target {{{ .premium }}} instance to go to its overview page, and then click **Data** > **Export** in the left navigation pane.
+
+3. On the **Export** page, click **Export Data** in the upper-right corner. Then configure the following settings:
+
+ - **Task Name**: enter a name for the export task. The default value is `SNAPSHOT_{snapshot_time}`.
+ - **Source Connection**: enter **Username** and **Password** of your {{{ .premium }}} instance, and then click **Test Connection** to verify the credentials.
+ - **Target Connection**:
+ - **Storage Provider**: choose Amazon S3.
+ - **Folder URI**: enter the URI of the Amazon S3 with the `s3:////` format.
+ - **Bucket Access**: choose one of the following access credentials and then fill in the credential information:
+ - **AWS Role ARN**: enter the role ARN that has the permission to access the bucket. It is recommended to create the role ARN with AWS CloudFormation. For more information, see [Configure External Storage Access](/tidb-cloud/configure-external-storage-access.md#configure-amazon-s3-access).
+ - **AWS Access Key**: enter the access key ID and access key secret that have the permission to access the bucket.
+ - **Exported Data**: choose the databases or tables you want to export.
+ - **Data Format**: choose **SQL** or **CSV**.
+ - **Compression**: choose **Gzip**, **Snappy**, **Zstd**, or **None**.
+
+4. Click **Export**.
+
+### Export data to Azure Blob Storage
+
+1. Log in to the [TiDB Cloud console](https://tidbcloud.com/) and navigate to the [**TiDB Instances**](https://tidbcloud.com/tidbs) page.
+
+ > **Tip:**
+ >
+ > You can use the combo box in the upper-left corner to switch between organizations, projects, and instances.
+
+2. Click the name of your target {{{ .premium }}} instance to go to its overview page, and then click **Data** > **Export** in the left navigation pane.
+
+3. On the **Export** page, click **Export Data** in the upper-right corner. Then configure the following settings:
+
+ - **Task Name**: enter a name for the export task. The default value is `SNAPSHOT_{snapshot_time}`.
+ - **Source Connection**: enter **Username** and **Password** of your {{{ .premium }}} instance, and then click **Test Connection** to verify the credentials.
+ - **Target Connection**:
+ - **Storage Provider**: choose Azure Blob Storage.
+ - **Folder URI**: enter the URI of Azure Blob Storage with the `azure://.blob.core.windows.net///` format.
+ - **SAS Token**: enter the SAS token that has the permission to access the container. It is recommended to create a SAS token with the [Azure ARM template](https://learn.microsoft.com/en-us/azure/azure-resource-manager/templates/). For more information, see [Configure External Storage Access](/tidb-cloud/configure-external-storage-access.md#configure-azure-blob-storage-access).
+ - **Exported Data**: choose the databases or tables you want to export.
+ - **Data Format**: choose **SQL** or **CSV**.
+ - **Compression**: choose **Gzip**, **Snappy**, **Zstd**, or **None**.
+
+4. Click **Export**.
+
+### Export data to Alibaba Cloud OSS
+
+1. Log in to the [TiDB Cloud console](https://tidbcloud.com/) and navigate to the [**TiDB Instances**](https://tidbcloud.com/tidbs) page.
+
+ > **Tip:**
+ >
+ > You can use the combo box in the upper-left corner to switch between organizations, projects, and instances.
+
+2. Click the name of your target {{{ .premium }}} instance to go to its overview page, and then click **Data** > **Export** in the left navigation pane.
+
+3. On the **Export** page, click **Export Data** in the upper-right corner:
+
+ - **Task Name**: enter a name for the export task. The default value is `SNAPSHOT_{snapshot_time}`.
+ - **Source Connection**: enter **Username** and **Password** of your {{{ .premium }}} instance, and then click **Test Connection** to verify the credentials.
+ - **Target Connection**:
+ - **Storage Provider**: choose Alibaba Cloud OSS.
+ - **Folder URI**: enter the Alibaba Cloud OSS URI where you want to export the data, in the `oss:////` format.
+ - **AccessKey ID** and **AccessKey Secret**: enter the AccessKey ID and AccessKey Secret that have the permission to access the bucket.
+ - **Exported Data**: choose the databases or tables you want to export.
+ - **Data Format**: choose **SQL** or **CSV**.
+ - **Compression**: choose **Gzip**, **Snappy**, **Zstd**, or **None**.
+
+4. Click **Export**.
+
+### Cancel an export task
+
+To cancel an ongoing export task, take the following steps:
+
+1. Log in to the [TiDB Cloud console](https://tidbcloud.com/) and navigate to the [**TiDB Instances**](https://tidbcloud.com/tidbs) page.
+
+ > **Tip:**
+ >
+ > You can use the combo box in the upper-left corner to switch between organizations, projects, and instances.
+
+2. Click the name of your target {{{ .premium }}} instance to go to its overview page, and then click **Data** > **Export** in the left navigation pane.
+
+3. On the **Export** page, view the export task list.
+
+4. Choose the export task you want to cancel, and then click **Action**.
+
+5. Choose **Cancel** in the drop-down list. Note that you can only cancel the export task that is in the **Running** status.