# s3_move>: Move files in Amazon S3 **s3_move>** operator moves files within Amazon S3 by copying them to the destination and then deleting the source. ```yaml +move_file: s3_move>: source: source-bucket/source-key destination: destination-bucket/destination-key ``` ## Secrets When you don't know how to set secrets, please refer to [Managing Workflow Secret](https://docs.treasuredata.com/articles/pd/about-workflow-secret-management) * **aws.s3.region, aws.region** An optional explicit AWS Region in which to access S3. Default is us-east-1. * **aws.s3.access_key_id, aws.access_key_id** The AWS Access Key ID to use when accessing S3. When using `credential_provider: assume_role`, this is not required. * **aws.s3.secret_access_key, aws.secret_access_key** The AWS Secret Access Key to use when accessing S3. When using `credential_provider: assume_role`, this is not required. ## Options * **source**: `SOURCE_BUCKET/SOURCE_KEY` Path to the source file in Amazon S3 to move from. Use either this parameter or the combination of `source_bucket` and `source_key`. Examples: ```yaml source: source-bucket/my-data.gz ``` ```yaml source: source-bucket/file/in/a/directory ``` * **source_bucket**: SOURCE_BUCKET The S3 bucket where the source file is located. Can be used together with the `source_key` parameter. * **source_key**: SOURCE_KEY The S3 key of the source file. Can be used together with the `source_bucket` parameter. * **destination**: `DESTINATION_BUCKET/DESTINATION_KEY` Path to the destination file in Amazon S3 to move to. Use either this parameter or the combination of `destination_bucket` and `destination_key`. Examples: ```yaml destination: destination-bucket/my-data-moved.gz ``` ```yaml destination: destination-bucket/file/in/another/directory ``` * **destination_bucket**: DESTINATION_BUCKET The S3 bucket where the destination file will be created. Can be used together with the `destination_key` parameter. * **destination_key**: DESTINATION_KEY The S3 key of the destination file. Can be used together with the `destination_bucket` parameter. * **recursive**: BOOLEAN Move all objects with the specified prefix recursively. Default is false. Examples: ```yaml +move_directory: s3_move>: source: source-bucket/my-directory/ destination: destination-bucket/backup/ recursive: true ``` * **objects_per_iteration**: NUMBER Maximum number of objects to move per iteration when using recursive mode. Must be between 1 and 1000. Default is 1000. * **region**: REGION An optional explicit AWS Region in which to access S3. This may also be specified using the `aws.s3.region` secret. Default is us-east-1. * **path_style_access**: BOOLEAN An optional flag to control whether to use path-style or virtual hosted-style access when accessing S3. Default is false. * **log_each_object**: BOOLEAN Whether to log each object being moved. Default is true. * **credential_provider**: NAME The credential provider to use for AWS authentication. Supported values are `access_key` (default) and `assume_role`. Examples: ```yaml +move_file_with_assume_role: s3_move>: source: source-bucket/source-key destination: destination-bucket/destination-key credential_provider: assume_role assume_role_authentication_id: ${auth_id} ``` * **assume_role_authentication_id**: NUMBER The authentication ID for assume role when using `credential_provider: assume_role`. This corresponds to the `Amazon S3 Import Integration v2` configuration. How to get authentication_id is written in [Reusing the existing Authentication](https://docs.treasuredata.com/articles/#!int/amazon-s3-import-integration-v2/a/AmazonS3ImportIntegrationv2-reuseAuthenticationReusingtheexistingAuthentication). ## Notes * When moving folders recursively, you cannot move a folder into itself or into one of its subfolders. For example, you cannot move `my-folder/` to `my-folder/backup/`. * The move operation works by copying files first, then deleting the originals. If something goes wrong during the process, some files might be copied to the destination but not yet deleted from the source.