Skip to content Skip to sidebar Skip to footer

How to Automatically Upload Local Files to S3

Before reading…

  • Target Audience: This article is written for developers with beginner to intermediate feel. They are familiar with the tenets of software evolution and PowerShell. They are new to Amazon Web Services (AWS) and the Simple Storage Service (S3).
  • Scenario: Assuming the programmer is new to AWS, the document uses a conversational tone to accept them through an end-to-end scenario. Information technology gets them started with the various Amazon Web Services and provides step-by-step details for authoring the PowerShell script.
  • Sources: I wrote 100% of the content, including samples and PowerShell code, without an editor providing input. The information was derived from various Amazon documentation, pulled together to be a cohesive gear up of end-to-stop instructions.

…and here'south the sample.


The ability to script important tasks allows It professionals to be efficient and effective in their tasks. Backing upwardly important files to the cloud or triggering the upload of new files to your site are disquisitional functions that can be scripted to meliorate efficiency and brand the processes resilient.

In this document, y'all'll learn how to upload files to Amazon Simple Storage Service (S3) using PowerShell. You'll create an Amazon Spider web Services (AWS) account that will allow you to manage AWS users and groups via Identity and Admission Direction (IAM) as well as work with S3 buckets and folders. After installing the AWS Tools for Windows PowerShell, you'll configure the prerequisites for connecting to AWS via PowerShell. And finally, yous'll use PowerShell to recursively iterate through directories to upload files to S3.

If y'all're new to Amazon S3 and need to offset from scratch, this is a beginning-to-end walkthrough of how to upload your files to Amazon Simple Storage Service (S3) using PowerShell.

A breakdown of the procedure (detailed steps are further below):

  1. Create an Amazon account to access Amazon Web Services (AWS)

    a. You tin attempt most of the AWS services for free for a year.

  2. Create a user and group via Amazon Identity and Access Direction (IAM) to perform the fill-in/upload

  3. Create a bucket in Amazon Uncomplicated Storage Service (S3) to concord your files

  4. Install AWS Tools for Windows PowerShell, which contains the modules needed to admission AWS

  5. Open PowerShell and configure prerequisite settings

  6. Write a PowerShell script that copies files from your local computer to the Amazon S3 bucket you previously created

    a. The script volition utilize the credentials of the fill-in user created.

    b. The script volition be a PowerShell framework script to get you started.

This tutorial will get you up and running with Amazon Simple Storage Service (S3) and a PowerShell script that uploads files. Once this framework script is in place, you can add to the logic making it as specific or complex as necessary for your state of affairs.

i. Create an Amazon root account

Amazon has made this simple. Navigate to Amazon AWS and select "Create a Free Business relationship." Follow the steps outlined.

If you already have an account with Amazon for consumer purchases, you can use this for a single logon identity. I recommend using Multi-Factor Authentication with your AWS Root account to provide an additional layer of security.

To boot the tires on AWS without hitting your pocket book, select the costless for a year pick and get some experience under your chugalug.

2. Create a user and group

To access the AWS services, create a group and user business relationship via IAM. Keep these guidelines in listen:

  • Follow IAM's All-time Practices for user and grouping direction
  • Create a grouping to backup/upload files with the appropriate permissions
  • Create a user for accessing S3 during the backup/upload from PowerShell

You can find detailed instructions for how to create an account and group in the video titled "Getting Started with AWS Identity and Access Direction" at IAM's Getting Started. The following is a synopsis.

  1. Login to AWS Management Console with your Root account or an Administrator business relationship (if you lot created a separate one)
    • Navigate to Amazon AWS -> select My Account in the top right -> select AWS Management Panel from the drop-downwardly listing
  2. Create a group with access to S3 buckets called S3BackupOperators
    • Set permissions for this group to AmazonS3FullAccess so that members accept the necessary access and permissions to perform backups/uploads.
  3. Create a user for accessing the S3 buckets called backupOperator
    • Add this user to the S3BackupOperators group, from which they will inherit the permissions they need to write files to the S3 buckets.
  4. Generate Admission Keys for the user
    • In order to access the Amazon S3 buckets via PowerShell, your IAM user (backupOperator) will need both an Access Central ID and a Secret Access Key (to specify in the PowerShell script for authentication).
    • To create an access key, log into the AWS Direction Panel and:
      1. Select Users
      2. Select the user name previously created
      3. Curlicue down and select Create Admission Key
      4. Exist sure to write down or save the Admission Keys created

three. Create a bucket in S3

You will need an Amazon S3 bucket to hold your files, which is analogous to a directory/folder on your local reckoner. More information can be institute at Working with Amazon S3 Buckets.

Follow the instructions at Create a Saucepan and name it something relevant, such as Backups.

Note: Because the AmazonS3FullAccess policy was applied to the S3BackupOperators group, members of that grouping have add/delete permissions to your S3 buckets. If you would like to further restrict access for that grouping to only the Backups bucket, review the documentation at Managing Admission Permissions to Your Amazon S3 Resources.

Amazon has written a PowerShell module that allows you to collaborate with Amazon Web Services remotely via PowerShell scripts. Download AWS Tools for Windows PowerShell to your Windows PC and follow the installation instructions.

For further details, read Setting upwards the AWS Tools for Windows PowerShell.

Annotation: You will need to enable script execution each time yous open the PowerShell prompt, which requires administrator privileges. How to exercise this is detailed further below.

five. Open up PowerShell and configure prerequisite settings

There are a few caveats for using PowerShell with AWS.

Select which PowerShell prompt to use

There are 2 PowerShell prompts you tin can use with slightly different requirements to get started.

  1. Windows PowerShell that comes with Windows past default
  2. Windows PowerShell for AWS , which is installed with AWS Tools for Windows PowerShell

For both PowerShell prompts, you will need to enable script execution, as outlined in Setting upwards the AWS Tools for Windows PowerShell. Exist sure to open the prompt as an ambassador then run Fix-ExecutionPolicy RemoteSigned.

Enable script execution for PowerShell

Import AWSPowerShell module

You might need to import the AWSPowerShell modele, depending on which PowerShell prompt you use.

When using the Windows PowerShell for AWS prompt, the AWSPowerShell module is automatically imported and the Initialize-AWSDefaults cmdlet run for you, allowing you lot to brainstorm working with the AWS PowerShell Cmdlets immediately.

When using the default Windows PowerShell prompt, yous will need to manually import the AWSPowerShell module and run the Initialize-AWSDefaultscmdlet using the following commands.

                          PS                                          C:\              >                                          Import-Module                                          "              C:\Program                                          Files                                          (              x86              )              \AWS                                          Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1              "                                          PS                                          C:\              >                                          Initialize-AWSDefaults                                                  

Import the AWSPowerShell module

6. PowerShell script to upload/backup files to Amazon S3

The previous steps prepared you to copy files from your computer to Amazon S3 using PowerShell.

The full script will be shown at the end of this document. Meanwhile, permit's step through the sections of the script.

6.A. Prepare constant variables

Beginning, fix the constant variables that will be used in the script.

Variable Description Value (Example)
$accessKeyID The Admission Key ID for the backupOperator user EXAMPLEDXLAW52MZCGIA
$secretAccessKey The Hole-and-corner Access Cardinal for the backupOperator user examplekfLK2c8NCFjlhhjxvYBxJwPkli1HosK4F
$config AmazonS3Config object to hold configuration options, such every bit RegionEndpoint and ServiceURL These depend on your AWS account settings. In this example, the account uses the US-WEST-2 region endpoint.
RegionEndpoint = us-due west-2
ServiceURL = https://s3-u.s.a.-west-2.amazonaws.com/

Note: To discover out which Region your account is using, login to the AWS Management Console and note the Region specified in the URL. The post-obit example uses US-WEST-ii: https://u.s.a.-west-2.console.aws.amazon.com/console/abode?nc2=h_m_mc&region=u.s.a.-west-ii#

Commands to ready the user'southward Key variables (using the Access Keys previously generated) and to instantiate the AmazonS3Config object to hold the RegionEndpoint and ServiceURL values:

                          $accessKeyID              =              "              EXAMPLEDXLAW52MZCGIA              "                                          $secretAccessKey              =              "              examplekfLK2c8NCFjlhhjxvYBxJwPkli1HosK4F              "                                          $config              =              New-Object                                          Amazon.S3.AmazonS3Config                                          $config              .              RegionEndpoint              =              [              Amazon.RegionEndpoint              ]::              "              us-west-two                                                        $config              .              ServiceURL                                          =                                          "              https://s3-u.s.a.-west-2.amazonaws.com/              "                                                  

6.B. Create AmazonS3Client object

In this step, you will instantiate an AmazonS3Client object. It volition use the permissions and access keys granted to the backupOperator user, based on the variables created in the previous step. The AmazonS3Client object also requires the previously created AmazonS3Config object to interact with your Amazon S3 buckets.

Instantiate the AmazonS3Client object:

                          $client              =              [              Amazon.AWSClientFactory              ]::              CreateAmazonS3Client              (              $accessKeyID              ,              $secretAccessKey              ,              $config              )                                                  

6.C. Upload/backup files via PowerShell

With PowerShell, you take several options for uploading your files.

  • Copy specific files from a single folder or multiple directories.
  • Copy all files in a directory without including subdirectories.
  • Copy a directory and its subdirectories, by iterating through the subdirectories with PowerShell.

Re-create specific files

To copy specific files, use the post-obit commands.

                          Write-S3Object                                          -BucketName                                          Backups                                          -File                                          "              C:\Documents\Business\FinancialReports.xlsx              "                                          -Primal                                          "              /Documents/Business/FinancialReports.xlsx              "                                          Write-S3Object                                          -BucketName                                          Backups                                          -File                                          "              C:\Pictures\Business                                          Logos\logo.jpg              "                                          -Fundamental                                          "              /Pictures/Business                                          Logos/logo.jpg              "                                                  

Note: See the Write-S3Object Cmdlet documentation for more information.

Copy all files in a directory

To copy all the files in a directory, use the post-obit command.

                          Write-S3Object                                          -BucketName                                          Backups                                          -Binder                                          "              C:\Pictures\Family              "                                          -KeyPrefix                                          "              /Pictures/Family              "                                                  

Note: See the Write-S3Object Cmdlet documentation for more data.

Re-create a directory and subdirectories

To copy a directory and the subdirectories, utilise the post-obit function to iterate through the subdirectories recursively.

                          role                                          RecurseFolders              ([              string              ]              $path              )                                          {                                          $fc                                          =                                          New-Object                                          -com                                          Scripting.FileSystemObject                                          $binder                                          =                                          $fc              .              GetFolder              (              $path              )                                          # Iterate through subfolders                                          foreach                                          (              $i                                          in                                          $folder              .              SubFolders              )                                          {                                          $this              Folder                                          =                                          $i              .              Path                                          # Transform the local directory path to annotation compatible with S3 Buckets and Folders                                          # 1. Trim off the bulldoze letter and colon from the showtime of the Path                                          $s3Path                                          =                                          $this              Folder.ToString              ()                                          $s3Path                                          =                                          $s3Path              .              SubString              (              2              )                                          # two. Supervene upon dorsum-slashes with forrard-slashes                                          # Escape the back-slash special character with a back-slash so that it reads it literally, like so: "\\"                                          $s3Path                                          =                                          $s3Path                                          -replace                                          "\\"              ,                                          "/"                                          # Upload directory to S3                                          Write-S3Object                                          -BucketName                                          Backups                                          -Folder                                          $this              Folder                                          -KeyPrefix                                          $s3Path                                          }                                          # If subfolders exist in the current folder, then iterate through them besides                                          foreach                                          (              $i                                          in                                          $binder              .              subfolders              )                                          {                                          RecurseFolders              (              $i              .              path              )                                          }                                          }                                                  

6.D. Total PowerShell script

The following is a full PowerShell script that volition backup/upload a directory (including all subdirectories) from your local computer to an Amazon S3 Saucepan.

                          # Constants                                          $sourceDrive                                          =                                          "C:\"                                          $sourceFolder                                          =                                          "ImportantFiles"                                          $sourcePath                                          =                                          $sourceDrive                                          +                                          $sourceFolder                                          $s3Bucket                                          =                                          "Backups"                                          $s3Folder                                          =                                          "Archive"                                          # Constants – Amazon S3 Credentials                                          $accessKeyID              =              "EXAMPLEDXLAW52MZCGIA"                                          $secretAccessKey              =              "examplekfLK2c8NCFjlhhjxvYBxJwPkli1HosK4F"                                          # Constants – Amazon S3 Configuration                                          $config              =              New-Object                                          Amazon.S3.AmazonS3Config                                          $config              .              RegionEndpoint              =              [              Amazon.RegionEndpoint              ]::              "u.s.-west-2"                                          $config              .              ServiceURL                                          =                                          "https://s3-united states of america-west-2.amazonaws.com/"                                          # Instantiate the AmazonS3Client object                                          $client              =              [              Amazon.AWSClientFactory              ]::              CreateAmazonS3Client              (              $accessKeyID              ,              $secretAccessKey              ,              $config              )                                          # FUNCTION – Iterate through subfolders and upload files to S3                                          role                                          RecurseFolders              ([              string              ]              $path              )                                          {                                          $fc                                          =                                          New-Object                                          -com                                          Scripting.FileSystemObject                                          $folder                                          =                                          $fc              .              GetFolder              (              $path              )                                          foreach                                          (              $i                                          in                                          $folder              .              SubFolders              )                                          {                                          $this              Folder                                          =                                          $i              .              Path                                          # Transform the local directory path to notation compatible with S3 Buckets and Folders                                          # 1. Trim off the drive letter and colon from the start of the Path                                          $s3Path                                          =                                          $this              Folder.ToString              ()                                          $s3Path                                          =                                          $s3Path              .              SubString              (              2              )                                          # ii. Replace back-slashes with forward-slashes                                          # Escape the back-slash special character with a back-slash so that it reads it literally, like so: "\\"                                          $s3Path                                          =                                          $s3Path                                          -replace                                          "\\"              ,                                          "/"                                          $s3Path                                          =                                          "/"                                          +                                          $s3Folder                                          +                                          $s3Path                                          # Upload directory to S3                                          Write-S3Object                                          -BucketName                                          $s3Bucket                                          -Folder                                          $this              Folder                                          -KeyPrefix                                          $s3Path                                          }                                          # If subfolders exist in the current folder, then iterate through them too                                          foreach                                          (              $i                                          in                                          $folder              .              subfolders              )                                          {                                          RecurseFolders              (              $i              .              path              )                                          }                                          }                                          # Upload root directory files to S3                                          $s3Path                                          =                                          "/"                                          +                                          $s3Folder                                          +                                          "/"                                          +                                          $sourceFolder                                          Write-S3Object                                          -BucketName                                          $s3Bucket                                          -Folder                                          $sourcePath                                          -KeyPrefix                                          $s3Path                                          # Upload subdirectories to S3                                          RecurseFolders              (              $sourcePath              )                                                  

Summary

As stated in the starting time of this article, the purpose of this tutorial is to go yous up and running from scratch with Amazon Simple Storage Service (S3) and a PowerShell script that uploads files.

Through this procedure you lot:

  1. Created an Amazon AWS account
  2. Learned how to create and manage AWS users and groups via IAM
  3. Learned how to manage permissions for a group, scoping them based on their role
  4. Created a S3 saucepan and configured permissions through an IAM grouping
  5. Experienced how to configure AWS Tools for Windows PowerShell
  6. Discovered the prerequisites for connecting to AWS via PowerShell
  7. Learned how to use the Write-S3Object cmdlet to upload files to S3 buckets
  8. Saw how to iterate through folders and subfolders recursively with PowerShell
  9. Saw how to change the local folder path format to piece of work with S3 folder format

With this framework in place, you can expand the script to do a number of other things. Such as run in one case per day with a scheduled task (automate to improve efficiency), delete old folders/files from S3 as they're no longer used (reducing your S3 storage price), or merely upload new/inverse files to S3 (reducing your information transfer cost).

Resources

  • Amazon Web Services (AWS)
  • Identity and Access Management (IAM):
    • Getting Started
    • IAM'south Best Practices
    • Root Business relationship Credentials vs. IAM User Credentials
    • User Guide for Identity and Access Management (IAM)
  • AWS PowerShell:
    • AWS Tools for Windows PowerShell
    • Setting up the AWS Tools for Windows PowerShell
    • Write-S3Object Cmdlet
  • Amazon S3:
    • Create a Bucket
    • Working with Amazon S3 Buckets
    • Managing Access Permissions to Your Amazon S3 Resources
    • AmazonS3Config Object
    • AmazonS3Client Object
    • Getting Started with Amazon Uncomplicated Storage Service (S3)

brewingtonberved.blogspot.com

Source: https://todhilton.com/technicalwriting/upload-backup-your-files-to-amazon-s3-with-powershell/

Postar um comentário for "How to Automatically Upload Local Files to S3"