OneStream

Integrating AWS S3 and Windows PowerShell to Download and Rename Files

Web Designers At Work.

Nowadays, many companies are migrating their data to a cloud storage solution rather than on a physical server. Utilizing the cloud has been an increasingly popular solution over the past few years. The advantages of using cloud storage over physical storage include cost-effectivity, always-on availability, increased security, increased mobility, and more.

One popular cloud solution is called AWS (Amazon Web Services), which is provided by Amazon. AWS offers multiple cloud solutions for varying needs of businesses. The cloud storage solution, S3, “provides object storage through a web service interface. Amazon S3 uses the same scalable storage infrastructure that Amazon.com uses to run its global e-commerce network.”Partner Amazon Connect Circle Logo

Many organizations use AWS to connect their existing information systems to AWS S3 for storing data, archiving data, or even further integrating with other information systems (Ex. ERP Data -> AWS S3 -> OneStream).

Windows PowerShell is a windows command-line shell that uses a proprietary scripting language. PowerShell is useful for a variety of tasks including object manipulation, which we will further explore.

Importing AWS Tools for PowerShell

  1. Open PowerShell using “Run as Administrator” by right-clicking on PowerShell
  2. Verify script execution is Unrestricted by running the following command:

Get-ExecutionPolicy

  1. If script execution is Restricted, run:

Set-ExecutionPolicy -ExecutionPolicy Unrestricted

  1. Type in the following command to Import the AWS Tools for PowerShell

Import-Module AWSPowerShell

Connecting to AWS S3 using PowerShell

  1. Run the following command with your AccessKey and SecretKey to connect to your S3 storage. You may change the profile name to whatever you like.
Data Intelligence - The Future of Big Data
The Future of Big Data

With some guidance, you can craft a data platform that is right for your organization’s needs and gets the most return from your data capital.

Get the Guide

Set-AWSCredential -AccessKey AKIA0123456787EXAMPLE –SecretKey wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY -StoreAs MyNewProfile

  1. For future reference, you will no longer have to specify your AccessKey and SecretKey each time, you can just run the command:

Set-AWSCredential -ProfileName MyNewProfile

Downloading and Renaming Files from AWS S3 using PowerShell

  1. Define the bucket you would like to download the files from

$bucket = 'exampleBucket'

  1. Define the folder within the bucket you would like to download the files from. In my case, it is in the Subfolder1 folder inside of the Folder1 folder. Using the slash at the end selects all the files within that folder

$objects = Get-S3Object -BucketName $bucket -KeyPrefix 'Folder1/Subfolder1/'

  1. Define the path where you would like the files downloaded

$localPath = 'C:Usersomar.abuzaherDesktopBlog'

  1. Run a loop that pulls the files and file names. You must use the substring function to only include the name of the file as it is in S3.To find the start position of the file names, you count the amount of characters in the KeyPrefix starting from 0, then add 1.

foreach ($object in $objects) {

$fileName = $object.Key.Substring(19)

  1. Run a nested loop to change the file names. I included the static value “new” in front of the original file name. You can add anything to the file name here, it can even be dynamic (date, a substring of the original file name, etc.).

foreach($file in $fileName) {

$localFileName = ('NEW' + $file)

$localFilePath = Join-Path $localPath $localFileName

Copy-S3Object -BucketName $bucket -Key $object.Key -LocalFile $localFilePath }

}

The completed code looks like this:

$bucket = 'exampleBucket' 
$objects = Get-S3Object -BucketName 
$bucket -KeyPrefix 'Folder1/Subfolder1/' 
$localPath = 'C:Usersomar.abuzaherDesktopBlog' 
foreach ($object in $objects) { 
$fileName = $object.Key.Substring(19) 
foreach($file in $fileName) { 
$localFileName = ('NEW' + $file) 
$localFilePath = Join-Path $localPath $localFileName 
Copy-S3Object -BucketName $bucket -Key $object.Key -LocalFile $localFilePath } 
}

 

About the Author

Omar is a OneStream XF consultant with 2+ years of software implementation experience. A strong desire to solve business challenges with tech-based solutions allows Omar to blend finance with technology to deliver excellence.

More from this Author

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.