We have Azure Blob Storage Accounts with 100s of containers. The file structure is something like below:
container_01 | --somemedia.jpg --anothermedia.jpg
container_02 | --secondcontainersmedia.jpg --andSoOn --AndSoOnAndSoOn
My client wants to download all of the containers to local storage so that if necessary they can be re-uploaded to Azure. After doing some research I found this blog post. Updating the script from there to suit my needs (just updating from AzureRM to AZ and my personal connection and local path), I came up with the following suitable script for downloading the files.
$destination_path = 'C:\Storage Dump Test'
$connection_string = '[Insert Connection String]'
$storage_account = New-AzStorageContext -ConnectionString $connection_string
$containers = Get-AzStorageContainer -Context $storage_account
Write-Host 'Starting Storage Dump...'
foreach ($container in $containers)
{
Write-Host -NoNewline 'Processing: ' . $container.Name . '...'
$container_path = $destination_path + '\' + $container.Name
if(!(Test-Path -Path $container_path ))
{
New-Item -ItemType directory -Path $container_path
}
$blobs = Get-AzStorageBlob -Container $container.Name -Context $storage_account
Write-Host -NoNewline ' Downloading files...'
foreach ($blob in $blobs)
{
$fileNameCheck = $container_path + '\' + $blob.Name
if(!(Test-Path $fileNameCheck ))
{
Get-AzStorageBlobContent `
-Container $container.Name -Blob $blob.Name -Destination $container_path `
-Context $storage_account
}
}
Write-Host ' Done.'
}
Write-Host 'Download complete.'
So now I have a directory on my local storage with hundreds of folders containing media items. I need to create a PS script (or find some other way) to basically do the opposite-- take all the folders in that directory, create containers using the names of the folders, and upload the items within each folder to the appropriate container.
How should I start going about this?