0

We have Azure Blob Storage Accounts with 100s of containers. The file structure is something like below:

container_01 | --somemedia.jpg --anothermedia.jpg

container_02 | --secondcontainersmedia.jpg --andSoOn --AndSoOnAndSoOn

My client wants to download all of the containers to local storage so that if necessary they can be re-uploaded to Azure. After doing some research I found this blog post. Updating the script from there to suit my needs (just updating from AzureRM to AZ and my personal connection and local path), I came up with the following suitable script for downloading the files.

$destination_path = 'C:\Storage Dump Test'
$connection_string = '[Insert Connection String]'

$storage_account = New-AzStorageContext -ConnectionString $connection_string

$containers = Get-AzStorageContainer -Context $storage_account

Write-Host 'Starting Storage Dump...'

foreach ($container in $containers)
{
    Write-Host -NoNewline 'Processing: ' . $container.Name . '...'

    $container_path = $destination_path + '\' + $container.Name

    if(!(Test-Path -Path $container_path ))
    {
        New-Item -ItemType directory -Path $container_path
    }

    $blobs = Get-AzStorageBlob -Container $container.Name -Context $storage_account

    Write-Host -NoNewline ' Downloading files...'    

    foreach ($blob in $blobs)
    {        
        $fileNameCheck = $container_path + '\' + $blob.Name
        
        if(!(Test-Path $fileNameCheck ))
        {
            Get-AzStorageBlobContent `
            -Container $container.Name -Blob $blob.Name -Destination $container_path `
            -Context $storage_account
        }            
    }  

    Write-Host ' Done.'
}

Write-Host 'Download complete.'

So now I have a directory on my local storage with hundreds of folders containing media items. I need to create a PS script (or find some other way) to basically do the opposite-- take all the folders in that directory, create containers using the names of the folders, and upload the items within each folder to the appropriate container.

How should I start going about this?

1 Answer 1

0

You'd have a lot more success, quicker, using azcopy instead of working with the azure cmdlets. To copy:

azcopy copy '<local-file-path>' 'https://<storage-account-name>.<blob| dfs>.core.windows.net/<container-name>/<blob-name>'

It can also create containers:

azcopy make 'https://mystorageaccount.blob.core.windows.net/mycontainer'

azcopy can download an entire container without you having to specify each file. Use --recursive

See: https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10

4
  • Is there a way, using azcopy, to do this for hundreds of folders and their contents without having to do it manually? Commented Oct 23, 2020 at 16:01
  • absolutely! if you give azcopy the name of a container, it will download all of the contents too if you use --recursive parameter.
    – x0n
    Commented Oct 23, 2020 at 16:43
  • Read here: learn.microsoft.com/en-us/azure/storage/common/…
    – x0n
    Commented Oct 23, 2020 at 16:44
  • I looked through the documentation and maybe I'm being dense but I am not seeing a way to upload multiple folders as containers. If I have local folders 0000 - 0999, each with contents, how do I upload those to Azure preserving that same structure (ie local folder 0000 becomes container 0000, etc). Commented Oct 23, 2020 at 17:43

Not the answer you're looking for? Browse other questions tagged or ask your own question.