Skip to main content

Microsoft

Office 365 – How To Import PSTs From Azure Storage Via PowerShell

It’s been over a year since Microsoft started previewing the “PST Import Service” which allowed administrators to import PSTs into Exchange Online mailboxes. The blog post I wrote (“Office 365 – Using the New PST Import Service“) about the service back in May 2015 has been incredibly popular which tells me there’s a high amount of interest with this service.
For the most part, the PST Import Service is pretty good but there are a few restrictions; specifically, there is minimal flexibility in regards to the import options. While the “New-MailboxImportRequest” cmdlet is used behind the scenes, the PST Import Service doesn’t expose many of the parameters of that cmdlet such as the ability to exclude certain folders.
Below is a more flexible way to handle PST imports using PowerShell…

PST Import Service

The service provided by Microsoft is probably where you should start, it will handle most scenarios. The process is pretty well documented and not very difficult to execute. Additionally, one advantage of this approach is that Microsoft provides you the Azure Blog Storage to upload your PSTs to. However, as mentioned above, all you can do is import the PST and provide a destination, you can’t use all the parameters available natively to the “New-MailboxImportRequest” cmdlet.
Note: Something to be aware of, which is sort of hidden in the documentation, is that using the PST Import Service puts the mailbox on “Retention Hold” indefinitely. The idea is that if you have Retention Policies that delete or archive data older than X days, you don’t want those policies processing the newly imported data. So if you choose to use the PST Import Service, make sure to take the mailbox off Retention Hold when appropriate.

Advantages of Using PowerShell

You may find that you don’t want to import everything in a PST. Calendar entries in particular can be a problem; you may find that they create a storm of calendar alarms that no “Dismiss All” seems to fix. The “BadItemLimit” and “ConflictResolutionOption” parameters are other options that may be of use; the PST Import Service defaults “BadItemLimit” to “Unlimited” instead of the normal default of “0”.

Prerequisites

While Microsoft provides Azure Blob Storage with the PST Import Service, it appears that the SAS token that they provide only allows read and list access to the folder. You can upload PSTs to the folder using the token but you cannot download or read them. In order to use PowerShell for our imports, we’ll need to provision our own Azure Blob Storage with a SAS token that can read the uploaded files. There is of course a cost for this storage but it’s pretty minimal, a month of 1 TB of storage is less than $30 USD and you won’t even need the data for that long; check out the pricing matrix or pricing calculator to estimate your costs. You can also sign up for a free trial that gives you $200 in credits which is about 8 TB of data.

Setting Up Azure Blog Storage

Once you have an Azure account, you’ll want to install the Azure PowerShell module. You can then provision a storage account and storage container for the PSTs.
The commands below create a storage account called “iwitl” in the “West US” region using an Azure subscription called “Visual Studio Professional with MSDN”, you will need to select your own unique name and enter your subscription name (“Get-AzureSubscription | select SubscriptionName”). Also, the SAS token is set to expire 2016-12-31, you will want to adjust this accordingly.

Add-AzureAccount
$sta = "iwitl"
$stc = "pstupload"
$loc = "West US"
$sub = "Visual Studio Professional with MSDN"
$exp = "2016-12-31"
Select-AzureSubscription -SubscriptionName $sub
New-AzureStorageAccount -StorageAccountName $sta -Location $loc -Type "Standard_LRS"
Set-AzureSubscription -CurrentStorageAccountName $sta -SubscriptionName $sub
New-AzureStorageContainer -Name $stc -Permission Off
$token = New-AzureStorageContainerSASToken -Name $stc -Permission rwl -ExpiryTime $exp
Write-Host "`nStorage Account URI`n-------------------`nhttps://$sta.blob.core.windows.net/$stc`n`n"`
"SAS Token`n---------`n$token`n`n"`
"SAS URI`n-------`nhttps://$sta.blob.core.windows.net/$stc$token`n"

After provisioning, you’ll see the following output:

You now have your storage provisioned and can test connectivity. You’ll want to download the “Azure Storage Explorer” utility and you should be able to successfully connect using the “SAS URI” value in the PowerShell output.

Uploading PSTs into Azure Blob Storage

There are two options for getting the files into Azure Blog Storage. You can use the same PowerShell upload process as used by the PST Import Service using the “Azure AzCopy” tool. Otherwise, if you prefer a GUI interface, you can also use the recently installed “Azure Storage Explorer” utility to upload the PSTs.

How to Import PSTs Using PowerShell

Now that the PSTs are uploaded and you have your SAS token, you can use the “New-MailboxImportRequest” cmdlet in Exchange Online to kick off the import.
The following command imports the specified PST into a folder called “Imported PSTs”, excludes the Calendar from the import and allows for up to 25 bad items:

New-MailboxImportRequest -Name "mailbox_import_testuser" -Mailbox "test.user@iwitl.com" -TargetRootFolder "Imported PSTs" -ExcludeFolders "#Calendar#/*" -BadItemLimit 25 -AzureBlobStorageAccountUri "https://iwitl.blob.core.windows.net/pstupload/test.pst" -AzureSharedAccessSignatureToken "?sv=2015-04-05&sr=c&sig=Lo6%2B6IeNTSn%2Bbro5Xyytmu65PB0%2FyvGioHrKCFfzfLo%3D&se=2016-12-31T05%3A00%3A00Z&sp=rwl"

The “AzureBlobStorageAccountUri” is the “Storage Account URI” value from the previous PowerShell output along with the PST file name and the “AzureSharedAccessSignatureToken” is the “SAS Token” from that PowerShell output.

Once the import is started, you can use the “Get-MailboxImportRequest” and “Get-MailboxImportRequestStatistics” cmdlets to monitor the progress.
Once the process is completed, don’t forget to remove your PSTs from the Azure storage so you do not continue paying for them.
Tip: If you have Retention Policies in place, you may want to use the “Set-Mailbox -RetentionHoldEnabled $True” command to pause those policies before importing the PST. You will need to use “Set-Mailbox -RetentionHoldEnabled $False” in the future to apply the policies again.

Summary

The PST Import Service is a convenient way to import PSTs into Exchange Online mailboxes. However, this service has some limitations and may not meet the needs of every import project. Azure Blob Storage and PowerShell can be used to import PSTs in a similar manner but with the full flexibility of all the cmdlet parameters available.
Did you find this article helpful?
Leave a comment below or follow me on Twitter (@JoePalarchio) for additional posts and information on Office 365.
Looking to do some more reading on Office 365?
Catch up on my past articles here: Joe Palarchio.

Thoughts on “Office 365 – How To Import PSTs From Azure Storage Via PowerShell”

  1. Are the powershell cmdlet New-MailboxImportRequest gererally available or is ir only available in preview?

  2. It should be generally available however you likely need to add the “Mailbox Import Export” role before you see it.

  3. Hi Joe,
    I have verified that the my -AzureBlobStorageAccountUri is correct, However, I am receiving a (404) error not found. I have verified the path is correct in Azure Explorer.
    Unable to open PST file ‘https://#############.blob.core.windows.net/ingestiondata/test01/test3@contoso.com/test3.pst’. Error details: The remote server returned an error: (404) Not Found.
    Any ideas on what I can check?
    Thanks

  4. Hi Joe, when mapping the pst via CSV, is there a way to import the contents of the csv directly to the inbox, as opposed to plopping a folder into the tree ? Basically, I’m wondering if a cloud born mailbox (in other words, an empty mailbox object in EXO) can be targeted with this method or the standard (preferably) MSFT pst import service? Having an issue with a couple of mailboxes and this is looking like my only route. Thanks!

  5. I have noticed that the logging info is very limited. We can see that an item is skipped or is too large but we cannot see what the item is (i.e.details?!). Is there a work around to this?

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Joe Palarchio

More from this Author

Categories
Follow Us
TwitterLinkedinFacebookYoutubeInstagram