blog.natfan.io

Rants and ravings from a techy brit.
(Now hosted on DigitalOcean!)
Dark Mode?

Importing PSTs

Posted 5 months ago.

Today I had the pleasure of importing Outlook PSTs into Exchange Online. I'd never had to do it before, so I decided to look at the documentation. 🦆 me, there's a lot of it! "Okay", I think to myself, "I'll be able to handle this on my own and check up with the docs when I run into issues." Boy, how wrong I was...

Step 0: Permissions

The first issue I ran into was permissions. Apparently, my administrative account didn't have the "Mailbox Import Export role" added to it, so me and my boss spent 10 minutes faffing around with that. When we found the right checkbox and the permission finally kicked in, I figured I'd be good to go.

Step I: Naming (the only easy bit)

To start with, you need to name the import. "That's fine, I'll just call it <initial>-<account-to-migrate-username>." I murmur, and plug that in. Wow, it worked. That was easy.

Step II: Import the Data (there is no God)

Firstly I need to select whether I want "Upload [my] data" or "Ship hard drives to one of our physical locations". For that second one, wat. I guess I could see it being useful in cases of 100s of GBs of data, but I just want to test this with a 27MB PST so I'm going to go with the first one please.

After this it asks me to read the guide that I already had open because my permissions were lacking and it wasn't obvious as to which one I needed to use. Then it gives me a "SAS URL" to be able to upload. It also requires that I download the "azcopy" tool. WTF? Why? I just want to upload a PST gorram it! Fine, I download and install the azcopy tool, and run it. Would have been nice if it was a Powershell module as it's literally just command line, however I understand why a standalone installer might be easier. For reference, this is the (redacted slightly) command that I entered:

azcopy /Source:\\file.share.local.domain\exchange-archive$\PSTs\PSTThatIWantToMigratePlease.pst /Dest:"https://uuid.blob.core.windows.net/injestiondata?sv=2015-04-05&sr=c&si=IngestionSasForAzCopy201809040932135982&sig=SIGNATURE&se=2020-01-01T12%3A21%3A52Z /V:\\file.share.local.domain\exchage-archive$\PSTs\PSTThatIWantToMigratePlease.log /Y

Oh, what's that? I can't hear you over the following error:

[2019-12-02 12:25:36][ERROR] Error parsing destination location
"https://uuid.blob.core.windows.net/ingestiondata?sv=..."
Transfer from a file to a directory path is not supported.
Please update /Dest with an absolute path to a file.
For more details, please type "AzCopy /?:Dest" or use verbose option /V.

Does that make any sense to you? Because it doesn't to me... "Transfer from a file to a directory path is not supported"? So I'm not allowed to transfer files to directories, is this not the point of this tool? Well, one quick conflab with a co-worker later (cheers gman!) and we figured it was due to the fact that I specified a specific PST whereas I had to specify a directory instead. So one quick

mkdir PSTThatIWantToMigratePlease
mv PSTs\PSTThatIWantToMigratePlease.pst PSTThatIWantToMigratePlease

later, and I could run the same command as above with the Source path modified to remove the .pst file extension and I was good to go for the uploading side of things.

Step III: Check your work (am I still in high school?)

So the total output of the command I ran in the last step wasn't super helpful in terms of telling me what specifically worked, even with the /V flag on...

Finished 1 of total 1 file(s).
[2019-12-02 14:17:04] Transfer summary:
-----------------
Total files transferred: 1
Transfer successfully:   1
Transfer skipped:        0
Transfer failed:         0
Elapsed time:            00.00:00:01

Yeah, that don't help me much. Especially with the short time-frame, I figured something had gone wrong. So I followed the next steps in the guide and downloaded the Azure Storage Explorer, or as I'd learn to start calling it, the "Azure StoRAGE Explorer". First off, the entire app is written in Node.js, which might seem like an strange thing to bitch about, and it is, and maybe Node.js is the best tool for the job in this case but it still sort of annoys me when everything I see is written in what is basically a copy of Chrome now. I like my RAM, thanks. (VS Code might be the only exception here...) Actually, the Storage Explorer wasn't too bad, except of you try to delete something it fails and said that it couldn't delete the thing you wanted to delete because it doesn't exist, except clearly it does so why is it failing?!

Anywho, I install the app, right-clicked on 'Storage Accounts', plugged in my SAS URL and I was in. "Hey, this isn't so bad", I thought to myself. I found the PST and confirmed that it's size was the same in the Azure Blob storage as it was on my local machine, however I then decided to try to see the properties of the file. "FORBIDDEN", is the error message I receive when trying that though, so I guess it wasn't for me. Oh well.

Step IV: CSV File Mapping (why the hell isn't this just a web-based GUI?)

Awesome so the data has been uploaded. What's next on the checklist?

4. Prepare the mapping file.

Mapping file? Why can't I just say in the web-based GUI where I want the PST to go to? Fine, whatever. Let's take a look and see what the example mapping file looks like. (I've removed anything that's never filled, but trust me this table is worse than what I've shown below.)

Workload,FilePath,Name,Mailbox,IsArchive,TargetRootFolder,
Exchange,,annb.pst,[email protected],FALSE,/
Exchange,,annb_archive.pst,[email protected],TRUE,
Exchange,,donh.pst,[email protected],FALSE,/
Exchange,,donh_archive.pst,[email protected],TRUE,
Exchange,PSTFiles,pilarp.pst,[email protected],FALSE,/
Exchange,PSTFiles,pilarp_archive.pst,[email protected],TRUE,/ImportedPst
Exchange,PSTFiles,tonyk.pst,[email protected],FALSE,
Exchange,PSTFiles,tonyk_archive.pst,[email protected],TRUE,/ImportedPst
Exchange,PSTFiles,zrinkam.pst,[email protected],FALSE,
Exchange,PSTFiles,zrinkam_archive.pst,[email protected],TRUE,/ImportedPst

What in the everloving 🦆? That might be the single worst thing I've ever seen. Here it is in a table for your viewing pleasure:

Workload FilePath Name Mailbox IsArchive TargetRootFolder
Exchange annb.pst [email protected] FALSE /
Exchange annb_archive.pst [email protected] TRUE
Exchange donh.pst [email protected] FALSE /
Exchange donh_archive.pst [email protected] TRUE
Exchange PSTFiles pilarp.pst [email protected] FALSE /
Exchange PSTFiles pilarp_archive.pst [email protected] TRUE /ImportedPst
Exchange PSTFiles tonyk.pst [email protected] FALSE
Exchange PSTFiles tonyk_archive.pst [email protected] TRUE /ImportedPst
Exchange PSTFiles zrinkam.pst [email protected] FALSE
Exchange PSTFiles zrinkam_archive.pst [email protected] TRUE /ImportedPst

So I have to make a god-damn CSV in order to tell Office365 where to put the PST that I've made? Why is this SO DIFFICULT?!

I removed all the example stuff and just added the following:

Workload FilePath Name Mailbox IsArchive TargetRootFolder
Exchange username.pst GUID FALSED /Imported

Then, I ticked the boxes to say that "I'm done uploading my files" and "I have access to the mapping file", then pressed Next and uploaded the mapping file. I also had to explicitly "Validate" the mapping file, which it should just do on upload to be honest. I then "Saved" the job and had to wait for it to perform analysis...

Step V: Microsoft Valet Services (I guess analysis is a good thing?)

Once that's done you need to wait for the system to validate the files and actually run the import. The actual validation of the files doesn't take that long, to be fair, however I feel that it should all be done when I'm in the UI rather than adding it to a queue? Anywho, once that's done it allows you to press the "Start" button to process the job, instead of just running the job by itself. For some reason Microsoft engineers thought it'd be a good idea to have this import job be able to be forgotten about for months because you deleted that generic looking status update email. Ugh.

Once you actually click the button to start the process, it's slow. I mean really slow. It took one of my jobs about two and a half hours to import 17.6MB of data. WTF? Why it's so slow I'll never understand.

Closing Thoughts

Basically, I hated most of this process. Actually, that's misleading. I disliked it, but admired it. From a technical standpoint, it's genius. Uploading PSTs to blob storage, then parsing CSV files to tell the system where to go: it's incredible. However the actual implementation sucks. Why the whole process can't just be done in the backend is beyond me... Oh well, that's Microsoft for you!