I have a build script that would be very useful to set up to dump some files in the Azure data store so that they can be captured by my Azure web role.
My preferred plan was to find a way to mount the blob repository on my build server as a mapped drive and just use a copy of Robocopy to copy the files. This will be associated with the least amount of friction, as I am already deploying some files like this to other web servers using WebDrive .
I found a piece of software that will allow me to do this: http://www.gladinet.com/
However, during a further investigation, I found that he needed port 80 to run without any hairy hacking on the server.
So, is there another piece of software that I could use, or perhaps another way that I did not consider, for example, deploying files to a local folder that automatically syncs with the storage?
Update in response to @David Makogon
I am using http://waacceleratorumbraco.codeplex.com/ , which performs two-way synchronization between blob repository and web roles. I tested this with http://cloudberrylab.com/ and I can deploy the files manually in blob and they are correctly deployed for web roles. In addition, I made reverse and updated files in web roles, which were then synchronized with the blob, and I subsequently edited / downloaded them from the blob repository.
What I'm really looking for is a way to automate the indigenous side of things. Therefore, I do not have a manual step to copy multiple files. In the meantime, I am investigating Powershell solutions.
Tim saunders
source share