Is it possible to install blob storage on my local computer for deployment?

I have a build script that would be very useful to set up to dump some files in the Azure data store so that they can be captured by my Azure web role.

My preferred plan was to find a way to mount the blob repository on my build server as a mapped drive and just use a copy of Robocopy to copy the files. This will be associated with the least amount of friction, as I am already deploying some files like this to other web servers using WebDrive .

I found a piece of software that will allow me to do this: http://www.gladinet.com/

However, during a further investigation, I found that he needed port 80 to run without any hairy hacking on the server.

So, is there another piece of software that I could use, or perhaps another way that I did not consider, for example, deploying files to a local folder that automatically syncs with the storage?

Update in response to @David Makogon

I am using http://waacceleratorumbraco.codeplex.com/ , which performs two-way synchronization between blob repository and web roles. I tested this with http://cloudberrylab.com/ and I can deploy the files manually in blob and they are correctly deployed for web roles. In addition, I made reverse and updated files in web roles, which were then synchronized with the blob, and I subsequently edited / downloaded them from the blob repository.

What I'm really looking for is a way to automate the indigenous side of things. Therefore, I do not have a manual step to copy multiple files. In the meantime, I am investigating Powershell solutions.

+7
source share
3 answers

I know this is an old post, but in case anyone else comes here ... the answer is now yes. I worked on a CodePlex project to do just that. (All source code is available).

http://azuredrive.codeplex.com/

enter image description here

enter image description here

+9
source

If you are comfortable using powershell in the build process , you can use Cerebrata Cmdlets to upload files. If this does not work for you, you can write a custom event (but it sounds a bit more active).

+4
source

You cannot install a cloud drive from a calculation instance other than Windows Azure (for example, a local build machine).

Having said that: even if you can mount Cloud Drive from your build machine, your computing instances also need access to it, and there can only be one writer. If your computing instances only need read-only access, they will need to create a snapshot after uploading new files.

It really doesn't sound like a good idea. As knightpfhor suggested, Cerebrata cmdlets provide this feature (look at Import-File). This allows you to insert individual files into your own drops. You can optimize by clicking one ZIP file on blob. You can then use a technique similar to that described by Nate Totten in his multi-tenant example to discover new zip files and expand them in local storage. Nate's blog entry is here .

Oh, and if you don't want to use Cerebrata cmdlets, you can load drops directly using the Windows Azure Storage REST API (although the cmdlets are very easy to use and integrate easily with PowerShell).

+1
source

All Articles