Windows Azure Powershell Copying a File to a Virtual Machine

I am trying to use Windows Azure PowerShell to copy a zip file to a virtual machine. I managed to connect to the VM after the documentation.

But I canโ€™t find a tutorial for loading / copying / transferring a zip file to a VM disk, say, into drive C.

Can someone help me give some link for a tutorial or any idea how I can copy it?

+7
powershell azure azure-virtual-machine
source share
5 answers

You cannot use PowerShell to copy a file directly to a disk of a virtual machine's operating system (or even to one of the disks connected to it). There is no API for direct communication with virtual machines (you will need to create your own service for it.

You can use PowerShell to upload a file to Blob using Set-AzureStorageBlobContent . At this point, you can notify your application (perhaps with a queue message?) On your virtual machine so that there is a file waiting to be processed. And processing can be as simple as copying a file to a local VM disk.

+2
source share

Here is the approach I described here . It includes

  • Create and install an empty local VHD.
  • Copy files to a new VHD and dismantle.
  • Copy VHD to azure blob storage
  • Attach this VHD to your virtual machine.

Here is an example:

 #Create and mount a new local VHD $volume = new-vhd -Path test.vhd -SizeBytes 50MB | ` Mount-VHD -PassThru | ` Initialize-Disk -PartitionStyle mbr -Confirm:$false -PassThru | ` New-Partition -UseMaximumSize -AssignDriveLetter -MbrType IFS | ` Format-Volume -NewFileSystemLabel "VHD" -Confirm:$false #Copy my files Copy-Item C:\dev\boxstarter "$($volume.DriveLetter):\" -Recurse Dismount-VHD test.vhd #upload the Vhd to azure Add-AzureVhd -Destination http://mystorageacct.blob.core.windows.net/vhdstore/test.vhd ` -LocalFilePath test.vhd #mount the VHD to my VM Get-AzureVM MyCloudService MyVMName | ` Add-AzureDataDisk -ImportFrom ` -MediaLocation "http://mystorageacct.blob.core.windows.net/vhdstore/test.vhd" ` -DiskLabel "boxstarter" -LUN 0 | ` Update-AzureVM 
+5
source share

Here is the code I got from some powershell examples and modified it. He is working on a session created using New-PSSession . There a cool wrapper for this is also included below. Finally, I needed to send a whole folder to here too ..

An example of using an example to link them

 # open remote session $session = Get-Session -uri $uri -credentials $credential # copy installer to VM Write-Verbose "Checking if file $installerDest needs to be uploaded" Send-File -Source $installerSrc -Destination $installerDest -Session $session -onlyCopyNew $true <# .SYNOPSIS Returns a session given the URL .DESCRIPTION http://michaelcollier.wordpress.com/2013/06/23/using-remote-powershell-with-windows-azure-vms/ #> function Get-Session($uri, $credentials) { for($retry = 0; $retry -le 5; $retry++) { try { $session = New-PSSession -ComputerName $uri[0].DnsSafeHost -Credential $credentials -Port $uri[0].Port -UseSSL if ($session -ne $null) { return $session } Write-Output "Unable to create a PowerShell session . . . sleeping and trying again in 30 seconds." Start-Sleep -Seconds 30 } catch { Write-Output "Unable to create a PowerShell session . . . sleeping and trying again in 30 seconds." Start-Sleep -Seconds 30 } } } <# .SYNOPSIS Sends a file to a remote session. NOTE: will delete the destination before uploading .EXAMPLE $remoteSession = New-PSSession -ConnectionUri $remoteWinRmUri.AbsoluteUri -Credential $credential Send-File -Source "c:\temp\myappdata.xml" -Destination "c:\temp\myappdata.xml" $remoteSession Copy the required files to the remote server $remoteSession = New-PSSession -ConnectionUri $frontEndwinRmUri.AbsoluteUri -Credential $credential $sourcePath = "$PSScriptRoot\$remoteScriptFileName" $remoteScriptFilePath = "$remoteScriptsDirectory\$remoteScriptFileName" Send-File $sourcePath $remoteScriptFilePath $remoteSession $answerFileName = Split-Path -Leaf $WebPIApplicationAnswerFile $answerFilePath = "$remoteScriptsDirectory\$answerFileName" Send-File $WebPIApplicationAnswerFile $answerFilePath $remoteSession Remove-PSSession -InstanceId $remoteSession.InstanceId #> function Send-File { param ( ## The path on the local computer [Parameter(Mandatory = $true)] [string] $Source, ## The target path on the remote computer [Parameter(Mandatory = $true)] [string] $Destination, ## The session that represents the remote computer [Parameter(Mandatory = $true)] [System.Management.Automation.Runspaces.PSSession] $Session, ## should we quit if file already exists? [bool] $onlyCopyNew = $false ) $remoteScript = { param ($destination, $bytes) # Convert the destination path to a full filesystem path (to supportrelative paths) $Destination = $ExecutionContext.SessionState.` Path.GetUnresolvedProviderPathFromPSPath($Destination) # Write the content to the new file $file = [IO.File]::Open($Destination, "OpenOrCreate") $null = $file.Seek(0, "End") $null = $file.Write($bytes, 0, $bytes.Length) $file.Close() } # Get the source file, and then start reading its content $sourceFile = Get-Item $Source # Delete the previously-existing file if it exists $abort = Invoke-Command -Session $Session { param ([String] $dest, [bool]$onlyCopyNew) if (Test-Path $dest) { if ($onlyCopyNew -eq $true) { return $true } Remove-Item $dest } $destinationDirectory = Split-Path -Path $dest -Parent if (!(Test-Path $destinationDirectory)) { New-Item -ItemType Directory -Force -Path $destinationDirectory } return $false } -ArgumentList $Destination, $onlyCopyNew if ($abort -eq $true) { Write-Host 'Ignored file transfer - already exists' return } # Now break it into chunks to stream Write-Progress -Activity "Sending $Source" -Status "Preparing file" $streamSize = 1MB $position = 0 $rawBytes = New-Object byte[] $streamSize $file = [IO.File]::OpenRead($sourceFile.FullName) while (($read = $file.Read($rawBytes, 0, $streamSize)) -gt 0) { Write-Progress -Activity "Writing $Destination" -Status "Sending file" ` -PercentComplete ($position / $sourceFile.Length * 100) # Ensure that our array is the same size as what we read from disk if ($read -ne $rawBytes.Length) { [Array]::Resize( [ref] $rawBytes, $read) } # And send that array to the remote system Invoke-Command -Session $session $remoteScript -ArgumentList $destination, $rawBytes # Ensure that our array is the same size as what we read from disk if ($rawBytes.Length -ne $streamSize) { [Array]::Resize( [ref] $rawBytes, $streamSize) } [GC]::Collect() $position += $read } $file.Close() # Show the result Invoke-Command -Session $session { Get-Item $args[0] } -ArgumentList $Destination } <# .SYNOPSIS Sends all files in a folder to a remote session. NOTE: will delete any destination files before uploading .EXAMPLE $remoteSession = New-PSSession -ConnectionUri $remoteWinRmUri.AbsoluteUri -Credential $credential Send-Folder -Source 'c:\temp\' -Destination 'c:\temp\' $remoteSession #> function Send-Folder { param ( ## The path on the local computer [Parameter(Mandatory = $true)] [string] $Source, ## The target path on the remote computer [Parameter(Mandatory = $true)] [string] $Destination, ## The session that represents the remote computer # [Parameter(Mandatory = $true)] [System.Management.Automation.Runspaces.PSSession] $Session, ## should we quit if files already exist? [bool] $onlyCopyNew = $false ) foreach ($item in Get-ChildItem $Source) { if (Test-Path $item.FullName -PathType Container) { Send-Folder $item.FullName "$Destination\$item" $Session $onlyCopyNew } else { Send-File -Source $item.FullName -Destination "$destination\$item" -Session $Session -onlyCopyNew $onlyCopyNew } } } 
+4
source share

  .\AzCopy /Source:C:\myfolder /Dest: https://myaccount.blob.core.windows.net/mycontainer/myfolder/ /DestKey:key /Pattern:abc.txt 
  1. Logging on to a Remote Virtual Machine

  2. Powershell: Blob Download a single file

  .\AzCopy /Source: https://myaccount.file.core.windows.net/myfileshare/myfolder/ /Dest:C:\myfolder /SourceKey:key /Pattern:abc.txt 
+1
source share

Another solution is to use the Custom Script Extension .
Using a custom script extension allows you to copy a file to a virtual machine, even if the virtual machine does not have an open IP address (private network). Therefore, you do not need to configure winRm or anything else.

I used my own extension scripts in the past for post-deployment, such as installing an application on a virtual machine or a set of scales. Basically you upload files to the blob repository, and the custom script extension uploads these files to the virtual machine.

I created test-container on my blob memory account and uploaded two files:

  • deploy.ps1 : script executed in a virtual machine.
  • test.txt : text file with "Hello world from VM"

Here is the code for the deploy.ps1 file:

 Param( [string] [Parameter(Mandatory=$true)] $filename, [string] [Parameter(Mandatory=$true)] $destinationPath ) # Getting the full path of the downloaded file $filePath = $PSScriptRoot + "\" + $filename Write-Host "Checking the destination folder..." -Verbose if(!(Test-Path $destinationPath -Verbose)){ Write-Host "Creating the destination folder..." -Verbose New-Item -ItemType directory -Path $destinationPath -Force -Verbose } Copy-Item $filePath -Destination $destinationPath -Force -Verbose 

Here is the code for adding a custom script extension to a virtual machine.

 Login-AzureRMAccount $resourceGroupName = "resourcegroupname" $storageAccountName = "storageaccountname" $containerName = "test-container" $location = "Australia East" $vmName = "TestVM" $extensionName = "copy-file-to-vm" $filename = "test.txt" $deploymentScript = "deploy.ps1" $destintionPath = "C:\MyTempFolder\" $storageAccountKeys = (Get-AzureRmStorageAccountKey -ResourceGroupName $resourceGroupName -Name $storageAccountName).Value $storageAccountKey = $storageAccountKeys[0] Set-AzureRmVMCustomScriptExtension -ResourceGroupName $resourceGroupName -VMName $vmName -Name $extensionName -Location $location -TypeHandlerVersion "1.9" -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey -ContainerName $containerName -FileName $deploymentScript, $filename -Run $deploymentScript -Argument "$filename $destintionPath" -ForceRerun "1" 

You can remove the extension after copying the file:

 Remove-AzureRmVMCustomScriptExtension -ResourceGroupName $resourceGroupName -VMName $vmName -Name $extensionName -Force 

In my scenario, I have a logical application that runs every time a new file is added to the container. The logical application calls the runbook (an azure automation account is required), which adds a custom script extension and then removes it.

0
source share

All Articles