Powershell: how do you read and write I / O in one pipeline?

I would like to be able to enter quick, simple commands that manage files in place. For example:

# prettify an XML file format-xml foo | out-file foo 

This will not work, because the conveyor is designed to be greedy. The downstream cmdlet gets a write lock on the file as soon as the upstream cmdlet processes the first line of input, which delays the upstream cmdlet from reading the rest of the file.

There are many possible workarounds: writing to temporary files, individual operations in several pipelines (storing intermediate results in variables), or similar. But I believe that this is indeed a common task for which someone has developed a quick, shell-friendly shortcut.

I tried this:

 function Buffer-Object { [CmdletBinding()] param ( [parameter(Mandatory=$True, ValueFromPipeline=$True)] [psobject] $InputObject ) begin { $buf = new-list psobject } process { $buf.Add($InputObject) } end { $buf } } format-xml foo | buffer-object | out-file foo 

In some situations, it works fine. Configured for a short alias and included in a general distribution such as PSCX, it will be "good enough" for fast interactive tasks. Unfortunately, it seems that some cmdlets (including out-file) capture the lock in the Begin {} method, and not in Process {}, so it does not solve this specific example.

Other ideas?

+4
file-io powershell pipeline
source share
1 answer

As far as I remember (I can’t test now), you can read the whole file in memory with the namespace designation:

 ${c:file1.txt} = ${c:file1.txt} -replace "a" "o" 
+7
source share

All Articles