Skip to content

Instantly share code, notes, and snippets.

@poshcodebear
Last active September 24, 2015 00:18
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save poshcodebear/4b33132a9e1ec7a437d4 to your computer and use it in GitHub Desktop.
Save poshcodebear/4b33132a9e1ec7a437d4 to your computer and use it in GitHub Desktop.
sg-september-2015.ps1
(Add-Content -Value 'MACHINENAME,OSVERSION' -Path .\Output.csv -Encoding Ascii -PassThru) | Get-WmiObject Win32_OperatingSystem -ComputerName (Import-Csv -Path .\Input.csv).MACHINENAME -ErrorAction Ignore | foreach {Add-Content -Value "$($_.PSComputerName),$($_.Caption)" -Path .\Output.csv -Encoding Ascii}
@poshcodebear
Copy link
Author

My entry for September's ScriptingGame. Took me a bit to get it working the way I wanted; I tried to eliminate all semicolons and curly bracers, and I almost made it, with just 1 pair of curly bracers and no semicolons in the entire string.

Initially, I went the more direct route of using Select-Object and generating the properties dynamically, then using Export-Csv, which formats everything perfectly and sets the encoding correctly, but this left me with 2 semicolons and 4 curly brace pairs, which I can't really eliminate using this method while still fulfilling the basic requirements (specifically, the header names; if I could have ignored the header names, this would have been easy).

Then I decided to try manually building the file using Out-File, first by creating the file with the header names, then breaking the pipeline with a semicolon to add the rest to the end. I reduced the semicolon count and curly brace count by using a foreach loop to append the data to the file, when I had to use Select-Object before. This brought be down to 1 curly brace pair and no semicolons for writing the output, but I had to add a semicolon at the beginning, which feels like cheating to me, so I wanted to find a way to eliminate that one as well.

I almost gave up on this last part.

The challenge I had was I wanted to write the header to the file, then have the pipeline continue from there, without breaking the pipeline. The problem is that Out-File does not put anything on to the pipeline, so the pipe would terminate without executing Get-WmiObject. I needed to have whatever wrote to the file first also put something on the pipeline. The natural choice for this seemed to be Tee-Object, so I started there. Initially, it worked, kind of, in that it did put something on the pipeline to fool Get-WmiObject to execute, though it caused it to throw an error; telling it to ignore the error, however, caused the rest of the pipeline to execute exactly as I hoped it would.

Unfortunately, there were two problems with Tee-Object: first, there's no way to set the encoding, and the default encoding makes the CSV file it puts out completely useless; it also didn't write to the file until the pipeline completed, so it would either overwrite the actual output if I didn't specify append, or it would add the header at the bottom if I did.

Without the right encoding, I wasn't going to be able to proceed, so I set out to fix that first. Some digging turned up Add-Content, which has two parameters that were exactly what I was looking for: Encoding, and PassThru, the second of which told me that it would let me put whatever I wrote back on the pipeline to trick Get-WmiObject. This produced usable results, though the header was still on the bottom.

To fix the header, I decided to try forcing the pipeline to finish the first command before proceeding by surrounding it with parentheses. I honestly didn't expect that to work, but much to my delight, it did put out a properly formatted CSV file with the header in the right place and properly encoded so Excel and PowerShell were both able to read it correctly.

The one down side to this method is that it will not create a new output CSV file if one already exists, but will append to it inappropriately with another copy of the header embedded in the middle. That being said, ensuring it's a new file was not one of the requirements, and you typically wouldn't try to force these limitations in production anyway, so I'm calling this one good.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment