Hello reader. One thing that I’ve learned a hard lesson on over the last year is some of the many pains of writing scripts for large-scaled environments. PowerShell often makes things painfully simple to write a script but I’ve found that sometimes that script can run just too long, especially if there’s an unforeseen error.
In the past I’ve had scripts that might run 24-48 hours. I always try to optimize these scripts as much as possible, but it’s not always so easy depending on the number of items I’m working against.
One way to help combat this is to break up a script so that I get incremental reporting. This is usually done with loops that report out to a .CSV file after completing each vCenter Server
Since I do this quite often I wanted to create a function I could add to the end of my script that would then combine all of the .CSVs without duplicating the header information.
The function requires a SourceFolder that contains your .CSV files.
Use -Filter to provide a filter on the returned .CSV files that you’re grabbing.
Skip the -ExportFileName parameter and the results will only display on console
Combine-CSV -SourceFolder “C:\Temp" -Filter “vc*.csv” -ExportFileName “EVC-Audit-Combined.csv”
Download the full function from PoshCode
I’m always game for optimizations and improvements so feel free to add them to the comments.
Hope you find this useful. Thanks for reading.
Update: A sharp reader Hemanth.D (@SqlChow) made comment to me today (2013-02-19) that you could use the following command to combine the .csv files:
copy /A *.csv EVC-Audit-Combined.csv
He is absolutely correct that this will combine the files. The only thing it lacks is stripping the header of each individual CSV file. Nonetheless I plan to store this one in my pocket for future use. I can see it as useful when combining lists.
Note: this command will need to be run in a DOS prompt since PowerShell uses ‘copy’ as an alias for Copy-Item.
Link: Info about the copy command: