PowerShell: Logging in JSON format
When writing scripts for clients, it’;s important to generate good logs. It’;s a common position where a script works perfectly in test. But in production something is askew.
Being able to look back at the process and step through after the fact is essential.
Usually my logs are written in CSV format. CSV is great for parsing. But, it’;s awful to run in a text file. Due to this, I decided to modify my log function. It now writes in JSON format.
The function is intended to be run as part of a script. As tasks complete in the script, they call this function to provide data. This data is then written to the file.
Each time the function runs, it has to reopen the log file and pull the data. This could create a performance impact on very large scripts. Still, due to the require of accurate logs for debug. It is necessary to get that data to a non-volatile storage medium, rather then store in RAM.
For large scripts, you may need to consider multiple log files. Implement log levels or what needs to log.
Named parameters were used for script readability.
I think I could do better with naming the function, but that’;s stuck at 2 AM.
My most recent use for the function, was working with AD groups. As a result I wanted to use the group name as the top level key name. Being the key value, it is a mandatory parameter.
Values which are “Null” will write the string “” to the log file.
The function and it’;s parameters can be modified easily, making it applicable.
Output and Parsing
Below is a sample output I have run. You will notice that the exception message uses escaped characters. This is part of the ConvertTo-JSON function. When using ConvertFrom-JSON these will be properly encoded to the original character.
Using unique key values as we have, presents an interesting challenge for parsing. If we wanted to find groups which had an error. We need to search by the nested key ‘Result’;.
And you can use $x.ErrorGroups to see of the Groups which had errors.