Export SCCM task sequence variables with PowerShell

Whenever starting with a new technology or project, I try to gather as much information as possible to get an idea of what I'm working with.  I've recently started working with SCCM task sequences, and similar to MDT, there are built-in task sequence steps that cover the basic tasks that most system administrators need to perform.  However, as environments become more complex and we are asked to do more and to do it efficiently, we sometimes need to get deep into the weeds and pull out a hidden gem.

There is a trove of data that is required to perform actions in a task sequence that is hidden from view. SCCM uses this data to determine which servers to talk to, where packages are located, which step is currently running, and much more.  If we want to be able to use it, we need to know what is there. There are blog posts from years ago explaining how to export this information using VBscript, but this is 2017 and we deserve a PowerShell way to do it!  This code will export all task sequence variables, including those defined on collections and devices.



With the line below, we are creating a new PowerShell object based on the SCCM task sequence COM object and storing it in $TSEnv.
$TSEnv = New-Object -ComObject Microsoft.SMS.TSEnvironment
There are a few interesting methods to explore on this object, but for now we will focus on two of them:

  1. GetVariables()
  2. Value()
The first method allows us to query the currently running task sequence for the names of all available task sequence variables.  The second method returns the value of the variable specified inside the parenthesis.  By using both of them together in a loop, we can return a list of every variable and it's respective value and store it in a variable.  We then use Out-File to write the variables to a text file on the system drive (X: if run in WinPE).

Now we need to add the script to a task sequence and run it.  There are a few of ways to invoke PowerShell code in a task sequence:
  1. Run PowerShell Script step and providing a script name that exists within a package
  2. Run Command Line step and invoking PowerShell.exe with the -File parameter
  3. Run Command Line step and invoking PowerShell.exe with the -Command parameter
There are plenty of guides on how to use items 1 and 2 above, so let's play with the third.  To do this, we need to concatenate all of the commands so that they run in one PowerShell instance.  This can be done by using a semi-colon, which tells PowerShell to expect more code before exiting.  When done, it looks like this:


We can then paste this code into a Run Command Line step in the task sequence and deploy it to an SCCM client.  



After the task sequence completes, you will find a file in $ENV:SystemDrive\Windows\Temp named TSVariables.txt that contains a key=value listing of the variables.  One word of caution is that this exports ALL task sequence variables, including sensitive and masked variables such as the SCCM client push credentials (noted by a variable name starting with "_SMSTSReserved").  If you want to avoid exporting these, you can update the GetVariables code to match the code below.
$Vars = $TSEnv.GetVariables() | Where-Object {$_ -notmatch 'Password' -OR $_ -notmatch 'Reserved'}
You can add as many -OR statements to the code to filter sensitive information.  In a later post we'll go through some of the interesting variables and what they can be used for.

Quick Hits: Set-CMSite and Set-AdaptivaServer

It's eclipse star-gazing time, so the blog post this weekend comes to you from the road and features two functions that I have used extensively when working in my lab environment for SCCM and Adaptiva.

Set-CMSite

This function requires local administrator rights on the device and switches the site that your SCCM agent is connected to.  After execution, you can use my Get-CMLog function to follow along with the logs while your machine connects to the new infrastructure and starts performing registration activities.

Set-CMSite on Github



Set-AdaptivaServer

If you have any low-bandwidth locations being served by your SCCM infrastructure, you may have looked into ways to easily provide content to those locations.  One option is to use an Alternate Content Provider that can cache the content locally and uses P2P storage.  One that I have been testing is Adaptiva OneSite.  With multiple environments, having an easy way to switch a client from one infrastructure to another similar to how we do with Set-CMSite above, is crucial.

Set-AdaptivaServer on Github

Note: To maintain cached content at the branch site, you must use the built-in SCCM migration tool to mirror content to the new SCCM infrastructure and also publish the content to the new Adaptiva server.  Otherwise, the content will fail a hash check and will need to be downloaded across the WAN again.


Visualizing objects in the PowerShell console

Last week we talked about how to enhance time-based PowerShell objects by adding a duration. This provides useful metrics, but humans are visual by nature and it would help even more if we could visualize the numbers.  As with any task, I started my search for console visualizations by googling to see if anyone else had written something I could use.  I came across a blog post from Jeff Hicks in 2013 that showcased his PowerShell console graphing tool.  The console graphs were close to what I wanted to do, but in testing I found that dropping all of the other properties of the object didn't allow me to retain important and relevant data.

I set about modifying the function to my liking and ended up with the following changes:
  • Removed color-based conditional formatting for readability and ease-of-use
  • Modified the output to be object-based 
  • Added ability to specify columns to keep
The first modification ended up being more of a usability issue for me. My use-cases did not require changing colors of the graph and by removing this functionality, we reduce the complexity of the script and the number of mandatory properties.  Also, with the script leaner, it made my next tasks much easier.

In the process of updating the script, it became clear that meeting my requirement of retaining properties would also require returning objects instead of writing to the host.  After editing the code to maintain the objects passed into the function, it was simple enough to convert the Write-Host of the bars in the chart to instead add a new property with the bar as a string.

Once we had all of that code modified, I quickly realized that the more properties we specified and the greater the width of them, the less space we had for charting.  That's the exact opposite of the problem I originally had; now there's TOO much data!  By implementing my original requirement of being able to specify columns to keep, we are now actually restricting the data so that we can provide more helpful charts.

With the function complete, we can do fun things like chart the top 10 memory hogs:


And we can also get an idea of how many commands that PowerShell modules contain:



The Out-ConsoleGraph function is available on Github.



Enhancing time-based objects with duration

Often when troubleshooting we come across log files that have tons of data.  It can be hard to digest properly, so we look for tools that can break it down and provide some insight into what is happening.  Last week, I shared a blog post on parsing System Center logs with PowerShell.  If we use that function to convert the log lines into objects, we can now manipulate them and add calculated fields to provide even more data to assist in troubleshooting.

One of the important angles when troubleshooting is "How long did each task in the process take?". To answer this question, we need to take a look at each line in the logs and compare it to the following one.  With PowerShell, the pipeline processes each object one at a time, so how do we accomplish this with code?

The method that I came up with was to include a stutter-step for the pipeline.  Capture the first log line in a variable and hold onto it until the pipeline moves to the next entry.  We can then calculate the difference in the date/time fields and add it as a new property.  The second entry in the log then overwrites the first entry as the stored variable and we continue until the last entry in the log.  As it is the last entry in the log, we have nothing to compare it to, so we simply set it as a dash.



As you can see above, the Add-Duration function reads in the output of Get-CMLog, selects the UTCTime property for comparison, and adds the results to a property called TotalMilliseconds.

Depending on the duration of your logs, you may want to use one of the other options for duration such as Days, Hours, Minutes, or Seconds.  This is useful when troubleshooting long-running tasks such as OSD, or even when you just want to figure out what is taking a long time in the process so that you can track, trend, and reduce it.

And this function is not just for System Center logs.  It should work on any time-based object as long as you specify the property containing a [DateTime] field that New-TimeSpan can parse.

The full code is available below and on Github.

SCCM Log Parser

Anyone who has ever worked with SCCM loves and adores CMTrace.exe for it's ability to parse the System Center logs. Countless headaches have been adverted in it's name.  Still, it leaves quite a bit to be desired.  To address some of the short-comings, Microsoft has released CMLogViewer.exe.  This new tool supports quick filters, merging log files, and advanced filters.  Even with the added features, it still lacks the flexibility that PowerShell can offer.

I started my adventure to do more by searching online for anything that met my needs:
  • Read logs in the SCCM format
  • Parse a single log or multiple logs (must provide filename as a property)
  • Provide either local or UTC timestamps for global troubleshooting
  • Accepts logs directly or via pipeline

After not finding anything suitable, I set about writing my own.  Thankfully the fields in the log file are self-explanatory as key=value pairs and we can use basic regex to capture them.
  • Message
  • Time
  • Date
  • Component
  • Context
  • Type
  • Thread
  • File

To satisfy the local and UTC timestamps requirement, I had to add some additional regex capture groups that were then passed to the [DateTime]::ParseExact .Net method.  I then added the fields into a [PSCustomObject].  The regex and object were then wrapped in a foreach loop that processes log lines read in via Get-Content.  Since I also had a requirement of passing in multiple files and showing which line came from which log, I captured the current file name via Split-Path and also added that as a property to the object.  And to support the ability to specify logs through the pipeline, a Process block was added and then a loop to read in each file.

Meeting these requirements allow us to do fun stuff like the following:

Show me the log entries in SMSTS.log
   Get-CMLog smsts.log

Show me all log entries for CM logs and sort them by date
   Get-ChildItem -Path C:\Windows\CCM\Logs | Get-CMLog | Sort-Object UTCTime
Show me any log entry with "Error" in the message text
   Get-CMLog -Path .\SMSTS.log | ?{$_.Message -match 'Error'}

Show me all log entries for logs that contain an error
   Get-ChildItem -Path C:\Windows\CCM\Logs | Select-String -Pattern 'error' | Select -Unique Path | Get-CMLog

The full code is available below and on Github.