A Guide to Subject Matter Expert Tickets

A Guide to Subject Matter Expert Tickets

In the intricate ecosystem of IT support, the quality of communication in ticket submissions can significantly influence the efficiency of problem resolution. Imagine walking into a dense forest, each tree representing a different issue or ticket awaiting resolution. Just as a seasoned guide can navigate these woods with ease, providing clear paths and descriptions, a Subject Matter Expert (SME) in IT can illuminate the way to swift solutions with well-crafted tickets.

The Spectrum of Ticket Details

Venture into the thicket of daily IT support tickets, and you’ll encounter a wide array of communication styles. On one end, there are tickets like faint, barely noticeable trails – vague, minimal details offered by users unsure of what information is pertinent. Bob from manufacturing, for example, might simply state, “My computer won’t turn on,” leaving the path to resolution obscured by underbrush.

Contrastingly, tickets from more technically adept users, like Jan from accounting, are akin to well-trodden paths through the forest, marked by signs and clear directions. Jan not only mentions reinserting cables and attempting to power on her computer but also notes the absence of the usual boot-up text, laying breadcrumbs for IT support to follow towards a solution.

Cafting a Map to Resolution

Subject Matter Experts (SMEs) stand as the rangers of this forest, armed with the knowledge and tools to guide others through even the densest undergrowth. Here’s how they can effectively chart the course:

  • Know Your Audience: Just as a ranger alters their guidance based on the experience of the hikers, SMEs should tailor their ticket submissions to the technical level of the IT support team. This ensures that the instructions are neither too complex for general support staff nor too simplistic for specialists.
  • Use a Structured Format: A structured ticket is like a map, offering a clear overview of the terrain at a glance. By organizing the issue, steps taken, and potential solutions logically, SMEs create a guide that others can follow easily, avoiding unnecessary detours.
  • Prioritize Clarity and Brevity: In the dense forest of IT issues, clarity acts as a beacon, guiding the support team directly to the heart of the problem. SMEs should aim to illuminate the path with precise, concise language, ensuring no one gets lost in unnecessary details.
  • Offer Potential Solutions: Suggesting solutions or workarounds is akin to marking potential paths on a map. While not all may lead directly to the destination, they provide starting points, accelerating the journey towards resolution.
  • Include Visuals When Necessary: Sometimes, the most effective way to describe a landscape is through visuals. Diagrams, screenshots, and videos can serve as snapshots of the issue, offering immediate context and understanding.
  • Encourage Open Communication: Ending a ticket with an invitation for questions is like leaving a trail of markers for others to follow, ensuring that if the path becomes unclear, further guidance is just a call away.

Navigating the Forest Together

In the realm of IT, “Subject Matter Expert Tickets” are more than just requests for assistance; they’re opportunities for SMEs to lead by example, demonstrating how detailed, well-structured communication can streamline the resolution process. It’s about creating a collaborative environment where every ticket, like a trail in the forest, is clearly marked and navigable, leading to a more efficient, effective IT support system.

By adopting these strategies, SMEs not only enhance their own credibility but also contribute to a culture of clarity and cooperation, ensuring that the vast forest of IT support is a little easier for everyone to navigate.

Resolving KB5034439 error

Resolving KB5034439 error

While install the LB5034439 update, i received an error message of 0x80070643. Google failed me over and over. Every post I saw talked about using dism commands to repair the update. Which none of these resolved the issue. Finally microsoft dropped a useful article about the update. Inside the update, it stated that the update will fail if your recovery drive had less than 250mb of free space. Well, my recovery drive had only 500 mb of space and only 83 mb of free space. I will go over how to find that information. So, Resolving KB5034439 error was as simple as expanding the recovery drive.

Finding the Recovery Partition Size

So, to find the recovery partition size, I used a simple powershell script. The idea behind the script was to grab the drives, the partitions and do some simple math. Of course, this came from superuser. All I did was tweak it a little to reflect my needs.

$disksObject = @()
Get-WmiObject Win32_Volume -Filter "DriveType='3'" | ForEach-Object {
    $VolObj = $_
    $ParObj = Get-Partition | Where-Object { $_.AccessPaths -contains $VolObj.DeviceID }
    if ( $ParObj ) {
        $disksobject += [pscustomobject][ordered]@{
            DiskID = $([string]$($ParObj.DiskNumber) + "-" + [string]$($ParObj.PartitionNumber)) -as [string]
            Mountpoint = $VolObj.Name
            Letter = $VolObj.DriveLetter
            Label = $VolObj.Label
            FileSystem = $VolObj.FileSystem
            'Capacity(mB)' = ([Math]::Round(($VolObj.Capacity / 1MB),2))
            'FreeSpace(mB)' = ([Math]::Round(($VolObj.FreeSpace / 1MB),2))
            'Free(%)' = ([Math]::Round(((($VolObj.FreeSpace / 1MB)/($VolObj.Capacity / 1MB)) * 100),0))
        }
    }
}
$disksObject | Sort-Object DiskID | Format-Table -AutoSize

What this script does is, it grabs the volumes on the machine that is not detachable, like a usb. Then we loop through each volume and grab the partitions that has an id associated with the volume. From there we just pull the data out and do basic math. Finally we display the information. The important part of this script is the recovery label. If your free space is less than 250mbs, we are going to have some work to do.

Clean up the Recovery Partition

The first thing I tried to do is use the cleanmgr to clean up the recovery partition. Running it as an administrator will give you this option. Inside the disk cleanup software, select everything you can. Then in the “More Options” tab, you should be able to clean up the “System Restore and Shadow Copies”. After doing these items, run the script again and see if you have enough space. In my case I did not. Cleaning the Recovery partition did not resolve the KB5034439 error.

Growing Recovery

So, the first thing I had to do is go into my partition manager in my server. The recovery partition in my case was at the end of the drive. The partition next to the recovery was thankfully my main partition. I shrank my main partition by a gb. That was the easy part. Now the hard part. I had to rebuild my recovery partition inside that shrinked space. These are the steps on how to do that.

  1. Start CMD as administrator.
  2. Run reagentc /disable to disable the recovery system.
  3. run DiskPart
  4. Run List Disk to find your disk.
  5. Run Select Disk # to enter the disk you wish to edit.
  6. Run List Partition to see your partitions. We want the Recovery partition.
  7. Run select partition #. In my case, this is partition 4. The recovery partition.
  8. Run delete partition override. This will delete the partition. If you don’t have the right one selected, get your backups out.
  9. Run list partition to confirm the partition is gone.
  1. Now inside your partition manager, Click Action > Refresh
  2. Select the Free space and select New Simple Voume
  3. Inside the Assign Drive Letter or Path Window, Radio check “Do not assign a drive letter or drive path” and click Next
  4. Inside the Format Partition Change the volume Label to Recovery and click Next
  1. This will create the new partition. Navigate back to your command Prompt with Diskpart
  2. Run list partition
  3. Run select partition # to select your new partition.

The next command depends on the type of disk you are working with the first list disk shows a star under gpt if the disk was a gpt disk.

GPT Disk

  1. Run set id=de94bba4-06d1-4d40-a16a-bfd50179d6ac
  2. Run gpt attributes=0x8000000000000001

GPT Disk

  1. Run set id=27
  1. Run Exit to exit diskpart.
  2. Run reagentc /enable to renable the recovery disk partition.
    • This command moves the .wim file that it created with the disable from the c:\windows\system32\recovery back into the recovery partition.

Finally, run your windows updates. Resolving KB5034439 error is a little scary if you have a server that is more complex. Thankfully it wans’t that complex on my end. You will have to adapt your approach to match what is needed.

Clear Google Cache with Powershell

Clear Google Cache with Powershell

Yesterday I had to clear out a few users’ google chrome cache. It was a little aggravating going computer by computer. We were doing this because recently a core web app was updated. The application left traces of itself in the Google Chrome Cache and it caused all kinds of problems. So the last few I looked for a way to do it with PowerShell. Long and behold you can Clear Google Cache with Powershell.

The Script

We are starting with the script, to begin with. We are doing this because the core of this script is wrapped around a remote template I use. I will cover the template later down the road.

Function Clear-SHDGoogleCache {
    param (
        [parameter(
            ValueFromPipeline = $True,
            ValueFromPipelineByPropertyName = $True,    
            HelpMessage = "Hostname of target computer", 
            Mandatory = $true)][alias('ComputerName')][String[]]$Computer,
        [Parameter(
            HelpMessage = "Allows for custom Credential.")][string[]]$username,
        [Parameter(
            HelpMessage = "Allows for custom Credential.")][System.Management.Automation.PSCredential]$Credential
    )
    begin {
        if ($null -eq $username) { $username = "*" }
    }
    process {
        foreach ($PC in $Computer) {
            foreach ($user in $username) {
                $Parameters = @{
                    Computername = $PC
                    ScriptBlock  = {
                        if ($username -ne "*") {
                            if (Test-Path C:\Users\$user) {
                                Remove-Item "c:\users\$user\appdata\local\google\chrome\user data\default\cache\*" -Recurse -Force -ErrorAction SilentlyContinue 
                                Remove-Item "c:\users\$user\appdata\local\google\chrome\user data\default\code cache\js\*" -Recurse -Force -ErrorAction SilentlyContinue 
                                Remove-Item "c:\users\$user\appdata\local\google\chrome\user data\default\media cache\*" -Recurse -Force -ErrorAction SilentlyContinue 
                                Remove-Item "c:\users\$user\appdata\local\google\chrome\user data\Default\Service Worker\CacheStorage\*" -Recurse -Force -ErrorAction SilentlyContinue 
                                Remove-Item "c:\users\$user\appdata\local\google\chrome\user data\Default\Service Worker\ScriptCache\*" -Recurse -Force -ErrorAction SilentlyContinue 
                            }
                            else {
                                Write-Error "$user is not present."
                            }
                        }
                        else {
                            Remove-Item "c:\users\*\appdata\local\google\chrome\user data\default\cache\*" -Recurse -Force -ErrorAction SilentlyContinue 
                            Remove-Item "c:\users\*\appdata\local\google\chrome\user data\default\code cache\js\*" -Recurse -Force -ErrorAction SilentlyContinue 
                            Remove-Item "c:\users\*\appdata\local\google\chrome\user data\default\media cache\*" -Recurse -Force -ErrorAction SilentlyContinue 
                            Remove-Item "c:\users\*\appdata\local\google\chrome\user data\Default\Service Worker\CacheStorage\*" -Recurse -Force -ErrorAction SilentlyContinue 
                            Remove-Item "c:\users\*\appdata\local\google\chrome\user data\Default\Service Worker\ScriptCache\*" -Recurse -Force -ErrorAction SilentlyContinue   
                        }
                    }
                    Asjob        = $true
                    JobName      = $PC
                }
                if ($PSBoundParameters.ContainsKey('Credential')) {
                    $Parameters | Add-Member -MemberType NoteProperty -Name Credential -Value $Credential
                }
                if (Test-Connection -ComputerName $PC -Count 1 -Quiet) {
                    try {
                        Invoke-Command @Parameters
                    }
                    catch {
                        Write-Warning "$PC Invoke Command Failed"
                    }
                }
                else {
                    Write-Warning -Message "$PC is offline"
                }
            }
        }
    }
    end {}
}

The Breakdown

Let’s break down the script and see what is needed and how it is needed. The first thing you will notice is the computer and the user are both lists of strings. [String[]]. This means I will have to loop through each one. This is important because this means you can target a single user on multiple machines or multiple users on a single machine or both. The second thing I want to point out is the credentials. So, if you are not in admin mode, you can deploy the script with your admin creds, or with the local admin creds.

The Username is not a required item. Why did I do this? The simple answer is, if you don’t put a username, then it will clear every user’s google chrome cache. Notice in the “begin” area, we have if null is equal to username, then we want the username to be *. Later we ask, if the username is not equal to *, then we use the user. If not, we use the * which will go through all the users at once. Also notice in the do the user, we test if the user exists. If it doesn’t we deploy an error, if it does, we do our work.

if ($null -eq $username) { $username = "*" }
if ($user-ne "*") { do the user
    if (Test-Path C:\Users\$user) { Its there, lets go for it.} else { Error will robison.}
} else {do everyone}

The Core

At the core of this script is the remove-item blurp. We are going through each user data area and clearing out the different types of cache. There is the default cache, code, media, storage, and script caches. Each of these folders will have folders inside of them. So we need to recurse. We want to force it and we don’t care about the errors as some cache will not delete while chrome is active. Could I have added a kill chrome, yes, but why? if the end-user is working in chrome, this is going to be disruptive and force them to restart chrome. Lets look at the code.

Remove-Item "c:\users\$user\appdata\local\google\chrome\user data\default\cache\*" -Recurse -Force -ErrorAction SilentlyContinue 
                                Remove-Item "c:\users\$user\appdata\local\google\chrome\user data\default\code cache\js\*" -Recurse -Force -ErrorAction SilentlyContinue 
                                Remove-Item "c:\users\$user\appdata\local\google\chrome\user data\default\media cache\*" -Recurse -Force -ErrorAction SilentlyContinue 
                                Remove-Item "c:\users\$user\appdata\local\google\chrome\user data\Default\Service Worker\CacheStorage\*" -Recurse -Force -ErrorAction SilentlyContinue 
                                Remove-Item "c:\users\$user\appdata\local\google\chrome\user data\Default\Service Worker\ScriptCache\*" -Recurse -Force -ErrorAction SilentlyContinue 

That’s all the code you will need. If you selected not to use username the $user will turn into * which will do every user, including default. if you have something like sccm, pdq, intune, ninja, pulse, etc, then just user this part of the code with * instead of user. This will clear the cache as needed.

We close it all up and send it as a job to the machine in question. This way we are not stuck on each computer. It speeds things up. With Powershell 7, you can loop with a number of objects that you want which would speed this up even more.

Additional Reading

Slow ADUC on VPN

Slow ADUC on VPN

Like in my last post, I have been in IT for many years. Every place I have worked at and even when I worked at an MSP, I have always seen the Active Directory Users and Computers take a really long time to load. Often times be very slow while on VPN. I was finally challenged to see why.

Reasons

There are hundreds of reasons apparently for it being slow. I have seen it slow on Global connect, Open VPN, Cisco’s Anyconnect, WatchGuard, and more. Apparently, the issue is with how ADUC communicates via DNS.

Yes, it’s a DNS problem.

The solutions for a slow ADUC on VPN

Point to the server’s IP instead of the DNS name.

If you right click your ADUC in the start menu, you can click properties. Then from there, you can add /server=”<Your Servers IP Address>” and this should resolve the issue. The load time went from 5 minutes to 10 seconds. I’m not all sure the back end fix, but this one worked well.

A registry fix

Here is a registry fix that seems to work on some machines. I tested this on windows 10 and 11. I was unable to test it on multiple network stacks, just my pfsense and untangled stacks. So, let me know if these keys work for you.

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Tcpip\Parameters]
EnablePMTUDiscovery dword:00000000
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Tcpip\Parameters]
EnablePMTUBHDetect dword:00000000

Disable IPV6

If your network doesn’t need IPV6, sometimes disabling IPv6 will resolve these issues.

In theory, these two solutions should resolve the Slow ADUC on VPN. However, in some cases, it will not.

Continue Reading:

Multi-Account Containers

Multi-Account Containers

I have been in IT for a little over 10 years and have tried various browsers and plugs/extensions. Some are extremely useful, and some, are not so. I abandoned Firefox for a while because it was not compatible with the required software. Recently I have returned back to firefox because of Multi-Account Containers.

Firefox has a unique extension that only it has. This extension is called the multi-account container. What it does is allows you to open a tab in a container of its own. Link

What is Multi-Account Containers

The extension has containers. These containers hold all of the cached items inside of it. For example, if you log into o365 in one container, you will be able to log into a different o365 in another container. Unlike incognito mode, you will be able to work with items that need to cache on your computer like exchange online.

If you are in the MSP world? This is a game-changer. You can have a container for each of your clients and solely work out of that container for that client. For in-house IT, it allows you to test as a normal user vs an IT admin. Even in your home life, the added layer of security helps with your banking and personal items. This way Facebook doesn’t leak into your bank account’s cache.

My favorite feature

When firefox starts, you have a screen full of tabs of previously opened sites or most visited sites. Each one of these you can right-click and open in a different container. I can do this with links, and even the + for a new tab. I can dedicate a tab just for my company and a tab just for personal. This way my o365 doesn’t affect a client’s o365.

And yes, This beast is only available on firefox and firefox off shoots. So, long live firefox!

As always, if you have any questions feel free to ask.

Additional Reading:

LAPS Password With PowerShell

LAPS Password With PowerShell

A few of my clients use something called LAPS. Laps change the local administrator password on a computer and then store the information inside Active Directory. Since I don’t dive deep into this client’s computers often, I needed something to quickly type the first letter of the computer in question to pull up the LAPS password. Basically, I needed a list of computer names from the command itself. This is fully possible with Dynamic Parameters. So, today we will be grabbing the LAPS password with PowerShell.

Where Does LAPS Password live?

Most companies that set up LAPS do so with Active Directory. By default, Active Directory saves the password into an attribute called “ms-Mcs-AdmPwd” and LAPS also stores the expiration date in “ms-Mcs-AdmPwdExpirationTime” Thus, all you have to do is call a get-adcomputer command and pull out the information.

Get-Adcomputer -filter {name -like $Computer} -properties name,ms-Mcs-AdmPwd,ms-Mcs-AdmPwdExpirationTime | select-object name,ms-Mcs-AdmPwd,ms-Mcs-AdmPwdExpirationTime 

Now the “ms-Mcs-AdmPwdExpirationTime” is unique and needs to be parsed into something more readable. We can use the m method called [datetime] to do this.

Get-Adcomputer -filter {name -like $Computer} -properties name,ms-Mcs-AdmPwd,ms-Mcs-AdmPwdExpirationTime | select-object Name, @{l = "AdminPassword"; e = { $_."ms-Mcs-AdmPwd" } }, @{l = "AdminPasswordExpireTime"; e = { [datetime]::FromFileTime(($_."ms-Mcs-AdmPwdExpirationTime")) } }

There you have it, That’s how you get the LAPS password, But I want to take this one step further. I don’t know all the computer names. I want that information at my fingertips while I type out the command. So, I want to type something like Get-LAPS -ComputerName <here populate a name where I can tab> and bamn, it gives it to me when I hit enter. That’s where we will dive into dynamic parameters next.

Adding Dynamic Computer Name Parameters

In a previous article, we went over how to make a dynamic parameter. I want to help refresh memories by doing a single parameter and show you how it can be done with this function.

The first thing we need to do is create our form. This form allows us to use the dynamic parameters and gives us spots to pull data. This is more of a road map than anything else, but it is required for dynamics.

function Get-LapsPassword {
    [cmdletbinding()]
    Param()
    DynamicParam {}
    Begin {}
    Process {}
    End {}
}

The first part of our Dynamics is we want to name the parameter template. From there, we want to create a new object. This object will be the system collections object model for collections. AKA system attributes. Then we want to make an attribute object to add to that object later.

Building Out Objects

$paramTemplate = 'ComputerName' 
$AttributeCollection = New-Object System.Collections.ObjectModel.Collection[System.Attribute]
$ParameterAttribute = New-Object System.Management.Automation.ParameterAttribute

The ParameterAttribute will be where we add the flags like mandatory and position. We add those by dropping them directly into the ParamterAttribute object. A fun little fact, you can tab through and see what other items are available for this object. Things like the help message, the value from the pipeline, and more are hidden here. Today we only care about the mandatory and position.

$ParameterAttribute.Mandatory = $true
$ParameterAttribute.Position = 1

After we build out our parameter Attribute object, we need to add it to the Attribute Collection we made at the start. We do this by using the “.add()” function of objects.

$AttributeCollection.Add($ParameterAttribute)

Now we need to create another object. This will be the Runtime Parameter Directory. Basically, what they will be looking through. This is a system management automation object called runtime defined parameter directory. Say that 10 times fast…

More Objects

$RuntimeParameterDictionary = New-Object System.Management.Automation.RuntimeDefinedParameterDictionary

Now we need to make our Validate Set. We will create an array of devices using the Get-adcomputer command. Here we will push (Get-adcomputer -filter {enabled -eq “true”}).name into a variable. Now we will have a list of active computers. Notice that we filter out all other information by using the “.name” call.

$ParameterValidateSet = (Get-ADComputer -Filter { enabled -eq "true" -and OperatingSystem -Like '*Windows*' -and OperatingSystem -notlike "*Server*" }).name

Next, we need to create another object. This object is the system management automation validate set attribute object. We can feed this object our Parameter Validate Set.

$ValidateSetAttribute = New-Object System.Management.Automation.ValidateSetAttribute($ParameterValidateSet)

Afterward, it’s time to feed the Validate Set attribute to the attribute collection from the beginning. We can accomplish this by using the “.add()” method.

$AttributeCollection.Add($ValidateSetAttribute)

Next, it’s time to bring our Attribute collection into the command line. It’s time to make the run-time parameter. Once again, a new object. This time it’s the Run time Defined Parameter object. Like the last object, we can place our data directly into it. We will want the parameter’s name, the type, a string in this case, and the validate set.

$RuntimeParameter = New-Object System.Management.Automation.RuntimeDefinedParameter($paramTemplate, [string], $AttributeCollection)

Afterward, we take the above parameter and place it into our directory with the “.add()” method. We need the parameter Template and the Run time Parameter.

$RuntimeParameterDictionary.Add($paramTemplate, $RuntimeParameter) 

Finally, in the dynamic parameter block, we return our directory.

return $RuntimeParameterDictionary

Beginning

We are almost done. It’s time to bring the dynamic parameter into the function and make it useable. We do this in the beginning section. We shove the PSBoundParameters of our template name into a variable.

$MemberName = $PSBoundParameters[$paramTemplate]

Then from there, we call the $memberName in our Get-adcomputer command.

The Script

It’s that time, it’s time to put it all together, so you can copy and past it into your toolbox. It’s time To Grab LAPS Password With PowerShell.

function Get-LapsPassword {
    [cmdletbinding()]
    Param()
    DynamicParam {
        # Need dynamic parameters for Template, Storage, Project Type
        # Set the dynamic parameters' name
        $paramTemplate = 'ComputerName' 
        # Create the collection of attributes
        $AttributeCollection = New-Object System.Collections.ObjectModel.Collection[System.Attribute]
        # Create and set the parameters' attributes
        $ParameterAttribute = New-Object System.Management.Automation.ParameterAttribute
        $ParameterAttribute.Mandatory = $true
        $ParameterAttribute.Position = 1
        # Add the attributes to the attributes collection
        $AttributeCollection.Add($ParameterAttribute)
        # Create the dictionary 
        $RuntimeParameterDictionary = New-Object System.Management.Automation.RuntimeDefinedParameterDictionary
        # Generate and set the ValidateSet
        $ParameterValidateSet = (Get-ADComputer -Filter { enabled -eq "true" }).name
        $ValidateSetAttribute = New-Object System.Management.Automation.ValidateSetAttribute($ParameterValidateSet)
        # Add the ValidateSet to the attributes collection
        $AttributeCollection.Add($ValidateSetAttribute) 
        # Create and return the dynamic parameter
        $RuntimeParameter = New-Object System.Management.Automation.RuntimeDefinedParameter($paramTemplate, [string], $AttributeCollection)
        $RuntimeParameterDictionary.Add($paramTemplate, $RuntimeParameter) 
        return $RuntimeParameterDictionary
    } # end DynamicParam
    BEGIN {
        $MemberName = $PSBoundParameters[$paramTemplate]
    } # end BEGIN
    Process {
        $ComputerInfo = Get-ADComputer -Filter { name -like $MemberName } -Properties * 
    }
    End {
        $ComputerInfo | select-object Name, @{l = "AdminPassword"; e = { $_."ms-Mcs-AdmPwd" } }, @{l = "AdminPasswordExpireTime"; e = { [datetime]::FromFileTime(($_."ms-Mcs-AdmPwdExpirationTime")) } }
    }
}

Additional Reading