Resource – SHD Get ACL

Resource – SHD Get ACL

Ever need to combine Get-childitem and Get-ACL while only pulling the access information and users? Well, here we are. I hope you all can use it well.

function Get-SHDACL {
    [cmdletbinding()]
    param (
        [parameter(Mandatory = $true)][string]$Path,
        [string]$Filter,
        [switch]$Recurse,
        [switch]$Directory,
        [switch]$File
    )
    begin {
        if ($PSBoundParameters.ContainsKey("filter")) {
            if ($File) {
                if ($Directory) {
                    if ($Recurse) {
                        $SubPath = Get-ChildItem -Path $Path -Recurse -Directory -File -Filter $Filter
                    } else {
                        $SubPath = Get-ChildItem -Path $Path -Directory -File -Filter $Filter
                    }
                } else {
                    if ($Recurse) {
                        $SubPath = Get-ChildItem -Path $Path -Recurse -File -Filter $Filter
                    } else {
                        $SubPath = Get-ChildItem -Path $Path -File -Filter $Filter
                    }
                }
            } else {
                if ($Directory) {
                    if ($Recurse) {
                        $SubPath = Get-ChildItem -Path $Path -Recurse -Directory -filter $Filter
                    } else {
                        $SubPath = Get-ChildItem -Path $Path -Directory -Filter $Filter
                    }
                } else {
                    if ($Recurse) {
                        $SubPath = Get-ChildItem -Path $Path -Recurse -Filter $Filter
                    } else {
                        $SubPath = Get-ChildItem -Path $Path -Filter $Filter
                    }
                }
            }
        } else {
            if ($File) {
                if ($Directory) {
                    if ($Recurse) {
                        $SubPath = Get-ChildItem -Path $Path -Recurse -Directory -File
                    } else {
                        $SubPath = Get-ChildItem -Path $Path -Directory -File
                    }
                } else {
                    if ($Recurse) {
                        $SubPath = Get-ChildItem -Path $Path -Recurse -File
                    } else {
                        $SubPath = Get-ChildItem -Path $Path -File
                    }
                }
            } else {
                if ($Directory) {
                    if ($Recurse) {
                        $SubPath = Get-ChildItem -Path $Path -Recurse -Directory
                    } else {
                        $SubPath = Get-ChildItem -Path $Path -Directory
                    }
                } else {
                    if ($Recurse) {
                        $SubPath = Get-ChildItem -Path $Path -Recurse
                    } else {
                        $SubPath = Get-ChildItem -Path $Path
                    }
                }
            }
        }
    }
    Process {
        
        foreach ($Sub in $SubPath) {
            
            $ACLinfo = Get-Acl -Path $sub.FullName
            $Return += foreach ($ACL in $ACLinfo.Access) {
                [pscustomobject]@{
                    Path             = $Sub.FullName
                    FileSystemRights = $ACL.FileSystemRights
                    ID               = $ACL.IdentityReference
                    Type             = $ACL.AccessControlType
                    Inherited        = $ACL.IsInherited
                }
            }
        }
    }
    end {
        $Return
    }
}

Exchange Online – Mailbox Size Audit

Exchange Online – Mailbox Size Audit

Here is a little powerhouse script I wrote to audit the mailbox sizes. The focus recently is to see who’s mailbox sizes are about to be over and if it’s the deleted folder. I can’t show you all of the scripts, but I can show you part of it. This part will pull the data down in such a way that you can view the mailbox sizes, who has the largest mailboxes, what folder is the largest in those mailboxes, and what their deleted folder is sitting at. Let us take a look at the script itself.

The Script

function Get-SHDEXOMailboxSizeAudit {
    [cmdletbinding()]
    param (
        [pscredential]$Credential
    )
    Begin {
        
        #Installs required modules
        Write-Verbose "Installing required modules"
        if (!(Get-InstallEdModule ExchangeOnlineManagement)) { Install-Module ExchangeOnlineManagement }

        Write-Verbose "Checking and importing required modules"
        # Starts importanting required modules
        if (!(Get-Command Connect-ExchangeOnline)) { Import-Module ExchangeOnlineManagement }

    }
    process {
        
        #Connecting to exchange. Grabing credentials. 
        if (!($PSBoundParameters.ContainsKey('Credential'))) {
            $Credential = (Get-Credential) 
        }
        Connect-ExchangeOnline -credential $Credential
 
        #Grabs all the mailboxes at once. As this report will look at all the mailboxes. 
        $Mailboxes = Get-Mailbox

        #Starts looping through each mailbox
        For ($M = 0; $M -le $Mailboxes.count; $M++) {

            Write-Verbose "Gathering Info on: $($Mailboxes[$M].UserPrincipalName)"

            #Grabs the mailbox stats.
            $Stats = Get-EXOMailboxStatistics $mailboxes[$M].UserPrincipalName
            $FolderStats = Get-EXOMailboxFolderStatistics $mailboxes[$M].UserPrincipalName

            #Starts looping through those folders to get their sizes. 
            $MainFolderStats = foreach ($Folder in $FolderStats) {
                $FolderSize = [math]::Round((($Folder.FolderSize.split('(')[1].split(' ')[0] -replace ',', '') / 1gb), 2)
                [pscustomobject]@{
                    Name = $Folder.name
                    Size = $FolderSize
                }
            }

            #Adds this information to the mailbox object
            $Mailboxes[$M] | add-member -Name "Statistics" -MemberType NoteProperty -Value $Stats
            $Mailboxes[$M] | add-member -Name "FolderStats" -MemberType NoteProperty -Value $MainFolderStats
        }

        #Creates a return value. 
        $Return = foreach ($Mail in $Mailboxes) {

            #Starts looking at mailboxes that are not the discovery mailbox.
            if (!($mail.UserPrincipalName -like "DiscoverySearchMailbox*")) {

                #Grabs the deleted folder as that is a folder we want to see in this report. 
                $Deleted = $Mail.FolderStats | where-object { $_.name -like "Deleted Items" }

                #Grabs the largest folder. If  it's not the deleted folder, then we might want to increase their mailbox sizes.
                $LargestFolder = $Mail.FolderStats | Sort-Object Size -Descending | select-object -first 1 

                #Doing some math on a string. The string format (# bytes). Thus, we work the string to get the bytes. Divide and round. 
                $Size = [math]::Round(($Mail.Statistics.TotalItemSize.value.tostring().split('(')[1].split(' ')[0].replace(',', '') / 1gb), 2)

                #Grabs the mailboxes percentage.
                $DeletedToMailboxPercent = [math]::Round((($Deleted.Size / $size) * 100), 0)

                #Outputs the data to the return value. 
                [pscustomobject]@{
                    DisplayName             = $Mail.Displayname
                    UserPrincipalName       = $Mail.UserPrincipalName
                    MailboxSize             = $Size
                    LargetsFolder           = $LargestFolder.Name
                    LargetsFolderSize       = $LargestFolder.Size
                    DeletedItemSize         = $Deleted.Size
                    DeletedToMailboxPercent = $DeletedToMailboxPercent
                }
            }
        }
        
        #Disconnects exchange
        Disconnect-ExchangeOnline -confirm:$false > $null
    }
    End { 
        $Return | sort-object MailboxSize -Descending 
    }
}

The breakdown

First this script is designed to work on powershell 7. It will not work on powershell 5.

The first part we come to is the [pscredential] object. Notice it’s not mandatory. Notice no pipelining either. I have found PS credentials pipped intend to do very poorly. So, it’s simple, bam wam done.

Begin

Inside our begin tab, we have the module setup. We check installed modules for exchangeonlinemanagement. if it’s there, we ignore it and import the module, if it’s not we install it. Same way with importing. If it’s imported, then we do nothing, if it’s not, we import it.

Process

Next, we grab credentials if need be and connect to exchange. We use the get-credential command to grab the credentials. Then we use the connect-exchangeonline command to connect to the exchange online.

Once we are connected, the magic can start. This is the sweetness of this script. The first step is to grab all of the mailboxes at once with Get-Mailbox. Then we start a loop, not any loop, a for a loop. Wait! David, WHY A FOR LOOP! It’s simple, we want to add information to the index. So, we need to be able to call the index. We use the Get-EXOMailboxStatistics and choose the userprincipalname of the index we are looking for. We do the same thing with Get-EXOMailboxFolderStatistics. These gives us some clear stats we can use later on in the script. Now we loop through the folder stats that we just collected and math ourselves some bytes to gb. See the output of the get-exomailboxfolderstatistics looks like “24mb (#### bytes). So we need to filter that out and get the ####. I use a split to do this. I split out the string at the ‘(‘. This way, the bytes are on object 1. Then we split it again by the space. Now the bytes are on the 0 object. Then we replace the ‘,’ with nothing. Now we divide all that by 1gb to convert it to gbs. Then we drop that all into the mainfolderstats. Next, we add all of that information into the mailbox variable we created before this looping madness using add-member. We are adding it as a noteproperty.

Now we have prepped our data, it’s time to sort it. We start the loop once more, but this time it’s simple for each loop as we don’t need the array index value. We first grab all the deleted items folder. Then we grab the largest folder using the sort-object command on the size object of the folderstats note property that we made in the last step. Then we do some math. Like before to get the mailbox overall size. Finally, we grab the deleted mailbox percentage with a little more math. This time its percentage math. Now we have all of this useful information we use our pscustomobject and put it all together.

Then we disconnect using the disconnect-exchangeonline command.

End

Finally we display the return information inside our end tab. We sort the object by the mailbox size in a descending order.

SHD – Quickbook Search

SHD – Quickbook Search

This past week I needed to find all of the quickbook files on a computer without accessing quickbooks itself. The core of the script is a simple Get-Childitem command looped for each logical disk on the machine. Looking for one of the four main extensions for quickbooks. The four extensions are as follows

NameExtension
Company Filesqbw
Backup Filesqbb
Portable Filesqbm
Bank Statement filesqbo

The Script

function Search-QuickbookFiles {
    [cmdletbinding()]
    Param (
        [parameter(Mandatory = $True)][Validateset("Company Files", "Backup Files", "Portable Files", "Bank Statement Files")][string]$FileType
    )
    if ($FileType -like "Company Files") { $EXT = "QBW" }
    if ($FileType -like "Backup Files") { $EXT = "QBB" }
    if ($FileType -like "Portable Files") { $EXT = "QBM" }
    if ($FileType -like "Bank Statement Files") { $EXT = "QBO" }

    $Disks = Get-CimInstance -ClassName win32_logicaldisk 
    $QB = foreach ($Disk in $Disks) {
        Get-ChildItem -Path "$($Disk.DeviceID)\" -Filter "*.$EXT" -Recurse | Select-Object FullName,Length,LastWriteTime
    }
    $QB
}
$DateTime = (Get-Date).tostring("yyyy-MM-dd_hh-mm-ss")
if (!(Test-Path C:\temp)) {mkdir c:\temp}
Search-QuickbookFiles -FileType 'Company Files' | Export-Csv "c:\temp\$($Env:COMPUTERNAME)_CompanyFiles_$($DateTime).CSV"
Search-QuickbookFiles -FileType 'Backup Files' | Export-Csv "c:\temp\$($Env:COMPUTERNAME)_BackupFiles_$($DateTime).CSV"
Search-QuickbookFiles -FileType 'Portable Files' | Export-Csv "c:\temp\$($Env:COMPUTERNAME)_PortableFiles_$($DateTime).CSV"
Search-QuickbookFiles -FileType 'Bank Statement Files' | Export-Csv "c:\temp\$($Env:COMPUTERNAME)_BankStatementFiles_$($DateTime).CSV"

The Breakdown

Since I want this to be a little easier to use, I broke it down between the file types with a validate set parameter. This way you can choose which extension you want. Then I go through each extension and make an if statement for each one matching up the extension.

Next we get the disks using the Get-CimInstance -classname win32_logicaldisk. This grabs all the mapped drives, local drives, and anything else that has a drive letter.

Now we loop through those disks and search the root of each drive for any files with the extension we choose. We select the fullname as this gives us the full path. I also like having file size and last write time to determine if the file is valid still. Once we go through this loop we display the information.

Improvements

I can add remote computers to this setup.

That’s it, if you have any questions feel free too reach out.

SHD Resource – User to Groups

SHD Resource – User to Groups

This little guy is a simple dynamic parameter resource for you all. Take a look at my previous blog post about how these parameters work. Totally worth adding these things to your scripts. This script is simple, it uses the add-adgroupmemeber and dynamic parameters to help pad against mistakes.

The Script

function Set-SHDADGroupMemebers {
    [cmdletbinding()]
    param (
        [parameter(mandatory=$true)][validateset('Add','Remove')][string]$Action,
        [securestring]$Credential,
        [switch]$Output
    )
    DynamicParam {
        
        
        # Set the dynamic parameters' name
        $ParamName_portgroup = 'Group'
        # Create the collection of attributes
        $AttributeCollection = New-Object System.Collections.ObjectModel.Collection[System.Attribute]
        # Create and set the parameters' attributes
        $ParameterAttribute = New-Object System.Management.Automation.ParameterAttribute
        $ParameterAttribute.Mandatory = $true
        # Add the attributes to the attributes collection
        $AttributeCollection.Add($ParameterAttribute) 
        # Create the dictionary 
        $RuntimeParameterDictionary = New-Object System.Management.Automation.RuntimeDefinedParameterDictionary
        # Generate and set the ValidateSet 
        $arrSet = (Get-ADGroup -Filter *).name
        $ValidateSetAttribute = New-Object System.Management.Automation.ValidateSetAttribute($arrSet)    
        # Add the ValidateSet to the attributes collection
        $AttributeCollection.Add($ValidateSetAttribute)
        # Create and return the dynamic parameter
        $RuntimeParameter = New-Object System.Management.Automation.RuntimeDefinedParameter($ParamName_portgroup, [string], $AttributeCollection)
        $RuntimeParameterDictionary.Add($ParamName_portgroup, $RuntimeParameter)

        
        # Set the dynamic parameters' name
        $ParamName_datastore = 'Username'
        # Create the collection of attributes
        $AttributeCollection = New-Object System.Collections.ObjectModel.Collection[System.Attribute]
        # Create and set the parameters' attributes
        $ParameterAttribute = New-Object System.Management.Automation.ParameterAttribute
        $ParameterAttribute.Mandatory = $true
        # Add the attributes to the attributes collection
        $AttributeCollection.Add($ParameterAttribute)  
        # Generate and set the ValidateSet 
        $arrSet = (Get-ADUser -Filter *).name
        $ValidateSetAttribute = New-Object System.Management.Automation.ValidateSetAttribute($arrSet)
        # Add the ValidateSet to the attributes collection
        $AttributeCollection.Add($ValidateSetAttribute)
        # Create and return the dynamic parameter
        $RuntimeParameter = New-Object System.Management.Automation.RuntimeDefinedParameter($ParamName_datastore, [string], $AttributeCollection)
        $RuntimeParameterDictionary.Add($ParamName_datastore, $RuntimeParameter)
        return $RuntimeParameterDictionary
    }

    begin{
        $Group = $PsBoundParameters[$ParamName_portgroup]
        $username = $PsBoundParameters[$ParamName_datastore] 
    }
    process {
        if ($PSBoundParameters.ContainsKey('Credential')) {
            if ($Action -like "Add") {
                Add-ADGroupMember -Identity $group -Members $username -Credential $Credential
            } elseif ($Action -like "Remove") {
                Remove-ADGroupMember -Identity $group -Members $username -Credential $Credential
            } else {
                Get-ADGroupMember -Identity $group -Credential $Credential
            }
        } else {
            if ($Action -like "Add") {
                Add-ADGroupMember -Identity $group -Members $username
            } elseif ($Action -like "Remove") {
                Remove-ADGroupMember -Identity $group -Members $username
            } else {
                Get-ADGroupMember -Identity $group
            }
        }
    }
    end {
        if ($Output) {
            if ($PSBoundParameters.ContainsKey('Credential')) {
                Get-ADGroupMember -Identity $Group -Credential $Credential
            } else {
                Get-ADGroupMember -Identity $group
            }
        }
    }
}

Example 1

Set-SHDADGroupMemebers -Action Add -Group Administrators -Username Adam.Long -Output

Adds Adam.Long to the administrators group. Then it outputs all the users inside that group.

Example 2

Set-SHDADGroupMemebers -Action Remove -Group Administrators -Username Adam.Long

Removes Adam.Long from the Administrators group without outputting any additional information.

Upgrade Windows – Dell

Upgrade Windows – Dell

I have a love hate relationship with dell. When it comes to windows 10 upgrades, I really don’t like them. How do you mass push windows 10 upgrade to clients without breaking them. As many of you know, recently the 20H2 update has broken many different dells. So, a quick way to fix this is by comparing the model to the online list. This script isn’t that big, and is designed to be deployed from the end user’s machine as a system service or admin account.

The Script

$DellSite = Invoke-WebRequest -Uri "https://www.dell.com/support/kbdoc/en-us/000180684/dell-computers-tested-for-windows-10-october-2020-update-and-previous-versions-of-windows-10" -DisableKeepAlive
$Dellraw = $DellSite.Rawcontent.split('`r')
$CPInfo = Get-ComputerInfo
if ($Dellraw | select-string $CPInfo.csmodel) {
    if (!(Test-Path "$($env:SystemDrive)\Temp\Win10Upgrade")) {New-Item c:\temp\win10upgrade -Type directory}
    $DateTime = (Get-date).ToString("yyyy-MM-dd_hh-mm-ss")
    $webClient = New-Object System.Net.WebClient
    $url = 'https://go.microsoft.com/fwlink/?LinkID=799445'
    $file = "$($env:SystemDrive)\Temp\win10upgrade\Win10Update_$DateTime.exe"
    $webClient.DownloadFile($url, $file)
    Start-Process -FilePath $file -ArgumentList '/auto Upgrade /quiet /noreboot'
} else {
    Write-Error "$($env:COMPUTERNAME) is a $($CPInfo.CsModel) and is not on the approved list found at: https://www.dell.com/support/kbdoc/en-us/000180684/dell-computers-tested-for-windows-10-october-2020-update-and-previous-versions-of-windows-10"
}

The Breakdown

I’m glad you decided to stay for the breakdown. This breakdown isn’t going to take long. The first element of this break down is the invoke-webrequest. We capture the website with the required information. (Link). Then we split the raw content by the return carriage.

$DellSite = Invoke-WebRequest -Uri "https://www.dell.com/support/kbdoc/en-us/000180684/dell-computers-tested-for-windows-10-october-2020-update-and-previous-versions-of-windows-10" -DisableKeepAlive
$Dellraw = $DellSite.Rawcontent.split('`r')

Now our web data is ready to pull from. Now we need to get information from the computer itself. Most systems these days have the Get-ComputerInfo command on them. It pulls the system info on a computer. Next, we ask a simple if-then statement. If the $DellRaw has the model number, then download and install the upgrade, if not let us know. Basically, we need a bouncer at this point. We use the $CPInfo.CSmodel as this is the model number.

$CPInfo = Get-ComputerInfo
if ($Dellraw | select-string $CPInfo.csmodel) {
    #Install the upgrade
} else {
    #Warning the installer program that it's not on the list. 
}

The Download and Install

We first ask if the file folder is there. If it isn’t then we create it using the new-item command. Then we want to create a datetime stamp. Like in previous blogs, we use the .tostring() method to format the output. Then we declare the url and file path. Next, we invoke-webrequest command and download the file using the -outfile flag. Finally, we start the install with the correct flags. In this case, we want the install to be an upgrade that is silent and doesn’t force a restart because it could be in the middle of the day when this thing is finishing up. To start the installer we use the start-process

if (!(Test-Path "$($env:SystemDrive)\Temp\Win10Upgrade")) {New-Item c:\temp\win10upgrade -Type directory}
$DateTime = (Get-date).ToString("yyyy-MM-dd_hh-mm-ss")
$url = 'https://go.microsoft.com/fwlink/?LinkID=799445'
$file = "$($env:SystemDrive)\Temp\win10upgrade\Win10Update_$DateTime.exe"
Invoke-WebRequest -Uri $url -OutFile $file
Start-Process -FilePath $file -ArgumentList '/auto Upgrade /quiet /noreboot'

If your computer model is not on the list we will need to do a write-error. It’s best to say the computer name, model name, and the website it is pulling the data from. The write-error is collected by the standard deployment software.

Write-Error "$($env:COMPUTERNAME) is a $($CPInfo.CsModel) and is not on the approved list found at: https://www.dell.com/support/kbdoc/en-us/000180684/dell-computers-tested-for-windows-10-october-2020-update-and-previous-versions-of-windows-10"

I hope this helps. Y’all have a great day now you hear.

Citrix Workspace Installer Script

Citrix Workspace Installer Script

I don’t like working with Citrix receiver. They drive me crazy. One version doesn’t work with the other and so on and so forth. Then finding the one you need is a pain. Thankfully, the workspace is a little better at this process. Many of my clients have recently updated their back end so the new workspace will work for them. It only took a while. So, I built a script that automatically downloads the newest version and installs it accordingly. It wasn’t until later did I realize someone else did this already. But the one I made is a little better as it doesn’t run into the conflict of pulling the version number, at least in my humble opinion. This time we will start off with the script for us lazy admins. If you want to learn how it works, keep reading on.

The Script

IF (!(Test-Path c:\temp)){New-Item -Path c:\ -Name Temp -ItemType "directory"} 
IF (!(Test-Path c:\temp\Citrix)) {New-Item -Path c:\temp -Name Citrix -ItemType "directory"} 
$StartTime = (Get-Date).tostring("yyyy-MM-dd_hh-mm-ss")
$Logname = "C:\temp\Citrix\Install_$StartTime.log"
$DownloadFullPath = "C:\temp\Citrix\Installer_$StartTime.exe"
"Log: $($startTime): Started" > $Logname
try {
    $CitrixPage = Invoke-WebRequest -UseBasicParsing -Uri ("https://www.citrix.com/downloads/workspace-app/windows/workspace-app-for-windows-latest.html") -SessionVariable websession
    $LogTime = (Get-date).tostring("yyyy:MM:dd-hh:mm:ss")
    "Site: $($LogTime): Accessed" >> $Logname
} catch {
    $LogTime = (Get-date).tostring("yyyy:MM:dd-hh:mm:ss")
    "Site: $($LogTime): Failed to access" >> $Logname
    Write-Error "Site Error: Site not accessible"
    Break
}
$DownloadLink = $CitrixPage.Links | Where-Object {$_.rel -like "*CitrixWorkspaceApp.exe*"}
$URL = "Https:$($DownloadLink.rel)"
try {
    Invoke-WebRequest -Uri $URL -OutFile $DownloadFullPath
    $LogTime = (Get-date).tostring("yyyy:MM:dd-hh:mm:ss")
    "Site: $($LogTime): Download $URL to $DownloadFullPath" >> $Logname
} catch {
    $LogTime = (Get-date).tostring("yyyy:MM:dd-hh:mm:ss")
    "Site: $($LogTime): Failed to download $URL to $DownloadFullPath" >> $Logname
    Write-Error "Site Error: Download Failure"
    Break
}
try {
    $Install = Start-Process -FilePath $DownloadFullPath -ArgumentList '/silent /forceinstall /AutoUpdateCheck=disabled /noreboot' -PassThru -ErrorAction Stop
    $LogTime = (Get-date).tostring("yyyy:MM:dd-hh:mm:ss")
    "Install: $($LogTime): Installing $DownloadFullPath" >> $Logname
} catch {
    $LogTime = (Get-date).tostring("yyyy:MM:dd-hh:mm:ss")
    "Install: $($LogTime): $DownloadFullPath Failed to Install" >> $Logname
    Write-Error "Install Error"
    Break
}
$LogTime = (Get-date).tostring("yyyy:MM:dd-hh:mm:ss")
"Sleep: Sleep for 420 Seconds for install" >> $Logname
Start-Sleep -Seconds 420
$LogTime = (Get-date).tostring("yyyy:MM:dd-hh:mm:ss")
"Sleep: Stop Sleep" >> $Logname
$Programs = Get-CimInstance -ClassName win32_product
$Citrix = $Programs | where-object {$_.name -like "Citrix*Workspace*Browser"}
if ($null -ne $Citrix) {
    $LogTime = (Get-date).tostring("yyyy:MM:dd-hh:mm:ss")
    "Check: $($LogTime): $($Citrix.Caption) - $($Citrix.Version) Installed on: $($Citrix.Installdate.tostring())" >> $Logname
} else {
    $LogTime = (Get-date).tostring("yyyy:MM:dd-hh:mm:ss")
    "Check: $($LogTime): Install Failed" >> $Logname
}
Remove-Item -Path $DownloadFullPath -Force

The Break Down

Lets break this guy down. The first part is we are testing if the c:\temp folder exists. If it doesn’t then we will create it. Then we test if the Citrix folder exists, if it doesn’t, once again, we create it. We do this with the Test-Path for testing and the New-Item cmdlets.

IF (!(Test-Path c:\temp)){New-Item -Path c:\ -Name Temp -ItemType "directory"} 
IF (!(Test-Path c:\temp\Citrix)) {New-Item -Path c:\temp -Name Citrix -ItemType "directory"} 

Now we have the folders we will be using created, we need to move to creating the first log entry. We want this log to have a timestamp on it that matches the downloaded installer. To do this, we need to get the datetime first. While doing that we will create the filename of the log and the file name of the download path. This way it’s easier to work with later on in the script. We do this by using the Get-Date cmdlet. Normally the Get-Date cmdlet outputs is an object. Which isn’t very useful in a file name since it contains forbidden characters. (Not forbidden like slifer the sky dragon). A translation is required. We do this with the .tostring() method. Notice the way we format it.

  • y = year
  • M = Month
  • d = day
  • h = hour
  • m = minute
  • s = seconds
$StartTime = (Get-Date).tostring("yyyy-MM-dd_hh-mm-ss")

We then use the $StartTime variable inside the log name and the download pathname. This is done by a string with the variable inside of it. Next will be to create the log. We do this with a simple > which means out and create. >> means out and append. Notice in the example below we $($StartTime) we do this because the next character is a :. Inside PowerShell, you can do things like $Global:Var which tells the shell to keep that var in memory for other functions to use. This means the : is a command character. This is why we wrap the start time variable inside a $(). Powershell will only print what is inside the $(). Finally, take note of the > $Logname. We will be using $Logname more inside this script. This is why we created the variable.

$Logname = "C:\temp\Citrix\Install_$StartTime.log"
$DownloadFullPath = "C:\temp\Citrix\Installer_$StartTime.exe"
"Log: $($startTime): Started" > $Logname

Now we have the start of the log. It’s time to get the installer. In the past, we would just go to the download link and add that to a download script. However, recently Citrix changed how they download. They have tacked on an additional piece of code. Everything past the GDA is that special code they have tacked on to stop direct downloading. However, we have PowerShell on our side.

https://downloads.citrix.com/19176/CitrixWorkspaceApp.exe?__gda__=1615916903_06373f7510a0edd3a06ef41c13dbe8a7

The first thing we want to do is setup a try catch. This way we can catch errors and log them. Also we can break the script with an error message that is useful. This way if you are deploying out with something like continuum or PDQ your error message makes sense. Inside the try, we want to get the webpage itself. Then log that we grabbed information. The cmdlet to get the website is Invoke-webrequest. In the below example I am using the -usebasicparsing because it’s more compatible with websites and with systems. My goal is to launch this thing to 100+ machines. The -Uri is for the website itself and finally, we use the -sessionvariable as a websession. This allows us to grab data easier, especially if it’s auto-generated, like in this case.

$CitrixPage = Invoke-WebRequest -UseBasicParsing -Uri ("https://www.citrix.com/downloads/workspace-app/windows/workspace-app-for-windows-latest.html") -SessionVariable websession

After we grab the website, we have to log the event. We do the same thing we did with $StartTime and place it in the file we created a few moments ago.

$LogTime = (Get-date).tostring("yyyy:MM:dd-hh:mm:ss")
"Site: $($LogTime): Accessed" >> $Logname

If these commands fail for whatever reason, the website is down, the internet is blocking, anything, we need to know that the site can’t be reached. This is why we have a log. We create the same as the $logime but this time we also add a write-error and a break command. The write-error command will send an error to a deployment software, This way we know what’s going on. The break command breaks the script at that point and doesn’t continue.

$LogTime = (Get-date).tostring("yyyy:MM:dd-hh:mm:ss")
"Site: $($LogTime): Failed to access" >> $Logname
Write-Error "Site Error: Site not accessible"
Break

Lets put them together inside the try catch so you can see what it looks like.

try {
    $CitrixPage = Invoke-WebRequest -UseBasicParsing -Uri ("https://www.citrix.com/downloads/workspace-app/windows/workspace-app-for-windows-latest.html") -SessionVariable websession -DisableKeepAlive
    $LogTime = (Get-date).tostring("yyyy:MM:dd-hh:mm:ss")
    "Site: $($LogTime): Accessed" >> $Logname
} catch {
    $LogTime = (Get-date).tostring("yyyy:MM:dd-hh:mm:ss")
    "Site: $($LogTime): Failed to access" >> $Logname
    Write-Error "Site Error: Site not accessible"
    Break
}

Now we have the website itself inside a variable. It’s time to find what we need. The $CitrixPage contains different elements inside of it. Each item can give you information. The Rawcontent is just like it sounds, raw content of the page. While status code can give you information about if the site is up or what condition it is in. In this case, we will be looking at the links and status code. We check if the site has a good status of 200, if it doesn’t, then we don’t want to battle that battle. Thus we log and break like before. If it does, however, we want to take apart the links and find the one that contains the exe that we need. We do this with a where-object cmdlet. We search the .rel for the *CitrixWorkspaceApp.exe. Because the .links sometimes produces incomplete links, we have to build them. That’s the second step is to build the link. We will wrap the outcome inside an https: string.

if ($CitrixPage.statuscode -eq 200) {
    $DownloadLink = $CitrixPage.Links | Where-Object {$_.rel -like "*CitrixWorkspaceApp.exe*"}
    $URL = "Https:$($DownloadLink.rel)"
} else {
    "Site: $($LogTime): Site Status Code $($CitrixPage.StatusCode)" >> $Logname
    Write-Error "Site Error: Status Code $($CitrixPage.StatusCode)"
    Break
}

Now we have the custom URL for the download, we need to download the file itself. Remember the $DownloadFullPath we created a while ago. It’s time to use it. We will be using the invoke-webrequest once again as well. This time we will use the -OutFile cmdlet. This cmdlet of invoke-webrequest will download the file as requested from the url provided. Of course, we want to wrap all of this inside of a try catch. This way we can log correctly and break as needed.

try {
    Invoke-WebRequest -Uri $URL -OutFile $DownloadFullPath
    $LogTime = (Get-date).tostring("yyyy:MM:dd-hh:mm:ss")
    "Site: $($LogTime): Download $URL to $DownloadFullPath" >> $Logname
} catch {
    $LogTime = (Get-date).tostring("yyyy:MM:dd-hh:mm:ss")
    "Site: $($LogTime): Failed to download $URL to $DownloadFullPath" >> $Logname
    Write-Error "Site Error: Download Failure"
    Break
}

Now we have the installer to work with. The filename has the same time/date stamp as the log file so we can compare if the script doesn’t finish up correctly. Next we will start the process of another try catch to install the program. The command we will use is the Start-Process command. We start the $DownloadFullPath for the file name. We want this thing to be quiet and overwrite everything else there. Citrix, if given the /forceinstall will force the install by uninstalling the last version. Finally we tell it not to reboot with the /noreboot. Once we get past the arguments, we want to make sure we have the information from this thus we put the -passthru flag. This will allow us to store the information into a variable if we want to use that information later. The final part of the command is the -erroraction. We want this thing to stop if it hits an error. This way we know that something is broken. Then we log accordingly and catch accordingly like above.

try {
    $Install = Start-Process -FilePath $DownloadFullPath -ArgumentList '/silent /forceinstall /AutoUpdateCheck=disabled /noreboot' -PassThru -ErrorAction Stop
    $LogTime = (Get-date).tostring("yyyy:MM:dd-hh:mm:ss")
    "Install: $($LogTime): Installing $DownloadFullPath" >> $Logname
} catch {
    $LogTime = (Get-date).tostring("yyyy:MM:dd-hh:mm:ss")
    "Install: $($LogTime): $DownloadFullPath Failed to Install" >> $Logname
    Write-Error "Install Error"
    Break
}

We are almost done! This program takes an average of 5 minutes on older machines to install. Thus we sleep for 7 minutes. To do this we use the command Start-Sleep and set the -seconds to 420 seconds. We also make sure we log this information.

$LogTime = (Get-date).tostring("yyyy:MM:dd-hh:mm:ss")
"Sleep: Sleep for 420 Seconds for install" >> $Logname
Start-Sleep -Seconds 420
$LogTime = (Get-date).tostring("yyyy:MM:dd-hh:mm:ss")
"Sleep: Stop Sleep" >> $Logname

As we are not in a hurry, we use the PowerShell command Get-CimInstance to get the products and sort through that to find the Citrix Workplace Browser using the where-object cmdlet.

$Programs = Get-CimInstance -ClassName win32_product
$Citrix = $Programs | where-object {$_.name -like "Citrix*Workspace*Browser"}

Finally we check to see if the install was successful or not. This is done with a simple $null -ne $something. We do it this way because we first load nothing and start to compare nothing to something. if something is there, then we know the statement is true and stop processing. Very simple concept. If $Citrix does contain something we log that the install was successful and remove the installer. If we find $Citrix is $null, then we log the error and error out once again.

if ($null -ne $Citrix) {
    $LogTime = (Get-date).tostring("yyyy:MM:dd-hh:mm:ss")
    "Check: $($LogTime): $($Citrix.Caption) - $($Citrix.Version) Installed on: $($Citrix.Installdate.tostring())" >> $Logname
    Remove-Item -Path $DownloadFullPath -Force
} else {
    $LogTime = (Get-date).tostring("yyyy:MM:dd-hh:mm:ss")
    "Check: $($LogTime): Install Failed" >> $Logname
    Write-Error "Install: Install not complete."
    break
}

Improvements

With all good scripts, there is always room for improvement. The one that is blaring is the waiting for the install. This should really be a loop checking files or a registry key. If the file or key is not present, then continue to way 30 seconds. This would speed up the process as some computers process faster while others do not.

The second is the get-ciminstance because this is a slow command. We can improve the speed of this command by targeting once again either a file or a registry key instead. This way we can prove it was installed without the 30 to 60 second wait for the get-ciminstance to do its thing.

As always, If you have any questions, feel free to ask.