The first thing we do is set up the path we want to make. Then we test to see if the path exists. If they don’t, we make them. I’m using temp in this cause because I will be deploying this to 2000+ machines. We will remove the installer afterward. I want the Temp folder to existing afterward for future deployments.
Next, we grab the URL we want to work with This is the gimp’s official download portal. This portal is by default Oldest to newest when you pull from it using Powershell.
Then we use the Invoke-webrequest to grab the website as we did in a previous post. From there we grab all of the links. In this case, since it’s a repo, they are all download links except for 2. We only want the exes of the list, so we use a where-object to find those. Then we select the last 1 as it is the newest version.
Now we need to build our URL and our Path. This is some string controls. Notice the $($Something.Something) in this code. When you deal with an array in a string and want to grab a sub item, you need to call it out with the $().
Now we want to uninstall the pervious version of Gimp. Since gimp doesn’t show up in the win32_products, we go to it manually in the file system. Newer gimps host themselves inside the program files > gimp 2. So we search to see if that folder exists with a test-path. If it does, we then check to see if gimp is running. Then kill it with fire… ok, not fire, but force. Gimp is awesome about putting an uninstaller inside the file system. So we will use that. It’s located in the Gimp 2 > Uninst > Unins000.exe. Which can be triggered with a /verysilent parameter to keep it quiet. We do this with a start process and we use a flag -wait to wait on it to uninstall.
Then we start the install of the new gimp with the start-process again. We use the Download Name we made eailer with an argument list of /verysilent /norestart /allusers and a -wait.
Here is a little powerhouse script I wrote to audit the mailbox sizes. The focus recently is to see who’s mailbox sizes are about to be over and if it’s the deleted folder. I can’t show you all of the scripts, but I can show you part of it. This part will pull the data down in such a way that you can view the mailbox sizes, who has the largest mailboxes, what folder is the largest in those mailboxes, and what their deleted folder is sitting at. Let us take a look at the script itself.
The Script
function Get-SHDEXOMailboxSizeAudit {
[cmdletbinding()]
param (
[pscredential]$Credential
)
Begin {
#Installs required modules
Write-Verbose "Installing required modules"
if (!(Get-InstallEdModule ExchangeOnlineManagement)) { Install-Module ExchangeOnlineManagement }
Write-Verbose "Checking and importing required modules"
# Starts importanting required modules
if (!(Get-Command Connect-ExchangeOnline)) { Import-Module ExchangeOnlineManagement }
}
process {
#Connecting to exchange. Grabing credentials.
if (!($PSBoundParameters.ContainsKey('Credential'))) {
$Credential = (Get-Credential)
}
Connect-ExchangeOnline -credential $Credential
#Grabs all the mailboxes at once. As this report will look at all the mailboxes.
$Mailboxes = Get-Mailbox
#Starts looping through each mailbox
For ($M = 0; $M -le $Mailboxes.count; $M++) {
Write-Verbose "Gathering Info on: $($Mailboxes[$M].UserPrincipalName)"
#Grabs the mailbox stats.
$Stats = Get-EXOMailboxStatistics $mailboxes[$M].UserPrincipalName
$FolderStats = Get-EXOMailboxFolderStatistics $mailboxes[$M].UserPrincipalName
#Starts looping through those folders to get their sizes.
$MainFolderStats = foreach ($Folder in $FolderStats) {
$FolderSize = [math]::Round((($Folder.FolderSize.split('(')[1].split(' ')[0] -replace ',', '') / 1gb), 2)
[pscustomobject]@{
Name = $Folder.name
Size = $FolderSize
}
}
#Adds this information to the mailbox object
$Mailboxes[$M] | add-member -Name "Statistics" -MemberType NoteProperty -Value $Stats
$Mailboxes[$M] | add-member -Name "FolderStats" -MemberType NoteProperty -Value $MainFolderStats
}
#Creates a return value.
$Return = foreach ($Mail in $Mailboxes) {
#Starts looking at mailboxes that are not the discovery mailbox.
if (!($mail.UserPrincipalName -like "DiscoverySearchMailbox*")) {
#Grabs the deleted folder as that is a folder we want to see in this report.
$Deleted = $Mail.FolderStats | where-object { $_.name -like "Deleted Items" }
#Grabs the largest folder. If it's not the deleted folder, then we might want to increase their mailbox sizes.
$LargestFolder = $Mail.FolderStats | Sort-Object Size -Descending | select-object -first 1
#Doing some math on a string. The string format (# bytes). Thus, we work the string to get the bytes. Divide and round.
$Size = [math]::Round(($Mail.Statistics.TotalItemSize.value.tostring().split('(')[1].split(' ')[0].replace(',', '') / 1gb), 2)
#Grabs the mailboxes percentage.
$DeletedToMailboxPercent = [math]::Round((($Deleted.Size / $size) * 100), 0)
#Outputs the data to the return value.
[pscustomobject]@{
DisplayName = $Mail.Displayname
UserPrincipalName = $Mail.UserPrincipalName
MailboxSize = $Size
LargetsFolder = $LargestFolder.Name
LargetsFolderSize = $LargestFolder.Size
DeletedItemSize = $Deleted.Size
DeletedToMailboxPercent = $DeletedToMailboxPercent
}
}
}
#Disconnects exchange
Disconnect-ExchangeOnline -confirm:$false > $null
}
End {
$Return | sort-object MailboxSize -Descending
}
}
The breakdown
First this script is designed to work on powershell 7. It will not work on powershell 5.
The first part we come to is the [pscredential] object. Notice it’s not mandatory. Notice no pipelining either. I have found PS credentials pipped intend to do very poorly. So, it’s simple, bam wam done.
Begin
Inside our begin tab, we have the module setup. We check installed modules for exchangeonlinemanagement. if it’s there, we ignore it and import the module, if it’s not we install it. Same way with importing. If it’s imported, then we do nothing, if it’s not, we import it.
Process
Next, we grab credentials if need be and connect to exchange. We use the get-credential command to grab the credentials. Then we use the connect-exchangeonline command to connect to the exchange online.
Once we are connected, the magic can start. This is the sweetness of this script. The first step is to grab all of the mailboxes at once with Get-Mailbox. Then we start a loop, not any loop, a for a loop. Wait! David, WHY A FOR LOOP! It’s simple, we want to add information to the index. So, we need to be able to call the index. We use the Get-EXOMailboxStatistics and choose the userprincipalname of the index we are looking for. We do the same thing with Get-EXOMailboxFolderStatistics. These gives us some clear stats we can use later on in the script. Now we loop through the folder stats that we just collected and math ourselves some bytes to gb. See the output of the get-exomailboxfolderstatistics looks like “24mb (#### bytes). So we need to filter that out and get the ####. I use a split to do this. I split out the string at the ‘(‘. This way, the bytes are on object 1. Then we split it again by the space. Now the bytes are on the 0 object. Then we replace the ‘,’ with nothing. Now we divide all that by 1gb to convert it to gbs. Then we drop that all into the mainfolderstats. Next, we add all of that information into the mailbox variable we created before this looping madness using add-member. We are adding it as a noteproperty.
Now we have prepped our data, it’s time to sort it. We start the loop once more, but this time it’s simple for each loop as we don’t need the array index value. We first grab all the deleted items folder. Then we grab the largest folder using the sort-object command on the size object of the folderstats note property that we made in the last step. Then we do some math. Like before to get the mailbox overall size. Finally, we grab the deleted mailbox percentage with a little more math. This time its percentage math. Now we have all of this useful information we use our pscustomobject and put it all together.
Then we disconnect using the disconnect-exchangeonline command.
End
Finally we display the return information inside our end tab. We sort the object by the mailbox size in a descending order.
This past week I needed to find all of the quickbook files on a computer without accessing quickbooks itself. The core of the script is a simple Get-Childitem command looped for each logical disk on the machine. Looking for one of the four main extensions for quickbooks. The four extensions are as follows
Since I want this to be a little easier to use, I broke it down between the file types with a validate set parameter. This way you can choose which extension you want. Then I go through each extension and make an if statement for each one matching up the extension.
Next we get the disks using the Get-CimInstance -classname win32_logicaldisk. This grabs all the mapped drives, local drives, and anything else that has a drive letter.
Now we loop through those disks and search the root of each drive for any files with the extension we choose. We select the fullname as this gives us the full path. I also like having file size and last write time to determine if the file is valid still. Once we go through this loop we display the information.
Improvements
I can add remote computers to this setup.
That’s it, if you have any questions feel free too reach out.
This little guy is a simple dynamic parameter resource for you all. Take a look at my previous blog post about how these parameters work. Totally worth adding these things to your scripts. This script is simple, it uses the add-adgroupmemeber and dynamic parameters to help pad against mistakes.
The Script
function Set-SHDADGroupMemebers {
[cmdletbinding()]
param (
[parameter(mandatory=$true)][validateset('Add','Remove')][string]$Action,
[securestring]$Credential,
[switch]$Output
)
DynamicParam {
# Set the dynamic parameters' name
$ParamName_portgroup = 'Group'
# Create the collection of attributes
$AttributeCollection = New-Object System.Collections.ObjectModel.Collection[System.Attribute]
# Create and set the parameters' attributes
$ParameterAttribute = New-Object System.Management.Automation.ParameterAttribute
$ParameterAttribute.Mandatory = $true
# Add the attributes to the attributes collection
$AttributeCollection.Add($ParameterAttribute)
# Create the dictionary
$RuntimeParameterDictionary = New-Object System.Management.Automation.RuntimeDefinedParameterDictionary
# Generate and set the ValidateSet
$arrSet = (Get-ADGroup -Filter *).name
$ValidateSetAttribute = New-Object System.Management.Automation.ValidateSetAttribute($arrSet)
# Add the ValidateSet to the attributes collection
$AttributeCollection.Add($ValidateSetAttribute)
# Create and return the dynamic parameter
$RuntimeParameter = New-Object System.Management.Automation.RuntimeDefinedParameter($ParamName_portgroup, [string], $AttributeCollection)
$RuntimeParameterDictionary.Add($ParamName_portgroup, $RuntimeParameter)
# Set the dynamic parameters' name
$ParamName_datastore = 'Username'
# Create the collection of attributes
$AttributeCollection = New-Object System.Collections.ObjectModel.Collection[System.Attribute]
# Create and set the parameters' attributes
$ParameterAttribute = New-Object System.Management.Automation.ParameterAttribute
$ParameterAttribute.Mandatory = $true
# Add the attributes to the attributes collection
$AttributeCollection.Add($ParameterAttribute)
# Generate and set the ValidateSet
$arrSet = (Get-ADUser -Filter *).name
$ValidateSetAttribute = New-Object System.Management.Automation.ValidateSetAttribute($arrSet)
# Add the ValidateSet to the attributes collection
$AttributeCollection.Add($ValidateSetAttribute)
# Create and return the dynamic parameter
$RuntimeParameter = New-Object System.Management.Automation.RuntimeDefinedParameter($ParamName_datastore, [string], $AttributeCollection)
$RuntimeParameterDictionary.Add($ParamName_datastore, $RuntimeParameter)
return $RuntimeParameterDictionary
}
begin{
$Group = $PsBoundParameters[$ParamName_portgroup]
$username = $PsBoundParameters[$ParamName_datastore]
}
process {
if ($PSBoundParameters.ContainsKey('Credential')) {
if ($Action -like "Add") {
Add-ADGroupMember -Identity $group -Members $username -Credential $Credential
} elseif ($Action -like "Remove") {
Remove-ADGroupMember -Identity $group -Members $username -Credential $Credential
} else {
Get-ADGroupMember -Identity $group -Credential $Credential
}
} else {
if ($Action -like "Add") {
Add-ADGroupMember -Identity $group -Members $username
} elseif ($Action -like "Remove") {
Remove-ADGroupMember -Identity $group -Members $username
} else {
Get-ADGroupMember -Identity $group
}
}
}
end {
if ($Output) {
if ($PSBoundParameters.ContainsKey('Credential')) {
Get-ADGroupMember -Identity $Group -Credential $Credential
} else {
Get-ADGroupMember -Identity $group
}
}
}
}
I have a love hate relationship with dell. When it comes to windows 10 upgrades, I really don’t like them. How do you mass push windows 10 upgrade to clients without breaking them. As many of you know, recently the 20H2 update has broken many different dells. So, a quick way to fix this is by comparing the model to the online list. This script isn’t that big, and is designed to be deployed from the end user’s machine as a system service or admin account.
The Script
$DellSite = Invoke-WebRequest -Uri "https://www.dell.com/support/kbdoc/en-us/000180684/dell-computers-tested-for-windows-10-october-2020-update-and-previous-versions-of-windows-10" -DisableKeepAlive
$Dellraw = $DellSite.Rawcontent.split('`r')
$CPInfo = Get-ComputerInfo
if ($Dellraw | select-string $CPInfo.csmodel) {
if (!(Test-Path "$($env:SystemDrive)\Temp\Win10Upgrade")) {New-Item c:\temp\win10upgrade -Type directory}
$DateTime = (Get-date).ToString("yyyy-MM-dd_hh-mm-ss")
$webClient = New-Object System.Net.WebClient
$url = 'https://go.microsoft.com/fwlink/?LinkID=799445'
$file = "$($env:SystemDrive)\Temp\win10upgrade\Win10Update_$DateTime.exe"
$webClient.DownloadFile($url, $file)
Start-Process -FilePath $file -ArgumentList '/auto Upgrade /quiet /noreboot'
} else {
Write-Error "$($env:COMPUTERNAME) is a $($CPInfo.CsModel) and is not on the approved list found at: https://www.dell.com/support/kbdoc/en-us/000180684/dell-computers-tested-for-windows-10-october-2020-update-and-previous-versions-of-windows-10"
}
The Breakdown
I’m glad you decided to stay for the breakdown. This breakdown isn’t going to take long. The first element of this break down is the invoke-webrequest. We capture the website with the required information. (Link). Then we split the raw content by the return carriage.
Now our web data is ready to pull from. Now we need to get information from the computer itself. Most systems these days have the Get-ComputerInfo command on them. It pulls the system info on a computer. Next, we ask a simple if-then statement. If the $DellRaw has the model number, then download and install the upgrade, if not let us know. Basically, we need a bouncer at this point. We use the $CPInfo.CSmodel as this is the model number.
$CPInfo = Get-ComputerInfo
if ($Dellraw | select-string $CPInfo.csmodel) {
#Install the upgrade
} else {
#Warning the installer program that it's not on the list.
}
The Download and Install
We first ask if the file folder is there. If it isn’t then we create it using the new-item command. Then we want to create a datetime stamp. Like in previous blogs, we use the .tostring() method to format the output. Then we declare the url and file path. Next, we invoke-webrequest command and download the file using the -outfile flag. Finally, we start the install with the correct flags. In this case, we want the install to be an upgrade that is silent and doesn’t force a restart because it could be in the middle of the day when this thing is finishing up. To start the installer we use the start-process
If your computer model is not on the list we will need to do a write-error. It’s best to say the computer name, model name, and the website it is pulling the data from. The write-error is collected by the standard deployment software.
Write-Error "$($env:COMPUTERNAME) is a $($CPInfo.CsModel) and is not on the approved list found at: https://www.dell.com/support/kbdoc/en-us/000180684/dell-computers-tested-for-windows-10-october-2020-update-and-previous-versions-of-windows-10"
I hope this helps. Y’all have a great day now you hear.