by David | Oct 28, 2022 | Deployments, Information Technology, PowerShell
There has been a few times where I have needed to enable Remote Desktop Protocal on Remote computers. So, I built out a simple but powerful tool to help me with just this. It uses the Invoke-Command command to enable the RDP. So, lets dig in.
The Script – Enable RDP on a Remote Computer
function Enable-SHDComputerRDP {
<#
.SYNOPSIS
Enables target computer's RDP
.DESCRIPTION
Enables taget Computer's RDP
.PARAMETER Computername
[String[]] - Target Computers you wish to enable RDP on.
.PARAMETER Credential
Optional credentials switch that allows you to use another credential.
.EXAMPLE
Enable-SHDComputerRDP -computername <computer1>,<computer2> -Credential (Get-credential)
Enables RDP on computer1 and on computer 2 using the supplied credentials.
.EXAMPLE
Enable-SHDComputerRDP -computername <computer1>,<computer2>
Enables RDP on computer1 and on computer 2 using the current credentials.
.OUTPUTS
[None]
.NOTES
Author: David Bolding
.LINK
https://therandomadmin.com
#>
[cmdletbinding()]
param (
[Parameter(
ValueFromPipeline = $True,
ValueFromPipelineByPropertyName = $True,
HelpMessage = "Provide the target hostname",
Mandatory = $true)][Alias('Hostname', 'cn')][String[]]$Computername,
[Parameter(HelpMessage = "Allows for custom Credential.")][System.Management.Automation.PSCredential]$Credential
)
$parameters = @{
ComputerName = $ComputerName
ScriptBlock = {
Enable-NetFirewallRule -DisplayGroup 'Remote Desktop'
Set-ItemProperty ‘HKLM:\SYSTEM\CurrentControlSet\Control\Terminal Server\‘ -Name “fDenyTSConnections” -Value 0
Set-ItemProperty ‘HKLM:\SYSTEM\CurrentControlSet\Control\Terminal Server\WinStations\RDP-Tcp\‘ -Name “UserAuthentication” -Value 1
}
}
if ($PSBoundParameters.ContainsKey('Credential')) { $parameters += @{Credential = $Credential } }
Invoke-Command @parameters
}
The breakdown
Comments/Documentation
The first part of this tool is the in-house Documentation. Here is where you can give an overview, description, parameters, examples, and more. Using the command Get-help will produce the needed information above. On a personal level, I like adding the type inside the parameters. I also like putting the author and date inside the Notes field.
Parameters
We are using two parameters. A computer name parameter and a credential parameter. The ComputerName parameter contains a few parameter flags. The first is the Value from Pipeline flags. This allows us to pipe data to the function. The next is the Value From Pipeline by Property name. This allows us to pass the “ComputerName” Value. Thus we can pull from an excel spreadsheet or a list of computer names. Next, we have the Help Message which is just like it sounds. It’s a small help message that can be useful to the end user. Finally, we have the Mandatory flag. As this command is dependent on that input, we need to make this mandatory. The next item in computername is the Alias. This allows us to use other names. In this example, we are using the hostname or the CN. This is just a little something that helps the end user. Finally, we have the type. This is a list of strings which means we can target more than one computer at a time.
The next parameter is the Credential Parameter. This one is unique. The only flag we have here is the Hel message. The type is a little different. The type is a System Management Automation PSCredential. And yes, it’s complex. A simple run down is, use Get-Credentials here. This function is designed to be automated with this feature. If you are using a domain admin account, you may not need to use this. However, if you are working on computers in a different domain, and don’t have rights, you can trigger this parameter.
param (
[Parameter(
ValueFromPipeline = $True,
ValueFromPipelineByPropertyName = $True,
HelpMessage = "Provide the target hostname",
Mandatory = $true)][Alias('Hostname', 'cn')][String[]]$Computername,
[Parameter(HelpMessage = "Allows for custom Credential.")][System.Management.Automation.PSCredential]$Credential
)
The Script Block
Now we need to create the script block that will be used inside the invoke-command. We are going to build out a splat. We build splats with @{}. The Information will be inside here. When we push a splat into a command we need to add each flag from that command that is required. Here we are going to be adding the computer name and script block. The computer name flag is a list of strings for our invoke-command. Thus, we can drop the Computername into the ComputerName. Yeah, that’s not confusing. The script block is where the action is.
$parameters = @{
ComputerName = $ComputerName
ScriptBlock = {
#Do something
}
}
Let’s open up some knowledge. The first thing we need to do is enable the remote desktop firewall rules. This will allow the remote desktop through the firewall.
Enable-NetFirewallRule -DisplayGroup 'Remote Desktop'
Next, we need to add the registry keys. The first key is to disable the deny TS connection keys. Next, we need to enable the User Authentication key.
Set-ItemProperty ‘HKLM:\SYSTEM\CurrentControlSet\Control\Terminal Server\‘ -Name “fDenyTSConnections” -Value 0
Set-ItemProperty ‘HKLM:\SYSTEM\CurrentControlSet\Control\Terminal Server\WinStations\RDP-Tcp\‘ -Name “UserAuthentication” -Value 1
Adding Credentials
Now we have created the Parameter, it’s time to add credentials when needed. We do this by asking if the parameter credentials was added. This is done through the PSBoundParameters variable. We search the Contains Key method and to see if Credential is set. If it is, Then we add the credentials.
if ($PSBoundParameters.ContainsKey('Credential')) { $parameters += @{Credential = $Credential } }
Finally, we invoke the command. We are using the splat which is the @parameters variable instead of the $parameter.
Invoke-Command @parameters
And that’s how you can quickly Enable RDP on a Remote Computer using PowerShell within your domain.
Additional Reading
by David | Oct 21, 2022 | Help Desk, Information Technology, PowerShell
During my in-house days, one of the things I had to do constantly was clear people’s print jobs. So I learned to Clear Print Jobs with Powershell to make my life easier. It surely did. With PowerShell I could remotely clear the print jobs as most of my machines were on the primary domain. All you need to know was the server and the printer’s name. The server could be the computer in question if it’s a local printer. Or it could be the print server. Wherever the queue is being held.
The Script
function Clear-SHDPrintJobs {
[cmdletbinding()]
param (
[parameter(HelpMessage = "Target Printer", Mandatory = $true)][alias('Name', 'Printername')][String[]]$name,
[parameter(HelpMessage = "Computer with the printer attached.", Mandatory = $true)][alias('Computername', 'Computer')][string[]]$PrintServer
)
foreach ($Print in $PrintServer) {
foreach ($N in $name) {
$Printers = Get-Printer -ComputerName $Print -Name "*$N*"
Foreach ($Printer in $Printers) {
$Printer | Get-PrintJob | Remove-PrintJob
}
}
}
}
The Breakdown
It’s time to break down this code. The parameters are set to mandatory. As you need this information. Notice, both parameters are lists of strings. This means you can have the same printer name on multiple printers and trigger this command.
Parameters
We are using two parameters. The first is the name of the Printer. This parameter is a mandatory parameter. We are using the alias switch here as well. Basically, this switch gives different options to make the command easier to work with. The options are, Name and Printername. It is also a list of strings. This way we can import an array if need be. I’ll go over why that’s important later. Finally, we have a help message. This can be useful for the user to figure out what they need to do.
[parameter(HelpMessage = "Target Printer", Mandatory = $true)][alias('Name', 'Printername')][String[]]$name
The next parameter is like the first. It is a Mandatory parameter with alias options. The options are “ComputerName” and “Computer”. I set the name as “PrintServer” because I dealt with print servers most of the time. Once again we have a list of strings for multiple machines.
Foreach Loops
Next, we look at our function. This is where we Clear Print Jobs with PowerShell. There are three loops in total. The first foreach loop cycles through the print server list. So for each print server, we enter the next loop. The next loop consists of the Printer Names.
foreach ($Print in $PrintServer) {
foreach ($N in $name) {
#Do something
}
}
Inside the Name loop, we use the get-printer command. We target the print server and ask for a printer that contains the name we requested. Thus, if you use a *, you can clear out all the print jobs from that device. This is a very powerful option.
$Printers = Get-Printer -ComputerName $Print -Name "*$N*"
After gathering the printers from the server, we start another loop. This loop will be foreach printer we have from this server. We pipe the output from the Printers to Get-PrintJob. This allows us to see the print jobs. Then we pipe that information into Remove-PrintJob. This clears the print job.
$Printer | Get-PrintJob | Remove-PrintJob
That’s it for this function. It’s a great little tool that will change how you clear print jobs of buggy print systems.
Conclusion
In conclusion, I have used this function a few hundred times in my day. The environment was the domain level. This does not work for cloud print options.
Additional reading
Images made by MidJourney AI
by David | Sep 23, 2022 | Azure, Exchange, Information Technology, PowerShell, Resources
While reading on Reddit, I found a common thread. People need a quick way to do a Share Point File Audit. I have a PowerShell function for this in my toolbox. This tool heavily uses the Search-UnifiedAuditLog command let. The most common items I tend to audit are file modifications and deletions. This function goes through, modified, moved, renamed, downloaded, uploaded, accessed, synced, malware detection, restored from trash, locked, and finally unlocked. The Search-UnifiedAuditLog is an exchange online command at the time of this writing. Thus, you need to connect to exchange online. In this function, I am using the switch command. I will follow that structure for the breakdown. Lets first jump in with the function.
The Function
function Invoke-SharePointFileAudit {
[cmdletbinding()]
param (
[Parameter(Mandatory = $true)][validateset("Deleted", "Modified", "Moved", "Renamed", "Downloaded", "Uploaded", "Synced", "Accessed", "MalwareDetected", "Restored", "Locked", "unLocked")][string]$Type,
[parameter(Mandatory = $false)][switch]$KeepAlive,
[switch]$SharePointOnline,
[switch]$OneDrive,
[Nullable[DateTime]]$StartDate,
[Nullable[DateTime]]$EndDate,
[string]$Outfile,
[int]$ResultSize = 5000
)
Begin {
$Module = Get-Module ExchangeOnlineManagement -ListAvailable
if ($Module.count -eq 0) {Install-Module ExchangeOnlineManagement -Repository PSGallery -AllowClobber -Force}
$getsessions = Get-PSSession | Select-Object -Property State, Name
$isconnected = (@($getsessions) -like '@{State=Opened; Name=ExchangeOnlineInternalSession*').Count -gt 0
If ($isconnected -ne "false") {
try {
Connect-ExchangeOnline
}
catch {
Write-Error "Exchange Online Failed. Ending"
end
}
}
#Auto Generates Start and Finish dates
if ($Null -eq $StartDate) { $StartDate = ((Get-Date).AddDays(-89)).Date }
if ($Null -eq $EndDate) { $EndDate = (Get-Date).Date }
#Tests if end date is before start date.
if ($EndDate -lt $StartDate) { $StartDate = ((Get-Date).AddDays(-89)).Date }
if ($EndDate -gt (Get-Date).Date) { $EndDate = (Get-Date).Date }
}
Process {
switch ($Type) {
"Deleted" {
$DeletedRecords = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -Operations "FileDeleted,FileDeletedFirstStageRecycleBin,FileDeletedSecondStageRecycleBin,FileVersionsAllDeleted,FileRecycled" -SessionId deleted -SessionCommand ReturnLargeSet -ResultSize 5000
$Return = foreach ($DeletedRecord in $DeletedRecords) {
$JSONInfo = $DeletedRecord.AuditData | convertfrom-json
[pscustomobject][ordered]@{
TimeStampe = ($JSONInfo.creationtime).tolocaltime()
UserName = $DeletedRecord.UserIds
ClientIP = $JSONInfo.ClientIP
Source = $JSONInfo.EventSource
Workload = $JSONInfo.Workload
Operation = $JSONInfo.Operation
SiteURL = $JSONInfo.SiteURL
RelativeURL = $JSONInfo.SourceRelativeUrl
FileName = $JSONInfo.SourceFileName
ObjectID = $JSONInfo.ObjectId
}
}
}
"Modified" {
$ModifiedRecords = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -Operations "FileModified,FileModifiedExtended" -SessionId Modified -SessionCommand ReturnLargeSet -ResultSize 5000
$Return = foreach ($ModifiedRecord in $ModifiedRecords) {
$JSONInfo = $ModifiedRecord.AuditData | convertfrom-json
[pscustomobject][ordered]@{
TimeStamp = ($JSONInfo.creationtime).tolocaltime()
UserName = $ModifiedRecord.UserIds
ClientIP = $JSONInfo.ClientIP
Source = $JSONInfo.EventSource
Workload = $JSONInfo.Workload
Operation = $JSONInfo.Operation
SiteURL = $JSONInfo.SiteURL
RelativeURL = $JSONInfo.SourceRelativeUrl
FileName = $JSONInfo.SourceFileName
ObjectID = $JSONInfo.ObjectId
}
}
}
"Moved" {
$MovedRecords = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -Operations "FileMoved" -SessionId Moved -SessionCommand ReturnLargeSet -ResultSize 5000
$Return = foreach ($MovedRecord in $MovedRecords) {
$JSONInfo = $MovedRecord.AuditData | convertfrom-json
[pscustomobject][ordered]@{
TimeStamp = ($JSONInfo.creationtime).tolocaltime()
UserName = $MovedRecord.UserIds
ClientIP = $JSONInfo.ClientIP
Source = $JSONInfo.EventSource
Workload = $JSONInfo.Workload
Operation = $JSONInfo.Operation
SiteURL = $JSONInfo.SiteURL
SourceRelativeURL = $JSONInfo.SourceRelativeUrl
DestinationRelativeURL = $JSONInfo.DestinationRelativeURL
FileName = $JSONInfo.SourceFileName
ObjectID = $JSONInfo.ObjectId
}
}
}
"Renamed" {
$RenamedRecords = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -Operations "FileRenamed" -SessionId Renamed -SessionCommand ReturnLargeSet -ResultSize 5000
$Return = foreach ($RenamedRecord in $RenamedRecords) {
$JSONInfo = $RenamedRecord.AuditData | convertfrom-json
[pscustomobject][ordered]@{
TimeStamp = ($JSONInfo.creationtime).tolocaltime()
UserName = $RenamedRecord.UserIds
ClientIP = $JSONInfo.ClientIP
Source = $JSONInfo.EventSource
Workload = $JSONInfo.Workload
Operation = $JSONInfo.Operation
SiteURL = $JSONInfo.SiteURL
SourceRelativeURL = $JSONInfo.SourceRelativeUrl
SourceFileName = $JSONInfo.SourceFileName
DestinationFileName = $JSONInfo.DestinationFileName
ObjectID = $JSONInfo.ObjectId
}
}
}
"Downloaded" {
$DownloadedRecords = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -Operations "FileDownloaded" -SessionId Downloaded -SessionCommand ReturnLargeSet -ResultSize 5000
$Return = foreach ($DownloadedRecord in $DownloadedRecords) {
$JSONInfo = $DownloadedRecord.AuditData | convertfrom-json
[pscustomobject][ordered]@{
TimeStamp = ($JSONInfo.creationtime).tolocaltime()
UserName = $DownloadedRecord.UserIds
ClientIP = $JSONInfo.ClientIP
Source = $JSONInfo.EventSource
Workload = $JSONInfo.Workload
Operation = $JSONInfo.Operation
SiteURL = $JSONInfo.SiteURL
SourceRelativeURL = $JSONInfo.SourceRelativeUrl
SourceFileName = $JSONInfo.SourceFileName
ObjectID = $JSONInfo.ObjectId
}
}
}
"Uploaded" {
$UploadedRecords = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -Operations "FileUploaded" -SessionId Uploaded -SessionCommand ReturnLargeSet -ResultSize 5000
$Return = foreach ($UploadedRecord in $UploadedRecords) {
$JSONInfo = $UploadedRecord.AuditData | convertfrom-json
[pscustomobject][ordered]@{
TimeStamp = ($JSONInfo.creationtime).tolocaltime()
UserName = $UploadedRecord.UserIds
ClientIP = $JSONInfo.ClientIP
Source = $JSONInfo.EventSource
Workload = $JSONInfo.Workload
Operation = $JSONInfo.Operation
SiteURL = $JSONInfo.SiteURL
SourceRelativeURL = $JSONInfo.SourceRelativeUrl
SourceFileName = $JSONInfo.SourceFileName
ObjectID = $JSONInfo.ObjectId
}
}
}
"Synced" {
$SyncedRecords = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -Operations "FileSyncDownloadedFull,FileSyncUploadedFull" -SessionId Synced -SessionCommand ReturnLargeSet -ResultSize 5000
$Return = foreach ($SyncedRecord in $SyncedRecords) {
$JSONInfo = $SyncedRecord.AuditData | convertfrom-json
[pscustomobject][ordered]@{
TimeStamp = ($JSONInfo.creationtime).tolocaltime()
UserName = $SyncedRecord.UserIds
ClientIP = $JSONInfo.ClientIP
Source = $JSONInfo.EventSource
Workload = $JSONInfo.Workload
Operation = $JSONInfo.Operation
SiteURL = $JSONInfo.SiteURL
SourceRelativeURL = $JSONInfo.SourceRelativeUrl
SourceFileName = $JSONInfo.SourceFileName
ObjectID = $JSONInfo.ObjectId
}
}
}
"Accessed" {
$AccessedRecords = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -Operations "FileAccessed,FileAccessedExtended" -SessionId Accessed -SessionCommand ReturnLargeSet -ResultSize 5000
$Return = foreach ($AccessedRecord in $AccessedRecords) {
$JSONInfo = $AccessedRecord.AuditData | convertfrom-json
[pscustomobject][ordered]@{
TimeStamp = ($JSONInfo.creationtime).tolocaltime()
UserName = $AccessedRecord.UserIds
ClientIP = $JSONInfo.ClientIP
Source = $JSONInfo.EventSource
Workload = $JSONInfo.Workload
Operation = $JSONInfo.Operation
SiteURL = $JSONInfo.SiteURL
SourceRelativeURL = $JSONInfo.SourceRelativeUrl
SourceFileName = $JSONInfo.SourceFileName
ObjectID = $JSONInfo.ObjectId
}
}
}
"MalwareDetected" {
$MalewareRecords = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -Operations "FileMalwareDetected" -SessionId MalewareRecords -SessionCommand ReturnLargeSet -ResultSize 5000
$Return = foreach ($MalewareRecord in $MalewareRecords) {
$JSONInfo = $MalewareRecord.AuditData | convertfrom-json
[pscustomobject][ordered]@{
TimeStamp = ($JSONInfo.creationtime).tolocaltime()
UserName = $MalewareRecord.UserIds
ClientIP = $JSONInfo.ClientIP
Source = $JSONInfo.EventSource
Workload = $JSONInfo.Workload
Operation = $JSONInfo.Operation
SiteURL = $JSONInfo.SiteURL
RelativeURL = $JSONInfo.SourceRelativeUrl
FileName = $JSONInfo.SourceFileName
ObjectID = $JSONInfo.ObjectId
}
}
}
"Restored" {
$RestoredRecords = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -Operations "FileRestored" -SessionId RestoredRecords -SessionCommand ReturnLargeSet -ResultSize 5000
$Return = foreach ($RestoredRecord in $RestoredRecords) {
$JSONInfo = $RestoredRecord.AuditData | convertfrom-json
[pscustomobject][ordered]@{
TimeStamp = ($JSONInfo.creationtime).tolocaltime()
UserName = $RestoredRecord.UserIds
ClientIP = $JSONInfo.ClientIP
Source = $JSONInfo.EventSource
Workload = $JSONInfo.Workload
Operation = $JSONInfo.Operation
SiteURL = $JSONInfo.SiteURL
RelativeURL = $JSONInfo.SourceRelativeUrl
FileName = $JSONInfo.SourceFileName
ObjectID = $JSONInfo.ObjectId
}
}
}
"Locked" {
$LockedRecords = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -Operations "LockRecord" -SessionId Locked -SessionCommand ReturnLargeSet -ResultSize 5000
$Return = foreach ($LockedRecord in $LockedRecords) {
$JSONInfo = $LockedRecord.AuditData | convertfrom-json
[pscustomobject][ordered]@{
TimeStamp = ($JSONInfo.creationtime).tolocaltime()
UserName = $LockedRecord.UserIds
ClientIP = $JSONInfo.ClientIP
Source = $JSONInfo.EventSource
Workload = $JSONInfo.Workload
Operation = $JSONInfo.Operation
SiteURL = $JSONInfo.SiteURL
RelativeURL = $JSONInfo.SourceRelativeUrl
FileName = $JSONInfo.SourceFileName
ObjectID = $JSONInfo.ObjectId
}
}
}
"unLocked" {
$unLockedRecords = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -Operations "UnlockRecord" -SessionId UnlockRecord -SessionCommand ReturnLargeSet -ResultSize 5000
$Return = foreach ($unLockedRecord in $unLockedRecords) {
$JSONInfo = $unLockedRecord.AuditData | convertfrom-json
[pscustomobject][ordered]@{
TimeStamp = ($JSONInfo.creationtime).tolocaltime()
UserName = $unLockedRecord.UserIds
ClientIP = $JSONInfo.ClientIP
Source = $JSONInfo.EventSource
Workload = $JSONInfo.Workload
Operation = $JSONInfo.Operation
SiteURL = $JSONInfo.SiteURL
RelativeURL = $JSONInfo.SourceRelativeUrl
FileName = $JSONInfo.SourceFileName
ObjectID = $JSONInfo.ObjectId
}
}
}
}
}
end {
if (!($SharePointOnline -and $OneDrive) -or ($SharePointOnline -and $OneDrive)) {
if ($PSBoundParameters.ContainsKey("OutFile")) {
$Return | Export-Csv ./$Outfile.CSV
}
else {
$Return
}
}
elseif ($SharePointOnline) {
if ($PSBoundParameters.ContainsKey("OutFile")) {
$Return | where-object { $_.workload -like "SharePoint" } | Export-Csv ./$Outfile.CSV
}
else {
$Return | where-object { $_.workload -like "SharePoint" }
}
}
elseif ($OneDrive) {
if ($PSBoundParameters.ContainsKey("OutFile")) {
$Return | where-object { $_.workload -like "OneDrive" } | Export-Csv ./$Outfile.CSV
}
else {
$Return | where-object { $_.workload -like "OneDrive" }
}
}
if (!($KeepAlive)) {
Disconnect-ExchangeOnline -Confirm:$false -InformationAction Ignore -ErrorAction SilentlyContinue
}
}
}
The Breakdown of Share Point File Audit
I’m glad you came to the breakdown. It means you want to know how the code works. This means you truly care about learning. Thank you. This code repeats itself a few times in different ways. So, I will call out the differences, but not the likes after the first time explaining something. The first section is our Parameters.
Parameters
We have 8 Parameters, and only one of them is mandatory. Firstly, we have the Type parameter. This mandatory validate set allows you to select from a list of commands we will be using in this function.
- Deleted
- Modified
- Created
- Moved
- Renamed
- Downloaded
- Uploaded
- Synced
- Accessed
- MalwareDetected
- Restored
- Locked
- UnLocked
Afterward, we have Keep Alive. This allows us to run the command multiple times without signing back into the system. So, if you want to keep your session alive, flip that flag. Next, we have two switches. The first Switch is to pull only items edited in SharePoint itself. The next is for one drive. They are named accordingly. After that, we have a start date and an end date. These values are nullable. Basically, you don’t need them. The outfile is asking for just the name of the file. We are using the “./” to save it wherever you run the command from. Finally, we have the result size. If you want the max number of results, 5000. However, you can make this number smaller.
Begin
In our begin section, we want to test the Exchange Online Management Module. Secondly, we want to validate exchange connectivity. After that, we want to gather the date information for the start and end dates. Let’s take a look at the exchange part first.
$Module = Get-Module ExchangeOnlineManagement -ListAvailable
The Get-Module command works with PowerShell 5.1. However, I have seen PowerShell flak with this command failing to pull the information. I am going to assume your PowerShell is up to date with your current version.
if ($Module.count -eq 0) {
Install-Module ExchangeOnlineManagement -Repository PSGallery -AllowClobber -Force
}
Afterward, we want to install the exchange online management module if we don’t detect the module. We are using the count to see how many objects are inside our module variable. If it’s 0, it’s time to install. We install it from the PSGallery.
$getsessions = Get-PSSession | Select-Object -Property State, Name
$isconnected = (@($getsessions) -like '@{State=Opened; Name=ExchangeOnlineInternalSession*').Count -gt 0
Now, we test exchange connections. We use the Get-PSSession to review the current connections. Next, we test if the connections with the name “ExchangeOnlineInternalSession” is greater than zero. “isconnected” will produce a true or false statement.
If ($isconnected -ne "false") {
try {
Connect-ExchangeOnline
} catch {
Write-Error "Exchange Online Failed. Ending"
end
}
}
After which, we can test with. False, we try to connect. However, if there is an error, we end the script and let the user know. We are not using a credential object to authenticate because MFA should always be a thing.
#Auto Generates Start and Finish dates
if ($Null -eq $StartDate) { $StartDate = ((Get-Date).AddDays(-89)).Date }
if ($Null -eq $EndDate) { $EndDate = (Get-Date).Date }
#Tests if end date is before start date.
if ($EndDate -lt $StartDate) { $StartDate = ((Get-Date).AddDays(-89)).Date }
if ($EndDate -gt (Get-Date).Date) { $EndDate = (Get-Date).Date }
Afterward, we need to get the dates right. If the start date is null, we are going to pull 90 days back. We do this by using the standard. We do the same with the end date. If it’s null, we grab today’s date. Now to prevent errors, we check the start date and end date. The end date can’t be before the start date. This is similar to the end date. The end date can’t be greater than the current date. We use the if statement to resolve this.
Process
We begin the process by looking directly at our “Type” variable by using a switch command. The switch allows us to go through each “Type” and run the commands accordingly. Let’s look at one of the switch processes.
$DeletedRecords = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -Operations "FileDeleted,FileDeletedFirstStageRecycleBin,FileDeletedSecondStageRecycleBin,FileVersionsAllDeleted,FileRecycled" -SessionId deleted -SessionCommand ReturnLargeSet -ResultSize 5000
$Return = foreach ($DeletedRecord in $DeletedRecords) {
$JSONInfo = $DeletedRecord.AuditData | convertfrom-json
[pscustomobject][ordered]@{
TimeStampe = ($JSONInfo.creationtime).tolocaltime()
UserName = $DeletedRecord.UserIds
ClientIP = $JSONInfo.ClientIP
Source = $JSONInfo.EventSource
Workload = $JSONInfo.Workload
Operation = $JSONInfo.Operation
SiteURL = $JSONInfo.SiteURL
RelativeURL = $JSONInfo.SourceRelativeUrl
FileName = $JSONInfo.SourceFileName
ObjectID = $JSONInfo.ObjectId
}
}
The data that search-unifiedauditlog produces a section called “AuditData”. This section has almost every piece of information you will need. The difference between each “Type” will be the Operations, and session id. The operations target the required logs. This creates the backbone of the Share Point File Audit. The graph below will show which operations I am using. Once you gather the operation information, we need to pull the AuditData. This data will be in JSON format. We start off by looping the records with a for each loop. Then we pull the auditdata and pipe it into convertfrom-json. Next, we create our PS Custom Object. Other than Moved, the output of the other logs contains almost the same information. See the script for the information.
Operation Filters
- Deleted
- FileDeleted
- FileDeletedFirstStageRecycleBin
- FileDeletedSecondStageRecycleBin
- FileVersionsAllDeleted
- FileRecycled
- Modified
- FileModified
- FileModifiedExtended
- Moved
- Renamed
- Downloaded
- Uploaded
- Synced
- FileSyncDownloadedFull
- FileSyncUploadedFull
- Accessed
- FileAccessed
- FileAccessedExtended
- MalwareDetected
- Restored
- Locked
- UnLocked
End
Finally, it’s time for the end block. This is where we will present the data we have gathered. Firstly, we need to determine if the SharePoint or Onedrives were flipped or not.
if (!($SharePointOnline -and $OneDrive) -or ($SharePointOnline -and $OneDrive)) {
if ($PSBoundParameters.ContainsKey("OutFile")) {
$Return | Export-Csv ./$Outfile.CSV
} else {
$Return
}
}
Here we checking if both flags are not checked or if both flags are checked. Then we check if the user gave us a filename. If they did, we export our report to a csv file wherever we are executing the function from. However, if the user didn’t give us a filename, we just dump all the results.
elseif ($SharePointOnline) {
if ($PSBoundParameters.ContainsKey("OutFile")) {
$Return | where-object { $_.workload -like "SharePoint" } | Export-Csv ./$Outfile.CSV
}
else {
$Return | where-object { $_.workload -like "SharePoint" }
}
}
elseif ($OneDrive) {
if ($PSBoundParameters.ContainsKey("OutFile")) {
$Return | where-object { $_.workload -like "OneDrive" } | Export-Csv ./$Outfile.CSV
}
else {
$Return | where-object { $_.workload -like "OneDrive" }
}
}
if (!($KeepAlive)) {
Disconnect-ExchangeOnline -Confirm:$false -InformationAction Ignore -ErrorAction SilentlyContinue
}
Now, if the user selected either or, we present that information. We present those infos by using a where-object. Like before we ask if the user produced an outfile. Finally, we ask if keep alive was set. If it wasn’t we disconnect from the exchange.
Conclusion
In conclusion, auditing shouldn’t be difficult. We can quickly pull the info we need. I hope you enjoy this powerful little tools.
by David | Jan 19, 2022 | Information Technology, PowerShell
Do you need to find Old Snapshots on a hyper-v server? It’s super easy. So, today we will go through how to get some basic information that allows us to make judgment calls.
The Script – Find Old Snapshots
$Date = (Get-Date).AddDays(-7)
$Vms = Get-VM | where-object { $_.state -like "Running" }
$Return = foreach ($VM in $Vms) {
$SnapShots = $VM | Get-VMSnapshot
foreach ($SnapShot in $SnapShots) {
if ($snapshot.creationTime -lt $date) {
[pscustomobject]@{
SnapShotName = $SnapShot.name
SnapShotCreationDate = $SnapShot.CreationTime
VituralMachine = $SnapShot.VmName
Host = $SnapShot.ComputerName
}
}
}
}
$Return
The Breakdown
The first part of the script is getting the age requirements. In this case, we want to know anything older than 7 days. So we use the Get-Date command. We add -7 days and this will give us the date to compare by.
$Date = (Get-Date).AddDays(-7)
In this case, we only want the running machines. The reason I want running machines is that the powered-off machines might be in a decommissioning process or for other reasons. So we look at the state of each VM to see if it’s “Running”. We do this with a where-object.
$Vms = Get-VM | where-object { $_.state -like "Running" }
Now we have the running VMs to work with, we want to get each one’s snapshot. We want to compare each snapshot to see if it’s older than the date. The information we want is the name of the snapshot, the snapshots creation date, the vm, and the hostname. So we start for each loop. Inside the look, we ask with an if statement the creation time is less than the date we created earlier. Then from there we create a PS custom object and pull out the information we want.
$Return = foreach ($VM in $Vms) {
$SnapShots = $VM | Get-VMSnapshot
foreach ($SnapShot in $SnapShots) {
if ($snapshot.creationTime -lt $date) {
[pscustomobject]@{
SnapShotName = $SnapShot.name
SnapShotCreationDate = $SnapShot.CreationTime
VituralMachine = $SnapShot.VmName
Host = $SnapShot.ComputerName
}
}
}
}
Then finally we output the $return value. We can export this to a CSV and drop it into a file share. I personally do this with a nextcloud instance. You can read more about that here. Another option is to email the report using a Microsoft Graph API or an SMTP email system. Finally, if you have confidence in your choice, you can delete the VMs.
Conclusion
Running this script and combining it with the file drop and a few other pieces of automation changed how I worked with multiple clients. This was a good cleanup process and saved many of my clients’ much-needed storage space. Let me know how you use this code for your systems.
by David | Sep 9, 2021 | Information Technology, PowerShell, Resources
Recently the send-mailmessage was put to rest with good reason. It failed to do its job by securing the emails. It sent items via plain text and not SSL encrypted. Great for internal nothing fancy, but bad if you wanted to send data outside the world. So, I built an alternative that I use the mail message system inside windows. Let’s Send Mail With Powershell.
The Script
function Send-SHDMailMessage {
<#
.SYNOPSIS
Sends an email using custom SMTP settings.
.DESCRIPTION
Sends an email using custom SMTP settings.
.PARAMETER From
As string. Name of the Email address from which it will come from.
Example: "David Bolding <admin@example.com>"
.PARAMETER To
As string. The email address too.
Example: "Admin@example.com"
.PARAMETER SMTPUsername
As string. The Username for the SMTP service you will be using.
.PARAMETER SMTPPassword
As string. The Password in plain text for the smtp service you will be using.
.PARAMETER SMTPServer
As string. The server name of the SMTP service you will be using.
.PARAMETER SMTPPort
As string. The Server Port for the smtp serivce you will be using.
.PARAMETER Subject
As string. The subject line of the email.
.PARAMETER Body
As string. The body of the email as a string. Body takes default over BodyAsHtml
.PARAMETER BodyAsHTML
As string. The body of the email in html format.
.PARAMETER PlainText
Sends the email in plain text and not ssl.
.PARAMETER Attachment
As array of strings. A list of full file names for attachments. Example: "c:\temp\log.log"
.PARAMETER CC
Email address for a carbon copy to this email.
.PARAMETER BCC
Email address for a blind carbon copy of this email.
.PARAMETER Priority
A validate set for High, Normal, Low. By default it will send emails out with Normal.
.EXAMPLE
$Message = @{
from = "HR <HumanResources@Example.com>"
To = "Somebody@Example.com"
SMTPUsername = "SMTP2GoUsername"
SMTPPassword = "SMTP2GoPassword"
SMTPServer = "mail.smtp2go.com"
SMTPPort = "2525"
Subject = "Test"
Attachment = "C:\temp\JobOffer1.pdf","C:\temp\JobOffer2.pdf"
BodyAsHtml = @"
<html>
<body>
<center><h1>Congradulation</h1></center>
<hr>
<p>Attached is the job offers we discussed on the phone.</p>
<br>
Thank you,<br><br>
Human Resources
</body>
</html>
"@
}
Send-SHDMail @Message
Sends an email using the required information with two attachments.
.EXAMPLE
$Message = @{
from = "HR <HumanResources@Example.com>"
To = "Somebody@Example.com"
SMTPUsername = "SMTP2GoUsername"
SMTPPassword = "SMTP2GoPassword"
SMTPServer = "mail.smtp2go.com"
SMTPPort = "2525"
Subject = "Test"
BodyAsHtml = @"
<html>
<body>
<center><h1>Sorry, Not Sorry</h1></center>
<hr>
<p>Sorry you didn't get the job. Maybe next time show up with clothing on.</p>
<br>
Thank you,<br><br>
Human Resources
</body>
</html>
"@
}
Send-SHDMail @Message
This will send out an email without any attachments.
.EXAMPLE
$Message = @{
from = "HR <HumanResources@Example.com>"
To = "Somebody@Example.com"
SMTPUsername = "SMTP2GoUsername"
SMTPPassword = "SMTP2GoPassword"
SMTPServer = "mail.smtp2go.com"
SMTPPort = "2525"
Subject = "Test"
Body = "Your Hired"
}
Send-SHDMail @Message
Sends out a message using just a simple text in the body.
.EXAMPLE
$Message = @{
from = "HR <HumanResources@Example.com>"
To = "Somebody@Example.com"
SMTPUsername = "SMTP2GoUsername"
SMTPPassword = "SMTP2GoPassword"
SMTPServer = "mail.smtp2go.com"
SMTPPort = "2525"
Subject = "Test"
Attachment = "C:\temp\JobOffer1.pdf","C:\temp\JobOffer2.pdf"
}
Send-SHDMail @Message
This will send out an email that is blank with attached items.
.EXAMPLE
$Message = @{
from = "Notify <Notify@Example.com>"
To = "Somebody@Example.com"
SMTPUsername = "SMTP2GoUsername"
SMTPPassword = "SMTP2GoPassword"
SMTPServer = "mail.smtp2go.com"
SMTPPort = "2525"
Subject = "$SomethingWrong"
PlainText = $true
}
Send-SHDMail @Message
This will send out an unsecured email in plain text.
.EXAMPLE
$Message = @{
from = "Notifiy <Notify@example.com>"
To = "IT@example.com"
SMTPUsername = "SMTPUser"
SMTPPassword = "SMTPPassword"
SMTPServer = "mail.example.com"
SMTPPort = "2525"
Subject = "Server Down"
CC = "ITManagers@Example.com"
BCC = "CFO@Example.com"
PlainText = $True
Priority = "High"
BodyAsHTML = @"
<html>
<body>
<center><h1>SERVER DOWN!</h1></center>
</body>
</html>
"@
}
Send-SHDMailMessage @Message
.OUTPUTS
no Output.
.NOTES
Author: David Bolding
Date: 09/8/2021
.LINK
#>
[cmdletbinding()]
param (
[parameter(Mandatory = $true)][String]$From,
[parameter(Mandatory = $true)][String]$To,
[parameter(Mandatory = $true)][String]$SMTPUsername,
[parameter(Mandatory = $true)][String]$SMTPPassword,
[parameter(Mandatory = $true)][String]$SMTPServer,
[parameter(Mandatory = $true)][String]$SMTPPort,
[parameter(Mandatory = $true)][String]$Subject,
[Switch]$PlainText,
[string]$Body,
[String]$BodyAsHTML,
[String[]]$Attachment,
[string]$CC,
[string]$BCC,
[Validateset("High","Low","Normal")][String]$Priority
)
# Server Info
$SmtpServer = $SmtpServer
$SmtpPort = $SmtpPort
# Creates the message object
$Message = New-Object System.Net.Mail.MailMessage $From, $To
If ($PSBoundParameters.ContainsKey("CC")) {
$Message.CC.Add($CC)
}
If ($PSBoundParameters.ContainsKey("BCC")) {
$Message.Bcc.Add($BCC)
}
If ($PSBoundParameters.ContainsKey("Priority")) {
$Message.Priority = $Priority
} else {
$Message.Priority = "Normal"
}
# Builds the message parts
$Message.Subject = $Subject
if ($PSBoundParameters.ContainsKey("Body")) {
$Message.IsBodyHTML = $false
$Message.Body = $Body
}
elseif ($PSBoundParameters.ContainsKey("BodyAsHTML")) {
$Message.IsBodyHTML = $true
$Message.Body = $BodyAsHTML
}
else {
$Message.IsBodyHTML = $false
$Message.Body = ""
}
if ($PSBoundParameters.ContainsKey('Attachment')) {
foreach ($attach in $Attachment) {
$message.Attachments.Add("$Attach")
}
}
# Construct the SMTP client object, credentials, and send
$Smtp = New-Object Net.Mail.SmtpClient($SmtpServer, $SmtpPort)
if ($PlainText) {
$Smtp.EnableSsl = $true
}
else {
$Smtp.EnableSsl = $true
}
$Smtp.Credentials = New-Object System.Net.NetworkCredential($SMTPUsername, $SMTPPassword)
$Smtp.Send($Message)
#Closes the message object and the smtp object.
$message.Dispose()
$Smtp.Dispose()
}
Examples
$Message = @{
from = "HR <HumanResources@Example.com>"
To = "Somebody@Example.com"
SMTPUsername = "SMTP2GoUsername"
SMTPPassword = "SMTP2GoPassword"
SMTPServer = "mail.smtp2go.com"
SMTPPort = "2525"
Subject = "Job Offers"
Attachment = "C:\temp\JobOffer1.pdf","C:\temp\JobOffer2.pdf"
BodyAsHtml = @"
<html>
<body>
<center><h1>Congradulation</h1></center>
<hr>
<p>Attached is the job offers we discussed on the phone.</p>
<br>
Thank you,<br><br>
Human Resources
</body>
</html>
"@
}
Send-SHDMail @Message
In this example, I am using the SMTP2go service to send a job offer letter to a new employee. It contains the attachment flag with two offers. Each attachment is separated with a comma as this is a list of strings. The Body is an HTML using the BodyAsHTML flag. The BodyAsHTML has some custom formatting to make it look somewhat nice.
$Message = @{
from = "HR <HumanResources@Example.com>"
To = "Somebody@Example.com"
SMTPUsername = "SMTP2GoUsername"
SMTPPassword = "SMTP2GoPassword"
SMTPServer = "mail.smtp2go.com"
SMTPPort = "2525"
Subject = "Thank you"
BodyAsHtml = @"
<html>
<body>
<center><h1>Sorry, Not Sorry</h1></center>
<hr>
<p>Sorry you didn't get the job. Maybe next time show up with clothing on.</p>
<br>
Thank you,<br><br>
Human Resources
</body>
</html>
"@
}
Send-SHDMail @Message
In this example, we are once again using the SMTP2Go to send a rejection letter. No attachments are present on this email. The bodyashtml string is set with a nice custom HTML page.
$Message = @{
from = "Notify <Notify@Example.com>"
To = "Somebody@Example.com"
SMTPUsername = "SMTP2GoUsername"
SMTPPassword = "SMTP2GoPassword"
SMTPServer = "mail.smtp2go.com"
SMTPPort = "2525"
Subject = "Server Down"
Body = "XYZ Server Is Down"
}
Send-SHDMail @Message
In this example, we are sending out a notification email using a different user than before. We are using the same smtp2go, but you can use any server with the username and password you like. The body is a basic string with no HTML formating.
$Message = @{
from = "Notify <Notify@Example.com>"
To = "Somebody@Example.com"
SMTPUsername = "SMTP2GoUsername"
SMTPPassword = "SMTP2GoPassword"
SMTPServer = "mail.smtp2go.com"
SMTPPort = "2525"
Subject = "$SomethingWrong"
PlainText = $true
}
Send-SHDMail @Message
In this example, we are sending a notification email without a body. We are using a custom variable for the subject line. We are also sending this without the SSL encryption as some legacy systems don’t understand SSL encryption.
$Message = @{
from = "Notifiy <Notify@example.com>"
To = "IT@example.com"
SMTPUsername = "SMTPUser"
SMTPPassword = "SMTPPassword"
SMTPServer = "mail.example.com"
SMTPPort = "2525"
Subject = "Server Down"
CC = "ITManagers@Example.com"
BCC = "CFO@Example.com"
Priority = "High"
BodyAsHTML = @"
<html>
<body>
<center><h1>SERVER DOWN!</h1></center>
</body>
</html>
"@
}
Send-SHDMailMessage @Message
In this example, we are Sending a Carbon copy to the IT manager and a blind carbon copy of the email to the CFO with a high priority.
Notes
- The Body and BodyAsHTML can conflict with each other. If you do both the Body and BodyAsHTML, by default, the body will be selected. If you do not put in a body or bodyashtml, it will send the email with a body of “”.
- This script requires you to have an SMTP service like SMTP2Go.
Conclusion
That my friends is how you Send Mail With Powershell.