by David | Oct 28, 2022 | Deployments, Information Technology, PowerShell
There has been a few times where I have needed to enable Remote Desktop Protocal on Remote computers. So, I built out a simple but powerful tool to help me with just this. It uses the Invoke-Command command to enable the RDP. So, lets dig in.
The Script – Enable RDP on a Remote Computer
function Enable-SHDComputerRDP {
<#
.SYNOPSIS
Enables target computer's RDP
.DESCRIPTION
Enables taget Computer's RDP
.PARAMETER Computername
[String[]] - Target Computers you wish to enable RDP on.
.PARAMETER Credential
Optional credentials switch that allows you to use another credential.
.EXAMPLE
Enable-SHDComputerRDP -computername <computer1>,<computer2> -Credential (Get-credential)
Enables RDP on computer1 and on computer 2 using the supplied credentials.
.EXAMPLE
Enable-SHDComputerRDP -computername <computer1>,<computer2>
Enables RDP on computer1 and on computer 2 using the current credentials.
.OUTPUTS
[None]
.NOTES
Author: David Bolding
.LINK
https://therandomadmin.com
#>
[cmdletbinding()]
param (
[Parameter(
ValueFromPipeline = $True,
ValueFromPipelineByPropertyName = $True,
HelpMessage = "Provide the target hostname",
Mandatory = $true)][Alias('Hostname', 'cn')][String[]]$Computername,
[Parameter(HelpMessage = "Allows for custom Credential.")][System.Management.Automation.PSCredential]$Credential
)
$parameters = @{
ComputerName = $ComputerName
ScriptBlock = {
Enable-NetFirewallRule -DisplayGroup 'Remote Desktop'
Set-ItemProperty ‘HKLM:\SYSTEM\CurrentControlSet\Control\Terminal Server\‘ -Name “fDenyTSConnections” -Value 0
Set-ItemProperty ‘HKLM:\SYSTEM\CurrentControlSet\Control\Terminal Server\WinStations\RDP-Tcp\‘ -Name “UserAuthentication” -Value 1
}
}
if ($PSBoundParameters.ContainsKey('Credential')) { $parameters += @{Credential = $Credential } }
Invoke-Command @parameters
}
The breakdown
Comments/Documentation
The first part of this tool is the in-house Documentation. Here is where you can give an overview, description, parameters, examples, and more. Using the command Get-help will produce the needed information above. On a personal level, I like adding the type inside the parameters. I also like putting the author and date inside the Notes field.
Parameters
We are using two parameters. A computer name parameter and a credential parameter. The ComputerName parameter contains a few parameter flags. The first is the Value from Pipeline flags. This allows us to pipe data to the function. The next is the Value From Pipeline by Property name. This allows us to pass the “ComputerName” Value. Thus we can pull from an excel spreadsheet or a list of computer names. Next, we have the Help Message which is just like it sounds. It’s a small help message that can be useful to the end user. Finally, we have the Mandatory flag. As this command is dependent on that input, we need to make this mandatory. The next item in computername is the Alias. This allows us to use other names. In this example, we are using the hostname or the CN. This is just a little something that helps the end user. Finally, we have the type. This is a list of strings which means we can target more than one computer at a time.
The next parameter is the Credential Parameter. This one is unique. The only flag we have here is the Hel message. The type is a little different. The type is a System Management Automation PSCredential. And yes, it’s complex. A simple run down is, use Get-Credentials here. This function is designed to be automated with this feature. If you are using a domain admin account, you may not need to use this. However, if you are working on computers in a different domain, and don’t have rights, you can trigger this parameter.
param (
[Parameter(
ValueFromPipeline = $True,
ValueFromPipelineByPropertyName = $True,
HelpMessage = "Provide the target hostname",
Mandatory = $true)][Alias('Hostname', 'cn')][String[]]$Computername,
[Parameter(HelpMessage = "Allows for custom Credential.")][System.Management.Automation.PSCredential]$Credential
)
The Script Block
Now we need to create the script block that will be used inside the invoke-command. We are going to build out a splat. We build splats with @{}. The Information will be inside here. When we push a splat into a command we need to add each flag from that command that is required. Here we are going to be adding the computer name and script block. The computer name flag is a list of strings for our invoke-command. Thus, we can drop the Computername into the ComputerName. Yeah, that’s not confusing. The script block is where the action is.
$parameters = @{
ComputerName = $ComputerName
ScriptBlock = {
#Do something
}
}
Let’s open up some knowledge. The first thing we need to do is enable the remote desktop firewall rules. This will allow the remote desktop through the firewall.
Enable-NetFirewallRule -DisplayGroup 'Remote Desktop'
Next, we need to add the registry keys. The first key is to disable the deny TS connection keys. Next, we need to enable the User Authentication key.
Set-ItemProperty ‘HKLM:\SYSTEM\CurrentControlSet\Control\Terminal Server\‘ -Name “fDenyTSConnections” -Value 0
Set-ItemProperty ‘HKLM:\SYSTEM\CurrentControlSet\Control\Terminal Server\WinStations\RDP-Tcp\‘ -Name “UserAuthentication” -Value 1
Adding Credentials
Now we have created the Parameter, it’s time to add credentials when needed. We do this by asking if the parameter credentials was added. This is done through the PSBoundParameters variable. We search the Contains Key method and to see if Credential is set. If it is, Then we add the credentials.
if ($PSBoundParameters.ContainsKey('Credential')) { $parameters += @{Credential = $Credential } }
Finally, we invoke the command. We are using the splat which is the @parameters variable instead of the $parameter.
Invoke-Command @parameters
And that’s how you can quickly Enable RDP on a Remote Computer using PowerShell within your domain.
Additional Reading
by David | Oct 21, 2022 | Help Desk, Information Technology, PowerShell
During my in-house days, one of the things I had to do constantly was clear people’s print jobs. So I learned to Clear Print Jobs with Powershell to make my life easier. It surely did. With PowerShell I could remotely clear the print jobs as most of my machines were on the primary domain. All you need to know was the server and the printer’s name. The server could be the computer in question if it’s a local printer. Or it could be the print server. Wherever the queue is being held.
The Script
function Clear-SHDPrintJobs {
[cmdletbinding()]
param (
[parameter(HelpMessage = "Target Printer", Mandatory = $true)][alias('Name', 'Printername')][String[]]$name,
[parameter(HelpMessage = "Computer with the printer attached.", Mandatory = $true)][alias('Computername', 'Computer')][string[]]$PrintServer
)
foreach ($Print in $PrintServer) {
foreach ($N in $name) {
$Printers = Get-Printer -ComputerName $Print -Name "*$N*"
Foreach ($Printer in $Printers) {
$Printer | Get-PrintJob | Remove-PrintJob
}
}
}
}
The Breakdown
It’s time to break down this code. The parameters are set to mandatory. As you need this information. Notice, both parameters are lists of strings. This means you can have the same printer name on multiple printers and trigger this command.
Parameters
We are using two parameters. The first is the name of the Printer. This parameter is a mandatory parameter. We are using the alias switch here as well. Basically, this switch gives different options to make the command easier to work with. The options are, Name and Printername. It is also a list of strings. This way we can import an array if need be. I’ll go over why that’s important later. Finally, we have a help message. This can be useful for the user to figure out what they need to do.
[parameter(HelpMessage = "Target Printer", Mandatory = $true)][alias('Name', 'Printername')][String[]]$name
The next parameter is like the first. It is a Mandatory parameter with alias options. The options are “ComputerName” and “Computer”. I set the name as “PrintServer” because I dealt with print servers most of the time. Once again we have a list of strings for multiple machines.
Foreach Loops
Next, we look at our function. This is where we Clear Print Jobs with PowerShell. There are three loops in total. The first foreach loop cycles through the print server list. So for each print server, we enter the next loop. The next loop consists of the Printer Names.
foreach ($Print in $PrintServer) {
foreach ($N in $name) {
#Do something
}
}
Inside the Name loop, we use the get-printer command. We target the print server and ask for a printer that contains the name we requested. Thus, if you use a *, you can clear out all the print jobs from that device. This is a very powerful option.
$Printers = Get-Printer -ComputerName $Print -Name "*$N*"
After gathering the printers from the server, we start another loop. This loop will be foreach printer we have from this server. We pipe the output from the Printers to Get-PrintJob. This allows us to see the print jobs. Then we pipe that information into Remove-PrintJob. This clears the print job.
$Printer | Get-PrintJob | Remove-PrintJob
That’s it for this function. It’s a great little tool that will change how you clear print jobs of buggy print systems.
Conclusion
In conclusion, I have used this function a few hundred times in my day. The environment was the domain level. This does not work for cloud print options.
Additional reading
Images made by MidJourney AI
by David | Oct 14, 2022 | Information Technology, PowerShell
Anyone who has been in IT long enough has performed a Ping Test. A simple ping and IP address. Most of us have used the “Ping 8.8.8.8 >> c:\pingtest.txt” to see how many times something failed. But did you know you can do the same thing with PowerShell? It’s also much cleaner and easier to understand. Unlike simple ping, we can see when the outage was, not just there was one. This helps to find logs easier.
The Script – Ping Test with PowerShell
function Test-Ping {
<#
.SYNOPSIS
Sends an email using custom SMTP settings.
.DESCRIPTION
Sends an email using custom SMTP settings.
.PARAMETER SecondsToRun
[Int] How long the script will run. This will add the number of seconds to the current time. Then we will loop based off that input. Default is 24 hours
.PARAMETER Outfile
[string] The full path of where the CSV will go.
.PARAMETER SecondsToSleep
[int] How many second will be between each test. Default is 1
.PARAMETER IPaddress
[ipaddress] The IP address that we want to test. Default is 8.8.8.8
.PARAMETER Show
[switch]Displays the output of the ping.
.EXAMPLE
Test-Ping -SecondsToSleep 1 -IPaddress 8.8.4.4 -SecondsToRun 24 -Outfile c:\temp\google1.csv -show
Pings 8.8.4.4 for 24 seconds at a one second interval. It appends the ping log to c:\temp\google1.csv and shows the output
DateTime IPaddress Latency Status
-------- --------- ------- ------
2022-09-27_08_47_01 8.8.4.4 21 Success
2022-09-27_08_47_02 8.8.4.4 11 Success
2022-09-27_08_47_03 8.8.4.4 10 Success
2022-09-27_08_47_04 8.8.4.4 10 Success
2022-09-27_08_47_05 8.8.4.4 10 Success
2022-09-27_08_47_06 8.8.4.4 10 Success
2022-09-27_08_47_07 8.8.4.4 10 Success
.EXAMPLE
Test-Ping -SecondsToSleep 1 -IPaddress 8.8.4.4 -SecondsToRun 7
DateTime IPaddress Latency Status
-------- --------- ------- ------
2022-09-27_08_48_56 8.8.4.4 11 Success
2022-09-27_08_48_57 8.8.4.4 10 Success
2022-09-27_08_48_58 8.8.4.4 10 Success
2022-09-27_08_48_59 8.8.4.4 12 Success
2022-09-27_08_49_00 8.8.4.4 10 Success
2022-09-27_08_49_01 8.8.4.4 10 Success
2022-09-27_08_49_02 8.8.4.4 11 Success
If you choose not to have an output of any type, the show will happen to have an output.
.OUTPUTS
PowerShell Custom Object
.NOTES
Author: David Bolding
Date: 09/27/2022
.LINK
https:://therandomadmin.com
#>
[cmdletbinding()]
param (
[int]$SecondsToRun,
[string]$Outfile,
[int]$SecondsToSleep,
[ipaddress]$IPaddress,
[switch]$Show
)
#Determines if Seconds was delcared.
#if it wasn't, add 24 hours. If it was, add those seconds.
if ($PSBoundParameters.ContainsKey('SecondsToRun')) {
$Time = (Get-date).AddSeconds($SecondsToRun)
}
else {
$Time = (Get-date).AddSeconds(86400)
}
#While current time is less than or equal to the added seconds.
while ((get-date) -le $time) {
#if IPaddress exists, we ping that ip address with test-connection with a count 1
#if not, we ping google.
if ($PSBoundParameters.ContainsKey('IPaddress')) {
$Test = Test-Connection $IPaddress -count 1
}
else {
$Test = Test-Connection 8.8.8.8 -count 1
}
#Grabs current datetime in a readable string
$DateTime = (Get-date).ToString("yyyy-MM-dd_hh_mm_ss")
#Creates object
$Results = [pscustomobject][ordered]@{
DateTime = $DateTime
IPaddress = $test.address.IPAddressToString
Latency = $test.Latency
Status = $test.Status
}
#If outfile exists, drop the csv there and append
#if not, display
if ($PSBoundParameters.ContainsKey('Outfile')) {
if ($Show) {
$Results | Export-Csv -Path $Outfile -Append -NoClobber -NoTypeInformation
$Results
}
else {
$Results | Export-Csv -Path $Outfile -Append -NoClobber -NoTypeInformation
}
}
else {
#If no output is selected, we force an output with the else.
$Results
}
#if secondstosleep exists, we sleep by that many, if not, we sleep for 1 second.
if ($PSBoundParameters.ContainsKey('SecondsToSleep')) {
start-sleep -Seconds $SecondsToSleep
}
else {
Start-Sleep -Seconds 1
}
}
}
The Breakdown
This function is designed to be quick. The idea is to quickly type Test-Ping and off to the races we go. You can set each parameter accordingly. We are going to work with 5 parameters. Firstly, “SecondsToRun” is for how long you want this script to run. Next, we have the “Outfile” parameter. This parameter is a full file path string. For example, c:\temp\pingtest.csv. Aftward, we have “SecondsToSleep”. This will be how long between intervals. Next, we want an IP address using the “IPaddress”. Finally, we have the show switch. So, if you select the csv output, if you want to see the output at the same time, you can trigger the show flag.
if ($PSBoundParameters.ContainsKey('SecondsToRun')) {
$Time = (Get-date).AddSeconds($SecondsToRun)
} else {
$Time = (Get-date).AddSeconds(86400)
}
The first part of the script is using the PSBoundParameter. This little guy lets you review what the end user puts into the function. In this case, we are looking for the key SecondToRun. If we have this key, we get the time that many seconds from now and place it into a variable. If we key isn’t present, we create a Time variable of 24 hours from now. Another way to go about this is by default the variable at the start. Then dropping it there.
The Loop
while ((get-date) -le $time) { #Do Something }
Afterward, we enter the loop. The idea behind the look is the current time has to be less than or equal to the $time variable we made in our last step.
if ($PSBoundParameters.ContainsKey('IPaddress')) {
$Test = Test-Connection $IPaddress -count 1
} else {
$Test = Test-Connection 8.8.8.8 -count 1
}
Like before, we are using the PSboundParameters and we are looking for the IPaddress. Here, if we have the IP address stated, we use the test-connection command and grab information about the IP address. if now, we use google. Like before, you can default the parameter as well to achieve the same results. It’s dependent on your goals.
#Grabs current datetime in a readable string
$DateTime = (Get-date).ToString("yyyy-MM-dd_hh_mm_ss")
#Creates object
$Results = [pscustomobject][ordered]@{
DateTime = $DateTime
IPaddress = $test.address.IPAddressToString
Latency = $test.Latency
Status = $test.Status
}
Next, we grab the current date and create our PowerShell Custom Object. This is where it differs from simple ping and useful data. We have a date time. We have the latency and the status. As this is a PowerShell custom object, we can export this data, sort this data, and even more.
#If outfile exists, drop the csv there and append
#if not, display
if ($PSBoundParameters.ContainsKey('Outfile')) {
if ($Show) {
$Results | Export-Csv -Path $Outfile -Append -NoClobber -NoTypeInformation
$Results
} else {
$Results | Export-Csv -Path $Outfile -Append -NoClobber -NoTypeInformation
}
} else {
#If no output is selected, we force an output with the else.
$Results
}
What we do here with the data is we check if we want to output this data. if we do, we just dump it. If we want this data to show, we ask it to show with the show tag. Notice the -append flag on the export-csv. This means we are going to be adding to the file instead of rebuilding it. Thus a log. Now, if you don’t flag anything, we will have it show. This way we are not getting null.
#if secondstosleep exists, we sleep by that many, if not, we sleep for 1 second.
if ($PSBoundParameters.ContainsKey('SecondsToSleep')) {
Start-sleep -Seconds $SecondsToSleep
} else {
Start-Sleep -Seconds 1
}
Finally, we sleep. This is our interval ratio. Like before, if it doesn’t exist, we use 1 second. If it does, we use that input. This could be done using the same default method.
Conclusion
In conclusion, this is designed to be part of your toolbox. So, if you want to ping your DNS server for 24 hours and see if there is anything wrong you can log it outside of the shell. I would suggest this function be in the profile of every machine on your network as it can save so much time with troubleshooting. It’s easy doing a Ping Test with PowerShell.
Additional reading:
Images by MidJourney AI
by David | Oct 7, 2022 | Deployments, Information Technology, PowerShell
When building out scripts, we must consider different ways they will fail. One of the ways I have seen them fail is through the UAC of a computer. The script needs to be run by an administrator. The question is, How do you check if you are running as an Administrator? Here are the two ways I like doing this check.
The Comment Requires it
Powershell has a handy little feature called #Requires. The idea is simple, you place a #Requires at the top of your script. I suggest looking at the official documentation because there is a lot you can do. As of PowerShell 4, #Requires -RunAsAdministrator is a thing. Having this requirement at the start will tell the shell to fail out.
Powershell Checks
The next method is using PowerShell to check if the current shell is administrator through the security protocols of windows. This method only uses two lines of code to produce a true or false statement. Thus, it’s best to keep it inside a function for later use.
function Test-Administrator {
$user = [Security.Principal.WindowsIdentity]::GetCurrent();
(New-Object Security.Principal.WindowsPrincipal $user).IsInRole([Security.Principal.WindowsBuiltinRole]::Administrator)
}
The first part is grabbing the current user of the terminal. We store that information and then create a new object. We create a security principal windows Principal object. Here we can check what the user’s role was and if it was the built-in administrator role.
There we have it, how to test if a script is running as admin.
Taking it an additional step forward
Let’s take this script to the next level by adding a restart in admin mode. The following code can be used to restart any terminal session into admin mode. However, it breaks in vs code.
$CurrentProcess = [System.Diagnostics.Process]::GetCurrentProcess()
$CurrentProcessID = New-Object System.Diagnostics.ProcessStartInfo $CurrentProcess.Path
$CurrentProcessID.Arguments = '-file ' + $script:MyInvocation.MyCommand.Path
$CurrentProcessID.Verb = "runas"
[System.Diagnostics.Process]::Start($CurrentProcessID) | Out-Null
[Environment]::Exit(0)
The first part of this script catches the current process information. Then we pass that information into a new system diagnostic object to find the process start information. Next, we change the arguments to start a file and the current script’s name. We also set the verb to runas to trigger a run as administrator call. Then we start the process that we created and close to this current process. Afterward, the script will run as admin.
The Script – Run as an Administrator
function Test-Administrator {
$user = [Security.Principal.WindowsIdentity]::GetCurrent();
(New-Object Security.Principal.WindowsPrincipal $user).IsInRole([Security.Principal.WindowsBuiltinRole]::Administrator)
}
function Invoke-RunAsAdministrator {
[cmdletbinding()]
param (
[parameter(Mandatory = $true)][boolean]$Admin
)
if (!$Admin) {
$CurrentProcess = [System.Diagnostics.Process]::GetCurrentProcess()
$CurrentProcessID = New-Object System.Diagnostics.ProcessStartInfo $CurrentProcess.Path
$CurrentProcessID.Arguments = '-file ' + $script:MyInvocation.MyCommand.Path
$CurrentProcessID.Verb = "runas"
[System.Diagnostics.Process]::Start($CurrentProcessID) | Out-Null
[Environment]::Exit(0)
}
else {
Write-Verbose "Admin Rights Present"
}
}
Invoke-RunAsAdministrator -Admin (Test-Administrator)
Read-Host "Press any key to continue"
Now to make this more practical. Add the above code to the Citrix Workspace Installer. Then wrap the script up into an EXE with the PS1toExe program. Finally, add the little script program to your toolbox for future use.
Image by MidJourney AI
by David | Sep 30, 2022 | Information Technology, PowerShell
The other day I needed to test if a registry key was present on an end user’s computer and make it if it didn’t exist. I performed a registry key value test with PowerShell. Since I was doing more than one, I pulled an older tool from my tool box for this one. It’s small but easy to use.
The Script
function Test-RegistryValue {
param (
[parameter(Mandatory = $true)][ValidateNotNullOrEmpty()][string]$Path,
[parameter(Mandatory = $true)][ValidateNotNullOrEmpty()][string]$Value,
[switch]$ShowValue
)
try {
$Values = Get-ItemProperty -Path $Path | select-object -ExpandProperty $Value -ErrorAction Stop
if ($ShowValue) {
$Values
}
else {
$true
}
}
catch {
$false
}
}
The Breakdown
This script is only a “Try Catch” with some added inputs. We are first grabbing what we want to test in our parameters. We have two mandatory strings and a switch. The first string is for the path in which we are going to test. The second is the value we want to test. for example, if we want to see if google earth has a version number, we would give the path of HKLM:\Software\Google\Google Earth Pro and the value of Version. If we want to see that version we flip the next part which is the only switch, show value. Instead of saying true or false it will say the value.
Try Catch
$Values = Get-ItemProperty -Path $Path | select-object -ExpandProperty $Value -ErrorAction Stop
Inside our try-catch box, we are using the Get-ItemProperty command. We select the $Value. Finally, we stop the command using the error action flag stop. This prevents the command from falling apart. Next, we throw all that information into the $Values parameter.
try {
$Values = Get-ItemProperty -Path $Path | select-object -ExpandProperty $Value -ErrorAction Stop
if ($ShowValue) {
$Values
} else {
$true
}
} catch {
$false
}
Aftward, we use a basic if statement. We show the value when the “showvalue” flag is set. However, if it’s not, we just send a true. Finally, the catch will tell us a false statement if the Get-ItemProperty command failed for any reason.
Conclusion
There are other ways to do this, but this was the quickest I have found. I added the show Value recently because I needed it for troubleshooting the code. Overall, this little guy is perfect to add to any script that deals with registry changes.
More Links:
by David | Sep 23, 2022 | Azure, Exchange, Information Technology, PowerShell, Resources
While reading on Reddit, I found a common thread. People need a quick way to do a Share Point File Audit. I have a PowerShell function for this in my toolbox. This tool heavily uses the Search-UnifiedAuditLog command let. The most common items I tend to audit are file modifications and deletions. This function goes through, modified, moved, renamed, downloaded, uploaded, accessed, synced, malware detection, restored from trash, locked, and finally unlocked. The Search-UnifiedAuditLog is an exchange online command at the time of this writing. Thus, you need to connect to exchange online. In this function, I am using the switch command. I will follow that structure for the breakdown. Lets first jump in with the function.
The Function
function Invoke-SharePointFileAudit {
[cmdletbinding()]
param (
[Parameter(Mandatory = $true)][validateset("Deleted", "Modified", "Moved", "Renamed", "Downloaded", "Uploaded", "Synced", "Accessed", "MalwareDetected", "Restored", "Locked", "unLocked")][string]$Type,
[parameter(Mandatory = $false)][switch]$KeepAlive,
[switch]$SharePointOnline,
[switch]$OneDrive,
[Nullable[DateTime]]$StartDate,
[Nullable[DateTime]]$EndDate,
[string]$Outfile,
[int]$ResultSize = 5000
)
Begin {
$Module = Get-Module ExchangeOnlineManagement -ListAvailable
if ($Module.count -eq 0) {Install-Module ExchangeOnlineManagement -Repository PSGallery -AllowClobber -Force}
$getsessions = Get-PSSession | Select-Object -Property State, Name
$isconnected = (@($getsessions) -like '@{State=Opened; Name=ExchangeOnlineInternalSession*').Count -gt 0
If ($isconnected -ne "false") {
try {
Connect-ExchangeOnline
}
catch {
Write-Error "Exchange Online Failed. Ending"
end
}
}
#Auto Generates Start and Finish dates
if ($Null -eq $StartDate) { $StartDate = ((Get-Date).AddDays(-89)).Date }
if ($Null -eq $EndDate) { $EndDate = (Get-Date).Date }
#Tests if end date is before start date.
if ($EndDate -lt $StartDate) { $StartDate = ((Get-Date).AddDays(-89)).Date }
if ($EndDate -gt (Get-Date).Date) { $EndDate = (Get-Date).Date }
}
Process {
switch ($Type) {
"Deleted" {
$DeletedRecords = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -Operations "FileDeleted,FileDeletedFirstStageRecycleBin,FileDeletedSecondStageRecycleBin,FileVersionsAllDeleted,FileRecycled" -SessionId deleted -SessionCommand ReturnLargeSet -ResultSize 5000
$Return = foreach ($DeletedRecord in $DeletedRecords) {
$JSONInfo = $DeletedRecord.AuditData | convertfrom-json
[pscustomobject][ordered]@{
TimeStampe = ($JSONInfo.creationtime).tolocaltime()
UserName = $DeletedRecord.UserIds
ClientIP = $JSONInfo.ClientIP
Source = $JSONInfo.EventSource
Workload = $JSONInfo.Workload
Operation = $JSONInfo.Operation
SiteURL = $JSONInfo.SiteURL
RelativeURL = $JSONInfo.SourceRelativeUrl
FileName = $JSONInfo.SourceFileName
ObjectID = $JSONInfo.ObjectId
}
}
}
"Modified" {
$ModifiedRecords = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -Operations "FileModified,FileModifiedExtended" -SessionId Modified -SessionCommand ReturnLargeSet -ResultSize 5000
$Return = foreach ($ModifiedRecord in $ModifiedRecords) {
$JSONInfo = $ModifiedRecord.AuditData | convertfrom-json
[pscustomobject][ordered]@{
TimeStamp = ($JSONInfo.creationtime).tolocaltime()
UserName = $ModifiedRecord.UserIds
ClientIP = $JSONInfo.ClientIP
Source = $JSONInfo.EventSource
Workload = $JSONInfo.Workload
Operation = $JSONInfo.Operation
SiteURL = $JSONInfo.SiteURL
RelativeURL = $JSONInfo.SourceRelativeUrl
FileName = $JSONInfo.SourceFileName
ObjectID = $JSONInfo.ObjectId
}
}
}
"Moved" {
$MovedRecords = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -Operations "FileMoved" -SessionId Moved -SessionCommand ReturnLargeSet -ResultSize 5000
$Return = foreach ($MovedRecord in $MovedRecords) {
$JSONInfo = $MovedRecord.AuditData | convertfrom-json
[pscustomobject][ordered]@{
TimeStamp = ($JSONInfo.creationtime).tolocaltime()
UserName = $MovedRecord.UserIds
ClientIP = $JSONInfo.ClientIP
Source = $JSONInfo.EventSource
Workload = $JSONInfo.Workload
Operation = $JSONInfo.Operation
SiteURL = $JSONInfo.SiteURL
SourceRelativeURL = $JSONInfo.SourceRelativeUrl
DestinationRelativeURL = $JSONInfo.DestinationRelativeURL
FileName = $JSONInfo.SourceFileName
ObjectID = $JSONInfo.ObjectId
}
}
}
"Renamed" {
$RenamedRecords = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -Operations "FileRenamed" -SessionId Renamed -SessionCommand ReturnLargeSet -ResultSize 5000
$Return = foreach ($RenamedRecord in $RenamedRecords) {
$JSONInfo = $RenamedRecord.AuditData | convertfrom-json
[pscustomobject][ordered]@{
TimeStamp = ($JSONInfo.creationtime).tolocaltime()
UserName = $RenamedRecord.UserIds
ClientIP = $JSONInfo.ClientIP
Source = $JSONInfo.EventSource
Workload = $JSONInfo.Workload
Operation = $JSONInfo.Operation
SiteURL = $JSONInfo.SiteURL
SourceRelativeURL = $JSONInfo.SourceRelativeUrl
SourceFileName = $JSONInfo.SourceFileName
DestinationFileName = $JSONInfo.DestinationFileName
ObjectID = $JSONInfo.ObjectId
}
}
}
"Downloaded" {
$DownloadedRecords = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -Operations "FileDownloaded" -SessionId Downloaded -SessionCommand ReturnLargeSet -ResultSize 5000
$Return = foreach ($DownloadedRecord in $DownloadedRecords) {
$JSONInfo = $DownloadedRecord.AuditData | convertfrom-json
[pscustomobject][ordered]@{
TimeStamp = ($JSONInfo.creationtime).tolocaltime()
UserName = $DownloadedRecord.UserIds
ClientIP = $JSONInfo.ClientIP
Source = $JSONInfo.EventSource
Workload = $JSONInfo.Workload
Operation = $JSONInfo.Operation
SiteURL = $JSONInfo.SiteURL
SourceRelativeURL = $JSONInfo.SourceRelativeUrl
SourceFileName = $JSONInfo.SourceFileName
ObjectID = $JSONInfo.ObjectId
}
}
}
"Uploaded" {
$UploadedRecords = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -Operations "FileUploaded" -SessionId Uploaded -SessionCommand ReturnLargeSet -ResultSize 5000
$Return = foreach ($UploadedRecord in $UploadedRecords) {
$JSONInfo = $UploadedRecord.AuditData | convertfrom-json
[pscustomobject][ordered]@{
TimeStamp = ($JSONInfo.creationtime).tolocaltime()
UserName = $UploadedRecord.UserIds
ClientIP = $JSONInfo.ClientIP
Source = $JSONInfo.EventSource
Workload = $JSONInfo.Workload
Operation = $JSONInfo.Operation
SiteURL = $JSONInfo.SiteURL
SourceRelativeURL = $JSONInfo.SourceRelativeUrl
SourceFileName = $JSONInfo.SourceFileName
ObjectID = $JSONInfo.ObjectId
}
}
}
"Synced" {
$SyncedRecords = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -Operations "FileSyncDownloadedFull,FileSyncUploadedFull" -SessionId Synced -SessionCommand ReturnLargeSet -ResultSize 5000
$Return = foreach ($SyncedRecord in $SyncedRecords) {
$JSONInfo = $SyncedRecord.AuditData | convertfrom-json
[pscustomobject][ordered]@{
TimeStamp = ($JSONInfo.creationtime).tolocaltime()
UserName = $SyncedRecord.UserIds
ClientIP = $JSONInfo.ClientIP
Source = $JSONInfo.EventSource
Workload = $JSONInfo.Workload
Operation = $JSONInfo.Operation
SiteURL = $JSONInfo.SiteURL
SourceRelativeURL = $JSONInfo.SourceRelativeUrl
SourceFileName = $JSONInfo.SourceFileName
ObjectID = $JSONInfo.ObjectId
}
}
}
"Accessed" {
$AccessedRecords = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -Operations "FileAccessed,FileAccessedExtended" -SessionId Accessed -SessionCommand ReturnLargeSet -ResultSize 5000
$Return = foreach ($AccessedRecord in $AccessedRecords) {
$JSONInfo = $AccessedRecord.AuditData | convertfrom-json
[pscustomobject][ordered]@{
TimeStamp = ($JSONInfo.creationtime).tolocaltime()
UserName = $AccessedRecord.UserIds
ClientIP = $JSONInfo.ClientIP
Source = $JSONInfo.EventSource
Workload = $JSONInfo.Workload
Operation = $JSONInfo.Operation
SiteURL = $JSONInfo.SiteURL
SourceRelativeURL = $JSONInfo.SourceRelativeUrl
SourceFileName = $JSONInfo.SourceFileName
ObjectID = $JSONInfo.ObjectId
}
}
}
"MalwareDetected" {
$MalewareRecords = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -Operations "FileMalwareDetected" -SessionId MalewareRecords -SessionCommand ReturnLargeSet -ResultSize 5000
$Return = foreach ($MalewareRecord in $MalewareRecords) {
$JSONInfo = $MalewareRecord.AuditData | convertfrom-json
[pscustomobject][ordered]@{
TimeStamp = ($JSONInfo.creationtime).tolocaltime()
UserName = $MalewareRecord.UserIds
ClientIP = $JSONInfo.ClientIP
Source = $JSONInfo.EventSource
Workload = $JSONInfo.Workload
Operation = $JSONInfo.Operation
SiteURL = $JSONInfo.SiteURL
RelativeURL = $JSONInfo.SourceRelativeUrl
FileName = $JSONInfo.SourceFileName
ObjectID = $JSONInfo.ObjectId
}
}
}
"Restored" {
$RestoredRecords = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -Operations "FileRestored" -SessionId RestoredRecords -SessionCommand ReturnLargeSet -ResultSize 5000
$Return = foreach ($RestoredRecord in $RestoredRecords) {
$JSONInfo = $RestoredRecord.AuditData | convertfrom-json
[pscustomobject][ordered]@{
TimeStamp = ($JSONInfo.creationtime).tolocaltime()
UserName = $RestoredRecord.UserIds
ClientIP = $JSONInfo.ClientIP
Source = $JSONInfo.EventSource
Workload = $JSONInfo.Workload
Operation = $JSONInfo.Operation
SiteURL = $JSONInfo.SiteURL
RelativeURL = $JSONInfo.SourceRelativeUrl
FileName = $JSONInfo.SourceFileName
ObjectID = $JSONInfo.ObjectId
}
}
}
"Locked" {
$LockedRecords = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -Operations "LockRecord" -SessionId Locked -SessionCommand ReturnLargeSet -ResultSize 5000
$Return = foreach ($LockedRecord in $LockedRecords) {
$JSONInfo = $LockedRecord.AuditData | convertfrom-json
[pscustomobject][ordered]@{
TimeStamp = ($JSONInfo.creationtime).tolocaltime()
UserName = $LockedRecord.UserIds
ClientIP = $JSONInfo.ClientIP
Source = $JSONInfo.EventSource
Workload = $JSONInfo.Workload
Operation = $JSONInfo.Operation
SiteURL = $JSONInfo.SiteURL
RelativeURL = $JSONInfo.SourceRelativeUrl
FileName = $JSONInfo.SourceFileName
ObjectID = $JSONInfo.ObjectId
}
}
}
"unLocked" {
$unLockedRecords = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -Operations "UnlockRecord" -SessionId UnlockRecord -SessionCommand ReturnLargeSet -ResultSize 5000
$Return = foreach ($unLockedRecord in $unLockedRecords) {
$JSONInfo = $unLockedRecord.AuditData | convertfrom-json
[pscustomobject][ordered]@{
TimeStamp = ($JSONInfo.creationtime).tolocaltime()
UserName = $unLockedRecord.UserIds
ClientIP = $JSONInfo.ClientIP
Source = $JSONInfo.EventSource
Workload = $JSONInfo.Workload
Operation = $JSONInfo.Operation
SiteURL = $JSONInfo.SiteURL
RelativeURL = $JSONInfo.SourceRelativeUrl
FileName = $JSONInfo.SourceFileName
ObjectID = $JSONInfo.ObjectId
}
}
}
}
}
end {
if (!($SharePointOnline -and $OneDrive) -or ($SharePointOnline -and $OneDrive)) {
if ($PSBoundParameters.ContainsKey("OutFile")) {
$Return | Export-Csv ./$Outfile.CSV
}
else {
$Return
}
}
elseif ($SharePointOnline) {
if ($PSBoundParameters.ContainsKey("OutFile")) {
$Return | where-object { $_.workload -like "SharePoint" } | Export-Csv ./$Outfile.CSV
}
else {
$Return | where-object { $_.workload -like "SharePoint" }
}
}
elseif ($OneDrive) {
if ($PSBoundParameters.ContainsKey("OutFile")) {
$Return | where-object { $_.workload -like "OneDrive" } | Export-Csv ./$Outfile.CSV
}
else {
$Return | where-object { $_.workload -like "OneDrive" }
}
}
if (!($KeepAlive)) {
Disconnect-ExchangeOnline -Confirm:$false -InformationAction Ignore -ErrorAction SilentlyContinue
}
}
}
The Breakdown of Share Point File Audit
I’m glad you came to the breakdown. It means you want to know how the code works. This means you truly care about learning. Thank you. This code repeats itself a few times in different ways. So, I will call out the differences, but not the likes after the first time explaining something. The first section is our Parameters.
Parameters
We have 8 Parameters, and only one of them is mandatory. Firstly, we have the Type parameter. This mandatory validate set allows you to select from a list of commands we will be using in this function.
- Deleted
- Modified
- Created
- Moved
- Renamed
- Downloaded
- Uploaded
- Synced
- Accessed
- MalwareDetected
- Restored
- Locked
- UnLocked
Afterward, we have Keep Alive. This allows us to run the command multiple times without signing back into the system. So, if you want to keep your session alive, flip that flag. Next, we have two switches. The first Switch is to pull only items edited in SharePoint itself. The next is for one drive. They are named accordingly. After that, we have a start date and an end date. These values are nullable. Basically, you don’t need them. The outfile is asking for just the name of the file. We are using the “./” to save it wherever you run the command from. Finally, we have the result size. If you want the max number of results, 5000. However, you can make this number smaller.
Begin
In our begin section, we want to test the Exchange Online Management Module. Secondly, we want to validate exchange connectivity. After that, we want to gather the date information for the start and end dates. Let’s take a look at the exchange part first.
$Module = Get-Module ExchangeOnlineManagement -ListAvailable
The Get-Module command works with PowerShell 5.1. However, I have seen PowerShell flak with this command failing to pull the information. I am going to assume your PowerShell is up to date with your current version.
if ($Module.count -eq 0) {
Install-Module ExchangeOnlineManagement -Repository PSGallery -AllowClobber -Force
}
Afterward, we want to install the exchange online management module if we don’t detect the module. We are using the count to see how many objects are inside our module variable. If it’s 0, it’s time to install. We install it from the PSGallery.
$getsessions = Get-PSSession | Select-Object -Property State, Name
$isconnected = (@($getsessions) -like '@{State=Opened; Name=ExchangeOnlineInternalSession*').Count -gt 0
Now, we test exchange connections. We use the Get-PSSession to review the current connections. Next, we test if the connections with the name “ExchangeOnlineInternalSession” is greater than zero. “isconnected” will produce a true or false statement.
If ($isconnected -ne "false") {
try {
Connect-ExchangeOnline
} catch {
Write-Error "Exchange Online Failed. Ending"
end
}
}
After which, we can test with. False, we try to connect. However, if there is an error, we end the script and let the user know. We are not using a credential object to authenticate because MFA should always be a thing.
#Auto Generates Start and Finish dates
if ($Null -eq $StartDate) { $StartDate = ((Get-Date).AddDays(-89)).Date }
if ($Null -eq $EndDate) { $EndDate = (Get-Date).Date }
#Tests if end date is before start date.
if ($EndDate -lt $StartDate) { $StartDate = ((Get-Date).AddDays(-89)).Date }
if ($EndDate -gt (Get-Date).Date) { $EndDate = (Get-Date).Date }
Afterward, we need to get the dates right. If the start date is null, we are going to pull 90 days back. We do this by using the standard. We do the same with the end date. If it’s null, we grab today’s date. Now to prevent errors, we check the start date and end date. The end date can’t be before the start date. This is similar to the end date. The end date can’t be greater than the current date. We use the if statement to resolve this.
Process
We begin the process by looking directly at our “Type” variable by using a switch command. The switch allows us to go through each “Type” and run the commands accordingly. Let’s look at one of the switch processes.
$DeletedRecords = Search-UnifiedAuditLog -StartDate $StartDate -EndDate $EndDate -Operations "FileDeleted,FileDeletedFirstStageRecycleBin,FileDeletedSecondStageRecycleBin,FileVersionsAllDeleted,FileRecycled" -SessionId deleted -SessionCommand ReturnLargeSet -ResultSize 5000
$Return = foreach ($DeletedRecord in $DeletedRecords) {
$JSONInfo = $DeletedRecord.AuditData | convertfrom-json
[pscustomobject][ordered]@{
TimeStampe = ($JSONInfo.creationtime).tolocaltime()
UserName = $DeletedRecord.UserIds
ClientIP = $JSONInfo.ClientIP
Source = $JSONInfo.EventSource
Workload = $JSONInfo.Workload
Operation = $JSONInfo.Operation
SiteURL = $JSONInfo.SiteURL
RelativeURL = $JSONInfo.SourceRelativeUrl
FileName = $JSONInfo.SourceFileName
ObjectID = $JSONInfo.ObjectId
}
}
The data that search-unifiedauditlog produces a section called “AuditData”. This section has almost every piece of information you will need. The difference between each “Type” will be the Operations, and session id. The operations target the required logs. This creates the backbone of the Share Point File Audit. The graph below will show which operations I am using. Once you gather the operation information, we need to pull the AuditData. This data will be in JSON format. We start off by looping the records with a for each loop. Then we pull the auditdata and pipe it into convertfrom-json. Next, we create our PS Custom Object. Other than Moved, the output of the other logs contains almost the same information. See the script for the information.
Operation Filters
- Deleted
- FileDeleted
- FileDeletedFirstStageRecycleBin
- FileDeletedSecondStageRecycleBin
- FileVersionsAllDeleted
- FileRecycled
- Modified
- FileModified
- FileModifiedExtended
- Moved
- Renamed
- Downloaded
- Uploaded
- Synced
- FileSyncDownloadedFull
- FileSyncUploadedFull
- Accessed
- FileAccessed
- FileAccessedExtended
- MalwareDetected
- Restored
- Locked
- UnLocked
End
Finally, it’s time for the end block. This is where we will present the data we have gathered. Firstly, we need to determine if the SharePoint or Onedrives were flipped or not.
if (!($SharePointOnline -and $OneDrive) -or ($SharePointOnline -and $OneDrive)) {
if ($PSBoundParameters.ContainsKey("OutFile")) {
$Return | Export-Csv ./$Outfile.CSV
} else {
$Return
}
}
Here we checking if both flags are not checked or if both flags are checked. Then we check if the user gave us a filename. If they did, we export our report to a csv file wherever we are executing the function from. However, if the user didn’t give us a filename, we just dump all the results.
elseif ($SharePointOnline) {
if ($PSBoundParameters.ContainsKey("OutFile")) {
$Return | where-object { $_.workload -like "SharePoint" } | Export-Csv ./$Outfile.CSV
}
else {
$Return | where-object { $_.workload -like "SharePoint" }
}
}
elseif ($OneDrive) {
if ($PSBoundParameters.ContainsKey("OutFile")) {
$Return | where-object { $_.workload -like "OneDrive" } | Export-Csv ./$Outfile.CSV
}
else {
$Return | where-object { $_.workload -like "OneDrive" }
}
}
if (!($KeepAlive)) {
Disconnect-ExchangeOnline -Confirm:$false -InformationAction Ignore -ErrorAction SilentlyContinue
}
Now, if the user selected either or, we present that information. We present those infos by using a where-object. Like before we ask if the user produced an outfile. Finally, we ask if keep alive was set. If it wasn’t we disconnect from the exchange.
Conclusion
In conclusion, auditing shouldn’t be difficult. We can quickly pull the info we need. I hope you enjoy this powerful little tools.