Handle with PowerShell

Handle with PowerShell

Lets talk about Handle. Handle is an amazing program that allows you to see which program has access over a folder or file. This is a sysintel tool. Working with handle inside your powershell script is not a native thing. The first thing you will want to do is download handle.

We first create the folder we want handle to be downloaded in. In this case, the c:\temp folder will work. Notice we check first to see if it exists with the test path. We will continue this trend so we don’t have to go through the download and creation process repetitively.

if (!(Test-Path "c:\Temp")) { New-Item -Path "c:\" -Name Temp -ItemType Directory }

Now we test to see if handle has already been downloaded before. If not, we download it. We are going to be downloading the handled application from http://live.sysinternals.com/handle.exe All of the Sysinternals tools are on this website and you can programmatically download them at any time. We are going to save the handle.exe in our c:\temp folder we created a few seconds ago.

if (!(Test-Path "C:\temp\handle.exe")) {
        Invoke-WebRequest -Uri "http://live.sysinternals.com/handle.exe" -OutFile "c:\temp\handle.exe"  -UseBasicParsing -DisableKeepAlive
    } 

Now we have handle, it’s time to get a handle on handle inside PowerShell. As command prompt program it needs to be called from the command prompt. We want to capture the output. Thus, using something like start-process is out of the question and here is why. Start process starts another processes. It does not keep the process in the current window. Thus you can not capture that data without doing some PowerShell magic which may or may not work. So, what we do instead is use the cmd.exe itself. We will use the /c flag and then the path to the handle software.

$ProcessHandles = cmd.exe /c C:\temp\handle.exe -a -u "$FilePath" -accepteula

Let’s break this down a little more. We are starting the handle application with an -a. The A is dumping all the handle information. This is a ton of information. The -u shows the owning user name when searching for handles. So we are grabbing all the information and the user information. We want to do this because it gives us the programs as well. Then we give it the path of the folder we want. So we basically give it a target. Now we are pulling all the handle information from a target folder with the user name/process name. The final handle flag is -accepteula. This basically makes it more automated. We call the handle using the cmd.exe /c. This brings the command output into our terminal which we can capture by placing into the $processHandles. Bam, now we have a bunch of confusing string information. The next step is to parse this string. Here is what the string looks like:

Nthandle v4.22 - Handle viewer
Copyright (C) 1997-2019 Mark Russinovich
Sysinternals - www.sysinternals.com

WINWORD.EXE        pid: 14248  type: File          ACTIVEDIRECTORY\bolding  A34: C:\temp\Change Control.docx

Now we need to handle the handle strings. So we search each string for the file name or file path with a simple where-object. This should create an array of information.

$Handles = $ProcessHandles | where-object { $_ -like "*$FilePath*" }

In this case, we only have one, but we want to make sure it doesn’t break if there is more than one. So we start a foreach loop. We loop through each handle in our handles. Each handle loops like this:

WINWORD.EXE        pid: 14248  type: File          ACTIVEDIRECTORY\bolding  A34: C:\temp\Change Control.docx

They are split apart by spaces. So, what we are going to do is use the split features. We are going to then search each line for an *.exe as most programs are .exe at the end of the day. We could expand upon this, but we will leave it here at this level. Once we have the .exe we want to remove that .exe with the replace command. Here is what the code will look like so far.

foreach ($Handle in $handles) {
        $Process = ($handle.split(' ') | where-object { $_ -like "*.exe" }) -replace '.exe', ''
}

Notice how we pipe one command into another and then wrap it with the replace. Simple one-line power right there. From here we need to test if the $process is empty. We do this because if the file in question isn’t locked down, we don’t want to error out. So a simple, if null is not equal to process, is set. The goal is to push these items into a smart system that will kill the process. However, there is one item I have discovered over the years doing this that tends to get killed by going down this route and that’s explorer.exe. I have killed it more than once. This is why I place an exclusion for explorer.exe. To do this we just check if the name matches with another if statement. So here is what the code looks like so far for this loop.

$Tasks = foreach ($Handle in $handles) {
        $Process = ($handle.split(' ') | where-object { $_ -like "*.exe" }) -replace '.exe', ''
        if ($Null -ne $process) {
            if ($Process -notlike "explorer") {
                $Process
            }
        }
    }

Now, here is the fun part. We can kill these tasks from the script itself. All we have to do is loop it through and stop each process with a stop-process. I placed a kill switch in the parameters just for this. So, if the kill switch is true, then we loop through each task killing it. If not, then we just display the processes. It’s that simple. Here is what that code looks like:

if ($kill) {
        foreach ($Task in $tasks) {
            Stop-Process -name $Task -Force
        }
    } else {
        $Tasks
    }

It’s that time, let’s put it all together and make the script.

The Script

function Set-SHDLockedFileProcess {
    param (
        [String]$FilePath,
        [switch]$kill
    )
    if (!(Test-Path "c:\Temp")) { New-Item -Path "c:\" -Name Temp -ItemType Directory }
    if (!(Test-Path "C:\temp\handle.exe")) {
        Invoke-WebRequest -Uri "http://live.sysinternals.com/handle.exe" -OutFile "c:\temp\handle.exe"  -UseBasicParsing -DisableKeepAlive
    } 
    $ProcessHandles = cmd.exe /c C:\temp\handle.exe -a -u "$FilePath" -accepteula
    $Handles = $ProcessHandles | where-object { $_ -like "*$FilePath*" }
    $Tasks = foreach ($Handle in $handles) {
        $Process = ($handle.split(' ') | where-object { $_ -like "*.exe" }) -replace '.exe', ''
        if ($Null -ne $process) {
            if ($Process -notlike "explorer") {
                $Process
            }
        }
    }
    if ($kill) {
        foreach ($Task in $tasks) {
            Stop-Process -name $Task -Force
        }
    } else {
        $Tasks
    }
}
Finding Old Snapshots with PowerShell

Finding Old Snapshots with PowerShell

Do you need to find Old Snapshots on a hyper-v server? It’s super easy. So, today we will go through how to get some basic information that allows us to make judgment calls.

The Script – Find Old Snapshots

$Date = (Get-Date).AddDays(-7)
$Vms = Get-VM | where-object { $_.state -like "Running" } 
$Return = foreach ($VM in $Vms) {
    $SnapShots = $VM | Get-VMSnapshot
    foreach ($SnapShot in $SnapShots) {
        if ($snapshot.creationTime -lt $date) {
            [pscustomobject]@{
                SnapShotName         = $SnapShot.name
                SnapShotCreationDate = $SnapShot.CreationTime
                VituralMachine       = $SnapShot.VmName
                Host                 = $SnapShot.ComputerName
            }
        }
    }
}
$Return

The Breakdown

The first part of the script is getting the age requirements. In this case, we want to know anything older than 7 days. So we use the Get-Date command. We add -7 days and this will give us the date to compare by.

$Date = (Get-Date).AddDays(-7)

In this case, we only want the running machines. The reason I want running machines is that the powered-off machines might be in a decommissioning process or for other reasons. So we look at the state of each VM to see if it’s “Running”. We do this with a where-object.

$Vms = Get-VM | where-object { $_.state -like "Running" } 

Now we have the running VMs to work with, we want to get each one’s snapshot. We want to compare each snapshot to see if it’s older than the date. The information we want is the name of the snapshot, the snapshots creation date, the vm, and the hostname. So we start for each loop. Inside the look, we ask with an if statement the creation time is less than the date we created earlier. Then from there we create a PS custom object and pull out the information we want.

$Return = foreach ($VM in $Vms) {
    $SnapShots = $VM | Get-VMSnapshot
    foreach ($SnapShot in $SnapShots) {
        if ($snapshot.creationTime -lt $date) {
            [pscustomobject]@{
                SnapShotName         = $SnapShot.name
                SnapShotCreationDate = $SnapShot.CreationTime
                VituralMachine       = $SnapShot.VmName
                Host                 = $SnapShot.ComputerName
            }
        }
    }
}

Then finally we output the $return value. We can export this to a CSV and drop it into a file share. I personally do this with a nextcloud instance. You can read more about that here. Another option is to email the report using a Microsoft Graph API or an SMTP email system. Finally, if you have confidence in your choice, you can delete the VMs.

Conclusion

Running this script and combining it with the file drop and a few other pieces of automation changed how I worked with multiple clients. This was a good cleanup process and saved many of my clients’ much-needed storage space. Let me know how you use this code for your systems.

Install Perch with PowerShell

Install Perch with PowerShell

Perch is an event log tracker that can catch a lot of useful information. I like perch because it captures failed login information. It’s easy to sort and exportable. This is why many companies use the software. There are some gatchya’s with perch installs though. If you are installing it on a server, some services don’t auto start. Installing it from PowerShell also has a gatchya. This post is about how to install perch via PowerShell scripts. This method uses your token for the client’s site. Let’s Install Perch with PowerShell.

The Script

if (!(Test-Path "$($env:SystemDrive)\Temp")) { New-Item -Path "$env:SystemDrive\" -Name Temp -ItemType Directory }
$PerchURL = "https://cdn.perchsecurity.com/downloads/perch-log-shipper-latest.exe"
$PerchFullFileName = "$($env:SystemDrive)\Temp\perch-log-shipper-latest.exe"
Invoke-WebRequest -Uri $PerchURL -Outfile $PerchFullFileName -UseBasicParsing
start-process -FilePath "$PerchFullFileName" -ArgumentList @("/qn", 'OUTPUT="TOKEN"', 'VALUE="Your Token"') 
$Timeout = 0
do {
    $Timeout = $Timeout + 5
    start-sleep -Seconds 5
} Until (((Get-service -Name "Perch*" -ErrorAction SilentlyContinue).count -ge 2) -or ($Timeout -ge 500))

if ((Get-service -Name "Perch*" -ErrorAction SilentlyContinue).count -ge 2) {
    Get-Service -name perch-auditbeat | Set-Service -StartupType Automatic
    Get-Service -name perch-auditbeat | Start-Service
} else {
    Write-Error "Services did not install."
}

The Breakdown

Let’s break down the script. The first thing we do is create the download repo. I personally like to use the C:\Temp. Not all machines have a c:\. This is why I use the variable $Env:SystemDrive. If the OS drive is d, the code will add a D:\Temp. and so on and so forth.

The next line is the URL for the latest and greatest Perch installer. This keeps your download up to date. With that stated, this also means if they change something you will need to be able to catch that change. So you will need to stay up to date with their deployment. A good way to do that is by registering with their updates emails. I like to have a ticket every 3 to 6 months, randomly placed, to review deployments like this one. This is just a good habit.

Now we have the url, we want to create a path. Using the Env:SystemDrive we place the perch-log-shipper-latest.exe into the temp folder, our local repo. This will make the next command easier.

Now we invoke-webrequest this bad boy. Just like curl and wget, we are able to download the file. Using the PerchURL in the URI position and then the outfile will be the perchfullfilename. Of course, we use the basic parsing just in case it’s an older version of PowerShell. At the time of this writing, the default PowerShell is 5.1 on windows 10.

Now we start the installation. We start-process. Using the PerchFullFileName as the target. See, using parameters helps. Our argument list is /qn for quiet. The output is going to be the token. Finally our token value, value is our token from perch’s site.

Getting the token

To get the token, you will need to log into your perch system. At the top, select the company you wish to get the token from.

Next, you will need to click on the settings icon on the bottom left-hand corner. Then click the Network icon.

Normally we want to add a -wait flag to the end of the installer. Things like google chrome do great with the -wait flag. However, in this case, we don’t want to do that. The reason we create advanced checks is due to the multiple sub-processes inside of the perch install process. The wait flag only captures the first process.

Confirming Install Perch with Powershell

With all that, it’s time to confirm the installation. The most simple way to do this is by watching the services. Perch installs 2 services. It installs perch-auditbeat and perch-winlogbeat. During the confirmation process, we wait. If a timeout occurs, we get the error. If the application installs, we get the results. At this point, we want to start our time-out timer. That’s why we have a $timeout = 0. We then start a do until. Each time the system loops, we wait for 5 seconds. Then we add 5 to the timer. This effectively creates a timer for us. This process is repeated until the conditions are met. The services with the name perch* are greater than or equal to 2, or the time-out reaches 500.

If the services are installed or if the timeout is reached, we moved to the next step. By default, the auditbeat is set to manual. So we check to see if we have the two services. If we do, we then set the perch-auditbeat to automatic and start the service. If not, we throw an error saying the services did not install. This will alert the deployment engineer to dig deeper into this machine. From my experience, it’s because another application is installed.

Don’t forget to take a look at how you can install perch using intune.

Wait for service to appear – PowerShell

Wait for service to appear – PowerShell

This past week I had to install a piece of software that took 30 minutes to install. The software had multiple levels of processes that made the -wait feature completely useless. The best way to know the software was installed is to detect the service names. Thus you have to wait for service to appear. The fun part is depending on the windows 10 version, it would install upwards of 11 services.

Thankfully it followed a set pattern and the first 7 services were the only ones that we needed to watch for. Another wrinkle in this installer’s process is the computer couldn’t go to sleep or the screen couldn’t lock. This wrinkle had me think outside of the box a bit. I came up with two ideas. The first idea was to set the power configuration was set to never sleep and the screen not to lock. This would require exporting the config and importing. This was a lot of code, but doable. The second idea would tell the computer to click the scroll lock button over and over again. Guess which one I took? Yep, the scroll lock. Parsing random strings is fun and all, but can be accident-prone. Just to be fun about it, I decided to make things a little random with get-random. This way the system wouldn’t flag it as a virus.

The Script – Wait for service to appear

$WShell = New-Object -Com "Wscript.Shell"
$A = 0
Do {
    $RandomNumber1 = Get-Random -Minimum 1 -Maximum 20
    $RandomNumber2 = Get-Random -Minimum $RandomNumber1 -Maximum ($RandomNumber1 + 25)
    $WShell.SendKeys("{SCROLLLOCK}")
    Start-Sleep -Seconds $RandomNumber2
    $A = $A + $RandomNumber2
} until ((Get-Service -Name "frog*").count -ge 7)
write-host "The system took around $A seconds to complete"

The Breakdown

The core of this code is a simple Do Until loop. The loop executes at least one time. Then it evaluates to see if it needs to execute again. We have a while and Until. While basically means while this is happening do this. The until says, keep doing it man until this is met. Here we are using a Do Until. The first step is to create the windows script shell. AKA wscript.shell. This allows us to send key commands and such. Next, we create a 0 variable to keep up with the time. Because I like seeing the time.

We enter the do loop… Dum Dum Do? We start off by making a random number between 1 and 20. We place that into a variable. Then we start another random number. The minimum is the last random number and the maximum is the last random number plus 25. Thus the maximum time is 45 seconds.

Next, we send the scroll lock key. using the Wscript.shell sendkeys. Then we sleep for a random time. We add that information into the variable. Then we come to the Until. We ask to see if the service is there with Get-service. Our services all start with frog. Thus we use the -name “frog*”. We get the count will be greater than 7. If it is greater than 7, then we exit the loop and tell the user how long it took to complete.

The fun part about this is you can set the service to something that will never exist and the unit will keep running until you stop it. This will keep your computer from locking. If your company doesn’t have good monitoring software, like most small businesses, then it will go unnoticed.

Conclusion

And that’s how you Wait for service to appear. It’s not glamorous, but PowerShell does help the process get along. This code’s main goal is to keep the computer awake for at least 30 minutes. Always remember, that great code comes with great ability.

More links:

Microsoft Graph API – Powershell, Download user Images

Microsoft Graph API – Powershell, Download user Images

In my previous post, we went over how to Grab user information from a client. Today we will be going over how to Download User Images with Graph API. This piece is very straightforward until you get to the graph link. There is a unique limitation to PowerShell quotes that I found a good workaround.

Ok, we start off with the loop like before. We are using the /Users API. Since this is a user-level item, you have a top loop through each user with the User Principal Name. This means your string will be inside double quotes “” instead of single quotes because you want PowerShell to read the value of the $($UPN). so far simple. The next part is the word photo. Once again, simple. Then the impact. the word $value has to be at the end. This means it’s going to drop whatever is instead value into the string. There are a few ways around this.

Option 1

Declare the variable beforehand. Simple and easy way to fix this problem.

$Value = '$value'
$UserPhotoLink = "https://graph.microsoft.com/v1.0/users/$($UPN)/photo/$value"

Options 2

Use the + symbol to add the string.

$UserPhotoLink = "https://graph.microsoft.com/v1.0/users/$($UPN)/photo/" + '$value'

Download User Images with Graph API

Now we have the link we need to download the image. Once again we are going to use the invoke-restmethod with our custom header like before. This time we are going to give the -outfile. Since not everyone has an image, I am also going to set the error action to silently continue.

Invoke-RestMethod -Method get -Headers $Header -Uri $UserPhotoLink -ErrorAction SilentlyContinue -OutFile "$DropPath\$($UPN).jpg" 

That’s it. The for loop will allow you to download all of your user’s images with the UPN as the file name. I hope this has been helpful.

More Information:

Microsoft Graph API – Powershell, Download user Images

Microsoft Graph API PowerShell

In the last blog, We talked about how to create a registered app with Graph API permissions. This app’s main purpose is to become the base for an employee directory through Powershell. If you haven’t read it yet, you can here. Today’s blog is about how to interact with the app with PowerShell. Here are the pieces of information you will need from the Application.

  1. The Tenant’s ID
  2. The Application ID
  3. The Secret Key

If you have those three pieces of information, we can build a script to grab all the users. The app has “User.Read.All” permissions. What this means is that the app can pull all the information inside the azure ad that is directly linked to the user’s Azure AD account. This does not include things like SharePoint, mail, etc, directory, teams, etc. You have to grant permission for those items. But items like name, usernames, images, and much more is accessible as read-only.

The first thing we want to do is put the required information into variables to be used over again and again.

$AppID = "XXXXXXXX-XXXX-xxxx-xxxx-xxxxxxxxxxxx"
$AppKey = "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX"
$ClientID = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"

Next, we want to make a variable that will host the clientID inside the Oauth URL string. We use this string to authenticate later we want to grab the access token.

$Token = "https://login.microsoftonline.com/$($ClientID)/oauth2/v2.0/token"

The body of the access token requires a redirect url. You can set this as “http://localhost” if you want. We really don’t care unless the app has a redirected URL assigned to it. That’s for another day. So, the next step is to build the body. We do this with a basic hash table. Inside this table, we need client_id, client_secret, redirect_url, grant_type, and the scope.

$Body = @{
        client_id     = "$AppID"
        client_secret = "$AppKey"
        redirect_url  = "$redirect_url"
        grant_type    = "client_credentials"
        scope         = "https://graph.microsoft.com/.default"
}

Now we have the Token URL, and the body. We need to get the access token. We do this with a post method to the Token URL using the body information. Then we place that token in its own variable to be used in the header.

$request = Invoke-RestMethod -Uri $token -Body $Body -Method Post
$Access_Token = $request.access_token

Now we need to create the header. We are using the Authorization inside the header. We bear the access token inside the header.

$Header = @{
        Authorization = "Bearer $($Access_Token)"
}

Up to this point, we have been building the header for the next command. Now we need to switch gears and create the query URL for the graph. You can build out the string inside the graph explorer (Link) and read up on the documentation. The permissions give us access to the users’ information. You can read up on the query statement at this link.

In our example, we want to get a list of all users and these values.

  • First Name
  • Last Name
  • Display Name
  • Email Address
  • What they use to sign in with
  • Job Title
  • Account Enabled
  • Assigned Licenses.
$userInfoLink = 'https://graph.microsoft.com/v1.0/users?$select=givenname,surname,displayName,mail,userPrincipalName,jobtitle,accountenabled,assignedlicenses'

We are using the /users api to gather this information. Hint to the User.Read.All permissions we gave in the last blog post. Now, we grab the first piece of information using Invoke-RestMethod.

$PageInfo = Invoke-RestMethod -Headers $Header -Uri $UserInfoLink -method get 
$PageInfo.Value
$Userinfo = $PageInfo.Value

BAM! we have information. $PageInfo.Value will give you the first hundred records. Wait, only the first 100? Yep, Graph only presents the first 100 items. So, how do you handle that? You create a loop. how do we determine if we need a loop, we look to see if the value ‘@odata.nextlink’ exists. We do this with a Do While loop and the ‘@odata.nextlink’ information. The ‘@odata.nextlink’ is the next link request that will need to be executed. AKA the next page of the outcome. So each time the loop has a ‘@odata.nextlink’ the final loop will stop because it doesn’t have the link.

Do {
        $PageInfo = Invoke-RestMethod -Headers $Header -Uri $PageInfo.'@odata.nextlink' -Method Get
        $Userinfo += $PageInfo.Value 
} while (($PageInfo | Get-Member).name -contains '@odata.nextlink')

Notice we keep adding $PageInfo.Value to userinfo. This is building the userinfo array. This way we have the data we need. One of the requirements for this script is to have the Assigned user Information. So we need to add the sku part number of each license to the user. The problem is the assigned licenses gives back only the skuID. So what we can do is loop through the $UserInfo and add them accordingly. So we do this by starting a For Loop. Why a for loop? Because we will be adding a member property of the sku part numbers.

for ($I = 0; $I -lt $Userinfo.count; $I++) {
}

Inside this loop, we need to get the users’ licenses. We can do that with graphs as well. It’s still part of the user.read.all permissions as it still uses the /users api. We select the user by giving the UPN name. Then we ask for the license details with a /licenseDetails. I know super complex. Here is what the link will look like

for ($I = 0; $I -lt $Userinfo.count; $I++) {
        $LiceLink = "https://graph.microsoft.com/v1.0/users/$($UserInfo[-1].Userprincipalname)/licenseDetails"
}

Next we invoke the rest method again and using the link we generated pull in the information. I have never seen a user have more than 20 licenses. So, no need for a loop here.

for ($I = 0; $I -lt $Userinfo.count; $I++) {
        $LiceLink = "https://graph.microsoft.com/v1.0/users/$($UserInfo[-1].Userprincipalname)/licenseDetails"
        $UserLice = (Invoke-restmethod -Headers $Header -Uri $LiceLink -Method Get).value
}

Finally, we use the Add-Member feature to add a licensing information variable with the sku part number of each licenses inside that user’s account.

for ($I = 0; $I -lt $Userinfo.count; $I++) {
        $LiceLink = "https://graph.microsoft.com/v1.0/users/$($UserInfo[-1].Userprincipalname)/licenseDetails"
        $UserLice = (Invoke-restmethod -Headers $Header -Uri $LiceLink -Method Get).value
        $UserInfo[$i] | Add-Member -Name "LicenseInfo" -MemberType NoteProperty -Value $UserLice.skupartnumber
}

Each time the loop runs, it pulls that User’s UPN from the UserInfo array. Grabs the details of the license from graph and pulls only the sku part number. Then adds that part number to the license info inside the UserInfo array at that location. Let’s put it all together now.

The Script

Function Get-GraphEmployeeReport {
    [cmdletbinding()]
    param (
        [string]$org,
        [string]$AppID = (Read-Host -Prompt "AppId"),
        [string]$AppKey = (Read-Host -Prompt "AppKey"),
        [string]$ClientID = (Read-Host -Prompt "ClientID"),
        [string]$redirect_url = "https://localhost",
        [string]$OutfilePath = "C:\FMIT\Reports\MFA",
        [switch]$Output
    )
    $Token = "https://login.microsoftonline.com/$($ClientID)/oauth2/v2.0/token"
    $Body = @{
        client_id     = "$AppID"
        client_secret = "$AppKey"
        redirect_url  = "$redirect_url"
        grant_type    = "client_credentials"
        scope         = "https://graph.microsoft.com/.default"
    }
    $request = Invoke-RestMethod -Uri $token -Body $Body -Method Post
    $Access_Token = $request.access_token
    
    $Header = @{
        Authorization = "Bearer $($Access_Token)"
    }
    $userInfoLink = 'https://graph.microsoft.com/v1.0/users?$select=givenname,surname,displayName,mail,userPrincipalName,jobtitle,accountenabled,assignedLicenses'
    $PageInfo = Invoke-RestMethod -Headers $Header -Uri $UserInfoLink -method get #| where-object { ($_.assignedLicenses.count -gt 0) -and ($_.accountEnabled -eq $true) }
    $Userinfo = $PageInfo.Value
    Do {
        $PageInfo = Invoke-RestMethod -Headers $Header -Uri $PageInfo.'@odata.nextlink' -Method Get
        $Userinfo += $PageInfo.Value 
    } while (($PageInfo | Get-Member).name -contains '@odata.nextlink')
    for ($I = 0; $I -lt $Userinfo.count; $I++) {
        $LiceLink = "https://graph.microsoft.com/v1.0/users/$($UserInfo[-1].Userprincipalname)/licenseDetails"
        $UserLice = (Invoke-restmethod -Headers $Header -Uri $LiceLink -Method Get).value
        $UserInfo[$i] | Add-Member -Name "LicenseInfo" -MemberType NoteProperty -Value $UserLice.skupartnumber
    }
    $Userinfo
}

That’s all it takes to get the information from the azure ad. You can export this data into an XML or JSON and integrate it with any other system that you like. This does work with Powershell universal as well. You can create a table that has the next button or just gather all the information at once and present it. The more times it looks the longer the information will take to populate. Using the User.Read.All, you are also able to pull photos with graph API.

If you have any more questions, feel free to ask.