An overview of Meeting-screen template for canvas apps

Summary

My need was for my app to use some code from the invite schedules tabs of the Meeting-screen template for canvas apps. There is also a detailed reference with the code. However, I want to share my notes and understanding of this template. Hopefully, you may get some ideas for your app to use some code from it.

Overview

The Meeting-screen template presents the user with Invite and Schedule tabs. The Invite tab is where users can select the invitees (stored in MyPeople Collection) after typing names in the search box. To search it uses the “Office365Users.SearchUser()” to get UPN and display-name of the invitees. All found results are added to a vertical gallery so the user can pick an invitee or multiple invitees.

As the user picks invites it builds a MyPeople collection. Additionally, the user can type an external email in the search box as an invite and that email (or UPN) is also can be added to the MyPeople collection by selecting Add Icon.

The MyPeople collection is used to make a call of “Office365Outlook.FindMeetingTimes()” to find available times of the selected invitees. The call also uses the start date, duration, and end date existing or new values from the schedule tab.

It creates a new MeetingTimes collection to bind to a new gallery on the schedule tab. Users can pick any available time for the meeting.

If the user makes any change in the Invitees or Start Day or Duration the MeetingTimes is repopulated and the user is given to select a different available time.

In the schedule tab, the user can search for the Rooms by calling to Office365Outlook.GetRooms() and Office365Outlook.GetRoomList(). The user is now able to select the room’s appropriate available times also need to be selected again so the app makes another call to Office365Outlook.FindMeetingTimes() now this time it is for the rooms UPNs.

After everything is selected and entered by the User. The user types the Subject and Body and selects Send Button. Here the App makes a call to Office365Outlook.CalendarGetTables() and Office365Outlook.V2CalendarPostItem.

And that is how the invite is sent to invitees and a room is selected by the user.

Finally

It may be a repeat of the other links I provided here. I think these simple to-the-point notes may give you ideas to develop or do something different in your app.

Posted in Technical Stuff | Leave a comment

Demystifying ExpandMenu Component from the Creator Kit.

Summary

Are you developing apps in Power Apps? Have you heard of Creator Kit? https://aka.ms/CreatorKit If yes, great. If you have not heard of it I highly recommend you check that out. I will put some video references at the end of this post.

This post will share my learning on the ExpandMenu canvas component from the Creator Kit.

The Creator Kit has canvas components and PCF Controls. Please note, the Power Apps Component Framework (PCF) Controls are the code component. You may run into DLP issues blocking set by your company admins to deploy code components. If you are a pro developer the source is located here for your study.

Please note that there is a new experimental feature used i.e. “Behavior formula for components“. You will need to turn on this new feature in your environment.

ExapandMenu Inner Working

Input, Output, and Behavior properties.

All components have input and output properties. The input properties are to pass to the component and the output is for getting values from the component. Additionally, there is now new behavior property, think of behavior properties as events being raised by the component.

Input Properties

The input properties are for passing the values to the ExpandMenu Component. Note: there is a DefaultExpandValue property, it has the “Raise OnReset” event flag as turned on. This means the Component’s OnReset method will run whenever this property is set, it will then set the variable “IsOpen” to its value.

NameData TypeDefault ValueRaise OnReset Flag
ItemsTableSee below #1None
IsNavigationEnabledBooleantrueNone
ThemeRecordSee below #2None
DefaultExpandValueBooleanfalseSet(   IsOpen,   ExpandMenu.DefaultExpandValue)
Input Properties

Output Properties

The GetMenuIconPath acts like a method. The component internally uses to get an SVG Path text value for the Icon.

Note the IsExpanded is set to the IsOpen (component’s internal variable). Anytime the IsOpen component’s internal variable is changed the value will reflect using IsExpanded. Later you will see the screen using this component will make use of IsExpanded to control the width.

NameData TypeParameterValue
IsExpandedBoolean IsOpen
SelectedItemRecordnoneglExpandMenu.Selected
GetMenuIconPathTextIconName:TextSee # 3 below
By IconName
Find the SVG path
Output Properties

Behavior Properties

NameReturn
data type
Called
OnExpandSelectBooleanWhen a user selects a Hamburger image.
OnButtonSelectBooleanWhen user selects
a menu item.
Behavior properties
WidthIf(!Self.IsExpanded, 46, 221)
The Width property controls the expanding the component horizontally.

Components UI Controls

The following are the UI controls. The imgExpanButton acts as a hamburger image. The gaExpandMenu is the vertical galley with the Items values bounded as the list of menu items.

imgExpandButton

When a user clicks on the Hamburger image, the OnSelect event fires. It sets the Component level variable “IsOpen”. The IsOpen variable is tied to the Component’s IsExpanded output property. Also, it makes a call to the “OnExpandSelect” behavior property.

Notice the image is an SVG Path with the Theme color value.

AttributeValue
Image“data:image/svg+xml,” &  EncodeUrl(     “<svg xmlns=’http://www.w3.org/2000/svg&#8217; xmlns:xlink=’http://www.w3.org/1999/xlink&#8217; version=’1.1′ viewBox=’-10 0 2068 2048′>   <g transform=’matrix(1 0 0 -1 0 2048),rotate(0,1034,1024)’>    <path fill='” & ExpandMenu.Theme.palette.neutralPrimary & “‘ d=’M2048 1408h-2048v128h2048v-128v0zM2048 384h-2048v128h2048v-128v0zM2048 897h-2048v127h2048v-127v0z’ />   </g>   </svg>” )  
OnSelect/*
Toggle IsOpen Variable.
It is local to the component. Also, call the OnExpandSelect event.
*/
Set(IsOpen,!IsOpen);
ExpandMenu.OnExpandSelect()
Height46
Width46
imgExpandButton

glExpandMenu

AttributeValue
ItemsExpandMenu.Items
HeightExpandMenu.Height
Width221
OnSelect/*
If a screen is present and a navigation flag is not disabled
then navigate to the screen.
Also, raise the OnButtonSelect Event */

If(!IsBlank(ThisItem.Screen) &&
    ExpandMenu.IsNavigationEnabled,
  Navigate(ThisItem.Screen));

//Raise the OnButtonSelect event
ExpandMenu.OnButtonSelect();
glExpandMenu

rectHighHighLight

AttributeValue
X, Y5,10
Width, Height3,20
rectHighHighLight

imgIcon

It first checks Icon Name is present in the table hash. If not found it uses whatever is passed.
If present then it builds the using SVG path with theme color values.

AttributeValue
ImageIf(IsBlank(ExpandMenu.GetMenuIconPath(ThisItem.Icon)), ThisItem.Icon, “data:image/svg+xml,” &  EncodeUrl(     “<svg xmlns=’http://www.w3.org/2000/svg&#8217; xmlns:xlink=’http://www.w3.org/1999/xlink&#8217; version=’1.1′ viewBox=’-10 0 ” & 2068 & ” 2048′>   <g
transform=’matrix(1 0 0 -1 0 2048),rotate(0, 2068,1024)’>    <path fill='” &
ExpandMenu.Theme.palette.neutralPrimary & “‘ d='” &
ExpandMenu.GetMenuIconPath(ThisItem.Icon) & “‘ />   </g> </svg>” ))
X,Y14,12
Width,Height16,16
imgIcon

lblLabel

Note the left padding is applied to skip the icon on the left of the label.

AttributeValue
TextThisItem.Label
ToolTipThisItem.ToolTip
Width, HeightParent.Width,40
PaddingLeft46
lblLabel

Use Component on the screen

Add a screen and add the following.

    Width:If(Self.IsExpanded, 200, 46)   
    Items:Table(
        {Icon:”PowerApps”, Label: “Power Apps”, Screen:scrExpandMenu},         {Icon: “PowerBILogo”, Label: “Power BI”, Screen:scrAutoWidthLabel},         {Icon: “PowerAutomateLogo”, Label: “Power Automate”, Screen:scrBreadcrumb},         {Icon: “Dataverse”, Label:”Dataverse”, Screen:scrCommandBar}    )  
Screen using the component.

Legend

Earlier in the post, I added references to #1, #2, and #3. I kept it at the end to refer it back as it is a large code piece.

#1Table( { Icon: “PowerApps”, Label: “Power Apps”, Screen:App.ActiveScreen, Tooltip:”Power Apps Tooltip” } )
#2{ palette: { themePrimary: “#0078d4”, themeLighterAlt: “#eff6fc”, themeLighter: “#deecf9”, themeLight: “#c7e0f4”, themeTertiary: “#71afe5”, themeSecondary: “#2b88d8”, themeDarkAlt: “#106ebe”, themeDark: “#005a9e”, themeDarker: “#004578”, neutralLighterAlt: “#faf9f8”, neutralLighter: “#f3f2f1”, neutralLight: “#edebe9”, neutralQuaternaryAlt: “#e1dfdd”, neutralQuaternary: “#d0d0d0”, neutralTertiaryAlt: “#c8c6c4”, neutralTertiary: “#a19f9d”, neutralSecondary: “#605e5c”, neutralPrimaryAlt: “#3b3a39”, neutralPrimary: “#323130”, neutralDark: “#201f1e”, black: “#000000”, white: “#ffffff” } }
#3LookUp( Table(         {             Name: “SizeLegacy”,             Code: “E2B2”,             Path: “some code for SVG PATH”         },         {             Name: “PageLink”,             Code: “E302”,             Path: “some other code for SVG PATH”         }      ) , Name=IconName).Path
legend

Summary

This post may help you when you look at the Expand Menu component code and try to use the same pattern for your own new control. Keep it as clean as possible like this control.

You can look at this video for the Creator Kit Overview.

Please let me know your feedback.

Posted in Technical Stuff | Leave a comment

Tip: Connect various M365 PowerShell modules for a demo tenant.

Summary

This is a quick tip to connect to a demo tenant using one-time storing of the password (a secured string) to a file and calling the connect command as many times as you want to test for the demo tenant.

#Execute Read-Host once and comment out 
#Read-Host "Enter password" -AsSecureString | ConvertFrom-SecureString | Out-File "C:\Temp\passwordNEW.txt"

$TenantName="CRM106438" # Change to your tenant name
$Username = $("admin@{0}.onmicrosoft.com" -f $TenantName)
$Password = cat "C:\Temp\passwordNEW.txt" | ConvertTo-SecureString
$Creds = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $Username, $Password
Connect-MicrosoftTeams -Credential $Creds 
Connect-ExchangeOnline -Credential $Creds
Connect-SPOservice -url $("https://{0}-admin.sharepoint.com" -f $TenantName) -Credential $Creds
Connect-Msolservice -Credential $Creds 
Connect-AzureAD -Credential $Creds 

Posted in powershell | Leave a comment

How to download ‘ALL’ files from a very large document library?

Summary

My customer had a large document library on SharePoint Online with a deep folder structure. The total number of items is 2.7 million (2,743,321). They want those documents downloaded to the local folder. I know, this is an actual scenario and issue.

The issue was the time it took to download the script they wrote. The customer used that script it was working but it took a very long time. The performance was a huge issue. Additionally, machine resources were getting effected, and download happen sequentially and no parallel download.

This article is to show the proposed approach and working solution for the customer. The blog post is mainly with the scripts so no need to write anything. You will need to be knowledgeable on PowerShell and little MS Graph API technical skills.

An approach

  1. Not to use the PnP PowerShell Get-PnPListItem call. This will take forever to complete. It is a helpful command but not for this scenario.
  2. Use some other ways to get the file’s relative URLs using MS Graph API.
  3. Only get the ID and WebURL properties using MS Graph, as highlighted above.
  4. Store ID and WebURL properties in the CSV files in the batch of 100,000.
  5. Once all the CSV files are downloaded run another script (Phase-2 below).
  6. This script will import the CSV file and iterate over all web URL.
  7. Using the Web URL value, create a folder and download the file if not present. To download use the Get-PnPFile command.
  8. This is an important step, run the script in multiple PowerShell windows so the script downloads files in parallel.

Phase 1

In this phase 1, you will need a way to get the CSV files with ID and WebURL from DocLib. These CSV files will be with 100K rows. If you do the math there will be 28 CSV files for 2.7 million records. (Your numbers will be different based on your files).

To run the script below you will need the following:

Azure Function running on a Queue Trigger.

Storage with Azure Queue and Azure Blob.

Azure AD app with MS Graph API to have Sites Read access. (I used Full control)

You will need a driveID. I used the Graph PowerShell SDK to get it.

# Input bindings are passed in via param block.
param($QueueItem, $TriggerMetadata)

# Write out the queue message and insertion time to the information log.
Write-Host "PowerShell queue trigger function processed work item: $QueueItem"
Write-Host "Queue item insertion time: $($TriggerMetadata.InsertionTime)"


# Populate with the App Registration details and Tenant ID
$ClientId                = "TODO"
$ClientSecret            = "TODO" 
$queueName               = "TODO"
$containerName           = "TODO"  
$tenantid                = "TODO" 
$env:AzureWebJobsStorage = "TODO"
$env:LOG_FILE_PATH       = "C:\TEMP"
$GraphScopes             = "https://graph.microsoft.com/.default"
$driveID                 = "TODO" 
# To get drive id variable execute the following command
# $drives = Get-MgSiteListDrive -SiteId Your-SITE -ListId Your-LIST 
# You will need to connect to Graph. Follow this article.

# Get the access token to execute MS Graph calls.
$headers = @{
    "Content-Type" = "application/x-www-form-urlencoded"
}
# Formulate body with four parameters.
$body = "grant_type=client_credentials&client_id=$ClientId&client_secret=$ClientSecret&scope=https%3A%2F%2Fgraph.microsoft.com%2F.default"
# Create login URL for the tenant id
$authUri = "https://login.microsoftonline.com/$tenantid/oauth2/v2.0/token"
# Make a POST call to Azure AD login URL
$response = Invoke-RestMethod $authUri  -Method 'POST' -Headers $headers -Body $body
# Using Token from the above call, create header with bearer token
$headers = @{
    "Content-Type" = "application/x-www-form-urlencoded"
    "Authorization" = $("Bearer {0}" -f $response.access_token)
}
#Function to move local file to blob storage
function MoveLogFilesToBlobContainer
{
    $storageContainer = New-AzStorageContext -ConnectionString $env:AzureWebJobsStorage | Get-AzStorageContainer  -Name $containerName
    #Write-Output $storageContainer
    Get-ChildItem $env:LOG_FILE_PATH -Filter ListOfIDs*.csv | 
    Foreach-Object {
        $blobNameWithFolder = $("{0}" -f $_.Name)
        Write-Output $("Move {0} to {1} Blob Container AS BlobName {2}." -f $_.FullName, $storageContainer.Name, $blobNameWithFolder)
        Set-AzStorageBlobContent -File $_.FullName `
            -Container $storageContainer.Name `
            -Blob $blobNameWithFolder `
            -Context $storageContainer.Context -Force
        Remove-Item -Path $_.FullName -Force
    }
}
#Function to put a message in a queue
function Put2MsgInQueue([Int]$aCounter,[String]$anUrl2Process)
{
    $FormattedMessage = $("{0},{1}" -f $aCounter,  $anUrl2Process )
    Write-Host $FormattedMessage
    $context = New-AzStorageContext -ConnectionString $env:AzureWebJobsStorage
    $queue = Get-AzStorageQueue -Name $queueName -Context $context
    # Create a new message using a constructor of the CloudQueueMessage class
    $queueMessage = [Microsoft.Azure.Storage.Queue.CloudQueueMessage]::new($FormattedMessage)
    # Add a new message to the queue
    $queue.CloudQueue.AddMessageAsync($QueueMessage)
}

function ScrapTheListItems([String]$aRestURI, [int]$batchNumber)
{
    $StopWatch = [System.Diagnostics.Stopwatch]::StartNew()
    $restURI = $aRestURI
    Write-Output $("restURI {0}" -f $restURI);
    # 200 * 500 = 100,000 rows in file.
    # 200 rows per API call
    $batchCountSize           = 2 #500
    #initialize the index and array to start
    $batchIndex               = 1
    $outArray = @()
    # MAKE a call to MS GRAPH API using the bearer token header
    $response = Invoke-RestMethod $restURI  -Method 'GET' -Headers $headers
    Write-Output $("response {0}" -f $response);
    # Get the next link URL.
    $restURI = $response."@odata.nextLink"
    while ($null -ne $restURI)
    {
        # Convert an array with Name & Value pair to an object array.
        # This is needed so the object array can be stored as CSV
        foreach ( $i in $response.value)
        {
            $anObj = New-Object PSObject
            Add-Member -InputObject $anObj -MemberType NoteProperty -Name 'id' -Value $i.Id
            Add-Member -InputObject $anObj -MemberType NoteProperty -Name 'webUrl' -Value $i.webUrl
            $outArray += $anObj
        }

        $totalRows = $batchIndex * 200

        Write-Output $("batchIndex : {0}, call to graph API for 200 rows now total is  {1}" -f $batchIndex, $totalRows );

        if ( $batchIndex -eq $batchCountSize)
        {
            $exportCsvURLPath          = $("{0}\ListOfIDs-{1}.csv" -f $env:LOG_FILE_PATH, $batchNumber )
            Write-Output $("Create {0}" -f $exportCsvURLPath);
            # create a message in the Queue to start a new func app to process 1000 urls.
            $outArray | Export-Csv -Path "$exportCsvURLPath" -NoTypeInformation -Force
            ## MOVE TO BLOB CONTAINER
            MoveLogFilesToBlobContainer
            #initialize the index and array to start
            $batchIndex               = 1
            $outArray = @()
            # add file batch number to next 
            $batchNumber++
            ###NOW EXIT FROM LOOP
            break
        }
        else
        {
            $batchIndex++
        }
        # MAKE a call to MS GRAPH API using the bearer token header
        $response = Invoke-RestMethod $restURI  -Method 'GET' -Headers $headers
        # Get the next link URL.
        $restURI = $response."@odata.nextLink"
    }

    # The last remaining batch may be less than the batch count size
    if (($batchIndex -gt 1) -or ($outArray.Count -gt 0))
    {
            $exportCsvURLPath          = $("{0}\ListOfIDs-{1}.csv" -f $env:LOG_FILE_PATH, $batchNumber )
            Write-Output $("Create {0}" -f $exportCsvURLPath);
            # Create a message in the Queue to start a new Function App.
            $outArray | Export-Csv -Path "$exportCsvURLPath" -NoTypeInformation -Force
            ## MOVE the CSV file TO the BLOB CONTAINER
            MoveLogFilesToBlobContainer
    }
    if ($null -ne $restURI)
    {
        Put2MsgInQueue -aCounter $batchNumber -anUrl2Process $restURI
    }
    $StopWatch.Stop()
    Write-Output $("Elapsed time in TotalMinutes: {0}" -f $StopWatch.Elapsed.TotalMinutes);
}

# To start this Function add a manual queue message as "START"
if ("START" -eq $QueueItem)
{
    Write-Host "we are running first time"
    $counter = 1
    # For first time we need to make a call to MS Graph API
    $firstURI = $("https://graph.microsoft.com/v1.0/drives/$driveID/list/items?{0}" -f '$Select=Id%2CWebUrl') 
    ScrapTheListItems $firstURI  $counter
}
else
{
    # Function will always fall here with the index#, URL to fetch
    $splittedArray = $QueueItem.split(",")
    $counter = [int]$splittedArray[0]
    ScrapTheListItems $splittedArray[1] $counter
}

Phase 2

In Phase 2, the script is to download the file. It does the following steps.

Read the CSV File with the passed-in batch number. e.g. 1, 2, 3…,274. the files are assumed to in the blob storage. If you want to point to a local folder, you may need to change the code.

Read all 100k records. Using the Web URL check for the local folder presence and file presence.

If not present, create a folder and dump the file using Get-PnPFile command.

Run the following in multiple PowerShell prompt so the files are downloaded in parallel. Even if you run twice with the same batch number parameter the script will figure out and not download the file if downloaded already before.


# Input bindings are passed in via param block.
param($batchNumber)

### TODO REMOVE LATER ONLY FOR DEBUGGING
$MaxFiles2Get                   = 10000
$CurrentFileNumber              = 0
#Initialize variables
$DownloadLocation               = "V:\Verification Documents"
$SiteURL                        = "https://Contoso.sharepoint.com/sites/LegalDept"
$CSVFilesPath                   = "C:\LegalDept"
$OrechestratorCSVFileName       = "0-Orchestrator.csv"
$env:LOG_FILE_PATH              = "C:\LegalDept\Logs"
$global:TotalFilesAlreadyPresent       = 0
$global:TotalFilesDownloaded           = 0
$global:ConnectPnPDoneFlag             = $false

Add-Type -AssemblyName System.Web

function Write-Log
{
    [CmdletBinding()]
    Param
    (
        [Parameter(Mandatory=$true,
                   ValueFromPipelineByPropertyName=$true)]
        [ValidateNotNullOrEmpty()]
        [Alias("LogContent")]
        [string]$Message,

        [Parameter(Mandatory=$false)]
        [Alias('LogPath')]
        [string]$Path='C:\Logs\PowerShellLog.log',
        
        [Parameter(Mandatory=$false)]
        [ValidateSet("Error","Warn","Info")]
        [string]$Level="Info",
        
        [Parameter(Mandatory=$false)]
        [switch]$NoClobber
    )

    Begin
    {
        # Set VerbosePreference to Continue so that verbose messages are displayed.
        $VerbosePreference = 'Continue'
    }
    Process
    {
        
        # If the file already exists and NoClobber was specified, do not write to the log.
        if ((Test-Path $Path) -AND $NoClobber) {
            Write-Error "Log file $Path already exists, and you specified NoClobber. Either delete the file or specify a different name."
            Return
            }

        # If attempting to write to a log file in a folder/path that doesn't exist create the file including the path.
        elseif (!(Test-Path $Path)) {
            Write-Verbose "Creating $Path."
            New-Item $Path -Force -ItemType File
            }

        else {
            # Nothing to see here yet.
            }

        # Format Date for our Log File
        $FormattedDate = Get-Date -Format "yyyy-MM-dd HH:mm:ss"

        # Write message to error, warning, or verbose pipeline and specify $LevelText
        switch ($Level) {
            'Error' {
                Write-Error $Message
                $LevelText = 'ERROR:'
                }
            'Warn' {
                Write-Warning $Message
                $LevelText = 'WARNING:'
                }
            'Info' {
                Write-Verbose $Message
                $LevelText = 'INFO:'
                }
            }
        
        # Write log entry to $Path
        "$FormattedDate $LevelText $Message" | Out-File -FilePath $Path -Append
        ## also dump to console
        #$savedColor = $host.UI.RawUI.ForegroundColor 
        #$host.UI.RawUI.ForegroundColor = "DarkGreen"
        Write-Output  $message 
        #$host.UI.RawUI.ForegroundColor = $savedColor
    }
    End
    {
    }
}

function WriteExceptionInformation($AnItem)
{
    Write-Log -Path $LogFileName  $AnItem.Exception.Message
    Write-Log -Path $LogFileName  $AnItem.Exception.StackTrace    
    <#        
    Write-Log -Path $LogFileName  $AnItem.Exception.ScriptStackTrace
    Write-Log -Path $LogFileName  $AnItem.InvocationInfo | Format-List *
    #> 
}

function UpdateOrchestratorInouFile()
{
    param (
        [string]$Status2update
    )
    $csvfile = Import-CSV -Path $("{0}\{1}" -f $CSVFilesPath, $OrechestratorCSVFileName)

    if ( $null -ne $csvfile)
    {
        $outArray = @()
        foreach ( $aRowInFile in $csvfile)
        {
            $anObj = New-Object PSObject
            Add-Member -InputObject $anObj -MemberType NoteProperty -Name 'BatchNumber' -Value $aRowInFile.BatchNumber
            if ($aRowInFile.BatchNumber -eq $batchNumber)
            {
                # Change status to Status2update
                Add-Member -InputObject $anObj -MemberType NoteProperty -Name 'Status' -Value $Status2update
            }
            else {
                <# Action when all if and elseif conditions are false #>
                # Keep the status as is
                Add-Member -InputObject $anObj -MemberType NoteProperty -Name 'Status' -Value $aRowInFile.Status
            }
            $outArray += $anObj
        }
        # Important step modify the file.
        $outArray | Export-Csv -Path $("{0}\{1}" -f $CSVFilesPath, $OrechestratorCSVFileName) -NoTypeInformation -Force
    }
}


function MainWorkerFunc 
{

    $didFailStatusHappen = $false
    $importCsvURLPath          = $("ListOfIDs-{0}.csv" -f $batchNumber )
    $csvfile = Import-CSV -Path $("{0}\{1}" -f $CSVFilesPath, $importCsvURLPath)
    if ( $null -ne $csvfile)
    {
        UpdateOrchestratorInouFile -Status2update "INPROGESS"

        try {


            # sample https://Contoso.sharepoint.com/sites/LegalDept/Verification%20Documents/Documents/FirtnameLastname0877_115502.pdf
            foreach ( $aRowInFile in $csvfile)
            {
                $webUrl2work = $aRowInFile.webUrl
                if ( $null -ne $webUrl2work)
                {
                    # remove the URL decoding from web url
                    $decodedWebUrl2work = [System.Web.HttpUtility]::UrlDecode($webUrl2work) 
                    $fileName = Split-Path -Path $decodedWebUrl2work -Leaf 
                    $splitArr = $decodedWebUrl2work.split('/')             
                    $filePath = $DownloadLocation
                    # now build a local path
                    $idx = 0;
                    foreach ( $valInArr in $splitArr)
                    {
                        # skip all four indices 0,1,2,3,4,5
                        if ( $idx -ge 6)
                        {
                            # skip the file name
                            if ( $fileName -ne $valInArr)
                            {
                                # append the path to the existing
                                $filePath = $("{0}\{1}" -f $filePath, $valInArr)
                            }
                        }
                        $idx++
                    }
                    #Ensure All Folders in the Local Path
                    $LocalFolder = $filePath
                    #Create Local Folder, if it doesn't exist
                    If (!(Test-Path -Path $LocalFolder)) 
                    {
                        New-Item -ItemType Directory -Path $LocalFolder | Out-Null
                    }
                    #Download file , if it doesn't exist
                    If (!(Test-Path -LiteralPath $("{0}\{1}" -f $filePath, $fileName))) 
                    {
                        try
                        {
                            if ( $global:ConnectPnPDoneFlag -eq $false )
                            {
                                Write-Log -Path $LogFileName  $("Connecting to {0}" -f $SiteURL);
                                Connect-PnPOnline $SiteURL -ClientId "TODO" -ClientSecret "*****"
                                Write-Log -Path $LogFileName  $("Connected  to {0}" -f $SiteURL);
                                # since we are connected make this flag true
                                $global:ConnectPnPDoneFlag = $true
                            }
                            # string the host from the URL
                            # https://Contoso.sharepoint.com/sites/LegalDept/Verification%20Documents/Documents/FirtnameLastname0877_115502.pdf
                            # should be /sites/LegalDept/Verification%20Documents/Documents/FirtnameLastname0877_115502.pdf
                            $relativeFileURL = ([uri]$webUrl2work).LocalPath
                            Write-Log -Path $LogFileName  $("Download file from {0}." -f $relativeFileURL);
                            Get-PnPFile -Url $relativeFileURL -Path $filePath -FileName "$fileName" -AsFile
                            Write-Log -Path $LogFileName  $("to {0}\{1}." -f $filePath,$fileName);
                            $global:TotalFilesDownloaded += 1
                        }
                        catch
                        {
                            WriteExceptionInformation ( $PSItem )
                            UpdateOrchestratorInouFile -Status2update "FAILED"
                            $didFailStatusHappen = $true
                            ### STOP everything if the error occured
                            break
                        }
                    }
                    else
                    {
                        $global:TotalFilesAlreadyPresent += 1
                        Write-Log -Path $LogFileName  $("File {0}\{1} already downloded." -f $filePath,$fileName);
                    }
                    $CurrentFileNumber += 1
                    Write-Log -Path $LogFileName  $("CurrentFileNumber {0}" -f $CurrentFileNumber);
                    # TODO REMOVE LATER
                    if ( $CurrentFileNumber -eq $MaxFiles2Get)
                    {
                        break
                    }
                }
            }
        }
        catch {
            WriteExceptionInformation ( $PSItem )
            UpdateOrchestratorInouFile -Status2update "FAILED"
            $didFailStatusHappen = $true
            ### STOP everything if the error occured
            break
        }
        finally {
            <#Do this after the try block regardless of whether an exception occurred or not#>
            ##### 
            #Update complete only if fail did not happen before.
            if ( $true -ne $didFailStatusHappen )
            {
                UpdateOrchestratorInouFile -Status2update "COMPLETE"
        
            }
        }

    }
}

$StopWatch = [System.Diagnostics.Stopwatch]::StartNew()
$LogFileName = $("{0}\Batch-{1:d2}-Log-{2}.txt" -f $env:LOG_FILE_PATH , $batchNumber, (Get-Date -Format "yyyy-MM-dd-HH-mm-ss"))
Write-Log -Path $LogFileName " *************************************** Start  *************************************** "

#Change Window Title
$Host.UI.RawUI.WindowTitle = $("Batch number {0}." -f $batchNumber);

MainWorkerFunc # CALL THE MAIN WORKER FUNCTION
$StopWatch.Stop()
Write-Log -Path $LogFileName " ------------------------------------------------------------------------------------- "
Write-Log -Path $LogFileName  $("Batch number {0}." -f $batchNumber);
Write-Log -Path $LogFileName  $("Total files already found present: {0}" -f $global:TotalFilesAlreadyPresent);
Write-Log -Path $LogFileName  $("Total files downloaded: {0}" -f $global:TotalFilesDownloaded);
$StopWatch.Stop()
Write-Log -Path $LogFileName  $("Elapsed time in TotalMinutes: {0}" -f $StopWatch.Elapsed.TotalMinutes);
Write-Log -Path $LogFileName " ------------------------------------------------------------------------------------- "
Write-Log -Path $LogFileName " ***************************************  End   *************************************** "

Orchestrator PowerShell Script



#Initialize variables
$CSVFilesPath                   = "C:\LegalDept"
$OrechestratorCSVFileName       = "0-Orchestrator.csv"
$env:LOG_FILE_PATH              = "C:\LegalDept\Logs"

function WriteExceptionInformation($AnItem)
{
    Write-Log -Path $LogFileName  $AnItem.Exception.Message
    Write-Log -Path $LogFileName  $AnItem.Exception.StackTrace            
    Write-Log -Path $LogFileName  $AnItem.Exception.ScriptStackTrace
    Write-Log -Path $LogFileName  $AnItem.InvocationInfo | Format-List * 
}

function Write-Log
{

    [CmdletBinding()]
    Param
    (
        [Parameter(Mandatory=$true,
                   ValueFromPipelineByPropertyName=$true)]
        [ValidateNotNullOrEmpty()]
        [Alias("LogContent")]
        [string]$Message,

        [Parameter(Mandatory=$false)]
        [Alias('LogPath')]
        [string]$Path='C:\Logs\PowerShellLog.log',
        
        [Parameter(Mandatory=$false)]
        [ValidateSet("Error","Warn","Info")]
        [string]$Level="Info",
        
        [Parameter(Mandatory=$false)]
        [switch]$NoClobber
    )

    Begin
    {
        # Set VerbosePreference to Continue so that verbose messages are displayed.
        $VerbosePreference = 'Continue'
    }
    Process
    {
        
        # If the file already exists and NoClobber was specified, do not write to the log.
        if ((Test-Path $Path) -AND $NoClobber) {
            Write-Error "Log file $Path already exists, and you specified NoClobber. Either delete the file or specify a different name."
            Return
            }

        # If attempting to write to a log file in a folder/path that doesn't exist create the file including the path.
        elseif (!(Test-Path $Path)) {
            Write-Verbose "Creating $Path."
            New-Item $Path -Force -ItemType File
            }

        else {
            # Nothing to see here yet.
            }

        # Format Date for our Log File
        $FormattedDate = Get-Date -Format "yyyy-MM-dd HH:mm:ss"

        # Write message to error, warning, or verbose pipeline and specify $LevelText
        switch ($Level) {
            'Error' {
                Write-Error $Message
                $LevelText = 'ERROR:'
                }
            'Warn' {
                Write-Warning $Message
                $LevelText = 'WARNING:'
                }
            'Info' {
                Write-Verbose $Message
                $LevelText = 'INFO:'
                }
            }
        
        # Write log entry to $Path
        "$FormattedDate $LevelText $Message" | Out-File -FilePath $Path -Append
        ## also dump to console
        #$savedColor = $host.UI.RawUI.ForegroundColor 
        #$host.UI.RawUI.ForegroundColor = "DarkGreen"
        Write-Output  $message 
        #$host.UI.RawUI.ForegroundColor = $savedColor
    }
    End
    {
    }
}


function MainOrchestratorFunc {
    $csvfile = Import-CSV -Path $("{0}\{1}" -f $CSVFilesPath, $OrechestratorCSVFileName)

    if ( $null -ne $csvfile)
    {
        foreach ( $aRowInFile in $csvfile)
        {
            Write-Log -Path $LogFileName $("Batch number {0:d2} has Status {1}" -f $aRowInFile.BatchNumber, $aRowInFile.Status ) 
            switch ($aRowInFile.Status.ToUpper())
            {
                "NEW"
                {
                    Write-Log -Path $LogFileName $("Batch number {0:d2} has '{1}' Status. Spawn this batch and change status to InProgress." -f $aRowInFile.BatchNumber, $aRowInFile.Status )
                    # spawm the file with the batch number
                    SpawnThePowerShellProcess -batchnumber2Process $aRowInFile.BatchNumber
                }
                "INPROGRESS"
                {
                    Write-Log -Path $LogFileName $("Batch number {0:d2} has '{1}' Status. Do nothing." -f $aRowInFile.BatchNumber, $aRowInFile.Status ) 
                }
                "FAILED"
                {
                    Write-Log -Path $LogFileName $("Batch number {0:d2} has '{1}' Status. Spawn this batch and change status to InProgress." -f $aRowInFile.BatchNumber, $aRowInFile.Status ) 
                    SpawnThePowerShellProcess -batchnumber2Process $aRowInFile.BatchNumber
                }
                "COMPLETE"
                {
                    Write-Log -Path $LogFileName $("Batch number {0:d2} has '{1}' Status. Do nothing." -f $aRowInFile.BatchNumber, $aRowInFile.Status ) 
                }
                default
                    {
                        Write-Log -Path $LogFileName $("Batch number {0:d2} has and INVALID Status {1}" -f $aRowInFile.BatchNumber, $aRowInFile.Status ) 
                    }
            }
        }
    }
}
function SpawnThePowerShellProcess {
    param (
        [int]$batchnumber2Process
    )
    $processOptions = @{
        FilePath = "PowerShell" 
        WorkingDirectory = "C:\scripts"
        ArgumentList = "C:\scripts\DownloadFiles.ps1 -batchNumber $batchnumber2Process" 
    }
    Start-Process @processOptions -Verb RunAs -WindowStyle Normal
}

$StopWatch = [System.Diagnostics.Stopwatch]::StartNew()
$LogFileName = $("{0}\Orchestrator-Log-{1}.txt" -f $env:LOG_FILE_PATH , (Get-Date -Format "yyyy-MM-dd-HH-mm-ss"))
Write-Log -Path $LogFileName " *************************************** Start  *************************************** "
MainOrchestratorFunc # CALL THE MAIN ORCHESTRATOR FUNCTION
$StopWatch.Stop()
Write-Log -Path $LogFileName " ------------------------------------------------------------------------------------- "
Write-Log -Path $LogFileName  $("Elapsed time in TotalMinutes: {0}" -f $StopWatch.Elapsed.TotalMinutes);
Write-Log -Path $LogFileName " ------------------------------------------------------------------------------------- "
Write-Log -Path $LogFileName " ***************************************  End   *************************************** "

The Orchestrator CSV file

Status field is caseinsentive
Status New means new
1,New
Status InProgress means the batch is currently running
1,InProgress
Status Failed means the batch failed needs to run again
1,Failed
Status Complete means the batch is complete DO NOT run
1,Complete

"BatchNumber","Status"
"1","COMPLETE"
"2","NEW"
"3","COMPLETE"
"4","INPROGRESS"
"5","COMPLETE"
"6","COMPLETE"
"7","INPROGRESS"
"8","COMPLETE"
"9","FAILED"
"10","COMPLETE"

Conclusion

The script and instructions are rough, but they should be helpful if you are a developer.

The customer used the code from here. The code works but for the large doc lib they noticed issues mentioned in the summary,

This article would help you to download the files in multiple threads. The above script in phase 2 could run again pick up where it left from in case of failure. The script will skip the files already downloaded to the local folder.

Posted in MS Graph, PnP.PowerShell, SharePoint, SharePoint 2013, Technical Stuff | Leave a comment

How to use AAD Access Token in Connect-MgGraph?

Summary

The Microsoft Graph PowerShell SDK is a great and simpler ways to get MS Graph API PowerShell code working quickly. But what I have found the source code and example to utilize the X509 certificate ways of authentication. For doing a quick demo with the Azure AD security token there a simple way which I will describe here in this post.

Script example

The tip is very simple. Since Connect-MgGraph does not have Client Secret parameter, use the Invoke-RestMethod to get the access token. Once valid token is received pass it to the Connect-MgGraph and make the rest of the other MS Graph SDK calls after that.

See in the following example I have used the Get-MgGroup call after successfully connecting to MS Graph.

# The following command only required one time execution
if ( Get-ExecutionPolicy)
{
    Write-Host "RemoteSigned policy exists."
}
else
{
    Write-Host "RemoteSigned policy does not exist."
    Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser
}

if (Get-Module -ListAvailable -Name Microsoft.Graph) {
    Write-Host "Microsoft.Graph Module exists"
} 
else {
    Write-Host "Microsoft.Graph Module does not exist"
    Install-Module Microsoft.Graph -Scope AllUsers
}

# Populate with the App Registration details and Tenant ID
$ClientId          = "TODO"
$ClientSecret      = "TODO" 
$tenantid          = "TODO" 
$GraphScopes       = "https://graph.microsoft.com/.default"


$headers = @{
    "Content-Type" = "application/x-www-form-urlencoded"
}

$body = "grant_type=client_credentials&client_id=$ClientId&client_secret=$ClientSecret&scope=https%3A%2F%2Fgraph.microsoft.com%2F.default"
$authUri = "https://login.microsoftonline.com/$tenantid/oauth2/v2.0/token"
$response = Invoke-RestMethod $authUri  -Method 'POST' -Headers $headers -Body $body
$response | ConvertTo-Json
 
$token = $response.access_token
 
# Authenticate to the Microsoft Graph
Connect-MgGraph -AccessToken $token

# If you want to see debugging output of the command just add "-Debug" to the call.
Get-MgGroup -Top 10

Conclusion

I hope this helps you. I use this technique to quickly check / test the calls to the MS Graph.

Note: Please make sure your Azure AD app has required permission applied and consented or else you would get “Insufficient privileges to complete the operation.” error.

Also use the MS Graph explorer as UI ways to test your API and check required permission.

https://aka.ms/GE

PS C:\WINDOWS\system32> Get-MgUser -Top 10
Get-MgUser : Insufficient privileges to complete the operation.
At line:1 char:1
+ Get-MgUser -Top 10
+ ~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : InvalidOperation: ({ ConsistencyLe...ndProperty =  }:<>f__AnonymousType59`9) [Get-MgUser_List1], RestException`1
    + FullyQualifiedErrorId : Authorization_RequestDenied,Microsoft.Graph.PowerShell.Cmdlets.GetMgUser_List1

PS C:\WINDOWS\system32> 
Posted in MS Graph, Technical Stuff | Leave a comment

How to get all sites from the tenant using MS Graph API?

Summary

The PnP PowerShell command Get-PnPTenantSite to get all sites from the tenant takes longer time. Additionally, it does not have asynchronous ways to get the information in the Azure Durable Function.

This article uses the MS Graph API List Sites to get the sites. To use this API the following Application API permissions required for the Azure AD app.

Sites.Read.All, Sites.ReadWrite.All

Script

$StartMs = (Get-Date).Millisecond

# You will need Azure AD app with the following API permissions.
# Application	Sites.Read.All
#
$ClientId          = "TODO"
$ClientSecret      = "TODO"
$tenantid          = "TODO"
$path2File         = 'C:\temp\test.txt' # Change this as you like.

## Get Auth Token ## 
$headersAuth = @{
    "Content-Type" = "application/x-www-form-urlencoded"
    'Accept' = '*/*'
}
$body = $("grant_type=client_credentials&client_id={0}&client_secret={1}&scope=https%3A%2F%2Fgraph.microsoft.com%2F.default" -f $ClientId, $ClientSecret)
$outhTokenUrl = $("https://login.microsoftonline.com/{0}/oauth2/v2.0/token" -f $tenantid)
$response = Invoke-RestMethod $outhTokenUrl -Method 'POST' -Headers $headersAuth -Body $body
$response | ConvertTo-Json
$tokenExpiryTime = (get-date).AddSeconds($response.expires_in)
##
## Make the first call with $filer to your tenant name ##
##
$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headers.Add("Content-Type", "application/json")
$headers.Add("SdkVersion", "postman-graph/v1.0")
$headers.Add("Prefer", "apiversion=2.1")
$headers.Add("Authorization", $("Bearer {0}" -f $response.access_token) )
$response = Invoke-RestMethod 'https://graph.microsoft.com/v1.0/sites?$filter=siteCollection/hostname eq ''{CHANGE TO YOUR TENANT NAME}.sharepoint.com''' -Method 'GET' -Headers $headers
$response | ConvertTo-Json
## Check if there are more sites to fetch...
while ( $response.'@odata.nextLink' -ne $null )
{
    ## iterate on the response and write the site url to the file.
    foreach ( $val in $response.Value )
    {
        Write-Output $("{0}" -f $val.webUrl)
        Add-Content -Path $path2File -Value $val.webUrl
    }
    # check if the token expired, if it did get a new one.
    if ( (get-date) -gt  $tokenExpiryTime )
    {
        $response = Invoke-RestMethod "https://login.microsoftonline.com/$($tenantid)/oauth2/v2.0/token" -Method 'POST' -Headers $headers -Body $body
        $response | ConvertTo-Json

        $tokenExpiryTime = (get-date).AddSeconds($response.expires_in)

        # modify header with the new token
        $headers = @{
            "Content-Type" = "application/json"
            "Authorization" = $("Bearer {0}" -f $response.access_token) 
        }
    }

    # first store the data of Web URL to the file.

    $response = Invoke-RestMethod $response.'@odata.nextLink' -Method 'GET' -Headers $headers
    $response | ConvertTo-Json 

}
$EndMs = (Get-Date).Millisecond
WriteHost "This script took $($EndMs - $StartMs) milliseconds to run."

Conclusion

Using MS Graph API you can overcome to get the list of all sites within your tenant. This can be done using the Get-PnPTenantSite but it has overheads if you just want the site urls of all sites.

Posted in MS Graph, SharePoint | Leave a comment

How to hide welcome message for an empty SharePoint List?

Summary

Create a new custom list, you will notice this new list will have the following message as shown below.

An image.. + “Welcome to your new list” + “Select the New button to get started”

The custom welcome message for the empty list.

This post should help you to hide the welcome message.

Step By Step Solution

Step # 1: Create a column hideWelcome Yes/No Type and default value as No.

Create a hideWelcome column,

Step # 2: Add a dummy first row with hideWelcome as Yes for this row.

Add a new row for this hidewelcome column as Yes

Step # 3: Add the View Formatter JSON code for the trick.

{
  "$schema": "https://developer.microsoft.com/json-schemas/sp/v2/row-formatting.schema.json",
  "hideColumnHeader": true,
  "hideSelection": true,
  "debugMode": true,
  "rowFormatter": {
    "elmType": "div",
    "style": {
      "display":  "=if([$hideWelcome], 'none', '')"
    },
	"children": [
	  {
		"elmType": "div",
		"txtContent": "[$Title]",
		"style": {
		  "flex-grow": "1"
		}
	  }
	]
  }
}

The first row with the hideWelcome as Yes will hide the welcome message.

The hidden welcome message

Conclusion

The above trick may not work for all scenarios. As I have not tested all scenario. This is a technique to hide the welcome message based on the hidden field named hideWelcome.

Posted in SharePoint, Technical Stuff | Leave a comment

How to extract User Profile Photo using MS Graph API?

Summary

The existing Champion Management Platform Teams app created in the PnP community uses the User Profile Photo MS Graph API to extract and update the profile picture with the badge.

This article will demonstrate how to do the same API call with Power Automate.

Prerequisites

  • Power Automate Premium license to use the HTTP actions
  • Azure AD app with the MS Graph User.ReadWrite.All permission granted

Step by Step Solution

Step # 1: Create an Azure AD App with MS Graph Application Permission granted

Azure AD app

Step #2: Make a note of Application ID, Tenant ID, and Client Secret for the above Azure AD app.

Use these noted values in the next step.

Step #3: Create a new Power Automate Flow the Manually trigger a flow. Initialize three variables and define one text input as UserUPN.

Step #4: Make an HTTP call to get the app’s access token.

HTTP call to get an access token.
# Please use the highlighted values for URI, Headers and Body. 

{
    "inputs": {
        "method": "POST",
        "uri": "https://login.microsoftonline.com/@{variables('TenantID')}/oauth2/v2.0/token",
        "headers": {
            "Content-Type": "application/x-www-form-urlencoded"
        },
        "body": "grant_type=client_credentials&scope=https://graph.microsoft.com/.default&client_id=@{variables('ApplicationID')}&client_secret=@{variables('ClientSecret')}"
    },
    "metadata": {
        "operationMetadataId": "a69e019a-d351-409a-ae1b-340a23f4b775"
    }
}

Step # 5: User Parse JSON to get the output value of the above action.

Parse JSON
{
    "type": "object",
    "properties": {
        "token_type": {
            "type": "string"
        },
        "expires_in": {
            "type": "integer"
        },
        "ext_expires_in": {
            "type": "integer"
        },
        "access_token": {
            "type": "string"
        }
    }
}

Step # 6: Now make the Get Profile Image call to get the Image content of the profile photo.

Make an HTTP call to get the profile image.
# Please use the highlighted values for URI and Headers. 

{
    "inputs": {
        "method": "GET",
        "uri": "https://graph.microsoft.com/v1.0/users/@{triggerBody()['text']}/photo/$value",
        "headers": {
            "responseType": "blob",
            "Content-Type": "blob",
            "Authorization": "@{body('ParseJSONforToken')?['token_type']} @{body('ParseJSONforToken')?['access_token']}"
        }
    },
    "metadata": {
        "operationMetadataId": "775579d0-6aa9-4326-a1f9-0ad37217c304"
    }
}

Step # 7: Finally add compose section to get the image content from the above action.

Conclusion

As shown in the above technique you can get the user profile images on the tenant. The Azure AD app plays an important part to make an MS Graph API call. The same API call can be made using the PUT and you should be able to apply a badge or anything else to the user profile picture.

Posted in Technical Stuff | Leave a comment

How to Export Intune reports using Graph APIs?

Summary

The following REST API call is to get the InTune report data for the tenant.

# The API is a REST call with the request body to get the report CSV file.
https://graph.microsoft.com/beta/deviceManagement/reports/exportJobs

Please refer here for more details on the API.

Step By Step Solution

Step # 1 Create an Azure AD app with the MS Graph “DeviceManagementManagedDevices.Read.All” permission.

MS Graph “DeviceManagementManagedDevices.Read.All” permission.

Please note the Application ID, Secret, and Tenant ID. You will need these three pieces of information in the PowerShell Script.

Step # 2 Using PowerShell run the following script.

# Init Variables
$outputPath    = "C:\Hold"
$outputCSVPath = "C:\Hold\EAWFAreport.zip"  #might need changed

$ApplicationID   = "TOBE CHANGED"
$TenantID        = "TOBE CHANGED"
$AccessSecret    = "TOBE CHANGED"

#Create an hash table with the required value to connect to Microsoft graph
$Body = @{    
    grant_type    = "client_credentials"
    scope         = "https://graph.microsoft.com/.default"
    client_id     = $ApplicationID
    client_secret = $AccessSecret
} 

#Connect to Microsoft Graph REST web service
$ConnectGraph = Invoke-RestMethod -Uri https://login.microsoftonline.com/$TenantID/oauth2/v2.0/token -Method POST -Body $Body

#Endpoint Analytics Graph API
$GraphGroupUrl = "https://graph.microsoft.com/beta/deviceManagement/reports/exportJobs"

# define request body as PS Object
$requestBody = @{
    reportName = "Devices"
    select = @(
        "DeviceId"
        "DeviceName"
        "SerialNumber"
        "ManagedBy"
        "Manufacturer"
        "Model"
        "GraphDeviceIsManaged"
    )

}

# Convert to PS Object to JSON object
$requestJSONBody = ConvertTo-Json $requestBody

#define header, use the token from the above rest call to AAD.
# in post method define the body is of type JSON using content-type property.
$headers = @{
    'Authorization' = $(“{0} {1}” -f $ConnectGraph.token_type,$ConnectGraph.access_token)
    'Accept' = 'application/json;'
    'Content-Type' = "application/json"
}

#This API call will start a process in the background to #download the file.
$webResponse = Invoke-RestMethod $GraphGroupUrl -Method 'POST' -Headers $headers -Body $requestJSONBody -verbose


#If the call is a success, proceed to get the CSV file.
if ( -not ( $null -eq $webResponse ) )
{
    #Check status of export (GET) until status = complete
    do
    {

#format the URL to make a next call to get the file location.
        $url2GetCSV = $("https://graph.microsoft.com/beta/deviceManagement/reports/exportJobs('{0}')" -f $webResponse.id)
        "Calling $url2GetCSV"
        $responseforCSV = Invoke-RestMethod $url2GetCSV -Method 'GET' -Headers $headers  -verbose
        if (( -not ( $null -eq $responseforCSV ) ) -and ( $responseforCSV.status -eq "completed"))
        {
            #download CSV from "URL=" to OutputCSVPath
            #### It means the completed status is true, now get the file.
            Invoke-WebRequest -Uri $responseforCSV.url -OutFile $outputCSVPath
		# Un Zip the file.
            Expand-Archive -LiteralPath $outputCSVPath -DestinationPath $outputPath

        }
        {
            Write-Host "Still in progress..."
        }
        Start-Sleep -Seconds 10 # Delay for 10 seconds.
    } While (( -not ( $null -eq $responseforCSV ) ) -and ( $responseforCSV.status -eq "inprogress"))

}

After this PowerShell script call, you should see the Zip and CSV files in the C:\Hold Folder.

Conclusion

The above steps will help you to get the InTune reports data file. The API still in the beta if anything changes I will update this post.

Posted in Technical Stuff | 2 Comments

How to prevent ListView WebPart from making frequent Search API calls?

Summary

My customer has migrated the classic sites to SharePoint Online. Some sites’ home pages are with the list view web part, these pages make multiple query calls to Search API every 60 seconds.

This will make Search API throttle the query and it may have a bad user experience if many users frequently visit the same page multiple times. Read this article on the details of throttling.

Step by Step to diagnose the issue

For an issue like throttling, to diagnose you will need the Browser’s Developer Tools, you can get more information here about the developer tool for the Edge browser.

Open the developer’s tool’s console window and type localStorage.clear() and sessionStorage.clear(). This will clear all the browser cache.

Go to the SharePoint page with List View WebPart. Open the Network tab and watch for the outbound traffic. You will notice every 60 seconds the ListView WebPart will try to refresh.

This happens because the following are the settings for the list view web part.

“Automatic Refreshing interval (seconds)”

Increase the auto-refresh interval and turn it on to show the manual refresh button.

Conclusion

This may be a simple thing and can cause issues only if the page is popular and many users are visiting at the same time. If the user leaves the page active for a long time the auto-refresh will make the call in the interval and the user may not even need the refresh.

The better fix for such a page, increase the interval and provide the manual refresh button so as per the user’s need they can refresh. This will also reduce unnecessary calls to the Search API.

Posted in Technical Stuff | Leave a comment