How to retrieve Response header values from the Custom Connector in PowerApps Canvas App?

Summary

For a project, I wrote a new Power Platform Custom Connector for an API call which returned the values in the response header. I wrote the connector with the response header values mapped but the Power Apps did not see those values. I always got the blank values in PowerApps. I used the Power Apps Monitor tool, and it showed the network Response did pass the header values on the Network.

After searching and asking experts I found that the Power Apps designer does not work to get the response headers. So I used the ScriptBase code for the connector to read the response header values and add to the response body with other response data.

NOTE: This post is an advanced-level step so please follow the following articles before you follow the tips from here.

Get started with custom connectors in Power Automate

Steps to resolve

Create the custom connector with the above steps or get any existing custom connector for which you need to read the response header.

To get familiar with writing the ScriptBase code follow this article. Once you have done all of the above steps you can copy the code from below in the ScriptBase.

    public class Script : ScriptBase
    {
        public override async Task<HttpResponseMessage> ExecuteAsync()
        {
            if (this.Context.OperationId == "TM_CreateJob" ||
                this.Context.OperationId == "OP_CreateJob")
            {
                // forward the request
                HttpResponseMessage response = await this.Context.SendAsync(this.Context.Request, this.CancellationToken)
                .ConfigureAwait(continueOnCapturedContext: false);
                // get the response
                var responseString = await response.Content.ReadAsStringAsync().ConfigureAwait(false);
                // parse the response to Jobject
                // create new body
                var headers = response.Headers;
                if (response.IsSuccessStatusCode)
                {
                    var xPag = "Not Found";
                    if (headers.Contains("Operation-Location"))
                    {
                        var xHeader = headers.GetValues("Operation-Location").FirstOrDefault();
                        xPag = xHeader.ToString();
                    }
                    var _newBody = new JObject
                    {
                        ["jobId"] = xPag

                    };
                    response.Content = CreateJsonContent(_newBody.ToString());
                }
                return response;
            }
            else
            {
                var response = await this.Context.SendAsync(this.Context.Request, this.CancellationToken).ConfigureAwait(false);
                return response;
            }
        } // End of ExecuteAsync
    } // end of class Script

Conclusion

I hope this code is useful to you. I have found some of the following articles useful.

Create custom connector for own APIhttps://benediktbergmann.eu/2021/12/30/create-custom-connector-for-own-api/
Make Your First Custom Connector For Power Automate And Power Appshttps://www.matthewdevaney.com/make-your-first-custom-connector-for-power-automate-and-power-apps/
Custom Connector + Custom Headers? – Power CAT Livehttps://www.youtube.com/watch?v=oVR3dFpepYc
Transform A Custom Connector Response With C# Codehttps://www.matthewdevaney.com/transform-a-custom-connector-response-with-c-code/
Posted in Power Apps, Power Automate | Leave a comment

Microsoft Build 2023 (May 23-25)

Summary

The following is a list of all Build 2023 sessions alphabetically. You can click on the link to get to the session recording (if it is available and you have registered as free). I have crossed out the links where they are not available for later viewing.


Sessions
360 Degrees of Feedback: Enhancing the Microsoft 365 Developer Program
2023 Imagine Cup World Championship
A
A lap around the Microsoft Store CLI
Accelerate development with Visual Studio and Microsoft Power Platform
Accelerate your application development with a flexible toolkit of UI controls
Accelerate your data potential with Microsoft Fabric
Accelerating 3D simulation workflows with NVIDIA Omniverse and DGX Cloud on Azure
Achieve more with Azure PaaS: Azure App Service product experts, Q&A
Advanced developer tips and tricks in Visual Studio
AI Infused Omnichannel Communications
AI innovation in the Microsoft Power Platform
AI made easier: How the ONNX Runtime and Olive toolchain will help you, Q&A
All things client and mobile app development with .NET MAUI
AMD + Azure: Outstanding performance and the latest security features
Angular and ReactJS for mobile and desktop apps
Anjuna Demo: Confidential Computing Made Easy
Ansys and Microsoft: Collaboration and tools to transform simulation
Application reliability with Azure Load Testing and Chaos Studio, Q&A
ASP.NET Core and Blazor futures, Q&A
Automate and protect your documents with Adobe and Microsoft 365
Automated cloud application testing with Microsoft Azure and Playwright, Q&A
Automated compliance testing tools for Microsoft 365 apps
Azure Container Instances(ACI) use-cases and roadmap
Azure Cosmos DB for MongoDB: Choosing the right architecture, Q&A
Azure Linux: A container host OS for Azure Kubernetes Service (AKS) Q&A
Azure Spring Apps: The easy way to run your apps
Azure Synapse Data Explorer and Delta Lake Integration
Azure-enabled vision AI with NVIDIA AI Enterprise and Jetson
B
Best practices on managing GPUs in Azure with Run:ai
Better decision making with always up-to-date vector databases
Beyond 300M users: How to bring your Teams app to Outlook and Microsoft 365
Blazor + .NET MAUI – the perfect “hybrid”
Boost Dev Time: Blazor Hybrid App + .NET MAUI = 2X Faster Results!
Build AI-assisted communication workflows for customer engagement
Build and maintain your company Copilot with Azure ML and GPT-4
Build and Maintain your Company Copilot with Azure ML and GPT-4, Q&A
Build and ship global full stack serverless web apps, Q&A
Build apps with state-of-the-art computer vision
Build Intelligent Apps with .NET and Azure
Build IoT solutions with MQTT in Azure Event Grid
Build Modern Applications at Lightning Speed with Orkes
Build scalable and secure enterprise apps on OSS databases, Q&A
Build scalable, cloud-native apps with AKS and Azure Cosmos DB
Build secure applications with External Identities in Microsoft Entra
Build The Best With Active Tests: Shift Left with API Security
Build the next big thing with MongoDB Atlas on Microsoft Azure
Build with Microsoft Word as a platform: Word JavaScript APIs and key user scenarios
Build, Customize, and Deploy LLMs At-Scale on Azure with NVIDIA NeMo
Building Adaptive Card-based Loop components for Microsoft 365
Building AI solutions with Semantic Kernel
Building and scaling cloud-native, intelligent applications on Azure
Building and using AI models responsibly
Building and using AI models responsibly, Q&A
Building Chat Plugins for Microsoft Bing and Edge
Building computer vision applications, Q&A
Building on the Microsoft Cloud: Audio/video calling from a custom app
C
Cloud Security from Left-to-Right, Intelligent, Integrated, Automated
Cloud-native and Linux on Microsoft Azure, Q&A
Cloud-native development with .NET 8
Collaboration by design in the age of AI and the modern workplace
Connecting and securing workloads from anywhere made easy with AI/ML
Continuous delivery with GitHub Actions
Create custom Virtual Meetings apps with Azure Communication Services and Microsoft Teams
D
Data analytics for the era of AI
Database capabilities you need for your next awesome app, Q&A
Data-driven app and web development with Microsoft Power Platform, Q&A
Deep dive into .NET performance and native AOT
Deep dive into monitoring your cloud-native environment with Azure Monitor
Deep dive on Azure Load Testing in CI/CD
Deliver AI-powered experiences across cloud and edge, with Windows
Deliver apps from code to cloud with Azure Kubernetes Service
Deploy & manage storage volumes with Azure Container Storage
Designing and implementing automation and conversational AI, Q&A
Designing secure Azure SQL Database solutions
Develop from anywhere with Visual Studio Code
Develop in the cloud with Microsoft Dev Box
Develop modern connected applications on 5G, MEC/Edge, Space, and Azure
Developer career tools: Saying NO to improve mental health, Q&A
Developer joy with Scott Hanselman and friends
Development with accessibility in mind, Q&A
Do We Need Rust Language Support for Azure?
Driving better outcomes through API first development 
E
e2e testing with Playwright
Edge to Cloud with InfluxDB
Edge to Cloud with InfluxDB
Eliminate data silos with OneLake, the OneDrive for Data
Empower every BI professional to do more with Microsoft Fabric
Enable more sensitive workloads on Azure with confidential VMs and containers
Enabling Successful Confidential Computing with AMD + Azure
Enhance your solutions with new Azure AI products and features
Everything you want us to know about Azure AI services
Everything you want us to know about Azure Cognitive Search 
Execute tests with HyperExecute’s Just In Time Orchestration
Expanding your reach with Windows: How developers can win new users
Explore CIAM capabilities with External Identities in Microsoft Entra
Explore how to build in 3D with Microsoft Mesh
Explore LEADTOOLS AI-powered SDKs: Document and Medical Viewer controls
Exploring the future of AI in SQL and Spark scenarios
Extend data security into multi-cloud apps and solutions
External Identities in Microsoft Entra: the future of creative brand experiences!
F
Federated Learning with Azure Machine Learning, NVIDIA FLARE and MONAI
Fluent 2: Designing Teams apps that seamlessly integrate experiences across Microsoft 365
Focus on code, not infra with Azure Functions, Azure Spring Apps, Dapr
Full stack scale with the Microsoft Power Platform
Full stack web in .NET 8 with Blazor
Fusion of Remote Operations, Simulation and Digital Twin
G
GenAI for Knowledge: transform search of unstructured data with personality
Get full-stack visibility into your Azure environment in minutes
Get true zero-trust runtime security in Kubernetes with SUSE NeuVector
Getting started with generative AI using Azure OpenAI Service
GitHub Advanced Security for Azure DevOps: Interactive deep dive, Q&A
GraphQL: New services and tools for building API-driven apps
H
Harness the power of AI: Extend Copilot and beyond
Harnessing generative AI with NVIDIA AI and Microsoft Azure
Help us elevate your development experience on Windows
How are You Utilizing GPUs? Best Practices on Managing GPUs in Azure
How GitHub builds GitHub with GitHub
How leading AI companies 10x experimentation velocity with Statsig
How leading AI companies 10x experimentation velocity with Statsig
How Secure is your Application and Infrastructure Code?
How to build next-gen AI services with NVIDIA AI on Azure Cloud
How to Create a PDF Document in Blazor Using the .NET PDF Library
How to optimize your vector databases while keeping your data current
How Unit Costs will change the way you think about your cloud bill
HTTP Load Balancing with NGINXaaS for Azure – Azure Native ISV Service
I
Identity APIs in Microsoft Graph: Advanced Queries deep dive
Increase developer velocity with Azure SQL Database, from data to API
Infrastructure as code in any programming language
Inject the power of the cloud and AI into your development workflow 
Inside Azure innovations with Mark Russinovich
Integrate your data with latest Data warehouse and Data Factory capabilities
Integrating Azure AI and Azure Kubernetes Service to build intelligent apps
Intune support for .NET MAUI on Android
J
Java experts at Microsoft, Q&A
K
Kickstart your .NET modernization journey with the RWA pattern
Kubernetes usage and management outside of Azure
Kusto detective agency Q&A
L
LEADTOOLS Low-Code Document Viewer, Editor and Medical Viewer Controls
Learn how to build the best Arm apps for Windows
Learn Live: Build a bot and Teams tab app with Teams Toolkit for Visual Studio Code
Learn Live: Build a web app with Blazor
Learn Live: Customize the command bar in Microsoft Power Apps
Learn Live: Enterprise 5G technologies for Azure Cloud Services
Learn Live: Get started with AI on Microsoft Azure
Learn Live: Get started with custom connectors in Power Automate
Learn Live: GitHub administration for GitHub Advanced Security
Learn Live: Introduction to Azure Arc-enabled Kubernetes
Learn Live: Introduction to Azure OpenAI Service
Learn Live: Migrate SQL workloads to Azure Managed Instances
Learn Live: Model data for Azure Cosmos DB for PostgreSQL
Learn Live: Publish a web app to Microsoft Azure with Visual Studio
Let’s talk about running your apps anywhere with Azure Arc
Level Up Your Cloud-native Security
Leverage AI Kits for Faster AI Solutions Tailored to Your Needs
M
Make a Once-in-a-Generation Leap – Migrate to Azure Ampere-based VMs
Manage resources from cloud to edge using Azure Automanage machine con
Metaverse Meets Generative AI: Opportunities & Challenges Ahead
Microsoft and Aisera AI Co-Pilot Demo
Microsoft Build opening
Microsoft Edge for Business: a dedicated work experience with AI, productivity, and more
Microsoft Edge: Bringing WebView2 to Microsoft Teams and beyond
Microsoft Edge: Building Progressive Web Apps for the sidebar
Microsoft Edge: State of web developer tools
Microsoft Fabric Data Factory Q&A
Microsoft Fabric Synapse data warehouse, Q&A
Models to outcomes with end-to-end data science workflows in Microsoft Fabric
Modern Productivity with Adobe and Microsoft
Modern workforce document automation solutions by Adobe & Microsoft
Modernize .NET and Java web apps in Azure App Service
Modernize your applications on Azure SQL Managed Instance, Q&A
Modernize your data integration to enable petabyte scale analytics with Microsoft Fabric
Modernize your Enterprise Data Warehouse & generate value from data with Microsoft Fabric
Modernize your Win32 application for security and privacy, Q&A
Modernizing high volume customer communications platforms
Modernizing with containers and serverless, Q&A
Modernizing your applications with containers and serverless​
Monitor Microsoft Azure SQL with Datadog’s Database Monitoring
Mural + Microsoft: How Mural is building for a hybrid workplace
N
Native apps for Windows on Snapdragon® compute platforms
Native Authentication for Customer-Facing Applications
New developer experiences in Windows
New PubSub capabilities in Azure Event Grid
News from around Microsoft Graph, Q&A
Next generation AI for developers with the Microsoft Cloud
Next-Level DevSecOps: Secure Supply Chain Consumption Framework, Q&A
No API Left Untested: Shift Left with API Security
NVIDIA AI Enterprise Registry for AzureML deployments of AI Workflows
O
Open for AI: Secure paths to data collaboration, volume, and diversity
OpenFL (Federated Learning) Building better AI models with private data
Operationalization of AI/ML models and use cases for ChatGPT, Q&A
Optimize your apps for Arm, Q&A
Optimizing Azure Cosmos DB: Strategies for cost efficiency and elasticity
p
Playwright, Q&A
Power real-time data streams from anywhere to Cosmos DB with Confluent
Practical deep dive into machine learning techniques and MLOps
Practical deep dive into machine learning techniques and MLOps, Q&A
Pragmatic techniques to get the most out of GitHub Copilot
Python web apps on Microsoft Azure, Q&A
Q
Q&Awith the team behind Microsoft’s new Linux Distro
Qualcomm Technologies and the power of Windows AI, Q&A
Qualcomm® AI Stack for developers and extension to on-device AI
R
Rapidly Build Distributed Applications using Orchestration
Real Time analytics – Sense, analyze, generate insights with Microsoft Fabric
Real-time analytics with Azure Synapse Data Explorer
Real-Time Connectivity to Snowflake and 100s of Sources from the Power Platform
Real-time event streaming
Real-Time Solution for Critical Data Migrations and Integrations
Reduce fraud and improve engagement using Digital Wallets
Revolutionize the future: Solutions for scale with Redis Enterprise
S
Scale observability and secure data in motion with Elastic and Confluent
Scott and Mark Learn to Code
Seamlessly integrate security throughout your code to cloud workflow
Secure and observe your APIs no matter where they run
Secure, govern and manage your data at scale in Microsoft Fabric
Securely test and debug your web apps and webhooks with dev tunnels
Securing container deployments on Azure Kubernetes Service with open-source tools
Securing organizations, pipelines, and integrations in Azure DevOps, Q&A
Self-expression and pronouns in Microsoft 365
Self-serve app infrastructure using Azure Deployment Environments
Set up your dev machine in record time with WinGet and Desired State Configuration
Shaping the future of work with AI
Ship-It safely with GitHub Advanced Security
Simplify Microsoft Entra Workload Identities with DevSecOps
Simplify Your Data Stack: Automate, Orchestrate and Integrate Data
Simplify Your Data Stack: Automate, Orchestrate and Integrate Data
Slow Starts to Stellar Results: How AI Can Improve Team Collaboration
State of GPT
State of the Art Data Retrieval with Machine Learning & Elasticsearch
Streamline eDiscovery with new innovations, including Microsoft Graph APIs
Synapse Data Engineering, Data Science & OpenAI Roundtable
T
Take your .NET apps to the cloud with Microsoft Azure, Q&A
Technology Vendor Integrations for B2C and B2B applications – Present and Future with Microsoft Entra
The era of the AI Copilot
The future of AI and generative code, Q&A
The future of app development with the Microsoft Power Platform
The future of digital twin solutions in manufacturing
The future of edge solutions in manufacturing
The future of NuGet
The Old New Thing with Raymond Chen, Q&A
The vision of the future with Microsoft Authentication Library (MSAL) as an authentication broker
Transform productivity with AI experiences in Microsoft Fabric
Transform Teams apps into multiplayer with Live Share
Troubleshooting apps running on Kubernetes
U
Unblock Cloud Transformation with Anjuna Confidential Computing Platform
Understanding the cryptographic controls in Azure SQL Database
Unleash your Outlook Add-ins experiences into the new Outlook
Upgrade your .NET projects with Visual Studio
Upgrading from Xamarin to .NET MAUI
Using AMD based confidential VMs for your Azure Data Explorer clusters
Using Azure to improve your organization’s sustainability posture
Using Cyber Data To Financially Quantify Cyber Risk Decisions
Using Cyber Data to Financially Quantify Cyber Risk Decisions
Using Spark to accelerate your lakehouse architecture with Microsoft Fabric
UX: Designing for Copilot
V
Vector Search Isn’t Enough
Vision AI at the Edge for Industrial Inspection
W
What’s new in .NET 8 for Web, frontends, backends, and futures?
What’s new in .NET Multi-platform App UI (MAUI), Q&A
What’s new in C# 12 and beyond
What’s new in Container Networking
What’s new with Azure Messaging
What’s next for Azure landing zones?
Windows Hybrid Apps – Developing for the cloud-first future
Windows Subsystem for Android: Opportunities for mobile developers
Write network-aware applications with Azure Programmable Connectivity
Y
You really can manage ALL Microsoft Azure services and features with Terraform
Posted in Build 2023, Technical Stuff | Leave a comment

How to create a JSON String array in PowerApps?

Summary

I needed to pass from a PowerApps the following JSON payload to a Power Automate Run parameter. I have just added a few properties of the large payload to remove complexity. This way I can tell you the problem with the PowerApps JSON string array.

// The following array is required as an output
Set (
    varRegistryFiltersArray,
    [
        {
            conditions: ["Diabetes"],  
            facilityLocations: [
                {
                    state: "CA",
                    country: "United States"
                }
            ]
        }
    ]
);
// But the JSON function created different output.
Set(
    varRegistryFiltersArrayString,
    JSON(
        varRegistryFiltersArray,
        JSONFormat.Compact
    )
);

When I ran the JSON function on the above variable ‘varRegistryFiltersArray’, I got the following output. As you can see I did not want the “Value”: “Diabetes” and did not want conditions as an array of objects.

[
    {

        // This was not expected output. PowerApps treats string array like this.
        "conditions": [
            {
                "Value": "Diabetes"
            }
        ],
        "facilityLocations": [
            {
                "country": "United States",
                "state": "CA"
            }
        ]
    }
]

This is a default behavior of the Power Apps String array (it treats as a Table) to produce the record for each string in the array as a record with a Value column. I searched but have not found anything simpler so I created this technique.

Wherever I have the string array I created the following tokens. The tokens are surrounded by the inner string array and later the tokens are substituted/ removed for the final JSON string using the Substitute function.

Set (
    varRegistryFiltersArray,
    [
        {
            conditions: [{replace: "Diabetes,replace"}],
            facilityLocations: [
                {
                    state: "CA",
                    country: "United States"
                }
            ]
        }
    ]
);
Set(
    varRegistryFiltersArrayString,
    JSON(
        varRegistryFiltersArray,
        JSONFormat.Compact
    )
);
//
// substitute "{replace:"  with ""   AND 
// substiture "replace\"}" with """"
// make a note of $ sign before string.
Set(
    varRegistryFiltersArrayString,
    Substitute(
        Substitute(
            varRegistryFiltersArrayString,
            $"{{""replace"":",
            ""
        ),
        $",replace""}}",
        $""""
    )
);

Conclusion

This technique worked for my large JSON payload. I hope it works for you this simple trick.

Posted in Power Apps, Power Automate | 2 Comments

How to resolve “FHIRBase.GETPatient failed: Failed to execute ‘atob’ on ‘Window’: The string to be decoded is not correctly encoded.” error?

Summary

I tried using the FHIRBase and FHIRClinical power platform connectors for my project. I got the following error.

FHIRBase.GETPatient failed: Failed to execute ‘atob’ on ‘Window’: The string to be decoded is not correctly encoded.

You will find the source code for the connector and sample app on GitHub.

In my scenario, I had the following two tenant environments.

  1. Power Platform tenant with an Azure AD and 25 demo users.
  2. An Azure instance with computing, storage, and other resources with Azure Health Data Services (AHDS).

Follow the following links for the learning and workshop to install AHDS.

Get started with Azure Health Data Services

After all the above steps, I was able to resolve this error.

Get started with Azure Health Data Services

Diagnostics steps.

Step # 1: I turned on the Edge Developers tools to see the Network traffic.

In my case, I noticed invoke calls made and the response was 401. Clearly, this was an Unauthorised user making the call through the connector. My Power Platform demo tenant’s user let’s say (Alex Weber) must be added as the Guest user in the second Azure AD tenant.

Step # 2: I tried again and I got the same error as before. I looked at the dev tools and this time I got the 403 response.

Now by 403, I got the Forbidden error which is better now. To resolve 403, I added the (Alex Weber) User in the FHIR – Contributor role in the Data Plane. Please follow this to learn about the roles for the FHIR API.

Step # 3: The Azure Healthcare API’s “user_imprsonation” permission consent must be granted.

Step # 4: Finally, make sure when you create the connector the first time in your Power Platform connection use the Azure FHIR APIAzure User.

As you see below, both the connectors were added to the Power Platform tenant and the Azure tenant.

Conclusion

After all the above steps, I was able to get the error resolved. Please see below.

I hope these few simple checks and fixes resolve the problem for you. In summary, make sure the User is in the FHIR Dataplance,

Posted in Azure, FHIR, Technical Stuff | Leave a comment

How to clean up the FHIR objects in the Postman received as a bundle?

Summary

My need was to remove stale FHIR condition objects with the wrong concept codes in the Azure FHIR test server.

Here is the quick tip and Javascript code to remove such data for any FHIR resource.

Note: You should be aware of getting a bearer token for the FHIR in the Postman call. I will refer to that call at the end. You should know how to make the GET and POST calls in Postman.

### In Postman you can add the call with the query paramete CLEAN-UP

GET https://{your-instnace}.fhir.azurehealthcareapis.com/Condition?code=254637007,23986001,44054006&_count=1000&CLEAN-UP=yes
Authorization: Bearer {{bearerToken}}
Accept: application/json

Write the following code in the TEST tab of the Postman. The following code expects that you made the POST call to get the “bearerToken” and saved it in the environment variable.

var query = {};
pm.request.url.query.all().forEach((param) => { query[param.key] = param.value});
IsCleanUp = query['CLEAN-UP'];

if (( IsCleanUp !== undefined ) && ( IsCleanUp.toLowerCase() === 'yes' ) )
{
    console.log("CLEAN-UP is " + ' ' + IsCleanUp);
    var jsonData = JSON.parse(responseBody);
    if ( jsonData.entry !== undefined )
    {
        for (let i = 0; i < jsonData.entry.length; i++) {
            const resource = jsonData.entry[i];
            if ( undefined !== resource.fullUrl)
            {
                console.log("DELETE " + resource.fullUrl);  
                pm.sendRequest({
                    url:  resource.fullUrl, 
                    method: 'DELETE',
                    header: {
                        'Accept': 'application/json',
                        'Content-Type': 'application/x-www-form-urlencoded',
                        'Authorization': 'Bearer ' + pm.environment.get("bearerToken")
                    },
                    body: {
                    }
                }, function (err, res) {
                    console.log("DELETED " + resource.fullUrl);
                });
            }
        }
    }
}
else{
    console.log("CLEAN-UP is " + ' ' + IsCleanUp);
}

Summary

This is a quick tip to clean up the FHIR objects in the FHIR test server.

Posted in FHIR | Leave a comment

How to add a Checkbox to a PowerApps gallery to add or remove items to/from a collection?

Summary

I need to add a checkbox to a Vertical Gallery and allow the user to check or uncheck the checkbox to add or remove the item to a new collection.

Step 1: Add a button on a screen, and add the following code in OnSelect.

/Create a collection
ClearCollect(colVehicles,
{Id: 1,Year: 2021, Make: "Honda", Model: "CR-V"},
{Id: 2,Year: 2021,Make: "Honda",Model: "HR-V"},
{Id: 3,Year: 2020,Make: "Honda",Model: "Accord"},
{Id: 4,Year: 2020,Make: "Mazda",Model: "CX30"},
{Id: 5,Year: 2022,Make: "Mazda",Model: "CX9"},
{Id: 6,Year: 2023,Make: "Mazda",Model: "CX-5"},
{Id: 7,Year: 2016,Make: "Toyota",Model: "Camery"}
);
// Add a new "IsItemSelected" column with default "false" to a new collection.
ClearCollect ( colNewVehicles,
AddColumns(colVehicles,"IsItemSelected",false));
// Start with the blank selected collection
ClearCollect(colSelectedVehicles,Blank());

Step 2: Bind the above new “colNewVehicles” collection to a Vertical Gallery. Bind the Year, Make, and Model.

Step 3: Now add a new Checkbox to this Gallery. Add the following code to the properties of this checkbox.

In the Default property of the Checkbox set the following code:

ThisItem.IsItemSelected

In the OnCheck property of the Checkbox set the following code:

Patch (colNewVehicles, ThisItem, {IsItemSelected: true} );
Collect(colSelectedVehicles, ThisItem);

In the OnUnCheck property of the Checkbox set the following code:

Patch ( colNewVehicles, ThisItem, {IsItemSelected: false} );
RemoveIf( colSelectedVehicles, Id = ThisItem.Id );

Step 4: (Optional) Bind the above colSelectedVehicles collection to the Vertical Gallery to see which items the user selects or deselect gets added to it. See below.

Conclusion

The above tips and steps will allow you to add a check box to add or remove items to a collection.

Refer

https://learn.microsoft.com/en-us/power-platform/power-fx/reference/function-clear-collect-clearcollect

https://learn.microsoft.com/en-us/power-platform/power-fx/reference/function-patch

https://learn.microsoft.com/en-us/power-platform/power-fx/reference/function-table-shaping

Posted in Power Apps | 1 Comment

An overview of Meeting-screen template for canvas apps

Summary

My need was for my app to use some code from the invite schedules tabs of the Meeting-screen template for canvas apps. There is also a detailed reference with the code. However, I want to share my notes and understanding of this template. Hopefully, you may get some ideas for your app to use some code from it.

Overview

The Meeting-screen template presents the user with Invite and Schedule tabs. The Invite tab is where users can select the invitees (stored in MyPeople Collection) after typing names in the search box. To search it uses the “Office365Users.SearchUser()” to get UPN and display-name of the invitees. All found results are added to a vertical gallery so the user can pick an invitee or multiple invitees.

As the user picks invites it builds a MyPeople collection. Additionally, the user can type an external email in the search box as an invite and that email (or UPN) is also can be added to the MyPeople collection by selecting Add Icon.

The MyPeople collection is used to make a call of “Office365Outlook.FindMeetingTimes()” to find available times of the selected invitees. The call also uses the start date, duration, and end date existing or new values from the schedule tab.

It creates a new MeetingTimes collection to bind to a new gallery on the schedule tab. Users can pick any available time for the meeting.

If the user makes any change in the Invitees or Start Day or Duration the MeetingTimes is repopulated and the user is given to select a different available time.

In the schedule tab, the user can search for the Rooms by calling to Office365Outlook.GetRooms() and Office365Outlook.GetRoomList(). The user is now able to select the room’s appropriate available times also need to be selected again so the app makes another call to Office365Outlook.FindMeetingTimes() now this time it is for the rooms UPNs.

After everything is selected and entered by the User. The user types the Subject and Body and selects Send Button. Here the App makes a call to Office365Outlook.CalendarGetTables() and Office365Outlook.V2CalendarPostItem.

And that is how the invite is sent to invitees and a room is selected by the user.

Finally

It may be a repeat of the other links I provided here. I think these simple to-the-point notes may give you ideas to develop or do something different in your app.

Posted in Technical Stuff | Leave a comment

Demystifying ExpandMenu Component from the Creator Kit.

Summary

Are you developing apps in Power Apps? Have you heard of Creator Kit? https://aka.ms/CreatorKit If yes, great. If you have not heard of it I highly recommend you check that out. I will put some video references at the end of this post.

This post will share my learning on the ExpandMenu canvas component from the Creator Kit.

The Creator Kit has canvas components and PCF Controls. Please note, the Power Apps Component Framework (PCF) Controls are the code component. You may run into DLP issues blocking set by your company admins to deploy code components. If you are a pro developer the source is located here for your study.

Please note that there is a new experimental feature used i.e. “Behavior formula for components“. You will need to turn on this new feature in your environment.

ExapandMenu Inner Working

Input, Output, and Behavior properties.

All components have input and output properties. The input properties are to pass to the component and the output is for getting values from the component. Additionally, there is now new behavior property, think of behavior properties as events being raised by the component.

Input Properties

The input properties are for passing the values to the ExpandMenu Component. Note: there is a DefaultExpandValue property, it has the “Raise OnReset” event flag as turned on. This means the Component’s OnReset method will run whenever this property is set, it will then set the variable “IsOpen” to its value.

NameData TypeDefault ValueRaise OnReset Flag
ItemsTableSee below #1None
IsNavigationEnabledBooleantrueNone
ThemeRecordSee below #2None
DefaultExpandValueBooleanfalseSet(   IsOpen,   ExpandMenu.DefaultExpandValue)
Input Properties

Output Properties

The GetMenuIconPath acts like a method. The component internally uses to get an SVG Path text value for the Icon.

Note the IsExpanded is set to the IsOpen (component’s internal variable). Anytime the IsOpen component’s internal variable is changed the value will reflect using IsExpanded. Later you will see the screen using this component will make use of IsExpanded to control the width.

NameData TypeParameterValue
IsExpandedBoolean IsOpen
SelectedItemRecordnoneglExpandMenu.Selected
GetMenuIconPathTextIconName:TextSee # 3 below
By IconName
Find the SVG path
Output Properties

Behavior Properties

NameReturn
data type
Called
OnExpandSelectBooleanWhen a user selects a Hamburger image.
OnButtonSelectBooleanWhen user selects
a menu item.
Behavior properties
WidthIf(!Self.IsExpanded, 46, 221)
The Width property controls the expanding the component horizontally.

Components UI Controls

The following are the UI controls. The imgExpanButton acts as a hamburger image. The gaExpandMenu is the vertical galley with the Items values bounded as the list of menu items.

imgExpandButton

When a user clicks on the Hamburger image, the OnSelect event fires. It sets the Component level variable “IsOpen”. The IsOpen variable is tied to the Component’s IsExpanded output property. Also, it makes a call to the “OnExpandSelect” behavior property.

Notice the image is an SVG Path with the Theme color value.

AttributeValue
Image“data:image/svg+xml,” &  EncodeUrl(     “<svg xmlns=’http://www.w3.org/2000/svg&#8217; xmlns:xlink=’http://www.w3.org/1999/xlink&#8217; version=’1.1′ viewBox=’-10 0 2068 2048′>   <g transform=’matrix(1 0 0 -1 0 2048),rotate(0,1034,1024)’>    <path fill='” & ExpandMenu.Theme.palette.neutralPrimary & “‘ d=’M2048 1408h-2048v128h2048v-128v0zM2048 384h-2048v128h2048v-128v0zM2048 897h-2048v127h2048v-127v0z’ />   </g>   </svg>” )  
OnSelect/*
Toggle IsOpen Variable.
It is local to the component. Also, call the OnExpandSelect event.
*/
Set(IsOpen,!IsOpen);
ExpandMenu.OnExpandSelect()
Height46
Width46
imgExpandButton

glExpandMenu

AttributeValue
ItemsExpandMenu.Items
HeightExpandMenu.Height
Width221
OnSelect/*
If a screen is present and a navigation flag is not disabled
then navigate to the screen.
Also, raise the OnButtonSelect Event */

If(!IsBlank(ThisItem.Screen) &&
    ExpandMenu.IsNavigationEnabled,
  Navigate(ThisItem.Screen));

//Raise the OnButtonSelect event
ExpandMenu.OnButtonSelect();
glExpandMenu

rectHighHighLight

AttributeValue
X, Y5,10
Width, Height3,20
rectHighHighLight

imgIcon

It first checks Icon Name is present in the table hash. If not found it uses whatever is passed.
If present then it builds the using SVG path with theme color values.

AttributeValue
ImageIf(IsBlank(ExpandMenu.GetMenuIconPath(ThisItem.Icon)), ThisItem.Icon, “data:image/svg+xml,” &  EncodeUrl(     “<svg xmlns=’http://www.w3.org/2000/svg&#8217; xmlns:xlink=’http://www.w3.org/1999/xlink&#8217; version=’1.1′ viewBox=’-10 0 ” & 2068 & ” 2048′>   <g
transform=’matrix(1 0 0 -1 0 2048),rotate(0, 2068,1024)’>    <path fill='” &
ExpandMenu.Theme.palette.neutralPrimary & “‘ d='” &
ExpandMenu.GetMenuIconPath(ThisItem.Icon) & “‘ />   </g> </svg>” ))
X,Y14,12
Width,Height16,16
imgIcon

lblLabel

Note the left padding is applied to skip the icon on the left of the label.

AttributeValue
TextThisItem.Label
ToolTipThisItem.ToolTip
Width, HeightParent.Width,40
PaddingLeft46
lblLabel

Use Component on the screen

Add a screen and add the following.

    Width:If(Self.IsExpanded, 200, 46)   
    Items:Table(
        {Icon:”PowerApps”, Label: “Power Apps”, Screen:scrExpandMenu},         {Icon: “PowerBILogo”, Label: “Power BI”, Screen:scrAutoWidthLabel},         {Icon: “PowerAutomateLogo”, Label: “Power Automate”, Screen:scrBreadcrumb},         {Icon: “Dataverse”, Label:”Dataverse”, Screen:scrCommandBar}    )  
Screen using the component.

Legend

Earlier in the post, I added references to #1, #2, and #3. I kept it at the end to refer it back as it is a large code piece.

#1Table( { Icon: “PowerApps”, Label: “Power Apps”, Screen:App.ActiveScreen, Tooltip:”Power Apps Tooltip” } )
#2{ palette: { themePrimary: “#0078d4”, themeLighterAlt: “#eff6fc”, themeLighter: “#deecf9”, themeLight: “#c7e0f4”, themeTertiary: “#71afe5”, themeSecondary: “#2b88d8”, themeDarkAlt: “#106ebe”, themeDark: “#005a9e”, themeDarker: “#004578”, neutralLighterAlt: “#faf9f8”, neutralLighter: “#f3f2f1”, neutralLight: “#edebe9”, neutralQuaternaryAlt: “#e1dfdd”, neutralQuaternary: “#d0d0d0”, neutralTertiaryAlt: “#c8c6c4”, neutralTertiary: “#a19f9d”, neutralSecondary: “#605e5c”, neutralPrimaryAlt: “#3b3a39”, neutralPrimary: “#323130”, neutralDark: “#201f1e”, black: “#000000”, white: “#ffffff” } }
#3LookUp( Table(         {             Name: “SizeLegacy”,             Code: “E2B2”,             Path: “some code for SVG PATH”         },         {             Name: “PageLink”,             Code: “E302”,             Path: “some other code for SVG PATH”         }      ) , Name=IconName).Path
legend

Summary

This post may help you when you look at the Expand Menu component code and try to use the same pattern for your own new control. Keep it as clean as possible like this control.

You can look at this video for the Creator Kit Overview.

Please let me know your feedback.

Posted in Technical Stuff | Leave a comment

Tip: Connect various M365 PowerShell modules for a demo tenant.

Summary

This is a quick tip to connect to a demo tenant using one-time storing of the password (a secured string) to a file and calling the connect command as many times as you want to test for the demo tenant.

#Execute Read-Host once and comment out 
#Read-Host "Enter password" -AsSecureString | ConvertFrom-SecureString | Out-File "C:\Temp\passwordNEW.txt"

$TenantName="CRM106438" # Change to your tenant name
$Username = $("admin@{0}.onmicrosoft.com" -f $TenantName)
$Password = cat "C:\Temp\passwordNEW.txt" | ConvertTo-SecureString
$Creds = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $Username, $Password
Connect-MicrosoftTeams -Credential $Creds 
Connect-ExchangeOnline -Credential $Creds
Connect-SPOservice -url $("https://{0}-admin.sharepoint.com" -f $TenantName) -Credential $Creds
Connect-Msolservice -Credential $Creds 
Connect-AzureAD -Credential $Creds 

Posted in powershell | Leave a comment

How to download ‘ALL’ files from a very large document library?

Summary

My customer had a large document library on SharePoint Online with a deep folder structure. The total number of items is 2.7 million (2,743,321). They want those documents downloaded to the local folder. I know, this is an actual scenario and issue.

The issue was the time it took to download the script they wrote. The customer used that script it was working but it took a very long time. The performance was a huge issue. Additionally, machine resources were getting effected, and download happen sequentially and no parallel download.

This article is to show the proposed approach and working solution for the customer. The blog post is mainly with the scripts so no need to write anything. You will need to be knowledgeable on PowerShell and little MS Graph API technical skills.

An approach

  1. Not to use the PnP PowerShell Get-PnPListItem call. This will take forever to complete. It is a helpful command but not for this scenario.
  2. Use some other ways to get the file’s relative URLs using MS Graph API.
  3. Only get the ID and WebURL properties using MS Graph, as highlighted above.
  4. Store ID and WebURL properties in the CSV files in the batch of 100,000.
  5. Once all the CSV files are downloaded run another script (Phase-2 below).
  6. This script will import the CSV file and iterate over all web URL.
  7. Using the Web URL value, create a folder and download the file if not present. To download use the Get-PnPFile command.
  8. This is an important step, run the script in multiple PowerShell windows so the script downloads files in parallel.

Phase 1

In this phase 1, you will need a way to get the CSV files with ID and WebURL from DocLib. These CSV files will be with 100K rows. If you do the math there will be 28 CSV files for 2.7 million records. (Your numbers will be different based on your files).

To run the script below you will need the following:

Azure Function running on a Queue Trigger.

Storage with Azure Queue and Azure Blob.

Azure AD app with MS Graph API to have Sites Read access. (I used Full control)

You will need a driveID. I used the Graph PowerShell SDK to get it.

# Input bindings are passed in via param block.
param($QueueItem, $TriggerMetadata)

# Write out the queue message and insertion time to the information log.
Write-Host "PowerShell queue trigger function processed work item: $QueueItem"
Write-Host "Queue item insertion time: $($TriggerMetadata.InsertionTime)"


# Populate with the App Registration details and Tenant ID
$ClientId                = "TODO"
$ClientSecret            = "TODO" 
$queueName               = "TODO"
$containerName           = "TODO"  
$tenantid                = "TODO" 
$env:AzureWebJobsStorage = "TODO"
$env:LOG_FILE_PATH       = "C:\TEMP"
$GraphScopes             = "https://graph.microsoft.com/.default"
$driveID                 = "TODO" 
# To get drive id variable execute the following command
# $drives = Get-MgSiteListDrive -SiteId Your-SITE -ListId Your-LIST 
# You will need to connect to Graph. Follow this article.

# Get the access token to execute MS Graph calls.
$headers = @{
    "Content-Type" = "application/x-www-form-urlencoded"
}
# Formulate body with four parameters.
$body = "grant_type=client_credentials&client_id=$ClientId&client_secret=$ClientSecret&scope=https%3A%2F%2Fgraph.microsoft.com%2F.default"
# Create login URL for the tenant id
$authUri = "https://login.microsoftonline.com/$tenantid/oauth2/v2.0/token"
# Make a POST call to Azure AD login URL
$response = Invoke-RestMethod $authUri  -Method 'POST' -Headers $headers -Body $body
# Using Token from the above call, create header with bearer token
$headers = @{
    "Content-Type" = "application/x-www-form-urlencoded"
    "Authorization" = $("Bearer {0}" -f $response.access_token)
}
#Function to move local file to blob storage
function MoveLogFilesToBlobContainer
{
    $storageContainer = New-AzStorageContext -ConnectionString $env:AzureWebJobsStorage | Get-AzStorageContainer  -Name $containerName
    #Write-Output $storageContainer
    Get-ChildItem $env:LOG_FILE_PATH -Filter ListOfIDs*.csv | 
    Foreach-Object {
        $blobNameWithFolder = $("{0}" -f $_.Name)
        Write-Output $("Move {0} to {1} Blob Container AS BlobName {2}." -f $_.FullName, $storageContainer.Name, $blobNameWithFolder)
        Set-AzStorageBlobContent -File $_.FullName `
            -Container $storageContainer.Name `
            -Blob $blobNameWithFolder `
            -Context $storageContainer.Context -Force
        Remove-Item -Path $_.FullName -Force
    }
}
#Function to put a message in a queue
function Put2MsgInQueue([Int]$aCounter,[String]$anUrl2Process)
{
    $FormattedMessage = $("{0},{1}" -f $aCounter,  $anUrl2Process )
    Write-Host $FormattedMessage
    $context = New-AzStorageContext -ConnectionString $env:AzureWebJobsStorage
    $queue = Get-AzStorageQueue -Name $queueName -Context $context
    # Create a new message using a constructor of the CloudQueueMessage class
    $queueMessage = [Microsoft.Azure.Storage.Queue.CloudQueueMessage]::new($FormattedMessage)
    # Add a new message to the queue
    $queue.CloudQueue.AddMessageAsync($QueueMessage)
}

function ScrapTheListItems([String]$aRestURI, [int]$batchNumber)
{
    $StopWatch = [System.Diagnostics.Stopwatch]::StartNew()
    $restURI = $aRestURI
    Write-Output $("restURI {0}" -f $restURI);
    # 200 * 500 = 100,000 rows in file.
    # 200 rows per API call
    $batchCountSize           = 2 #500
    #initialize the index and array to start
    $batchIndex               = 1
    $outArray = @()
    # MAKE a call to MS GRAPH API using the bearer token header
    $response = Invoke-RestMethod $restURI  -Method 'GET' -Headers $headers
    Write-Output $("response {0}" -f $response);
    # Get the next link URL.
    $restURI = $response."@odata.nextLink"
    while ($null -ne $restURI)
    {
        # Convert an array with Name & Value pair to an object array.
        # This is needed so the object array can be stored as CSV
        foreach ( $i in $response.value)
        {
            $anObj = New-Object PSObject
            Add-Member -InputObject $anObj -MemberType NoteProperty -Name 'id' -Value $i.Id
            Add-Member -InputObject $anObj -MemberType NoteProperty -Name 'webUrl' -Value $i.webUrl
            $outArray += $anObj
        }

        $totalRows = $batchIndex * 200

        Write-Output $("batchIndex : {0}, call to graph API for 200 rows now total is  {1}" -f $batchIndex, $totalRows );

        if ( $batchIndex -eq $batchCountSize)
        {
            $exportCsvURLPath          = $("{0}\ListOfIDs-{1}.csv" -f $env:LOG_FILE_PATH, $batchNumber )
            Write-Output $("Create {0}" -f $exportCsvURLPath);
            # create a message in the Queue to start a new func app to process 1000 urls.
            $outArray | Export-Csv -Path "$exportCsvURLPath" -NoTypeInformation -Force
            ## MOVE TO BLOB CONTAINER
            MoveLogFilesToBlobContainer
            #initialize the index and array to start
            $batchIndex               = 1
            $outArray = @()
            # add file batch number to next 
            $batchNumber++
            ###NOW EXIT FROM LOOP
            break
        }
        else
        {
            $batchIndex++
        }
        # MAKE a call to MS GRAPH API using the bearer token header
        $response = Invoke-RestMethod $restURI  -Method 'GET' -Headers $headers
        # Get the next link URL.
        $restURI = $response."@odata.nextLink"
    }

    # The last remaining batch may be less than the batch count size
    if (($batchIndex -gt 1) -or ($outArray.Count -gt 0))
    {
            $exportCsvURLPath          = $("{0}\ListOfIDs-{1}.csv" -f $env:LOG_FILE_PATH, $batchNumber )
            Write-Output $("Create {0}" -f $exportCsvURLPath);
            # Create a message in the Queue to start a new Function App.
            $outArray | Export-Csv -Path "$exportCsvURLPath" -NoTypeInformation -Force
            ## MOVE the CSV file TO the BLOB CONTAINER
            MoveLogFilesToBlobContainer
    }
    if ($null -ne $restURI)
    {
        Put2MsgInQueue -aCounter $batchNumber -anUrl2Process $restURI
    }
    $StopWatch.Stop()
    Write-Output $("Elapsed time in TotalMinutes: {0}" -f $StopWatch.Elapsed.TotalMinutes);
}

# To start this Function add a manual queue message as "START"
if ("START" -eq $QueueItem)
{
    Write-Host "we are running first time"
    $counter = 1
    # For first time we need to make a call to MS Graph API
    $firstURI = $("https://graph.microsoft.com/v1.0/drives/$driveID/list/items?{0}" -f '$Select=Id%2CWebUrl') 
    ScrapTheListItems $firstURI  $counter
}
else
{
    # Function will always fall here with the index#, URL to fetch
    $splittedArray = $QueueItem.split(",")
    $counter = [int]$splittedArray[0]
    ScrapTheListItems $splittedArray[1] $counter
}

Phase 2

In Phase 2, the script is to download the file. It does the following steps.

Read the CSV File with the passed-in batch number. e.g. 1, 2, 3…,274. the files are assumed to in the blob storage. If you want to point to a local folder, you may need to change the code.

Read all 100k records. Using the Web URL check for the local folder presence and file presence.

If not present, create a folder and dump the file using Get-PnPFile command.

Run the following in multiple PowerShell prompt so the files are downloaded in parallel. Even if you run twice with the same batch number parameter the script will figure out and not download the file if downloaded already before.


# Input bindings are passed in via param block.
param($batchNumber)

### TODO REMOVE LATER ONLY FOR DEBUGGING
$MaxFiles2Get                   = 10000
$CurrentFileNumber              = 0
#Initialize variables
$DownloadLocation               = "V:\Verification Documents"
$SiteURL                        = "https://Contoso.sharepoint.com/sites/LegalDept"
$CSVFilesPath                   = "C:\LegalDept"
$OrechestratorCSVFileName       = "0-Orchestrator.csv"
$env:LOG_FILE_PATH              = "C:\LegalDept\Logs"
$global:TotalFilesAlreadyPresent       = 0
$global:TotalFilesDownloaded           = 0
$global:ConnectPnPDoneFlag             = $false

Add-Type -AssemblyName System.Web

function Write-Log
{
    [CmdletBinding()]
    Param
    (
        [Parameter(Mandatory=$true,
                   ValueFromPipelineByPropertyName=$true)]
        [ValidateNotNullOrEmpty()]
        [Alias("LogContent")]
        [string]$Message,

        [Parameter(Mandatory=$false)]
        [Alias('LogPath')]
        [string]$Path='C:\Logs\PowerShellLog.log',
        
        [Parameter(Mandatory=$false)]
        [ValidateSet("Error","Warn","Info")]
        [string]$Level="Info",
        
        [Parameter(Mandatory=$false)]
        [switch]$NoClobber
    )

    Begin
    {
        # Set VerbosePreference to Continue so that verbose messages are displayed.
        $VerbosePreference = 'Continue'
    }
    Process
    {
        
        # If the file already exists and NoClobber was specified, do not write to the log.
        if ((Test-Path $Path) -AND $NoClobber) {
            Write-Error "Log file $Path already exists, and you specified NoClobber. Either delete the file or specify a different name."
            Return
            }

        # If attempting to write to a log file in a folder/path that doesn't exist create the file including the path.
        elseif (!(Test-Path $Path)) {
            Write-Verbose "Creating $Path."
            New-Item $Path -Force -ItemType File
            }

        else {
            # Nothing to see here yet.
            }

        # Format Date for our Log File
        $FormattedDate = Get-Date -Format "yyyy-MM-dd HH:mm:ss"

        # Write message to error, warning, or verbose pipeline and specify $LevelText
        switch ($Level) {
            'Error' {
                Write-Error $Message
                $LevelText = 'ERROR:'
                }
            'Warn' {
                Write-Warning $Message
                $LevelText = 'WARNING:'
                }
            'Info' {
                Write-Verbose $Message
                $LevelText = 'INFO:'
                }
            }
        
        # Write log entry to $Path
        "$FormattedDate $LevelText $Message" | Out-File -FilePath $Path -Append
        ## also dump to console
        #$savedColor = $host.UI.RawUI.ForegroundColor 
        #$host.UI.RawUI.ForegroundColor = "DarkGreen"
        Write-Output  $message 
        #$host.UI.RawUI.ForegroundColor = $savedColor
    }
    End
    {
    }
}

function WriteExceptionInformation($AnItem)
{
    Write-Log -Path $LogFileName  $AnItem.Exception.Message
    Write-Log -Path $LogFileName  $AnItem.Exception.StackTrace    
    <#        
    Write-Log -Path $LogFileName  $AnItem.Exception.ScriptStackTrace
    Write-Log -Path $LogFileName  $AnItem.InvocationInfo | Format-List *
    #> 
}

function UpdateOrchestratorInouFile()
{
    param (
        [string]$Status2update
    )
    $csvfile = Import-CSV -Path $("{0}\{1}" -f $CSVFilesPath, $OrechestratorCSVFileName)

    if ( $null -ne $csvfile)
    {
        $outArray = @()
        foreach ( $aRowInFile in $csvfile)
        {
            $anObj = New-Object PSObject
            Add-Member -InputObject $anObj -MemberType NoteProperty -Name 'BatchNumber' -Value $aRowInFile.BatchNumber
            if ($aRowInFile.BatchNumber -eq $batchNumber)
            {
                # Change status to Status2update
                Add-Member -InputObject $anObj -MemberType NoteProperty -Name 'Status' -Value $Status2update
            }
            else {
                <# Action when all if and elseif conditions are false #>
                # Keep the status as is
                Add-Member -InputObject $anObj -MemberType NoteProperty -Name 'Status' -Value $aRowInFile.Status
            }
            $outArray += $anObj
        }
        # Important step modify the file.
        $outArray | Export-Csv -Path $("{0}\{1}" -f $CSVFilesPath, $OrechestratorCSVFileName) -NoTypeInformation -Force
    }
}


function MainWorkerFunc 
{

    $didFailStatusHappen = $false
    $importCsvURLPath          = $("ListOfIDs-{0}.csv" -f $batchNumber )
    $csvfile = Import-CSV -Path $("{0}\{1}" -f $CSVFilesPath, $importCsvURLPath)
    if ( $null -ne $csvfile)
    {
        UpdateOrchestratorInouFile -Status2update "INPROGESS"

        try {


            # sample https://Contoso.sharepoint.com/sites/LegalDept/Verification%20Documents/Documents/FirtnameLastname0877_115502.pdf
            foreach ( $aRowInFile in $csvfile)
            {
                $webUrl2work = $aRowInFile.webUrl
                if ( $null -ne $webUrl2work)
                {
                    # remove the URL decoding from web url
                    $decodedWebUrl2work = [System.Web.HttpUtility]::UrlDecode($webUrl2work) 
                    $fileName = Split-Path -Path $decodedWebUrl2work -Leaf 
                    $splitArr = $decodedWebUrl2work.split('/')             
                    $filePath = $DownloadLocation
                    # now build a local path
                    $idx = 0;
                    foreach ( $valInArr in $splitArr)
                    {
                        # skip all four indices 0,1,2,3,4,5
                        if ( $idx -ge 6)
                        {
                            # skip the file name
                            if ( $fileName -ne $valInArr)
                            {
                                # append the path to the existing
                                $filePath = $("{0}\{1}" -f $filePath, $valInArr)
                            }
                        }
                        $idx++
                    }
                    #Ensure All Folders in the Local Path
                    $LocalFolder = $filePath
                    #Create Local Folder, if it doesn't exist
                    If (!(Test-Path -Path $LocalFolder)) 
                    {
                        New-Item -ItemType Directory -Path $LocalFolder | Out-Null
                    }
                    #Download file , if it doesn't exist
                    If (!(Test-Path -LiteralPath $("{0}\{1}" -f $filePath, $fileName))) 
                    {
                        try
                        {
                            if ( $global:ConnectPnPDoneFlag -eq $false )
                            {
                                Write-Log -Path $LogFileName  $("Connecting to {0}" -f $SiteURL);
                                Connect-PnPOnline $SiteURL -ClientId "TODO" -ClientSecret "*****"
                                Write-Log -Path $LogFileName  $("Connected  to {0}" -f $SiteURL);
                                # since we are connected make this flag true
                                $global:ConnectPnPDoneFlag = $true
                            }
                            # string the host from the URL
                            # https://Contoso.sharepoint.com/sites/LegalDept/Verification%20Documents/Documents/FirtnameLastname0877_115502.pdf
                            # should be /sites/LegalDept/Verification%20Documents/Documents/FirtnameLastname0877_115502.pdf
                            $relativeFileURL = ([uri]$webUrl2work).LocalPath
                            Write-Log -Path $LogFileName  $("Download file from {0}." -f $relativeFileURL);
                            Get-PnPFile -Url $relativeFileURL -Path $filePath -FileName "$fileName" -AsFile
                            Write-Log -Path $LogFileName  $("to {0}\{1}." -f $filePath,$fileName);
                            $global:TotalFilesDownloaded += 1
                        }
                        catch
                        {
                            WriteExceptionInformation ( $PSItem )
                            UpdateOrchestratorInouFile -Status2update "FAILED"
                            $didFailStatusHappen = $true
                            ### STOP everything if the error occured
                            break
                        }
                    }
                    else
                    {
                        $global:TotalFilesAlreadyPresent += 1
                        Write-Log -Path $LogFileName  $("File {0}\{1} already downloded." -f $filePath,$fileName);
                    }
                    $CurrentFileNumber += 1
                    Write-Log -Path $LogFileName  $("CurrentFileNumber {0}" -f $CurrentFileNumber);
                    # TODO REMOVE LATER
                    if ( $CurrentFileNumber -eq $MaxFiles2Get)
                    {
                        break
                    }
                }
            }
        }
        catch {
            WriteExceptionInformation ( $PSItem )
            UpdateOrchestratorInouFile -Status2update "FAILED"
            $didFailStatusHappen = $true
            ### STOP everything if the error occured
            break
        }
        finally {
            <#Do this after the try block regardless of whether an exception occurred or not#>
            ##### 
            #Update complete only if fail did not happen before.
            if ( $true -ne $didFailStatusHappen )
            {
                UpdateOrchestratorInouFile -Status2update "COMPLETE"
        
            }
        }

    }
}

$StopWatch = [System.Diagnostics.Stopwatch]::StartNew()
$LogFileName = $("{0}\Batch-{1:d2}-Log-{2}.txt" -f $env:LOG_FILE_PATH , $batchNumber, (Get-Date -Format "yyyy-MM-dd-HH-mm-ss"))
Write-Log -Path $LogFileName " *************************************** Start  *************************************** "

#Change Window Title
$Host.UI.RawUI.WindowTitle = $("Batch number {0}." -f $batchNumber);

MainWorkerFunc # CALL THE MAIN WORKER FUNCTION
$StopWatch.Stop()
Write-Log -Path $LogFileName " ------------------------------------------------------------------------------------- "
Write-Log -Path $LogFileName  $("Batch number {0}." -f $batchNumber);
Write-Log -Path $LogFileName  $("Total files already found present: {0}" -f $global:TotalFilesAlreadyPresent);
Write-Log -Path $LogFileName  $("Total files downloaded: {0}" -f $global:TotalFilesDownloaded);
$StopWatch.Stop()
Write-Log -Path $LogFileName  $("Elapsed time in TotalMinutes: {0}" -f $StopWatch.Elapsed.TotalMinutes);
Write-Log -Path $LogFileName " ------------------------------------------------------------------------------------- "
Write-Log -Path $LogFileName " ***************************************  End   *************************************** "

Orchestrator PowerShell Script



#Initialize variables
$CSVFilesPath                   = "C:\LegalDept"
$OrechestratorCSVFileName       = "0-Orchestrator.csv"
$env:LOG_FILE_PATH              = "C:\LegalDept\Logs"

function WriteExceptionInformation($AnItem)
{
    Write-Log -Path $LogFileName  $AnItem.Exception.Message
    Write-Log -Path $LogFileName  $AnItem.Exception.StackTrace            
    Write-Log -Path $LogFileName  $AnItem.Exception.ScriptStackTrace
    Write-Log -Path $LogFileName  $AnItem.InvocationInfo | Format-List * 
}

function Write-Log
{

    [CmdletBinding()]
    Param
    (
        [Parameter(Mandatory=$true,
                   ValueFromPipelineByPropertyName=$true)]
        [ValidateNotNullOrEmpty()]
        [Alias("LogContent")]
        [string]$Message,

        [Parameter(Mandatory=$false)]
        [Alias('LogPath')]
        [string]$Path='C:\Logs\PowerShellLog.log',
        
        [Parameter(Mandatory=$false)]
        [ValidateSet("Error","Warn","Info")]
        [string]$Level="Info",
        
        [Parameter(Mandatory=$false)]
        [switch]$NoClobber
    )

    Begin
    {
        # Set VerbosePreference to Continue so that verbose messages are displayed.
        $VerbosePreference = 'Continue'
    }
    Process
    {
        
        # If the file already exists and NoClobber was specified, do not write to the log.
        if ((Test-Path $Path) -AND $NoClobber) {
            Write-Error "Log file $Path already exists, and you specified NoClobber. Either delete the file or specify a different name."
            Return
            }

        # If attempting to write to a log file in a folder/path that doesn't exist create the file including the path.
        elseif (!(Test-Path $Path)) {
            Write-Verbose "Creating $Path."
            New-Item $Path -Force -ItemType File
            }

        else {
            # Nothing to see here yet.
            }

        # Format Date for our Log File
        $FormattedDate = Get-Date -Format "yyyy-MM-dd HH:mm:ss"

        # Write message to error, warning, or verbose pipeline and specify $LevelText
        switch ($Level) {
            'Error' {
                Write-Error $Message
                $LevelText = 'ERROR:'
                }
            'Warn' {
                Write-Warning $Message
                $LevelText = 'WARNING:'
                }
            'Info' {
                Write-Verbose $Message
                $LevelText = 'INFO:'
                }
            }
        
        # Write log entry to $Path
        "$FormattedDate $LevelText $Message" | Out-File -FilePath $Path -Append
        ## also dump to console
        #$savedColor = $host.UI.RawUI.ForegroundColor 
        #$host.UI.RawUI.ForegroundColor = "DarkGreen"
        Write-Output  $message 
        #$host.UI.RawUI.ForegroundColor = $savedColor
    }
    End
    {
    }
}


function MainOrchestratorFunc {
    $csvfile = Import-CSV -Path $("{0}\{1}" -f $CSVFilesPath, $OrechestratorCSVFileName)

    if ( $null -ne $csvfile)
    {
        foreach ( $aRowInFile in $csvfile)
        {
            Write-Log -Path $LogFileName $("Batch number {0:d2} has Status {1}" -f $aRowInFile.BatchNumber, $aRowInFile.Status ) 
            switch ($aRowInFile.Status.ToUpper())
            {
                "NEW"
                {
                    Write-Log -Path $LogFileName $("Batch number {0:d2} has '{1}' Status. Spawn this batch and change status to InProgress." -f $aRowInFile.BatchNumber, $aRowInFile.Status )
                    # spawm the file with the batch number
                    SpawnThePowerShellProcess -batchnumber2Process $aRowInFile.BatchNumber
                }
                "INPROGRESS"
                {
                    Write-Log -Path $LogFileName $("Batch number {0:d2} has '{1}' Status. Do nothing." -f $aRowInFile.BatchNumber, $aRowInFile.Status ) 
                }
                "FAILED"
                {
                    Write-Log -Path $LogFileName $("Batch number {0:d2} has '{1}' Status. Spawn this batch and change status to InProgress." -f $aRowInFile.BatchNumber, $aRowInFile.Status ) 
                    SpawnThePowerShellProcess -batchnumber2Process $aRowInFile.BatchNumber
                }
                "COMPLETE"
                {
                    Write-Log -Path $LogFileName $("Batch number {0:d2} has '{1}' Status. Do nothing." -f $aRowInFile.BatchNumber, $aRowInFile.Status ) 
                }
                default
                    {
                        Write-Log -Path $LogFileName $("Batch number {0:d2} has and INVALID Status {1}" -f $aRowInFile.BatchNumber, $aRowInFile.Status ) 
                    }
            }
        }
    }
}
function SpawnThePowerShellProcess {
    param (
        [int]$batchnumber2Process
    )
    $processOptions = @{
        FilePath = "PowerShell" 
        WorkingDirectory = "C:\scripts"
        ArgumentList = "C:\scripts\DownloadFiles.ps1 -batchNumber $batchnumber2Process" 
    }
    Start-Process @processOptions -Verb RunAs -WindowStyle Normal
}

$StopWatch = [System.Diagnostics.Stopwatch]::StartNew()
$LogFileName = $("{0}\Orchestrator-Log-{1}.txt" -f $env:LOG_FILE_PATH , (Get-Date -Format "yyyy-MM-dd-HH-mm-ss"))
Write-Log -Path $LogFileName " *************************************** Start  *************************************** "
MainOrchestratorFunc # CALL THE MAIN ORCHESTRATOR FUNCTION
$StopWatch.Stop()
Write-Log -Path $LogFileName " ------------------------------------------------------------------------------------- "
Write-Log -Path $LogFileName  $("Elapsed time in TotalMinutes: {0}" -f $StopWatch.Elapsed.TotalMinutes);
Write-Log -Path $LogFileName " ------------------------------------------------------------------------------------- "
Write-Log -Path $LogFileName " ***************************************  End   *************************************** "

The Orchestrator CSV file

Status field is caseinsentive
Status New means new
1,New
Status InProgress means the batch is currently running
1,InProgress
Status Failed means the batch failed needs to run again
1,Failed
Status Complete means the batch is complete DO NOT run
1,Complete

"BatchNumber","Status"
"1","COMPLETE"
"2","NEW"
"3","COMPLETE"
"4","INPROGRESS"
"5","COMPLETE"
"6","COMPLETE"
"7","INPROGRESS"
"8","COMPLETE"
"9","FAILED"
"10","COMPLETE"

Conclusion

The script and instructions are rough, but they should be helpful if you are a developer.

The customer used the code from here. The code works but for the large doc lib they noticed issues mentioned in the summary,

This article would help you to download the files in multiple threads. The above script in phase 2 could run again pick up where it left from in case of failure. The script will skip the files already downloaded to the local folder.

Posted in MS Graph, PnP.PowerShell, SharePoint, SharePoint 2013, Technical Stuff | Leave a comment