How to use AAD Access Token in Connect-MgGraph?

Summary

The Microsoft Graph PowerShell SDK is a great and simpler ways to get MS Graph API PowerShell code working quickly. But what I have found the source code and example to utilize the X509 certificate ways of authentication. For doing a quick demo with the Azure AD security token there a simple way which I will describe here in this post.

Script example

The tip is very simple. Since Connect-MgGraph does not have Client Secret parameter, use the Invoke-RestMethod to get the access token. Once valid token is received pass it to the Connect-MgGraph and make the rest of the other MS Graph SDK calls after that.

See in the following example I have used the Get-MgGroup call after successfully connecting to MS Graph.

# The following command only required one time execution
if ( Get-ExecutionPolicy)
{
    Write-Host "RemoteSigned policy exists."
}
else
{
    Write-Host "RemoteSigned policy does not exist."
    Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser
}

if (Get-Module -ListAvailable -Name Microsoft.Graph) {
    Write-Host "Microsoft.Graph Module exists"
} 
else {
    Write-Host "Microsoft.Graph Module does not exist"
    Install-Module Microsoft.Graph -Scope AllUsers
}

# Populate with the App Registration details and Tenant ID
$ClientId          = "TODO"
$ClientSecret      = "TODO" 
$tenantid          = "TODO" 
$GraphScopes       = "https://graph.microsoft.com/.default"


$headers = @{
    "Content-Type" = "application/x-www-form-urlencoded"
}

$body = "grant_type=client_credentials&client_id=$ClientId&client_secret=$ClientSecret&scope=https%3A%2F%2Fgraph.microsoft.com%2F.default"
$authUri = "https://login.microsoftonline.com/$tenantid/oauth2/v2.0/token"
$response = Invoke-RestMethod $authUri  -Method 'POST' -Headers $headers -Body $body
$response | ConvertTo-Json
 
$token = $response.access_token
 
# Authenticate to the Microsoft Graph
Connect-MgGraph -AccessToken $token

# If you want to see debugging output of the command just add "-Debug" to the call.
Get-MgGroup -Top 10

Conclusion

I hope this helps you. I use this technique to quickly check / test the calls to the MS Graph.

Note: Please make sure your Azure AD app has required permission applied and consented or else you would get “Insufficient privileges to complete the operation.” error.

Also use the MS Graph explorer as UI ways to test your API and check required permission.

https://aka.ms/GE

PS C:\WINDOWS\system32> Get-MgUser -Top 10
Get-MgUser : Insufficient privileges to complete the operation.
At line:1 char:1
+ Get-MgUser -Top 10
+ ~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : InvalidOperation: ({ ConsistencyLe...ndProperty =  }:<>f__AnonymousType59`9) [Get-MgUser_List1], RestException`1
    + FullyQualifiedErrorId : Authorization_RequestDenied,Microsoft.Graph.PowerShell.Cmdlets.GetMgUser_List1

PS C:\WINDOWS\system32> 
Posted in MS Graph, Technical Stuff | 1 Comment

How to get all sites from the tenant using MS Graph API?

Summary

The PnP PowerShell command Get-PnPTenantSite to get all sites from the tenant takes longer time. Additionally, it does not have asynchronous ways to get the information in the Azure Durable Function.

This article uses the MS Graph API List Sites to get the sites. To use this API the following Application API permissions required for the Azure AD app.

Sites.Read.All, Sites.ReadWrite.All

Script

$StartMs = (Get-Date).Millisecond

# You will need Azure AD app with the following API permissions.
# Application	Sites.Read.All
#
$ClientId          = "TODO"
$ClientSecret      = "TODO"
$tenantid          = "TODO"
$path2File         = 'C:\temp\test.txt' # Change this as you like.

## Get Auth Token ## 
$headersAuth = @{
    "Content-Type" = "application/x-www-form-urlencoded"
    'Accept' = '*/*'
}
$body = $("grant_type=client_credentials&client_id={0}&client_secret={1}&scope=https%3A%2F%2Fgraph.microsoft.com%2F.default" -f $ClientId, $ClientSecret)
$outhTokenUrl = $("https://login.microsoftonline.com/{0}/oauth2/v2.0/token" -f $tenantid)
$response = Invoke-RestMethod $outhTokenUrl -Method 'POST' -Headers $headersAuth -Body $body
$response | ConvertTo-Json
$tokenExpiryTime = (get-date).AddSeconds($response.expires_in)
##
## Make the first call with $filer to your tenant name ##
##
$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headers.Add("Content-Type", "application/json")
$headers.Add("SdkVersion", "postman-graph/v1.0")
$headers.Add("Prefer", "apiversion=2.1")
$headers.Add("Authorization", $("Bearer {0}" -f $response.access_token) )
$response = Invoke-RestMethod 'https://graph.microsoft.com/v1.0/sites?$filter=siteCollection/hostname eq ''{CHANGE TO YOUR TENANT NAME}.sharepoint.com''' -Method 'GET' -Headers $headers
$response | ConvertTo-Json
## Check if there are more sites to fetch...
while ( $response.'@odata.nextLink' -ne $null )
{
    ## iterate on the response and write the site url to the file.
    foreach ( $val in $response.Value )
    {
        Write-Output $("{0}" -f $val.webUrl)
        Add-Content -Path $path2File -Value $val.webUrl
    }
    # check if the token expired, if it did get a new one.
    if ( (get-date) -gt  $tokenExpiryTime )
    {
        $response = Invoke-RestMethod "https://login.microsoftonline.com/$($tenantid)/oauth2/v2.0/token" -Method 'POST' -Headers $headers -Body $body
        $response | ConvertTo-Json

        $tokenExpiryTime = (get-date).AddSeconds($response.expires_in)

        # modify header with the new token
        $headers = @{
            "Content-Type" = "application/json"
            "Authorization" = $("Bearer {0}" -f $response.access_token) 
        }
    }

    # first store the data of Web URL to the file.

    $response = Invoke-RestMethod $response.'@odata.nextLink' -Method 'GET' -Headers $headers
    $response | ConvertTo-Json 

}
$EndMs = (Get-Date).Millisecond
WriteHost "This script took $($EndMs - $StartMs) milliseconds to run."

Conclusion

Using MS Graph API you can overcome to get the list of all sites within your tenant. This can be done using the Get-PnPTenantSite but it has overheads if you just want the site urls of all sites.

Posted in MS Graph, SharePoint | Leave a comment

How to hide welcome message for an empty SharePoint List?

Summary

Create a new custom list, you will notice this new list will have the following message as shown below.

An image.. + “Welcome to your new list” + “Select the New button to get started”

The custom welcome message for the empty list.

This post should help you to hide the welcome message.

Step By Step Solution

Step # 1: Create a column hideWelcome Yes/No Type and default value as No.

Create a hideWelcome column,

Step # 2: Add a dummy first row with hideWelcome as Yes for this row.

Add a new row for this hidewelcome column as Yes

Step # 3: Add the View Formatter JSON code for the trick.

{
  "$schema": "https://developer.microsoft.com/json-schemas/sp/v2/row-formatting.schema.json",
  "hideColumnHeader": true,
  "hideSelection": true,
  "debugMode": true,
  "rowFormatter": {
    "elmType": "div",
    "style": {
      "display":  "=if([$hideWelcome], 'none', '')"
    },
	"children": [
	  {
		"elmType": "div",
		"txtContent": "[$Title]",
		"style": {
		  "flex-grow": "1"
		}
	  }
	]
  }
}

The first row with the hideWelcome as Yes will hide the welcome message.

The hidden welcome message

Conclusion

The above trick may not work for all scenarios. As I have not tested all scenario. This is a technique to hide the welcome message based on the hidden field named hideWelcome.

Posted in SharePoint, Technical Stuff | 1 Comment

How to extract User Profile Photo using MS Graph API?

Summary

The existing Champion Management Platform Teams app created in the PnP community uses the User Profile Photo MS Graph API to extract and update the profile picture with the badge.

This article will demonstrate how to do the same API call with Power Automate.

Prerequisites

  • Power Automate Premium license to use the HTTP actions
  • Azure AD app with the MS Graph User.ReadWrite.All permission granted

Step by Step Solution

Step # 1: Create an Azure AD App with MS Graph Application Permission granted

Azure AD app

Step #2: Make a note of Application ID, Tenant ID, and Client Secret for the above Azure AD app.

Use these noted values in the next step.

Step #3: Create a new Power Automate Flow the Manually trigger a flow. Initialize three variables and define one text input as UserUPN.

Step #4: Make an HTTP call to get the app’s access token.

HTTP call to get an access token.
# Please use the highlighted values for URI, Headers and Body. 

{
    "inputs": {
        "method": "POST",
        "uri": "https://login.microsoftonline.com/@{variables('TenantID')}/oauth2/v2.0/token",
        "headers": {
            "Content-Type": "application/x-www-form-urlencoded"
        },
        "body": "grant_type=client_credentials&scope=https://graph.microsoft.com/.default&client_id=@{variables('ApplicationID')}&client_secret=@{variables('ClientSecret')}"
    },
    "metadata": {
        "operationMetadataId": "a69e019a-d351-409a-ae1b-340a23f4b775"
    }
}

Step # 5: User Parse JSON to get the output value of the above action.

Parse JSON
{
    "type": "object",
    "properties": {
        "token_type": {
            "type": "string"
        },
        "expires_in": {
            "type": "integer"
        },
        "ext_expires_in": {
            "type": "integer"
        },
        "access_token": {
            "type": "string"
        }
    }
}

Step # 6: Now make the Get Profile Image call to get the Image content of the profile photo.

Make an HTTP call to get the profile image.
# Please use the highlighted values for URI and Headers. 

{
    "inputs": {
        "method": "GET",
        "uri": "https://graph.microsoft.com/v1.0/users/@{triggerBody()['text']}/photo/$value",
        "headers": {
            "responseType": "blob",
            "Content-Type": "blob",
            "Authorization": "@{body('ParseJSONforToken')?['token_type']} @{body('ParseJSONforToken')?['access_token']}"
        }
    },
    "metadata": {
        "operationMetadataId": "775579d0-6aa9-4326-a1f9-0ad37217c304"
    }
}

Step # 7: Finally add compose section to get the image content from the above action.

Conclusion

As shown in the above technique you can get the user profile images on the tenant. The Azure AD app plays an important part to make an MS Graph API call. The same API call can be made using the PUT and you should be able to apply a badge or anything else to the user profile picture.

Posted in Technical Stuff | 1 Comment

How to Export Intune reports using Graph APIs?

Summary

The following REST API call is to get the InTune report data for the tenant.

# The API is a REST call with the request body to get the report CSV file.
https://graph.microsoft.com/beta/deviceManagement/reports/exportJobs

Please refer here for more details on the API.

Step By Step Solution

Step # 1 Create an Azure AD app with the MS Graph “DeviceManagementManagedDevices.Read.All” permission.

MS Graph “DeviceManagementManagedDevices.Read.All” permission.

Please note the Application ID, Secret, and Tenant ID. You will need these three pieces of information in the PowerShell Script.

Step # 2 Using PowerShell run the following script.

# Init Variables
$outputPath    = "C:\Hold"
$outputCSVPath = "C:\Hold\EAWFAreport.zip"  #might need changed

$ApplicationID   = "TOBE CHANGED"
$TenantID        = "TOBE CHANGED"
$AccessSecret    = "TOBE CHANGED"

#Create an hash table with the required value to connect to Microsoft graph
$Body = @{    
    grant_type    = "client_credentials"
    scope         = "https://graph.microsoft.com/.default"
    client_id     = $ApplicationID
    client_secret = $AccessSecret
} 

#Connect to Microsoft Graph REST web service
$ConnectGraph = Invoke-RestMethod -Uri https://login.microsoftonline.com/$TenantID/oauth2/v2.0/token -Method POST -Body $Body

#Endpoint Analytics Graph API
$GraphGroupUrl = "https://graph.microsoft.com/beta/deviceManagement/reports/exportJobs"

# define request body as PS Object
$requestBody = @{
    reportName = "Devices"
    select = @(
        "DeviceId"
        "DeviceName"
        "SerialNumber"
        "ManagedBy"
        "Manufacturer"
        "Model"
        "GraphDeviceIsManaged"
    )

}

# Convert to PS Object to JSON object
$requestJSONBody = ConvertTo-Json $requestBody

#define header, use the token from the above rest call to AAD.
# in post method define the body is of type JSON using content-type property.
$headers = @{
    'Authorization' = $(“{0} {1}” -f $ConnectGraph.token_type,$ConnectGraph.access_token)
    'Accept' = 'application/json;'
    'Content-Type' = "application/json"
}

#This API call will start a process in the background to #download the file.
$webResponse = Invoke-RestMethod $GraphGroupUrl -Method 'POST' -Headers $headers -Body $requestJSONBody -verbose


#If the call is a success, proceed to get the CSV file.
if ( -not ( $null -eq $webResponse ) )
{
    #Check status of export (GET) until status = complete
    do
    {

#format the URL to make a next call to get the file location.
        $url2GetCSV = $("https://graph.microsoft.com/beta/deviceManagement/reports/exportJobs('{0}')" -f $webResponse.id)
        "Calling $url2GetCSV"
        $responseforCSV = Invoke-RestMethod $url2GetCSV -Method 'GET' -Headers $headers  -verbose
        if (( -not ( $null -eq $responseforCSV ) ) -and ( $responseforCSV.status -eq "completed"))
        {
            #download CSV from "URL=" to OutputCSVPath
            #### It means the completed status is true, now get the file.
            Invoke-WebRequest -Uri $responseforCSV.url -OutFile $outputCSVPath
		# Un Zip the file.
            Expand-Archive -LiteralPath $outputCSVPath -DestinationPath $outputPath

        }
        {
            Write-Host "Still in progress..."
        }
        Start-Sleep -Seconds 10 # Delay for 10 seconds.
    } While (( -not ( $null -eq $responseforCSV ) ) -and ( $responseforCSV.status -eq "inprogress"))

}

After this PowerShell script call, you should see the Zip and CSV files in the C:\Hold Folder.

Conclusion

The above steps will help you to get the InTune reports data file. The API still in the beta if anything changes I will update this post.

Posted in Technical Stuff | 4 Comments

How to prevent ListView WebPart from making frequent Search API calls?

Summary

My customer has migrated the classic sites to SharePoint Online. Some sites’ home pages are with the list view web part, these pages make multiple query calls to Search API every 60 seconds.

This will make Search API throttle the query and it may have a bad user experience if many users frequently visit the same page multiple times. Read this article on the details of throttling.

Step by Step to diagnose the issue

For an issue like throttling, to diagnose you will need the Browser’s Developer Tools, you can get more information here about the developer tool for the Edge browser.

Open the developer’s tool’s console window and type localStorage.clear() and sessionStorage.clear(). This will clear all the browser cache.

Go to the SharePoint page with List View WebPart. Open the Network tab and watch for the outbound traffic. You will notice every 60 seconds the ListView WebPart will try to refresh.

This happens because the following are the settings for the list view web part.

“Automatic Refreshing interval (seconds)”

Increase the auto-refresh interval and turn it on to show the manual refresh button.

Conclusion

This may be a simple thing and can cause issues only if the page is popular and many users are visiting at the same time. If the user leaves the page active for a long time the auto-refresh will make the call in the interval and the user may not even need the refresh.

The better fix for such a page, increase the interval and provide the manual refresh button so as per the user’s need they can refresh. This will also reduce unnecessary calls to the Search API.

Posted in Technical Stuff | Leave a comment

How to turn on versioning on ALL document libraries for a site?

Summary

The requirement is to turn on versioning for all site document libraries, including subsites. Also, the Major version number should be 500.

The Document Library Versioning Settings for Major Version as 500

The following Set-PnPList command can be used to set the Versioning and Major version as 500 for a Document Library

# 

Set-PnPList -Identity "Documents" -EnableVersioning 1 -MajorVersions 25

The following command can be used to set the Versionins and Major / Minor version numbers.

# In this command the minor version in the UI (as aboove) is the draft version.

Set-PnPList -Identity "Documents" -EnableVersioning $true -MajorVersions 25 -EnableMinorVersions $true -MinorVersions 10

#

There are two files to automate the process of turning on versioning. The input file is site2process.CSV. The input file is with list of Site Collection URLs to process.

Url
https://m365x162783.sharepoint.com/sites/Test1
https://m365x162783.sharepoint.com/sites/Test2

The script file is located here.

Conclusion

Using the above script you can turn on the versioning of all the document libaries.

Posted in PnP.PowerShell, SharePoint | Leave a comment

Build 2022 – May 24 to May 26

Summary

This post is to list the Build 2022 sessions by Solution Areas.

Click/Select on the Solution Area topic to get the list of sessions.

Azure
Azure – Application Innovation
Azure – Data
Azure – IoT
Azure – Infrastructure
Azure – AI
Azure – Enable
Azure – Migration and Modernization
Azure – Power BI
Azure – Mixed Reality
Microsoft 365
Microsoft 365 – Windows
Microsoft 365 – Microsoft Teams
Microsoft 365 – Microsoft 365 Apps
Microsoft 365 – Collaborative Apps
Microsoft 365 – Microsoft Graph
Microsoft 365 – Microsoft Viva
Microsoft 365 – Platform & Integration
Power Platform
Power Platform – Power Platform
Power Platform – Power Apps
Power Platform – Power BI
Power Platform – Power Automate
Power Platform – Power Virtual Agent
Security
Security – Identity & Access Management
Security – Compliance
AI & Innovation
AI at Scale
AI for Good
Startup and Other Areas
Startup
Other Areas

Top of page

Azure – Application Innovation
Level 100 – Azure – Application Innovation
“Hello, World!” in 3 programming languages 
A peek inside the developer’s toolkit
Build your own resume website and stand out to recruiters
Microsoft Community Training for Nonprofits
So many programming languages, so little time–which should I learn?
Tackling the technical interview
The New Developer’s Guide to the Cloud
“Intro to Tech Skills”? Tell me more!
Level 200 – Azure – Application Innovation
Codespaces: Managed cloud environments developers love with the controls their admins need
Maximize Agility by Automating Custom Rollouts in Azure Devops
Next Steps for Distributed Programming with .NET and Orleans
Rapidly code, test and ship from secure cloud developer environments
Scale cloud-native apps and accelerate app modernization
Sneak Peek: Azure Developer CLI
Level 300 – Azure – Application Innovation
Accelerate and secure your code to cloud development
Ask the Experts: .NET and Visual Studio
Ask the Experts: Accelerate and secure your code to cloud development
Ask the Experts: Delivering developer velocity through the entire engineering system
Ask the Experts: Deploy modern containerized apps and cloud native databases at scale
Ask the Experts: Modernize and scale enterprise Java applications
Ask the Experts: Seamless and secure Kubernetes experience and observability anywhere
Build native apps for any device with .NET and Visual Studio
Building document intelligence applications with Azure Applied AI and Azure Cognitive Services
Co-developing – accelerate your digital transformation with Microsoft by your side
Delivering developer velocity through the entire engineering system
Deploy modern containerized apps and cloud native databases at scale
Develop and Deploy Your Java Apps using Tools and Frameworks You Love
How to quickly add chat to your .NET app using the low-code Weavy Drop-in UI
Modernize and scale enterprise Java applications
NVIDIA RAPIDS Spark plugin on Azure Synapse
Seamless and secure Kubernetes experience and observability anywhere
Tooling for Incremental ASP.NET Core Migrations
Verifiable Credentials – The What, The Why, The How
What’s Next in C# 11

Top of page

Azure – Data
Level 100 – Azure – Data
GitHub + Microsoft Docs + You
Level 200 – Azure – Data
Accelerate innovation and achieve agility on a trusted, integrated platform with hybrid and multicloud capabilities
Analytics and Operational Data
Azure SQL and Azure Functions: Integration with SQL bindings
Introduction to Azure Arc enabled Kubernetes
Protect your databases using Microsoft Defender for Cloud
Securing Azure-Arc Enabled Data Services
Level 300 – Azure – Data
Ask the Experts: Accelerate and secure your code to cloud development
Ask the Experts: Build and deploy containerized applications and databases for hybrid and multicloud with Azure Arc
Ask the Experts: Modernize your applications with new features across SQL Server 2022 and Azure SQL
Azure Cosmos DB: Learn how to enable analytics over real-time operational data with Azure Synapse Link
Build and deploy containerized applications and databases for hybrid and multicloud with Azure Arc
Delivering developer velocity through the entire engineering system
Develop applications with Azure Database for MySQL – Flexible Server
Modernize your applications with new innovations across SQL Server 2022 and Azure SQL
Modernizing real-time data infrastructure with Azure

Top of page

Azure – IoT
Level 100 – Azure – IoT
Azure IoT Central Roadmap
Building the Arm64 ecosystem on Windows IoT Enterprise with the i.MX 8 platform
Create and Connect Secure and Trustworthy IoT Devices
Dual Operator – Windows IoT
Edge Device Image Builder
IoT Device Developer Experience
Windows IoT Roadmap
Windows IoT Security
Level 200 – Azure – IoT
Azure IoT: Device onboarding experiences for Edge with DPS
Azure RTOS Product Updates
Building Industrial Digital Twin Applications
Connecting MCU-IoT Devices to the Cloud
Data processing challenges in building enterprise grade IoT solutions
From the Edge to the Metaverse, how IoT powers it all
Prescriptive Architecture guidance for IoT Workloads
Project Haven: Kubernetes for the embedded edge
Reimagine the future of driving insights and actions from physical world data with Azure IoT, Azure Data, and Power Apps
Scaling to a Billion Devices – Azure IoT Platform Vision and Roadmap
Well Architected Framework for IoT
Level 300 – Azure – IoT
Ask the Experts: How Vision AI applications use NVIDIA DeepStream and Azure IOT Edge Services
Deploy IoT solutions with Azure SQL Database
Industrial IoT & Azure IoT Central

Top of page

Azure – Infrastructure
Level 100 – Azure – Infrastructure
The Essential Nature of Cloud Native Processor: Foundations, Solutions, and Benefits
Level 200 – Azure – Infrastructure
Accelerate Microsoft Teams Operator Connect with Azure based microservices
PowerShell 7
Level 300 – Azure – Infrastructure
Azure PaaS and Cloud Native Development

Top of page

Azure – AI
 Level 100 – Azure – AI
A guided journey into AI
Accelerate your Azure data and AI journey with IBM’s Data Fabric
Microsoft Build Into Focus: AI
 Level 200 – Azure – AI
Accelerate Compute on PC class devices with Azure IoT Edge for Linux on Windows (EFLOW)
Azure Machine Learning – MLOps and Responsible ML
Embrace digital transformation at the edge with Azure Percept
Solving healthcare’s toughest problems: Building clinical models using a self-supervision NLP framework
 Level 300 – Azure – AI
Ask the Experts: Scaling responsible MLOps with Azure Machine Learning
Ask the Experts: Seamless and secure Kubernetes experience and observability anywhere
Discussing accelerated model inference for AzureML Deployment with ONNX-RT, OLive, NVIDIA Triton Inference Server & Model Analyzer
Scaling responsible MLOps with Azure Machine Learning

Top of page

Azure – Enable
 Level 200 – Azure – Enable
GitHub + Azure: Better Together
Introduction to Azure Arc enabled Kubernetes

Top of page

Azure – Migration & Modernization
 Level 200 – Azure – Migration & Modernization
Building and running enterprise-grade Spring applications in the cloud

Top of page

Azure – Power BI
 Level 300 – Azure – Power BI
Ask the Experts: Democratizing your data at scale with Power BI
Democratize your data at scale with Power BI

Top of page

Azure – Mixed Reality
Level 100 – Azure – Mixed Reality
Microsoft Build Into Focus: Preparing for the metaverse

Top of page

Microsoft 365 – Windows
Level 100 – Microsoft 365 – Windows
Bring your Android apps to Windows
Level 200 – Microsoft 365 – Windows
Best practices for app publishing
Building great apps for Windows n/a
Create a simple Windows App with WinUI
Create next-gen experiences at scale with Windows
Edge DevTools: Reinventing Browser Tools for the Future of Web Development
Enriching desktop experiences with the power and reach of the web
Evolving MSIX
Evolving the Microsoft Store for Business and Education
Finding your success in the Microsoft Store
Improving your Windows app’s performance
Promoting apps and content to your target audiences on the Microsoft Store
Set up a great dev environment for any project
 Level 300 – Microsoft 365 – Windows
Ask the Experts: Building great apps with the open platform of Windows
Ask the Experts: Develop Windows apps on and for a rich ecosystem of platforms and devices
Building great apps with the open platform of Windows
Make your cross-platform apps best on Windows
The new Microsoft Store, built for your success
Level 400
Develop Windows apps on and for a rich ecosystem of platforms and devices

Top of page

Build 2022 – Microsoft 365 – Microsoft Teams
 Level 100 – Microsoft 365 – Microsoft Teams
Enabling 3rd party apps on Teams with confidence with M365 App Compliance Program
 Level 200 – Microsoft 365 – Microsoft Teams
Accelerating your Microsoft Teams app development and time to market with Teams Toolkit
Build rich micro-capabilities on Microsoft Teams platform by leveraging link unfurling
Building custom clients for virtual events & visit scenarios using Azure Communication Services, Microsoft Teams & Microsoft Graph
Conversational apps in Microsoft Teams
Create interactive meeting apps for Microsoft Teams
Deep dive into Microsoft Teams JS SDK v2 for extending Teams apps to Outlook and Office.com
Extend Approvals across your line of business applications in Microsoft Teams
Make your meetings more interactive! Learn how to build engaging synchronous experiences your users will love
Monetize your third-party apps for Microsoft Teams
Unlock enterprise adoption of your apps in Microsoft 365 and Microsoft Teams
 Level 300- Microsoft 365 – Microsoft Teams
Ask the Experts: Microsoft Graph
Microsoft Build Into Focus: Build collaborative apps to thrive in the modern workplace
Reach 270M users and grow your business with Microsoft Teams

Top of page

Microsoft 365 – Microsoft 365 Apps
 Level 200 – Microsoft 365 – Microsoft 365 Apps
Excel add-ins and data types
 Level 300 – Microsoft 365 – Microsoft 365 Apps
Ask the Experts: Microsoft Teams

Top of page

Microsoft 365 – Collaborative Apps
Level 200 – Microsoft 365 – Collaborative Apps
Build collaborative apps with new Microsoft 365 and Microsoft Teams collaboration controls in Power Apps
Building collaborative web apps with Fluid Framework & Azure Fluid Relay
Innovate with collaborative apps and low code
Level 300
Build collaborative apps with Microsoft Teams and Microsoft 365 services

Top of page

Microsoft 365 – Microsoft Graph
Level 200 – Microsoft 365 – Microsoft Graph
Building Microsoft Graph Connectors to improve your workplace search experience and increase engagement with app content
Building Real-time Collaborative Apps with Azure, Microsoft 365, and Power Platform
Latest and greatest from Microsoft Graph to power your people centric apps
Unlocking the power of your Microsoft 365 data with Microsoft Graph Data Connect
Level 300 – Microsoft 365 – Microsoft Graph
Deep Dive into Microsoft Graph SDKs

Top of page

Microsoft 365 – Microsoft Viva
Level 200 – Microsoft 365 – Microsoft Viva
How to open your app to cross-organizational collaboration with Microsoft Teams Connect
​Building tailored employee experiences with Microsoft Viva Connections and SharePoint Framework

Top of page

Microsoft 365 – Platform & Integration
Level 200 – Microsoft 365 – Platform & Integration
Bring Microsoft Teams (Chats & Channel) collaboration to your Apps by leveraging Microsoft Graph
Everything new you need to know about Microsoft Teams Platform in 30 minutes or less

Top of page

Power Platform – Power Platform
Level 100 – Power Platform – Power Platform
Access Connector for Dataverse and Power Platform GA Launch
Implement Power Platform with Architecture Best Practices and Fusion Development 
Microsoft technologies and the dev community: Who’s building what? Get inspired!
Level 200 – Power Platform – Power Platform
Expedite application delivery with low-code and fusion teams
Level 300 – Power Platform – Power Platform
Adobe + Microsoft: Building document workflows through low-code automation

Top of page

Power Platform – Power Apps
Level 100 – Power Platform – Power Apps
Microsoft Build Into Focus: Low Code solutions using Microsoft Power Platform
Level 200 – Power Platform – Power Apps
Building Teams apps that bring value quickly through low-code
Pro developer capabilities of Power Apps and Dataverse for Access Developers
What’s new in the world of Microsoft Power Apps
Level 300 – Power Platform – Power Apps
Ask the Experts: Power up your development efforts with the latest low code innovations
Build a Power Apps component
Discover and use Web APIs with Power Apps

Top of page

Power Platform – Power BI
Level 100 – Power Platform – Power BI
Milwaukee Tool Driving its DnA Culture with Unified Analytics 
Level 200 – Power Platform – Power BI
Best practices for deploying and scaling Power BI Embedded Analytics
Enhancements to administration, security, and governance in Power BI
Integrating Power BI deployment pipelines into Azure DevOps and Azure Pipelines.
Organize & query your data lakes using Azure Synapse database templates & lake databases
The Future of Enterprise Semantic Models  – Power BI Premium
Transform how business intelligence is used with Power BI in PowerPoint and Outlook 
Using AI With Documents: Syntex, AI Builder & Azure Cognitive Services
What’s new with Synapse Gen2 data warehouse

Top of page

Power Platform – Power Automate
Level 200 – Power Platform – Power Automate
Exciting innovations in Power Automate
Seamlessly scale RPA with Power Automate + Azure VM
Level 300 – Power Platform – Power Automate
Ask the Experts: Accelerate cloud automation and conversational bot development.​
How to accelerate cloud automation and conversational bot development for developers

Top of page

Power Platform – Power Virtual Agent
Level 200 – Power Platform – Power Virtual Agent
Managing Conversational Bots in an Enterprise
What’s new for Microsoft’s Conversational AI and Power Virtual Agents

Top of page

Security – Identity & Access Management
Level 200 – Security – Identity & Access Management
Build the SOC of the future with the Azure AD Identity Protection APIs
Configure Application Security Features
Creating secure identities for apps using the Microsoft identity platform
Building on Microsoft Sentinel platform 
Level 300 – Security – Identity & Access Management
More secure, and resilient, apps built on Azure AD Continuous Access Evaluation

Top of page

Security – Compliance
Level 200 – Security – Compliance
Automate and customize retention and deletion scenarios

Top of page

AI & Innovation
Level 200 – AI at Scale
Tapping into the Qualcomm AI Engine – On-device AI driving improved and new Windows experiences
The Future of AI Development Tools
Level 300 – AI at Scale
Azure Cognitive Service deployment: AI inference with NVIDIA Triton Server
Document Intelligence using Azure Feature Store (Feathr) and SynapseML
AI & Innovation
Level 200 – AI For Good
Driving inclusion and accessibility with dev tooling and AI services
How to develop custom object models with data labeling tools and AutoML

Top of page

Startup
Architecting Your Startup Stack
Ask the Experts: Architecting Your Startup Stack
Avoid these 3 mistakes to ensure your model reaches production
Intrapreneurship: Four Steps to Successfully Growing Your Idea in the Enterprise
Microsoft for Startups unlocking the Power of OpenAI for your startup

Top of page

Build 2022 – Other Areas
Level 100
2022 Imagine Cup World Championship
Advancing Equity & Inclusion in Products
Autism, Anxiety, and Running out of Spoons
Coding with kids: Cultivating the next generation of developers through play
Contributors: Assemble! Unleashing the Power of Community
Environmental sustainability: Device and cloud solutions strategies to reduce your climate impact
Microsoft Build After Hours: Day 1
Microsoft Build After Hours: Day 2
Microsoft Build Keynote Analysis
Microsoft Build Opening
Microsoft Build Opening & Core Theme Sessions Replay (optimized for EMEA time zones)
Panel discussion: Build the skills you need for today’s tech world
Level 200
.NET MAUI – Updates and Roadmap
Ask the Experts: Continuous Cloud Modernization with Feature Flags
Future Possibilities for .NET Core and WASI (WebAssembly on the Server)
GitHub Issues – planning and tracking for developers
Minimal APIs: Past, Present, and Future
Visual Studio 2022 and Beyond
Visual Studio 2022 for Mac and Beyond
Level 300
Ask the Experts: Unlocking Better SQL Performance and Scalability
Build the Future of Web 3 Faster and Easier – New Tools and New Features
Developing and Optimizing Software for Hybrid Architecture
Native client apps with Blazor Hybrid
Testing Modern Web Apps with Playwright
Level 400
Breaking barriers to make open source work at work
Continuous Delivery with GitHub Actions
Output Caching in ASP.NET Core 7

Posted in Build 2022 | Leave a comment

How to get SharePoint audit reports using Office 365 Management APIs?

Summary

The following are the customer concerns with the SharePoint sites on Microsoft 365 cloud regarding audit reports.

  • SPO site collection admins do not receive the same GUI presentation for site audit reports that were available on the SharePoint on-premises.
  • Currently, the reports are available only in the CSV files.
  • Currently, O365Tenant admins must run the report to get the CSV files

As per the customer, in the SharePoint on-premises SCA could set the following audit options and pull it from the Site Settings. The same is not possible in SPO.

  • Opening or downloading documents, viewing items in lists, viewing item properties
  • Editing items, Checking out or checking in items, Moving or copying items to another location on the site, Deleting or restoring items
  • Editing content types or columns, Searching site content, Editing users and permissions
SharePoint on-premises these options events were available to audit.

Step by Step Solution

The SharePoint online workload in Microsoft 365 tracks audit logs data at the tenant level. The tenant administrator can get these data (in CSV format) from the compliance center portal, which is a manual process for the admin.

To automate the process Office 365 Management API can be used to get these audit data. There are two types of Office 365 Management API?

Office 365 Service Communications API, which can do the following:

  • Get Services: Get the list of subscribed services.
  • Get Current Status: Get a real-time view of current and ongoing service incidents.
  • Get Historical Status: Get a historical view of service incidents.
  • Get Messages: Find Incident and Message Center communications.

Office 365 Management Activity API, which can do the following.

  • Use the Office 365 Management Activity API to retrieve information about the user, admin, system, and policy actions and events from Office 365 and Azure AD activity logs.
  • Audit and activity logs to create solutions that provide monitoring, analysis, and data visualization.

The Office 365 Management Activity API allows for pulling of audit logs for the following workloads.

  1. Audit.AzureActiveDirectory
  2. Audit.Exchange
  3. Audit.SharePoint
  4. Audit.General (includes all other workloads not included in the previous content types)
  5. DLP.All (DLP events only for all workloads)

Step # 1 In Compliance Center turn on auditing

  • Go to https://compliance.microsoft.com and sign in.
  • In the left navigation pane of the Microsoft 365 compliance center, click Audit.
  • If auditing is not turned on for your organization, a banner is displayed prompting you to start recording user and admin activity.
  • Click the Start recording user and admin activity banner.
  • It may take up to 60 minutes for the change to take effect.
Compliance Center to turn on auditing

Step # 2 Register an Azure AD App

To register the Azure AD application you can follow this step.

Add and grant consent to the following Application permissions.
– Office 365 Management API
1. ActivityFeed.Read
2. ActivityFeed.ReadDlp
3. ServiceHealth.Read

Step # 3 Start a subscription for a workload(s)

Prior to making this call you will need to get the Access Token by using the Azure AD app. The following call will add Audit.SharePoint, you can add more workloads.

*** REQUEST TO BE MADE ***
POST {root}/subscriptions/start?contentType=Audit.SharePoint&PublisherIdentifier={TenantGUID}
Content-Type: application/json; utf-8
Authorization: Bearer eyJ0e...Qa6wg

RESPONSE
HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8
{
    "contentType": "Audit.SharePoint",
    "status": "enabled",
}

Step # 4 Activity API operations

After one time subscription, you can make a call to the feed API to get audit information. The subscription can be stopped as well.

*** MAKE a Call to get the list of Audit reports ***
*** StartTime and End Time must not exceed 24 hours ***
*** The times must be in the form of yyyy-MM-ddTHH:mm:ss.fffZ ***
*** PowerShell $startTimeDt.ToString("yyyy-MM-ddTHH:mm:ss.fffZ")" ***

REQUEST

GET https://manage.office.com/api/v1.0/{{TenantID}}/activity/feed/subscriptions/content?
startTime={{startTime}}&
endTime={{endTime}}&
contentType={{contentType}}&
PublisherIdentifier={{TenantID}}

RESPONSE 

[
    {
        "contentUri": "https://manage.office.com/api/v1.0/b07282ed-2513-42ff-8322-de55ebce98f1/activity/feed/audit/20220404172343761148151$20220405052516832093374$audit_sharepoint$Audit_SharePoint$na0034",
        "contentId": "20220404172343761148151$20220405052516832093374$audit_sharepoint$Audit_SharePoint$na0034",
        "contentType": "Audit.SharePoint",
        "contentCreated": "2022-04-05T05:25:16.832Z",
        "contentExpiration": "2022-04-11T17:23:43.761Z"
    },
    {
.... removed for brevity
]

Step # 5 Using the above response for each “contentUri” make another get call to get the JSON response

As you can see from the response there are multiple data in the JSON response. These data can be different based on what content type is subscribed. Here is the schema for all response values.

 {
        "AppAccessContext": {
            "AADSessionId": "5e5d69ef-a701-4ff7-9068-adc9eaa444ba",
            "CorrelationId": "ff8230a0-20f5-c000-c88a-0b7be49d5f5b",
            "UniqueTokenId": "bflaGhwgWUuOpcy6h_cdAA"
        },
        "CreationTime": "2022-04-04T19:17:56",
        "Id": "aa088334-d15b-413b-3ed1-08da166fd2f6",
        "Operation": "FileAccessed",
        "OrganizationId": "b07282ed-2513-42ff-8322-de55ebce98f1",
        "RecordType": 6,
        "UserKey": "i:0h.f|membership|sdasdasd@live.com",
        "UserType": 0,
        "Version": 1,
        "Workload": "SharePoint",
        "ClientIP": "255.255.23.12",
        "ObjectId": "https://somesite.sharepoint.com/sites/appcatalog/clientsideassets/c40b89d1-d05f-4623-b02e-b78276a050d2/navigation-panel-application-customizer_039e677240705a0c1bbc4023a93bf51e.js",
        "UserId": "bobk@somewhere.com",
        "CorrelationId": "ff8230a0-20f5-c000-c88a-0b7be49d5f5b",
        "CustomUniqueId": false,
        "EventSource": "SharePoint",
        "ItemType": "File",
        "ListId": "300cfa22-0c1e-411b-a10e-7d6b81978e76",
        "ListItemUniqueId": "a25248f3-f04d-4442-b184-accadd27335c",
        "Site": "7b8c0cf1-acef-4e86-b3dc-009f68708b39",
        "UserAgent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.74 Safari/537.36 Edg/99.0.1150.46",
        "WebId": "aaae54a6-b040-43b4-965f-55aaa97b95df",
        "SourceFileExtension": "js",
        "SiteUrl": "https://somesite.sharepoint.com/sites/AppCatalog/",
        "SourceFileName": "navigation-panel-application-customizer_039e677240705a0c1bbc4023a93bf51e.js",
        "SourceRelativeUrl": "ClientSideAssets/c40b89d1-d05f-4623-b02e-b78276a050d2"
    }

Automating the above steps

After the above steps you should be able to see the audit data. But these are all manual steps to get the content. To automate these you will need to use the Azure resources such as Azure Storage, Azure Functions, and Azure Cosmos DB. The following is the logical architecture that depicts those Azure resources.

Logical Architecture for automating to get the audit data
Service TypeDescription
Azure Cosmos DBTo store the Audit Data from Store Events Azure Function
Azure Service BusTo queue messages from the Queue Events Azure Function
Azure StorageOptionally you could use Azure Queue instead of Azure Service Bus.
Azure FunctionThe Azure Function compute resources are where the Queue Event and Store Events will run
Azure Application InsightsThis is part of the Azure Monitor required for the Azure Function monitoring.
Power BI PremiumThe Power BI Premium license is required for the dashboard.
List of required Azure resources for the automating the audit log solution

To create the above resources in your Azure demo tenant you can use this ARM template.

The Queue Events Azure Function Code is located here.

The Store Events Azure Function Code is located here.

Once the data is gathered in the Azure Cosmos DB, the Power BI can generate a report similar to the following. The report shown is a basic example but once the data is collected to the Azure Cosmos DB you can create more such reports to meet your audit need.

The report can answer the following questions:

Who did an operation to the File, List, or Library?

When the operation was performed?

What operation was performed?

What sites are used?

The Audit report from the audit data gathered in the Azure Cosmos DB.

The Power BI PBX file is located here.

Conclusion

The above solution is a proof of concept (POC) for SharePoint or OneDrive data. There are other workloads audit data that can collect such as Exchange, DLP, Azure AD, etc.

Reference:

Using the Office 365 Management Activity API and Power BI for security analysis (Part 1)

Posted in PnP.PowerShell, SharePoint 2010, SharePoint 2013, Technical Stuff | 1 Comment

Application Lifecycle Management (ALM) for Power Platform (PP) 3 of 3

Summary

This is the final part of the three-part blog post.

Part1 and Part2 posts were focused on the basic settings for the Classic UI mode pipeline setup. In this post, I will explain more advanced, modular, and factory pattern ways of the CI/CD Azure DevOps pipelines.

My customer has Azure DevOps Server (ADOS) and pool agent as Windows OS for the default pool.

As defined in Part1, the prerequisite still applies to this approach. Additionally, you will need to have some basic familiarity with the YAML language. You will need the Git installed on your local machine. This approach uses the Microsoft Power Platform CLI. The Pac CLI performs better than Power Platform Build Tools defined in Part2.

Step by Step solution

Step # 1 Create an organization and project in your Azure DevOps.

Create a new project in the Azure DevOps organization

Step # 2 Initialize a blank Repository.

A blank repository

Step # 3 Clone the repository

Clone the repository in your local repository
### You will need Github installed on your local machine to run this command.
C:\SharePointDev\PowerPlatformALM>git clone {Your Clone repo URL}

Step # 4 Get the code from the GitHub repo. Copy files and folders to the root of the above local cloned repository.

The GitHub repo code is located here.

Copy the above two folders to the cloned local repo.

Step # 5 Using the following Git commands push the changed code to the Azure DevOps repo.

# this will check the status as untacked changes
git status  
# this command will add the untracked changes to the local as tracked
git add .
# this command will commit to the local repository
git commit -m "Added .azdops and pwsh folders"
# this command will push the local commited changes to the remote repoository
git push
The four commands to push the two folders and files to the Azure DevOps repo.

The final results to the Azure DevOps repository should look similar to the following. The .azdops and pwsh folders should be the top folder in your repository.

The Azure DevOps repository final result.

Step # 6 Create Azure Pipelines variable groups ‘Credentials’, ‘Dev’, ‘Test’, and ‘Prod’.

In the ADO project, click on the Library and Add the four variable groups. Now create variables in these variable groups. These variable groups are used in the YAML code. Please keep the name as defined below for variable groups and variables.

Add four variable groups.
#
# Variable Group Crendential will have the following three variables.
#
ClientId      = {Get the value from the Part1 blog post Step # 3 }
ClientSecret  = {Get the value from the Part1 blog post Step # 3 }
TenantId      = {Get the value from the Part1 blog post Step # 3 }
#
# Variable Group 'Dev', 'Test', 'Prod' will have the following variable.
# The Url value will be different for all three.
#
Url           = {Get the value from the Part1 blog post Step # 2 }

Note: The above values will be used in the pipeline.

Step # 7 Define Test and Prod Azure DevOps Environments

Under pipelines -> environments, define test and prod environments. The environments can be used for adding any approval requirements.

Create test and prod environments

For the prod environment, (optional) add the approval settings. Assign one or more approvers.

Assign approver to the prod environment.

Step # 9 Create a Pipeline using the existing file .azdops\pipelines\export-commit.yml

(Note: a longer step)

Click Create Pipeline button under the Pipelines
Select Azure Repos Git
Select the Repository for the project.
Select the ‘Existing Azure Pipelines YAML file.
Select ‘export-commit.yml’ file
Finally, review the code and click the arrow to save the pipeline. DO NOT RUN at this time.
Now rename the newly created pipeline by clicking 1, 2, and 3 as shown above.
Rename it to “Export and Commit by Solution Name Variable”
First, add the variable ‘SolutionName’ before running the pipeline.

Note: You will get the following error when you run the pipeline. YOU NEED TO set Contribute Allow for the build agent.

“remote: TF401027: You need the Git ‘GenericContribute’ permission to perform this action.”

An error when you run the pipeline the first time.
Change the Contribute settings to Allow for the Build Service agent.
Upon successful pipeline run, you should see your Solution under the solution folder as shown above.

Step # 8 Setup multi-stage build and deploy pipeline for .azdops\pipelines\build-deploy.yml

Consider the file .azdops\pipelines\build-deploy.yml as the template for your Power Platform Solution. From the above example for ‘Spark2022’ solution name I created the copy of the template to a new file named ‘Spark2022-build-deploy.yml’.

Make a copy of ‘build-deploy.yml’ template file for your solution. Replace the Demo2022 text from the yml file.

Step # 10 Create a Pipeline using the existing file .azdops\pipelines\{YOUR SOLUTION NAME}-build-deploy.yml

Note: You will repeat most of the solution steps from Step # 7 above. Please refer above for the screenshots.

Click on Pipeline, then click the “New pipeline” button.

Select Azure Repos Git

Select your specific repository

Select the “Existing Azure Pipelines YAML file”

Select the /.azdops/pipelines/{YOURSOLUTIONNAME-build-deploy.yml file. (You created in the step # 8)

DO NOT Run yet, click on the down arrow next to the Run blue button. Select Save.

Now click on Pipelines from the left menu. Click on the “All” tab.

Click on the newly created pipeline to rename, see below.

Rename the new pipeline.

Rename to something like “{YourSolutionName} Build and Deploy”. So for the above example, it would be “Spark2022 Build and Deploy”

Rename to “{YourSolutionName} Build and Deploy”

Step # 11 Create config.test.json and config.prod.json files (if not present) under the solution folder.

The two config files for the test and prod are under your Solution Folders directory.

Two config files for test and prod with empty JSON for now.

The save of these files will trigger the build if not, run the pipeline, you will see the multi-stage for Build, Test, and Prod

NOTE: In this pipeline, you do not need to set any variable before running it.

The build and deploy three stages.

Step # 12 Regarding the config.test.json and config.prod.json files.

This is an optional step. You may have environment variables for the connection references, site URL, list guide, etc. These variables are different in each environment. The values for deployment can be configured in the config.ENV.json file.

The trick to getting the content of the JSON file is to make use of the Visual Studio Code pac cli extension.

#
# Use the *unmanged* solution to extract dev config setting.
#
pac solution create-settings --solution-zip Spark2022.zip --settings-file config.dev.json
Dev config Settings using Pac CLI.

CITIZEN DEVELOPER SCENARIO

All above steps 1 to 12 are for the Build Administrator or Technical lead of the project.

For the Citizen developer’s role, the steps are simple and as follows.

  1. Gather the requirements or feedback or bugs for the application.
  2. Create a Power Platform Solution in the Dev Environment e.g. LaptopInventory
    1. Develop a PowerApps App
    2. Develop a PowerAutomate Flow
    3. Creates a List and any other components required for the solution.
    4. Test the application and flow
  3. Run Export and Commit Pipeline (export-commit.yml) for the LaptopInventory solution.
    1. Note: you need to add a SolutionName=LaptopInventory variable before running the pipeline
    2. This action will create a new directory ‘LaptopInventory’ in the repository
  4. Create config.test.json and config.prod.json files with an empty record as {}
    1. Note: This is a one-time activity. It can be repetitive if you are adding any config variables for various environments.
  5. Create a Build and Deploy Pipeline for the solution ‘LaptopInventory’ (See Step#8 and Step#9)
    1. Note: This is a one-time activity
    2. Make a copy of build-deploy.yml to LaptopInventory.yml in the ‘.azdops\piplelines‘ folder
    3. Change the text from ‘Demo2022’ to the ‘LaptopInventory’
    4. Create a Pipeline using the new YML file
    5. Run the pipeline if it is not running already. (This action will make the Solution available in the Test Environments)
  6. Ask Testers to test and provide feedback.
  7. Repeat the above steps from here as needed for the ALM cycle.

The below flow chart gives an overview of the process. The developers run the “Export & Commit” by the SolutionName. This action “1” exports the Solution and Commits to the Source Control Managemen (SCM). The checking in the file(s) for the solution auto triggers the “Build & Release” pipeline which eventually put the managed solution in the TEST or PROD environments.

Depending on the feedback from the testers the above process is repeated by the developer.

The two pipelines show the CI and CD.

Conclusion

This concludes the 3 parts blog series for Application Lifecycle Management for Power Platform.

Please refer to the following articles for more information.

YAML schema reference for Azure Pipelines

Move flows across environments without resetting connections! (Please use connection references in your solutions)

Power Platform Environment variables overview

Azure DevOps CODE: microsoft-power-platform-ce/spark-prodev-practice (This blog post made use of most of the code from this repo)

GitHub Actions CODE: melody-universe/levelup-devops (If you plan to use the GitHub actions this repo can be a good start)

Introducing the ALM Accelerator for Power Platform

Thank you Melody Universe and Paul Breuler for your code and knowledge sharing.

Posted in Power Apps, Power Apps, Power Automate, SharePoint, Technical Stuff | 1 Comment