How to get SharePoint audit reports using Office 365 Management APIs?

Summary

The following are the customer concerns with the SharePoint sites on Microsoft 365 cloud when it comes to audit reports.

  • SPO site collection admins do not receive the same GUI presentation for site audit reports that were available on the SharePoint on-premises.
  • Currently, the reports are available only in the CSV files.
  • Currently, O365Tenant admins must run the report to get the CSV files

As per the customer, in the SharePoint on-premises SCA could set the following audit options and pull it from the Site Settings. The same is not possible in SPO.

  • Opening or downloading documents, viewing items in lists, viewing item properties
  • Editing items, Checking out or checking in items, Moving or copying items to another location on the site, Deleting or restoring items
  • Editing content types or columns, Searching site content, Editing users and permissions
SharePoint on-premises these options events were available to audit.

Step by Step Solution

The SharePoint online workload in Microsoft 365 tracks audit logs data at the tenant level. The tenant administrator can get these data (in CSV format) from the compliance center portal, which is a manual process for the admin.

To automate the process Office 365 Management API can be used to get these audit data. There are two types of Office 365 Management API?

Office 365 Service Communications API, which can do the following:

  • Get Services: Get the list of subscribed services.
  • Get Current Status: Get a real-time view of current and ongoing service incidents.
  • Get Historical Status: Get a historical view of service incidents.
  • Get Messages: Find Incident and Message Center communications.

Office 365 Management Activity API, which can do the following.

  • Use the Office 365 Management Activity API to retrieve information about the user, admin, system, and policy actions and events from Office 365 and Azure AD activity logs.
  • Audit and activity logs to create solutions that provide monitoring, analysis, and data visualization.

The Office 365 Management Activity API allows to pull audit logs for the following workloads.

  1. Audit.AzureActiveDirectory
  2. Audit.Exchange
  3. Audit.SharePoint
  4. Audit.General (includes all other workloads not included in the previous content types)
  5. DLP.All (DLP events only for all workloads)

Step # 1 In Compliance Center turn on auditing

  • Go to https://compliance.microsoft.com and sign in.
  • In the left navigation pane of the Microsoft 365 compliance center, click Audit.
  • If auditing is not turned on for your organization, a banner is displayed prompting you start recording user and admin activity.
  • Click the Start recording user and admin activity banner.
  • It may take up to 60 minutes for the change to take effect.
Compliance Center to turn on auditing

Step # 2 Register an Azure AD App

To register the Azure AD application you can follow this step.

Add and grant consent to the following Application permissions.
– Office 365 Management API
1. ActivityFeed.Read
2. ActivityFeed.ReadDlp
3. ServiceHealth.Read

Step # 3 Start a subscription for a workload(s)

Prior to making this call you will need to get the Access Token by using the Azure AD app’s. The following call will add Audit.SharePoint, you can add more workloads.

*** REQUEST TO BE MADE ***
POST {root}/subscriptions/start?contentType=Audit.SharePoint&PublisherIdentifier={TenantGUID}
Content-Type: application/json; utf-8
Authorization: Bearer eyJ0e...Qa6wg

RESPONSE
HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8
{
    "contentType": "Audit.SharePoint",
    "status": "enabled",
}

Step # 4 Activity API operations

After one time subscription, you can make a call to the feed API to get audit information. The subscription can be stopped as well.

*** MAKE a Call to get the list of Audit reports ***
*** StartTime and End Time must not exceed 24 hours ***
*** The times must be in the form of yyyy-MM-ddTHH:mm:ss.fffZ ***
*** PowerShell $startTimeDt.ToString("yyyy-MM-ddTHH:mm:ss.fffZ")" ***

REQUEST

GET https://manage.office.com/api/v1.0/{{TenantID}}/activity/feed/subscriptions/content?
startTime={{startTime}}&
endTime={{endTime}}&
contentType={{contentType}}&
PublisherIdentifier={{TenantID}}

RESPONSE 

[
    {
        "contentUri": "https://manage.office.com/api/v1.0/b07282ed-2513-42ff-8322-de55ebce98f1/activity/feed/audit/20220404172343761148151$20220405052516832093374$audit_sharepoint$Audit_SharePoint$na0034",
        "contentId": "20220404172343761148151$20220405052516832093374$audit_sharepoint$Audit_SharePoint$na0034",
        "contentType": "Audit.SharePoint",
        "contentCreated": "2022-04-05T05:25:16.832Z",
        "contentExpiration": "2022-04-11T17:23:43.761Z"
    },
    {
.... removed for brevity
]

Step # 5 Using the above response for each “contentUri” make another get call to get the JSON response

As you can see from the response there are multiple data in the JSON response. These data can be different based on the what is the content type is subscribed. Here is the schema for all response values.

 {
        "AppAccessContext": {
            "AADSessionId": "5e5d69ef-a701-4ff7-9068-adc9eaa444ba",
            "CorrelationId": "ff8230a0-20f5-c000-c88a-0b7be49d5f5b",
            "UniqueTokenId": "bflaGhwgWUuOpcy6h_cdAA"
        },
        "CreationTime": "2022-04-04T19:17:56",
        "Id": "aa088334-d15b-413b-3ed1-08da166fd2f6",
        "Operation": "FileAccessed",
        "OrganizationId": "b07282ed-2513-42ff-8322-de55ebce98f1",
        "RecordType": 6,
        "UserKey": "i:0h.f|membership|sdasdasd@live.com",
        "UserType": 0,
        "Version": 1,
        "Workload": "SharePoint",
        "ClientIP": "255.255.23.12",
        "ObjectId": "https://somesite.sharepoint.com/sites/appcatalog/clientsideassets/c40b89d1-d05f-4623-b02e-b78276a050d2/navigation-panel-application-customizer_039e677240705a0c1bbc4023a93bf51e.js",
        "UserId": "bobk@somewhere.com",
        "CorrelationId": "ff8230a0-20f5-c000-c88a-0b7be49d5f5b",
        "CustomUniqueId": false,
        "EventSource": "SharePoint",
        "ItemType": "File",
        "ListId": "300cfa22-0c1e-411b-a10e-7d6b81978e76",
        "ListItemUniqueId": "a25248f3-f04d-4442-b184-accadd27335c",
        "Site": "7b8c0cf1-acef-4e86-b3dc-009f68708b39",
        "UserAgent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.74 Safari/537.36 Edg/99.0.1150.46",
        "WebId": "aaae54a6-b040-43b4-965f-55aaa97b95df",
        "SourceFileExtension": "js",
        "SiteUrl": "https://somesite.sharepoint.com/sites/AppCatalog/",
        "SourceFileName": "navigation-panel-application-customizer_039e677240705a0c1bbc4023a93bf51e.js",
        "SourceRelativeUrl": "ClientSideAssets/c40b89d1-d05f-4623-b02e-b78276a050d2"
    }

Automating the above steps

After above all steps you should be able to see the audit data. But these are all manual steps to get the content. To automate these you will need to use the Azure resources such as Azure Storage, Azure Functions, and Azure Cosmos DB. The following is the logical architecture that depicts those Azure resources.

Logical Architecture for automating to get the audit data
Service TypeDescription
Azure Cosmos DBTo store the Audit Data from Store Events Azure Function
Azure Service BusTo queue messages from the Queue Events Azure Function
Azure StorageOptionally you could use Azure Queue instead of Azure Service Bus.
Azure FunctionThe Azure Function compute resources are where the Queue Event and Store Events will run
Azure Application InsightsThis is part of the Azure Monitor required for the Azure Function monitoring.
Power BI PremiumThe Power BI Premium license is required for the dashboard.
List of required Azure resources for the automating the audit log solution

To create the above resources in your Azure demo tenant you can use this ARM template.

The Queue Events Azure Function Code is located here.

The Store Events Azure Function Code is located here.

Once the data is gathered in the Azure Cosmos DB, the Power BI can generate a report similar to the following. The report shown is a basic example but once the data is collected to the Azure Cosmos DB you can create more such reports to meet your audit need.

The report can answer the following questions:

Who did an operation to the File, List, or Library?

When the operation was performed?

What operation was performed?

What sites are used?

The Audit report from the audit data gathered in the Azure Cosmos DB.

Conclusion

The above solution is a proof of concept (POC) for SharePoint or OneDrive data. There are other workloads audit data that can collect such as Exchange, DLP, Azure AD, etc.

Reference:

Using the Office 365 Management Activity API and Power BI for security analysis (Part 1)

Posted in Technical Stuff, SharePoint 2010, SharePoint 2013, PnP.PowerShell | Leave a comment

Application Lifecycle Management (ALM) for Power Platform (PP) 3 of 3

Summary

This is the final part of the three-part blog post.

Part1 and Part2 posts were focused on the basic settings for the Classic UI mode pipeline setup. In this post, I will explain more advanced, modular, and factory pattern ways of the CI/CD Azure DevOps pipelines.

My customer has Azure DevOps Server (ADOS) and pool agent as Windows OS for the default pool.

As defined in Part1, the prerequisite still applies to this approach. Additionally, you will need to have some basic familiarity with the YAML language. You will need the Git installed on your local machine. This approach uses the Microsoft Power Platform CLI. The Pac CLI performs better than Power Platform Build Tools defined in Part2.

Step by Step solution

Step # 1 Create an organization and project in your Azure DevOps.

Create a new project in the Azure DevOps organization

Step # 2 Initialize a blank Repository.

A blank repository

Step # 3 Clone the repository

Clone the repository in your local repository
### You will need Github installed on your local machine to run this command.
C:\SharePointDev\PowerPlatformALM>git clone {Your Clone repo URL}

Step # 4 Get the code from the GitHub repo. Copy files and folders to the root of the above local cloned repository.

The GitHub repo code is located here.

Copy the above two folders to the cloned local repo.

Step # 5 Using the following Git commands push the changed code to the Azure DevOps repo.

# this will check the status as untacked changes
git status  
# this command will add the untracked changes to the local as tracked
git add .
# this command will commit to the local repository
git commit -m "Added .azdops and pwsh folders"
# this command will push the local commited changes to the remote repoository
git push
The four commands to push the two folders and files to the Azure DevOps repo.

The final results to the Azure DevOps repository should look similar to the following. The .azdops and pwsh folders should be the top folder in your repository.

The Azure DevOps repository final result.

Step # 6 Create Azure Pipelines variable groups ‘Credentials’, ‘Dev’, ‘Test’, and ‘Prod’.

In the ADO project, click on the Library and Add the four variable groups. Now create variables in these variable groups. These variable groups are used in the YAML code. Please keep the name as defined below for variable groups and variables.

Add four variable groups.
#
# Variable Group Crendential will have the following three variables.
#
ClientId      = {Get the value from the Part1 blog post Step # 3 }
ClientSecret  = {Get the value from the Part1 blog post Step # 3 }
TenantId      = {Get the value from the Part1 blog post Step # 3 }
#
# Variable Group 'Dev', 'Test', 'Prod' will have the following variable.
# The Url value will be different for all three.
#
Url           = {Get the value from the Part1 blog post Step # 2 }

Note: The above values will be used in the pipeline.

Step # 7 Define Test and Prod Azure DevOps Environments

Under pipelines -> environments, define test and prod environments. The environments can be used for adding any approval requirements.

Create test and prod environments

For the prod environment, (optional) add the approval settings. Assign one or more approvers.

Assign approver to the prod environment.

Step # 9 Create a Pipeline using the existing file .azdops\pipelines\export-commit.yml

(Note: a longer step)

Click Create Pipeline button under the Pipelines
Select Azure Repos Git
Select the Repository for the project.
Select the ‘Existing Azure Pipelines YAML file.
Select ‘export-commit.yml’ file
Finally, review the code and click the arrow to save the pipeline. DO NOT RUN at this time.
Now rename the newly created pipeline by clicking 1, 2, and 3 as shown above.
Rename it to “Export and Commit by Solution Name Variable”
First, add the variable ‘SolutionName’ before running the pipeline.

Note: You will get the following error when you run the pipeline. YOU NEED TO set Contribute Allow for the build agent.

“remote: TF401027: You need the Git ‘GenericContribute’ permission to perform this action.”

An error when you run the pipeline the first time.
Change the Contribute settings to Allow for the Build Service agent.
Upon successful pipeline run, you should see your Solution under the solution folder as shown above.

Step # 8 Setup multi-stage build and deploy pipeline for .azdops\pipelines\build-deploy.yml

Consider the file .azdops\pipelines\build-deploy.yml as the template for your Power Platform Solution. From the above example for ‘Spark2022’ solution name I created the copy of the template to a new file named ‘Spark2022-build-deploy.yml’.

Make a copy of ‘build-deploy.yml’ template file for your solution. Replace the Demo2022 text from the yml file.

Step # 10 Create a Pipeline using the existing file .azdops\pipelines\{YOUR SOLUTION NAME}-build-deploy.yml

Note: You will repeat most of the solution steps from Step # 7 above. Please refer above for the screenshots.

Click on Pipeline, then click the “New pipeline” button.

Select Azure Repos Git

Select your specific repository

Select the “Existing Azure Pipelines YAML file”

Select the /.azdops/pipelines/{YOURSOLUTIONNAME-build-deploy.yml file. (You created in the step # 8)

DO NOT Run yet, click on the down arrow next to the Run blue button. Select Save.

Now click on Pipelines from the left menu. Click on the “All” tab.

Click on the newly created pipeline to rename, see below.

Rename the new pipeline.

Rename to something like “{YourSolutionName} Build and Deploy”. So for the above example, it would be “Spark2022 Build and Deploy”

Rename to “{YourSolutionName} Build and Deploy”

Step # 11 Create config.test.json and config.prod.json files (if not present) under the solution folder.

The two config files for the test and prod are under your Solution Folders directory.

Two config files for test and prod with empty JSON for now.

The save of these files will trigger the build if not, run the pipeline, you will see the multi-stage for Build, Test, and Prod

NOTE: In this pipeline, you do not need to set any variable before running it.

The build and deploy three stages.

Step # 12 Regarding the config.test.json and config.prod.json files.

This is an optional step. You may have environment variables for the connection references, site URL, list guide, etc. These variables are different in each environment. The values for deployment can be configured in the config.ENV.json file.

The trick to getting the content of the JSON file is to make use of the Visual Studio Code pac cli extension.

#
# Use the *unmanged* solution to extract dev config setting.
#
pac solution create-settings --solution-zip Spark2022.zip --settings-file config.dev.json
Dev config Settings using Pac CLI.

CITIZEN DEVELOPER SCENARIO

All above steps 1 to 12 are for the Build Administrator or Technical lead of the project.

For the Citizen developer’s role, the steps are simple and as follows.

  1. Gather the requirements or feedback or bugs for the application.
  2. Create a Power Platform Solution in the Dev Environment e.g. LaptopInventory
    1. Develop a PowerApps App
    2. Develop a PowerAutomate Flow
    3. Creates a List and any other components required for the solution.
    4. Test the application and flow
  3. Run Export and Commit Pipeline (export-commit.yml) for the LaptopInventory solution.
    1. Note: you need to add a SolutionName=LaptopInventory variable before running the pipeline
    2. This action will create a new directory ‘LaptopInventory’ in the repository
  4. Create config.test.json and config.prod.json files with an empty record as {}
    1. Note: This is a one-time activity. It can be repetitive if you are adding any config variables for various environments.
  5. Create a Build and Deploy Pipeline for the solution ‘LaptopInventory’ (See Step#8 and Step#9)
    1. Note: This is a one-time activity
    2. Make a copy of build-deploy.yml to LaptopInventory.yml in the ‘.azdops\piplelines‘ folder
    3. Change the text from ‘Demo2022’ to the ‘LaptopInventory’
    4. Create a Pipeline using the new YML file
    5. Run the pipeline if it is not running already. (This action will make the Solution available in the Test Environments)
  6. Ask Testers to test and provide feedback.
  7. Repeat the above steps from here as needed for the ALM cycle.

The below flow chart gives an overview of the process. The developers run the “Export & Commit” by the SolutionName. This action “1” exports the Solution and Commits to the Source Control Managemen (SCM). The checking in the file(s) for the solution auto triggers the “Build & Release” pipeline which eventually put the managed solution in the TEST or PROD environments.

Depending on the feedback from the testers the above process is repeated by the developer.

The two pipelines show the CI and CD.

Conclusion

This concludes the 3 parts blog series for Application Lifecycle Management for Power Platform.

Please refer to the following articles for more information.

YAML schema reference for Azure Pipelines

Move flows across environments without resetting connections! (Please use connection references in your solutions)

Power Platform Environment variables overview

Azure DevOps CODE: microsoft-power-platform-ce/spark-prodev-practice (This blog post made use of most of the code from this repo)

GitHub Actions CODE: melody-universe/levelup-devops (If you plan to use the GitHub actions this repo can be a good start)

Introducing the ALM Accelerator for Power Platform

Thank you Melody Universe and Paul Breuler for your code and knowledge sharing.

Posted in Power Apps, Power Apps, Power Automate, SharePoint, Technical Stuff | 1 Comment

How to do CI/CD pipelines build for an SPFX on Azure DevOps for the site collection app catalog?

Summary

My customer needs a way to help the developers to create CI/CD pipeline build for the developers. The customer has the Azure DevOps Server (ADOS). They need guidelines for the DevOps process for SPFx projects.

The requirement is to utilize the existing shared build agents which were Windows VMs. Additionally, the deployment should be in the site collection app catalog not in the app catalog.

The sample on the web provided by Andrew Connel was a very good start. I modified it to meet the above requirements. I hope this may be helpful to you.

Prerequisites

  1. Azure DevOps (ADO) Server if not you can utilize the online ADO. (Follow this article to get a free five-user Basic Plan Azure Subscription.)
  2. GitHub Repo

Step By Step Solution

Step # 1: Create a Jobs folder in your source repo.

Create the following three files.

  1. build.yml (File is located here.)
  2. test.yml (File is located here.)
  3. deploy.yml (File is located here.)

Create a file in your repository

azure-pipleline.yml (File is located here.)

Step # 2: The above jobs files will be used to define the pipeline.

Click on the New Pipeline under the Pipelines.

Select the code location, in this case, select the Azure Repos Git.

Select the repository

Select the “Existing Azure Pipelines YAML file.

Now point to the azure-pipeline.yml file from Step # 1.

Step # 3: Create the side loading of the site where you want to create an app catalog.

The site collection app catalog can be created by SPO admins.

Use the Add-SPOSiteCollectionAppCatalog PowerShell command.

OR

Use the PnP Add-PnPSiteCollectionAppCatalog command.

Step # 4: You need to create APP ID and APP SECRET

Add-in permissions in SharePoint

SharePoint Add-In — Permission XML cheat sheet

The following three URLs will be handy, the first one is to register a new app, the second is to set the permissions for the app and the third one is to list apps you or others have previously created.

  1. https:// [Your SharePoint Site URL]/_layouts/15/AppRegNew.aspx
  2. https:// [Your SharePoint Site URL]/_layouts/15/AppInv.aspx
  3. https:// [Your SharePoint Site URL]/_layouts/15/AppPrincipals.aspx

Step # 5: Finally you need to create the following two variables in the pipelines.

ClientID

ClientSecret

Conclusion

The above steps were to provide the CI/CD pipelines to deploy the SPFX packages to the site collection app catalog.

To know more about SPFx. Please use the following links.

(678) SharePoint Framework Tutorials – YouTube

(678) SharePoint Framework – YouTube

Posted in Technical Stuff | Leave a comment

Why should you use compose versus variables in apply-to-each action?

Summary

If you have large Power Automate flow and you have used a lot of variables and have looping, I bet your flow performance is not good. If that is the problem, please keep reading you may benefit from this post.

Few simple facts:

  1. The Power Automate looping can run in concurrency. In other words, the actions like apply-to-each or do-while can run concurrently in parallel.
  2. If in the loops the variables is modified using append-to-variable action, internally Power Automate puts the LOCKS on the variables. This is by design so any other area of the concurrent code does not modify the value at the same time.
  3. This lock and unlock of variables takes time and your flow suffers from the performance.

DON’T use variable modification in the loop.

What is the solution?

The answer is to use the Compose. You may ask, but wait? the Compose action is by design to put the constant values.

Yes, that is correct the Compose action is for the constant. But there is a technique to get the values out from the loop using Compose.

Step By Step Solution

To demonstrate the issue and solution I will make use of the Compose and Variable actions in the parallel flow. I will create an array of integers using the range method.

Step # 1 Create the Manually trigger flow

Step # 2 Add a Compose Action with expression as follows. Rename it to ‘const array’

range(0,100)

Step # 3 Add an Initialize Variable with Name as ‘array’ and type as an ‘Array’ and add the same above expression for Value. Rename action to ‘var array’

After step # 3, you will have something like this.

Step # 4 Add Apply to each action and rename to “Apply to each const array”

Step # 5: Inside this apply-to-each add a Compose action with expression for “Current Item”

Step # 6: An important step, add a TEMP Compose 2 action and expression as the output of the above Compose.

Step # 7: Using the peek code take the expression value. It should be like the following. You need only bold values from it.

{
    "inputs": "@outputs('Compose')"
}

Step # 7: Add new Compose 3 action “Outside” of the apply-to-each loop. In the expression using the above expression. Rename it as ‘log cont array’

Step #8: You do not need the Compose 2 action we added in Step 6. So delete it now.

After step 7, you should have the flow as shown above.

Step # 9: You now will add the parallel branch next to the previous apply-to-each action. Add Initialize variable steps with Name as ‘results’, Type as an ‘Array’, and value as two square brackets [].

Step # 10: Add second apply-to-each on the ‘array’ variable rename it as ‘Apply to each var array’

Step # 11: Add ‘Append to array variable’, pick the results variable and Value as the ‘Current item’ of the second apply to each action.

Step # 12: Add the final Compose to display the results variable. Name it as ‘log var results’

After step 12, the flow should be like the above.

Step # 13 Finally you want to turn on concurrency for both the flows. Click on the ‘Settings’ for both apply-to-each actions. Turn on the concurrency and set parallelism to 50.

setting parallelism

Run the flow now and see the results. I get the following results.

The Compose inside the loop finishes in 2 seconds whereas the variables in the loop take 12 seconds.

using Compose inside the loop vs variables.

Nott: You can download and import the above sample flow in your environment. The export zip file is located here.

Conclusion

Obviously, the compose action gives better satisfactory performance over the variables for using inside the loop. Additionally, concurrency with parallelism provides better performing flow.

I have just re-wrote the same technique Pieter and John Lui have presented or blogged about this method.

Posted in Power Automate | Leave a comment

How to add Power Virtual Agent (PVA) Bot to a SharePoint page?

Summary

As you know the Power Virtual Agent (PVA) is a low-code and no-code solution. It is very easy to create a Bot quickly. Now, if you want this bot on the SharePoint page. There are a few options and their pros & cons.

In this article, I am going to list out all options. I will share the pros and cons of each option and share code where needed. This will help you guide your journey for a similar requirement.

Prerequisites

There are a few prerequisites, based on which option you pick you will need to get one or more of the following resources.

  1. Power Platform Environment – Required for all options
  2. SharePoint Online tenant – Required for all options
  3. Visual Studio – Required for Options # 2
  4. Azure Web App & Azure Front Door– Required for Options # 2
  5. Visual Studio Code  – Required for Options # 3
  6. SharePoint Framework Development (SPFx) Environment Setup – Required for Options # 3

Below are the three options:

Option # 1: Put embedded code from the bot on an IFRAME on the SharePoint page.

Pros:

  • Easy and quick
  • Easy to get embed code from the PVA Bot Channels.
  • Easy step to add the IFRAME on the SharePoint page

Cons:

  • The embed code has the code and the token is exposed as public for the content owner. If this code with url is leaked anyone from outside of the organization network can access the Bot. If this Bot is meant to be internal to organization then this option will not work.

Option # 2: Use DirectLine (DL) Secret and get DL Token from Azure Web Site URL for IFRAME on the SharePoint page.

Basically, in this approach, the IFRAME will be still used on the SharePoint page. But the URL for the IFRAME will be from the internally controlled Web Site URL which will be blocked from public access using IP restrictions. The internal website will be configured with the Direct Line Secret. The website will get the Direct Line Token for doing secured communication with the Bot.

Pros:

  • The bot will NOT be accessed by the public. It will be restricted by the website for the internal IP range only. Even if the URL is leaked it will be blocked by the IP Restriction on the Azure Front Door.
  • It used the Bot’s direct line Token as a protected communication mechanism.

Cons:

Option # 3: Create an SPFx Application Customizer to put on the SharePoint page.

Pros:

  • The bot will not be exposed to the public. The Direct Line Secret will be buried in the SPFx code & can not be seen

Cons:

  • There is SPFx web part development and in future for maintenance (if needed).

Step By Step Solution:

Option # 1:

The following two steps will get your Bot on the SharePoint page quickly.

  1. This is an easy solution. All you need to do is to get the embed code from the Bot. Follow here.
  2. Once you get the code you add this to your SharePoint page. Follow here.

Option # 2:

This option will use Azure Web App and Azure Front Door. The Azure Front Door will allow to restriction of IP within the organization.

The Azure Web App is the ASP.NET Core solution. There are two components in the website, one is to UI and the other is API. The UI part is to show the Web Chat and API is to get the Direct Line Token from the Secret. The Direct Line Secret is stored in the configuration of the website or can be secured using a key vault.

TODO: Code to follow.

Option # 3:

TODO: Code to follow.

Conclusion

Based on the above article’s options and their pros and cons you will be able to decide what is right for your need. Option #1 is very easy and quick to Option # 3 to get modern ways to block the access.

Some helpful links for Bot development

https://aka.ms/pvaarchitectureseries

https://github.com/microsoft/PowerVirtualAgentsSamples

https://aka.ms/BotComposerSeries

Posted in Technical Stuff | Leave a comment

The tenant-wide flag DisableCustomAppAuthentication in relation to Access Control System (ACS)?

Summary

If you have used the Access Control System (ACS) and using its APP ID and APP Secret in the PnP.PowerShell module it may not work.

The Connect-PnPOnline will work fine but when I try to get any command to make it work you will get the following (401) Unauthorized error. I have given Full Control for the Site Collection.

PS C:\WINDOWS\system32> Get-PnPList -Connection $conn1
Get-PnPList : The remote server returned an error: (401) Unauthorized.
At line:1 char:1
+ Get-PnPList -Connection $conn1
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : WriteError: (:) [Get-PnPList], WebException
    + FullyQualifiedErrorId : EXCEPTION,PnP.PowerShell.Commands.Lists.GetList

PS C:\WINDOWS\system32>  
<AppPermissionRequests AllowAppOnlyPolicy="true">
	<AppPermissionRequest Scope="http://sharepoint/content/sitecollection" Right="FullControl"/>
</AppPermissionRequests>

So, what is going on?

The answer lies in the tenant-wide flag “DisableCustomAppAuthentication”. This is because ACS is retired, and it should not be used for the new application. It is still supported for backward compatibility, but it is recommended use the “Sites.Selected” permission.

Set-SPOTenant -DisableCustomAppAuthentication $false

After disabling the above flag the PnP command works.

Conclusion

You can fix the ACS mechanism for your app temporarily, but it is “highly” recommended that you migrate the code to the new Authentication on the Azure AD.

Please see the following article for the “Sites.Selected” permission.

How does the MS Graph “Sites.Selected” permission work for granular permissions for SPO sites? | Pankaj Surti’s Blog

Posted in Technical Stuff | Leave a comment

How to get a list of Site Collection Admins for a SharePoint site?

Summary

My customer had a requirement to get the list of Site Collection Admins (SCAs) for any site within the tenant for anyone in the organization. The need of the user is to find additional information from the SCA.

This request can be on-demand from any user within the organization. The resolution is to have a tool for a user to request a site URL. Using the user’s Site URL request the tool can find out the SCAs for the site. The tool will fill out the Multi People field for the same request.

This tool as “Find SCAs Tool”

Prerequisite

  • “FindSCARequestTracker” SharePoint List
  • Azure Logic
  • Azure Function
  • PnP.PowerShell

Step By Step Solution

Step # 1 : Create a “FindSCARequestTacker” SharePoint List

Internal NameDisplay NameColumn Type
TitleSite URLSingle line of text
ListOfSCAList Of SCAsMulti People field
StatusStatusRead Only – Choice field with New & Completed

The Status field default value is “New”.

Step # 2: “ProcessFindSCARequestTracker” Logic App

The logic app will trigger on items created or modified for the above SharePoint list. The trigger condition for this is to check only trigger the flow when the Status field is New.

@contains(triggerBody()?['Status']?['Value'],'New')

There are only two actions in the flow. The first action is to make a call to an HTTP Trigger Azure Function. The second call is to Update the SharePoint item. The update is updating the “Status” field as Completed and the “ListOfSCA” field with the results from the Aure Function.

The Logic App with the two main actions.

Step # 3: “find-sca-by-site” Azure Function.

The “find-sca-by-site” Azure Function is set as an HTTPTriggered function. The input to the to the call is the JSON value with the Site URL. The output of this function is also JSON with the Claims array with the email address of the SCAs.

NOTE: You can refer to my previous post to set up Azure Function and Certificate. How to set up certificate in MAG Azure Function App for PnP.PowerShell?

NOTE: To get to the code for the above Azure function, please click here.

Please make sure the Azure AD app you create has the following permissions.

The Azure AD required permissions

Results

The results after the Azure Logic App and Azure Function are executed for a request.

Conclusion

With help of the Azure resource to get the list of Site Collections Admins can be created as a Tool. This will help reduce SPO admins’ tasks and the tools can provide information on what a user needs.

Posted in Azure, PnP.PowerShell, SharePoint | Leave a comment

How to automate and govern the “Sites.Selected” permissions using a custom tool?

Summary

Earlier, I posted an article regarding the “Sites.Selected” MS Graph permission to create the granular permissions for sites. The following is the link for the article.

How does the MS Graph “Sites.Selected” permission work for granular permissions for SPO sites?

This is all great that granting and revoking “granular” permission for reading or writing for the site can be controlled by the Admins. However, there are some gaps such as how the governance can be maintained. It is the Admin’s additional task to execute the scripts and to maintain the list of the Azure AD applications, sites, and permissions. Additionally, based on the tenant size and to execute the users’ requests to get access to sites can be Admins nightmare.

This article will address the above weaknesses by giving you steps to design and develop a tool to maintain the users’ requests. Also will guide you to automate the granting and revoking sites permissions.

I named the tool Sites Selected Request Tracker (SSRT)

Prerequisites

The following resources are required for the tool.

  1. Pnp.PowerShell – The PnP.PowerShell is used in the Azure Function to maintain the tracker list.
  2. Two SharePoint Lists – The SharePoint lists are required to track the list of Azure AD application IDs and the users’ permission requests.
  3. Azure Logic Apps – The Azure Logic Apps is required to trigger a flow on the request tracker list’s create or modify. Based on the grating or revoking trigger change it will also make a call to Azure Function.
  4. Azure Function – The Azure Function is required to execute the Grant and Revoke using the PnP.PowerShell. This is the real engine that will automate the task.
  5. Azure Key Vault – The Aure Key Vault is needed to store the Certificates for the Azure AD app. Please read the previous article for more information.

Architecture Diagram

The logical architecture diagram for the SSRT tool.

Architecture Diagram for the Sites Selected Request Tracker (SSRT)
Architecture Diagram for the Sites Selected Request Tracker (SSRT)

SharePoint List One – CustomerAppIDs Columns

This first list will have the list of all Azure AD apps (consented “Sites.Selected” permission). You can use the client secret or certificate. It is recommended to use the certificate for each Azure AD app.

Internal NameDisplay NameColumn Type
TitleApp ID GUIDSingle line of text
AppNameApp Display NameSingle line of text
SharePoint List One – CustomerAppIDs Columns

SharePoint List Two – SitesSelectedTracker Columns

This is the second list which keeps the track of the all the requests for sites. After adding the Azure AD app information in the first list, the Admins will add the sites for a specific Azure AD application with the Read or Write selection. Whenever the Admins makes change the item to Revoke the Logic App will revoke and delete the item from the tracker list.

Internal NameDisplay NameColumn TypeDescription
1TitleSite URLSingle line of textTo store url needs perms
2ApplicationIDApplicationIDLookupReference to app id and AppName columns from the above list.
3ReadWriteRead Or WriteChoiceRead or Write Choice. Default as Read.
4GrantRevokeGrant Or RevokeChoiceGrant or Revoke choice. Default as Grant
5ReadWriteCopyReadWriteCopySingle line of textUsed internally for the flow. Hidden from the user. Default is None
6GrantRevokeCopyGrantRevokeCopySingle line of textUsed internally for the flow. Hidden from the user. Defaultis Grant.
7RecordEngineStepsRecordEngineStepsEnhanced rich textUsed internally for the flow. Hidden from the user to enter. Engine uses to add steps description.
SharePoint List Two – SitesSelectedTracker Columns

“ProcessReadWrite” Azure Logic Apps

  • The Logic Apps triggers on the item create or modified on the SitesSelectedTracker.
  • Trigger conditions is
    • To Check ReadWrite to ReadWriteCopy are not equal OR
    • To Check GrantRevoke to GrantRevokeCopy are not equal.
  • If the above condition is met then the Azure Logic App makes a call to the “SPOSiteSelected” Azure Function with the following parameters.
# The following request body is passed to the Azure Function.
{
  "Action": "@{triggerBody()?['GrantRevoke']?['Value']}",
  "ClientAppID": "@{triggerBody()?['ApplicationID']?['Value']}",
  "DisplayName": "@{triggerBody()?['ApplicationID_x003a_AppName']?['Value']}",
  "Permission": "@{triggerBody()?['ReadWrite']?['Value']}",
  "SiteURL": "@{triggerBody()?['Title']}"
}

NOTE: To get to the code for the above Azure function, please click here.

“SPOSitesSelected” HTTPTriggered Azure Function

NOTE: You can refer to my previous post to set up Azure Function and Certificate. How to setup certificate in MAG Azure Function App for PnP.PowerShell?

NOTE: To get to the code for the above Azure function, please click here.

Conclusion

As described Sites Selected Request Tracker (SSRT) tool, it addresses the governance and automation issues for the “Sites.Selected” permission.

Posted in Azure, PnP.PowerShell, SharePoint | 1 Comment

How to resolve “The attempted operation is prohibited because it exceeds the list view threshold.” for Remove-PnPWeb?

Summary

A customer got the following error for removing a sub-site.

PS C:\Temp> Remove-PnPWeb -Identity testcaa -Force
Remove-PnPWeb : The attempted operation is prohibited because it exceeds the list view threshold.
At line:1 char:1
+ Remove-PnPWeb -Identity vacaa -Force
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : WriteError: (:) [Remove-PnPWeb], ServerException
    + FullyQualifiedErrorId : EXCEPTION,PnP.PowerShell.Commands.RemoveWeb

PS C:\Temp>

Clearly, the issue was due to a large list or lists present in the sub-site. Since the entire subsite needs to be deleted with the large list, to eliminate the above error the large list must be deleted first.

The following code can be useful to remove the items from the large list and it allowed to delete the list. Once all large lists were deleted the subsite was successfully deleted.

$action = "Delete"
# TODO Change your site or subsite URL
$siteUrl = https://[YOUR TENANT].sharepoint.com/sites/Contoso/testcaa
# TODO Change the list name 
$listName = "Contoso Community Care Data" 
$ErrorActionPreference="Stop"
Connect-PnPOnline –Url $siteUrl -UseWebLogin
$Stoploop = $false
[int]$Retrycount = "0"

write-host $("Start time " + (Get-Date))
do {
  try {

    if($action -eq "Delete")
    {
      $listItems= Get-PnPListItem -List $listName -Fields "ID" -PageSize 1000  
      $itemIds = $lisItems | Foreach {$_.Id}
      $itemCount = $listItems.Count
      while($itemCount -gt 0)
      {
        $batch = New-PnPBatch
        #delete in batches of 1000, 
        #if itemcount is less than 1000 , all will be deleted 
        if($itemCount -lt 1000)
        {
          $noDeletions = 0
        }
        else
        {
          $noDeletions = $itemCount -1000
        }
        for($i=$itemCount-1;$i -ge $noDeletions;$i--)
        {
          Remove-PnPListItem -List $listName -Identity $listItems[$i].Id -Batch $batch 
        }
        Invoke-PnPBatch -Batch $batch
        $itemCount = $itemCount-1000
     }
   }
   Write-Host "Job completed"
   $Stoploop = $true
  }

  catch {

    if ($Retrycount -gt 3){

       Write-Host "Could not send Information after 3 retrys."

       $Stoploop = $true

    }

    else {

      Write-Host "Could not send Information retrying in 30 seconds..."

      Start-Sleep -Seconds 30

      Connect-PnPOnline –Url $siteUrl -interactive

      $Retrycount = $Retrycount + 1

    }
  }
}
While ($Stoploop -eq $false)
write-host $("End time " + (Get-Date)) 

Conclusion

To delete any site the large lists items and list must be deleted first. Once that is done the site or subsite can be deleted.

Reference:

PnP Batch Add or Delete items from very large list, i.e. more than 300k items – Microsoft Tech Community

The fastest way to create SharePoint list items – Waldek Mastykarz

Posted in SharePoint, SharePoint 2010, SharePoint 2013 | Leave a comment

How does the MS Graph “Sites.Selected” permission work for granular permissions for SPO sites?

Summary

To provide granular access for the sites the Azure Access Control (ACS) was used in the past. The app id and secret can be created using the add-ins, more info is described here. Note: Please check ACS retirement info.

Now with the new Sites.Selected MS Graph permission you can use the granular level permission. This blog post is just to simplify and demonstrate the use of PnP PowerShell to create granular permissions.

For example, if the customer wants to have access to a few sites as a read permissions the “Site.Selected” permission techniques can be used to meet the need.

Step by Step Solution

Step # 1 Create Azure AD app with MS Graph Sites.FullControl.All permission

NOTE: Make sure you select MS Graph not SharePoint, as it also has the same permission but that is not valid.

Please make a note of the application id.

Admin App

Step # 2 Create Azure AD app with MS Graph Sites.Selected permission

Client-App
Add SharePoint Sites.Selected permission also if you are using PnP.PowerShell.

Step # 3 Create a PFX and CER certificate using PnP PowerShell.

New-PnPAzureCertificate -OutPfx pnpSites-Selected.pfx -OutCert pnpSites-Selected.cer -CertificatePassword (ConvertTo-SecureString -String "pass@word1" -AsPlainText -Force)

The above command will create two files, store them in the known directory.

Step # 4 Upload the CER file to the Azure AD app created in Step #1 i.e. Admin app.

Step # 5 Using PnP PowerShell command Grant-PnPAzureADAppSitePermission


$adminAppId = "9a6f4c8a-e9cf-44fd-b3ad-4413ed66a2ce"; #admin-app TODO to replace
$clientAppId = "ca2a60c7-d09a-4875-841b-117a02b504fd"; #client-app

$tenantPrefix = "M365x162783"; # replace with your tenant id TODO to replace
$tenantName = $tenantPrefix +".onmicrosoft.com";
$spoTenantName = "https://" + $tenantPrefix + ".sharepoint.com";

# site to apply granular permission, 
# it can be repeated for more than one sites
$site2apply = "https://m365x162783.sharepoint.com/sites/lbtest1"

$password = (ConvertTo-SecureString -AsPlainText 'pass@word1' -Force)

$adminConn = Connect-PnPOnline -Url $spoTenantName -ClientId $adminAppId -CertificatePath 'c:\Temp\pnpSites-Selected.pfx' -CertificatePassword $password  -Tenant $tenantName


#### GRANT
Grant-PnPAzureADAppSitePermission -AppId $clientAppId -DisplayName "Thisapp" -Permissions Read -Site $site2apply -Verbose

Step # 6 To check the client app has an access to read lists of the site

$clientAppId = "ca2a60c7-d09a-4875-841b-117a02b504fd"; #client-app

$tenantPrefix = "M365x162783"; # replace with your tenant id TODO to replace
$tenantName = $tenantPrefix +".onmicrosoft.com";

$site2apply = "https://m365x162783.sharepoint.com/sites/lbtest1"

$clientConn = Connect-PnPOnline -Url $site2apply -ClientId $clientAppId -CertificatePath 'c:\Temp\pnpSites-Selected.pfx' -CertificatePassword $password  -Tenant $tenantName

Get-PnPList

Click here to see the code for how to read all lists using non-PnP command i.e. REST calls.

Step # 7 To change the permission from “Read” to “Write”.

NOTE: The following code is an extension from the above code variables set in the above steps.

#### GET
$perms = Get-PnPAzureADAppSitePermission -Site $site2apply

#### SET
Set-PnPAzureADAppSitePermission -Site $site2apply -Permissions Write -PermissionId $perms.Id

Step # 8 To revoke the permission.

NOTE: The following code is an extension from the above code variables set in the above steps.

#### GET
$perms = Get-PnPAzureADAppSitePermission -Site $site2apply

#### REVOKE
Revoke-PnPAzureADAppSitePermission -Site $site2apply -PermissionId $perms.Id

Conclusion

The granular access to sites using Sites.Selected can be achieved using the PnP.PowerShell module.

Please see the next follow up article.

I have used code REST related call in Step # 6 from Srinivas’s blog

Use Microsoft Graph to Set Granular Permissions to SharePoint Online Sites for Azure AD Application – DEV Community

Controlling app access on a specific SharePoint site collections is now available in Microsoft Graph – Microsoft 365 Developer Blog

Posted in PnP.PowerShell, SharePoint | 4 Comments