Monday, 26 January 2026

Call a React Component with TypeScript and Zod: A Step-by-Step, Production-Ready Pattern

The fastest way to call a component in React is to render it via JSX and pass strictly typed props. This article shows how to call a component in React with TypeScript and Zod so you get compile-time and runtime safety, clear state management, and production-ready patterns.

The Problem

Developers often "call" (render) a component without strict typing or validation, leading to runtime bugs, unclear state, and hard-to-test UI.

Prerequisites

Node.js 20+, pnpm or npm, React 19, TypeScript 5+, Zod 3+, a modern browser. Ensure tsconfig has strict: true.

The Solution (Step-by-Step)

Step 1: Bootstrap a minimal TypeScript + React app

// package.json (excerpt) - ensures React 19 and strict TS
{
  "name": "react-call-component-ts",
  "private": true,
  "type": "module",
  "scripts": {
    "dev": "vite",
    "build": "tsc -b && vite build",
    "preview": "vite preview"
  },
  "dependencies": {
    "react": "^19.0.0",
    "react-dom": "^19.0.0",
    "zod": "^3.22.0"
  },
  "devDependencies": {
    "typescript": "^5.6.0",
    "vite": "^5.0.0",
    "@types/react": "^18.3.0",
    "@types/react-dom": "^18.3.0"
  }
}
// tsconfig.json - strict mode enabled for maximum safety
{
  "compilerOptions": {
    "target": "ES2022",
    "lib": ["ES2022", "DOM"],
    "jsx": "react-jsx",
    "module": "ESNext",
    "moduleResolution": "Bundler",
    "strict": true,
    "noFallthroughCasesInSwitch": true,
    "noUncheckedIndexedAccess": true,
    "skipLibCheck": true
  },
  "include": ["src"]
}

Step 2: Create a strictly typed child component with runtime validation

// src/components/Greeting.tsx
import React, { memo } from "react";
import { z } from "zod";

// 1) Define compile-time props shape via TypeScript
export type GreetingProps = {
  name: string;                 // Required user name
  mode: "friendly" | "formal";  // Discriminated literal union for behavior
};

// 2) Define runtime schema using Zod for additional safety in production
const greetingPropsSchema = z.object({
  name: z.string().min(1, "name is required"),
  mode: z.union([z.literal("friendly"), z.literal("formal")])
});

// 3) React.memo to avoid unnecessary re-renders when props are stable
export const Greeting = memo(function Greeting(props: GreetingProps) {
  // Validate props at runtime to fail fast in dev and log issues in prod
  const result = greetingPropsSchema.safeParse(props);
  if (!result.success) {
    // Render a small fallback and log schema errors for debugging
    console.error("Greeting props invalid:", result.error.format());
    return Invalid greeting config;
  }

  // Safe, parsed props
  const { name, mode } = result.data;

  // Render based on discriminated union value
  if (mode === "friendly") {
    return 

Hi, {name}! Welcome back.

; } return

Hello, {name}. It is good to see you.

; });

Explanation: We "call" a component in React by placing it in JSX like <Greeting name="Sam" mode="friendly" />. The TypeScript type enforces correct usage at compile time; Zod enforces it at runtime.

Step 3: Manage parent state with discriminated unions and render the child

// src/App.tsx
import React, { useEffect, useState } from "react";
import { Greeting } from "./components/Greeting";

// Discriminated union for page state: guarantees exhaustive checks
type PageState =
  | { kind: "loading" }
  | { kind: "ready"; userName: string }
  | { kind: "error"; message: string };

export function App() {
  const [state, setState] = useState({ kind: "loading" });

  // Simulate fetching the current user, then set ready state
  useEffect(() => {
    const timer = setTimeout(() => {
      // In a real app, replace with a fetch call and proper error handling
      setState({ kind: "ready", userName: "Sam" });
    }, 300);
    return () => clearTimeout(timer);
  }, []);

  // Render different UI based on discriminated union state
  if (state.kind === "loading") {
    return 

Loading…

; } if (state.kind === "error") { return

Error: {state.message}

; } // Key line: this is how you "call" (render) the component with props return (

Dashboard

{/* Rendering a list of components safely */} {(["Ada", "Linus", "Grace"] as const).map((n) => ( ))}
); }

Step 4: Mount the app

// src/main.tsx
import React from "react";
import { createRoot } from "react-dom/client";
import { App } from "./App";

const container = document.getElementById("root");
if (!container) throw new Error("Root container missing");

createRoot(container).render(
  // StrictMode helps surface potential issues
  
    
  
);

Best Practices & Security

Pro-Tip: Use React.memo for presentational components to avoid unnecessary re-renders.

Pro-Tip: Use discriminated unions for UI state to guarantee exhaustive handling and safer refactors.

Pro-Tip: Validate at runtime with Zod for boundary inputs (API responses, query params, environment-driven config).

Pro-Tip: Prefer useCallback and stable prop shapes when passing callbacks to memoized children.

Pro-Tip: Keep components pure; avoid hidden side effects inside render logic.

Security note (front-end): Do not embed secrets in the client. If you integrate with Azure or any backend, call a secured API instead of accessing resources directly from the browser.

Security note (Azure backend integration): Use Managed Identity and DefaultAzureCredential in the server/API, not the frontend. Grant the server's managed identity least-privilege RBAC roles only. Example: for Azure Storage reads, assign Storage Blob Data Reader to the API identity at the specific container scope.

Security note (data flow): Validate user input and API responses at the edge (API) with Zod or similar, then keep the front-end strictly typed.

Summary

• You call a component in React by rendering it in JSX with strictly typed, validated props.

• Discriminated unions make UI state predictable, and React.memo boosts performance.

• For real backends, keep secrets server-side, use Managed Identity with least-privilege RBAC, and validate at the edge.

Testing Quickstart

Test a component render with React Testing Library

// src/components/Greeting.test.tsx
import React from "react";
import { render, screen } from "@testing-library/react";
import "@testing-library/jest-dom";
import { Greeting } from "./Greeting";

test("renders friendly greeting", () => {
  render(<Greeting name="Sam" mode="friendly" />);
  expect(screen.getByText(/Hi, Sam!/)).toBeInTheDocument();
});

test("renders formal greeting", () => {
  render(<Greeting name="Ada" mode="formal" />);
  expect(screen.getByText(/Hello, Ada\./)).toBeInTheDocument();
});

This test verifies the component is "called" with valid props and renders deterministic output. For invalid props, assert that the fallback appears and console error is triggered.

React 19: The Practical Difference Between Hooks and Components (With TypeScript and Azure Integration)

The difference between react hooks and components matters because it defines how you separate logic from presentation. Problem: teams mix stateful logic directly inside UI and struggle to test, reuse, and scale. Solution: put data fetching, validation, and side effects in reusable hooks; keep rendering in lean components. Value: cleaner architecture, easier testing, fewer bugs, and production-ready integration with Azure using least privilege.

The Problem

Developers often blur the line between where logic lives (hooks) and where UI renders (components). This leads to duplicated code, tangled effects, and UI tests that are slow and brittle. We need a clear pattern: hooks encapsulate logic and I/O; components focus on layout and accessibility.

Prerequisites

Node.js v20+, TypeScript 5+ with strict mode, React 19, TanStack Query v5+, Zod v3+, Azure Functions Core Tools v4+, .NET 8 SDK, Azure CLI (or azd), and a browser-compatible fetch API.

The Solution (Step-by-Step)

1) Define strict runtime and compile-time types

// src/schemas/user.ts
import { z } from "zod";

// Zod schema for runtime validation and safe parsing
export const UserSchema = z.object({
  id: z.string().uuid(),
  email: z.string().email(),
  name: z.string().min(1),
});

export const UsersSchema = z.array(UserSchema);

export type User = z.infer<typeof UserSchema>;

2) Create a focused hook: logic, data fetching, and validation

// src/hooks/useUsers.ts
import { useMemo } from "react";
import { useQuery, QueryClient, QueryClientProvider } from "@tanstack/react-query";
import { UsersSchema, type User } from "../schemas/user";

// Discriminated union for explicit UI states
export type UsersState =
  | { status: "loading" }
  | { status: "error"; error: string }
  | { status: "success"; data: ReadonlyArray<User> };

// Fetch function with runtime validation and descriptive errors
async function fetchUsers(): Promise<ReadonlyArray<User>> {
  const res = await fetch("/api/users", { headers: { "accept": "application/json" } });
  if (!res.ok) {
    // Include status for observability; avoid leaking server internals
    throw new Error(`Request failed: ${res.status}`);
  }
  const json = await res.json();
  // Validate and coerce; throws if shape is wrong
  return UsersSchema.parse(json);
}

export function useUsers(): UsersState {
  const { data, error, status } = useQuery({
    queryKey: ["users"],
    queryFn: fetchUsers,
    staleTime: 60_000, // cache for 1 minute
    retry: 2,          // conservative retry policy
  });

  // Map TanStack Query status to a strict discriminated union for the UI
  return useMemo((): UsersState => {
    if (status === "pending") return { status: "loading" };
    if (status === "error") return { status: "error", error: (error as Error).message };
    // At this point data is defined and validated by Zod
    return { status: "success", data: data ?? [] };
  }, [status, error, data]);
}

// Optional: provide a QueryClient at the app root (copy-paste ready)
export const queryClient = new QueryClient();

// In your app root (e.g., src/main.tsx):
// import { createRoot } from "react-dom/client";
// import { QueryClientProvider } from "@tanstack/react-query";
// import { queryClient } from "./hooks/useUsers";
// import { App } from "./App";
// createRoot(document.getElementById("root")!).render(
//   <QueryClientProvider client={queryClient}>
//     <App />
//   </QueryClientProvider>
// );

3) Keep components presentational and accessible

// src/components/UsersList.tsx
import React from "react";
import { useUsers } from "../hooks/useUsers";

// Functional component focuses on rendering and accessibility
export function UsersList(): JSX.Element {
  const state = useUsers();

  if (state.status === "loading") {
    // Keep loading states lightweight and non-blocking
    return <p role="status" aria-live="polite">Loading users...</p>;
  }

  if (state.status === "error") {
    // Display a user-friendly message without revealing internals
    return <p role="alert">Could not load users. Please try again.</p>;
  }

  // Success path: minimal, semantic markup
  return (
    <ul aria-label="Users">
      {state.data.map(u => (
        <li key={u.id}>{u.name} ({u.email})</li>
      ))}
    </ul>
  );
}

4) Optional Azure back end: least-privilege, Managed Identity

This example shows an Azure Functions .NET 8 HTTP API that returns users. It authenticates to Azure Cosmos DB using DefaultAzureCredential and a system-assigned Managed Identity, avoiding connection strings. Assign only the necessary RBAC role.

// FunctionApp/Program.cs (.NET 8 isolated worker)
using Azure.Identity; // DefaultAzureCredential
using Microsoft.Azure.Functions.Worker;
using Microsoft.Azure.Functions.Worker.Http;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.DependencyInjection;
using System.Net;
using Azure.Core;
using Azure.Cosmos; // Azure.Data.Cosmos alternative for .NET SDK

var host = new HostBuilder()
    .ConfigureFunctionsWorkerDefaults()
    .ConfigureServices(services =>
    {
        // Use managed identity via DefaultAzureCredential
        services.AddSingleton<TokenCredential>(_ => new DefaultAzureCredential());

        services.AddSingleton<CosmosClient>(sp =>
        {
            var credential = sp.GetRequiredService<TokenCredential>();
            // Endpoint from configuration (no keys). Use App Settings.
            var endpoint = Environment.GetEnvironmentVariable("COSMOS_ENDPOINT")!;
            return new CosmosClient(endpoint, credential);
        });
    })
    .Build();

await host.RunAsync();

// FunctionApp/GetUsers.cs
using Microsoft.Azure.Functions.Worker;
using Microsoft.Azure.Functions.Worker.Http;
using System.Net;
using Azure.Cosmos;
using System.Text.Json;

namespace FunctionApp;

public class GetUsers(CosmosClient cosmos)
{
    // HTTP-triggered function returning JSON users
    [Function("GetUsers")] 
    public async Task<HttpResponseData> Run(
        [HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = "users")] HttpRequestData req)
    {
        var db = cosmos.GetDatabase("app");
        var container = db.GetContainer("users");

        var iterator = container.GetItemQueryIterator<UserDoc>("SELECT c.id, c.email, c.name FROM c");
        var results = new List<UserDoc>();
        while (iterator.HasMoreResults)
        {
            var page = await iterator.ReadNextAsync();
            results.AddRange(page);
        }

        var res = req.CreateResponse(HttpStatusCode.OK);
        await res.WriteStringAsync(JsonSerializer.Serialize(results));
        res.Headers.Add("Content-Type", "application/json");
        return res;
    }
}

public record UserDoc(string id, string email, string name);

Required RBAC (principle of least privilege): Assign the Function App's system-assigned identity the Cosmos DB Built-in Data Reader role scoped to the specific database or container. Avoid account-level permissions.

5) Minimal IaC for role assignment (Azure Bicep)

// main.bicep: create role assignment for Function App's managed identity
param cosmosAccountId string
param databaseRid string // scope appropriately (e.g., database or container resource ID)
param functionPrincipalId string // Function App's system-assigned identity principalId

resource roleDefinition 'Microsoft.Authorization/roleDefinitions@2022-04-01' existing = {
  scope: subscription()
  name: '00000000-0000-0000-0000-000000000001' // placeholder, replace with Cosmos DB Built-in Data Reader GUID
}

resource roleAssignment 'Microsoft.Authorization/roleAssignments@2022-04-01' = {
  name: guid(functionPrincipalId, databaseRid, roleDefinition.name)
  scope: resourceId('Microsoft.DocumentDB/databaseAccounts/sqlDatabases', cosmosAccountId, 'app')
  properties: {
    roleDefinitionId: roleDefinition.id
    principalId: functionPrincipalId
    principalType: 'ServicePrincipal'
  }
}

Note: Use azd to provision and configure environment variables like COSMOS_ENDPOINT. Never embed secrets or connection strings in code.

6) Wire up the React client to the Azure Function

// src/hooks/useUsers.ts (override the URL to your deployed Function App)
async function fetchUsers(): Promise<ReadonlyArray<User>> {
  const res = await fetch(import.meta.env.VITE_API_BASE + "/users", {
    headers: { accept: "application/json" },
    credentials: "include", // if using auth; otherwise omit
  });
  if (!res.ok) throw new Error(`Request failed: ${res.status}`);
  const json = await res.json();
  return UsersSchema.parse(json);
}

7) Testing hooks and components separately

// tests/useUsers.test.tsx
import { describe, it, expect } from "vitest";
import { QueryClient, QueryClientProvider } from "@tanstack/react-query";
import { renderHook, waitFor } from "@testing-library/react";
import { useUsers } from "../src/hooks/useUsers";

function wrapper({ children }: { children: React.ReactNode }) {
  const client = new QueryClient();
  return <QueryClientProvider client={client}>{children}</QueryClientProvider>;
}

describe("useUsers", () => {
  it("returns success after fetching", async () => {
    global.fetch = async () => new Response(JSON.stringify([]), { status: 200 });
    const { result } = renderHook(() => useUsers(), { wrapper });

    await waitFor(() => {
      expect(result.current.status).toBe("success");
    });
  });
});
// tests/UsersList.test.tsx
import { describe, it, expect } from "vitest";
import { render, screen } from "@testing-library/react";
import { QueryClient, QueryClientProvider } from "@tanstack/react-query";
import { UsersList } from "../src/components/UsersList";

function renderWithQuery(ui: React.ReactElement) {
  const client = new QueryClient();
  return render(<QueryClientProvider client={client}>{ui}</QueryClientProvider>);
}

describe("UsersList", () => {
  it("renders loading state", () => {
    renderWithQuery(<UsersList />);
    expect(screen.getByRole("status")).toHaveTextContent(/loading/i);
  });
});

Best Practices & Security

Hooks own side effects; components remain pure and predictable. Validate all external data with Zod and use strict TypeScript to catch issues at compile time. For Azure, prefer Managed Identity with DefaultAzureCredential and apply the smallest RBAC scope required. Keep API base URLs and configuration in environment variables managed by azd or your CI/CD system, not in source. For database reads with Entity Framework in .NET APIs, use AsNoTracking() to avoid unnecessary change tracking.

Summary

Hooks encapsulate reusable logic, I/O, and validation, while components render UI and stay testable. Strong typing with Zod and discriminated unions keeps state explicit and safe. Azure integration is secure with Managed Identity, least-privilege RBAC, and IaC via azd or Bicep.

SharePoint List vs Library: Key Differences, Use Cases, and Best Practices

Overview: What’s the Difference Between a List and a Library in SharePoint?

The primary question many teams ask is the difference between list and library in SharePoint. In simple terms, a SharePoint list manages rows of data (like a table), while a SharePoint document library manages files and their metadata. Understanding how they differ helps you choose the right container for your content and build a scalable information architecture.

Core Definitions

What is a SharePoint List?

A list stores structured data as items, similar to a spreadsheet or database table. Each item contains columns (text, number, choice, date, person, lookup, etc.). Lists are ideal for tracking processes and records that are not file-based.

  • Examples: Issue tracker, asset inventory, change requests, event registrations.
  • Typical columns: Status, Priority, Due Date, Assigned To, Category.

What is a SharePoint Document Library?

A document library stores files (documents, images, PDFs) plus metadata about those files. Libraries are designed for document-centric collaboration with rich file features.

  • Examples: Policies and procedures, project documents, design assets, client deliverables.
  • Typical metadata: Document Type, Owner, Project, Department, Confidentiality.

Key Differences at a Glance

  • Primary content: Lists store items (rows of data); libraries store files with metadata.
  • File handling: Libraries support check-in/out, file previews, co-authoring, and Office integration; lists don’t need file operations.
  • Versioning: Lists track item versions; libraries track both file and metadata versions with richer controls.
  • Templates & content types: Libraries often use document content types (e.g., Policy, Contract) with specific templates; lists use item content types.
  • Views & formatting: Both support custom views and conditional formatting; libraries add file-centric filters (e.g., by file type).
  • Automation: Both integrate with Power Automate; libraries frequently use flows for approvals and publishing.
  • Permissions: Both support unique permissions; libraries commonly secure folders or documents for compliance.

When to Use a List vs. a Library

Choose a List When

  • You track structured records without needing to store a file per record.
  • You need form-based data entry and validation across many columns.
  • You want lightweight workflows for requests, approvals, or status tracking.
  • You plan to integrate with Power Apps to build a data-driven app.

Choose a Library When

  • Your primary asset is a file (Word, Excel, PowerPoint, PDF, image, CAD).
  • You need co-authoring, track changes, and document version history.
  • You require document sets to group related files with shared metadata.
  • You want retention labels, records management, and approval workflows.

Practical Examples

Example 1: IT Asset Tracking (List)

Create a list with columns such as Asset Tag (single line), Model (choice), Assigned To (person), Purchase Date (date), Warranty Expiry (date), and Status (choice). Build views for “Assigned” and “In Repair”. Automate notifications when Warranty Expiry is within 30 days.

Example 2: Policy Management (Library)

Use a library with metadata: Policy Type (choice), Owner (person), Review Cycle (choice), Effective Date (date), Compliance Tag (choice). Enable major/minor versioning, check-out, and an approval flow. Use views for “Pending Review” and “Effective Policies.”

Example 3: Project Delivery Docs (Library with Document Sets)

Create a library using Document Sets for each project. Metadata like Client, Project Manager, Phase, and Confidentiality classify files. Configure folders or sets with unique permissions for client-specific access.

Power Features and Governance

Versioning and Check-In/Out

Libraries provide robust versioning for files, enabling approval, drafts, and rollbacks. Lists also version items, which is useful for audit trails on data changes.

Metadata and Content Types

Both support custom columns and content types. Use site columns to enforce consistency across sites. For libraries, align document content types with templates and approval policies.

Views, Filters, and Formatting

Use views like Group By, conditional formatting, and filters to surface relevant content. In libraries, combine metadata-driven navigation with pinned filters to flatten folder hierarchies.

Automation and Integrations

Leverage Power Automate for alerts, approvals, and review reminders. Use Power Apps to create forms for lists (e.g., requests), and Office desktop/web apps for library co-authoring.

Performance and Limits

  • Thresholds: Both are affected by the list view threshold (commonly 5,000 items for certain operations). Use indexed columns and filtered views to scale.
  • File handling: Libraries include file size limits and supported types; consider chunked uploads and OneDrive sync for large files.

Security and Compliance

  • Apply sensitivity labels and retention labels to libraries holding regulated documents.
  • Use unique permissions sparingly; favor SharePoint groups and inheritance to keep access manageable.
  • Enable auditing in Purview/M365 for critical lists and libraries.

Quick Decision Guide

  • If you primarily manage data records without files, choose a List.
  • If you primarily manage files and need collaboration features, choose a Library.
  • Combine both when needed: store requests in a list and link to documents in a library via lookup columns.

Best Practices

  • Design metadata first to enable better search, filters, and governance.
  • Favor views over deep folders, especially in libraries.
  • Standardize with site columns and content types for consistency.
  • Document naming conventions and permissions to reduce confusion.
  • Train users on co-authoring, versioning, and approvals in libraries.

FAQ

Can a list store files?

Lists can include an attachment per item, but this is limited and lacks rich document management features. For file-centric work, use a library.

Can I convert a list to a library?

No direct conversion exists. Instead, create a library, migrate files, and map metadata. Keep the list for tracking if needed.

Do both support Power Automate?

Yes. Triggers and actions exist for both list items and library documents, enabling approvals, notifications, and archival flows.

What Is a Document Set in SharePoint? Definition, Benefits, and Best Practices

What Is a Document Set in SharePoint?

A Document Set in SharePoint is a special content type that lets you manage multiple related documents as a single unit. Think of it like a project or case folder with its own metadata, shared versioning, and standardized templates that apply to every file inside. Document Sets streamline document management by grouping files that belong together—such as proposals, briefs, and reports—so teams can work consistently and efficiently.

Key Benefits of Using Document Sets

  • Unified metadata: Apply shared properties (e.g., Client, Project ID, Case Number) to the entire set and inherit them across all documents.
  • Consistent templates: Start each set with predefined document templates (like a cover sheet, briefing note, and checklist) to enforce standards.
  • Batch operations: Move, copy, share, or archive the entire set as one unit, reducing manual steps and errors.
  • Versioning at set level: Capture milestones of the whole set, not just individual files, for complete auditability.
  • Improved governance: Centrally control content types, policies, and workflows for entire document collections.
  • Better findability: Search and filter by shared metadata so related files surface together.
  • Repeatable processes: Package best-practice structure into a reusable set for repeat scenarios.

Real-World Examples

Marketing Campaign Kit

  • Templates: Creative brief, timeline, asset checklist, budget sheet.
  • Shared metadata: Campaign name, region, launch date, product line.
  • Outcome: Faster kickoff and consistent deliverables across teams.

Client Project Workspace

  • Templates: Statement of Work, Project Plan, Risk Log, Status Report.
  • Shared metadata: Client, Project ID, Account Manager, Phase.
  • Outcome: Centralized visibility and fewer filing mistakes.

Legal Case File

  • Templates: Case summary, evidence index, correspondence log.
  • Shared metadata: Case number, matter type, jurisdiction, confidentiality level.
  • Outcome: Strong compliance and easier audits.

How Document Sets Work

Document Sets are built on SharePoint content types. You enable the Document Set feature, create a new Document Set content type, assign templates and metadata, and add it to a library. Users then create a new set just like they would create a new folder—except it comes preconfigured with rules, templates, and shared properties.

Step-by-Step: Setting Up a Document Set

  • Enable the feature: Ensure the Document Set feature is activated at the site collection level (SharePoint Online has it available by default in most scenarios).
  • Create a content type: In Site Settings, create a new content type that inherits from Document Set.
  • Define metadata: Add site columns (e.g., Client, Project ID) that will apply across the set.
  • Add templates: Upload starter files (DOCX, XLSX, PPTX, etc.) to the Document Set so each new set is pre-populated.
  • Configure welcome page: Customize the Document Set home (welcome) page to guide users with instructions, links, and key properties.
  • Add to library: Add your Document Set content type to the target document library and set it as default if desired.
  • Permissions and policies: Apply permissions, retention labels, and workflows as needed.

Best Practices for SharePoint Document Sets

  • Design metadata first: Standardize site columns and content types to avoid future refactoring.
  • Keep it simple: Limit required fields to what users can reliably fill in during creation.
  • Template discipline: Use a minimal, approved set of templates to avoid clutter and confusion.
  • Automate where possible: Use Power Automate to create sets from requests, populate metadata, or move to an archive library at project close.
  • Govern naming: Enforce naming conventions (e.g., PROJ-1234 - Client - Phase) via guidance or automation.
  • Secure the set: If needed, break inheritance on the set to restrict access, but use sparingly to reduce admin overhead.
  • Train and document: Provide a short guide on when to use Document Sets vs. folders or standard libraries.

When to Use Document Sets vs. Alternatives

  • Use Document Sets when: You need shared metadata, standardized templates, and milestone versioning across multiple related files.
  • Use standard folders when: You only need lightweight grouping without metadata or templates.
  • Use separate libraries when: You need distinct permissions, advanced retention, or unique workflows per group.

Limitations and Considerations

  • Sync and OneDrive: Document Sets behave like folders in sync clients, but advanced features (welcome page) are web-only.
  • M365 sensitivity labels: Apply labels thoughtfully at the library or item level to avoid conflicts with set-level permissions.
  • Migrations: Ensure your migration tool supports Document Sets, content types, and metadata mapping.
  • External sharing: Verify sharing policies; sharing a set exposes all items inside.
  • Mobile experience: Core functions work, but configuration and welcome page customization are best on web.

Quick FAQ

Is a Document Set the same as a folder?

No. While it looks like a folder, a Document Set adds shared metadata, templates, a welcome page, and set-level versioning and policies.

Can I use approvals and workflows?

Yes. You can trigger flows on set creation, status changes, or on items within the set using Power Automate.

Does search recognize Document Sets?

Yes. Shared properties help group results, and you can refine search by Document Set metadata.

Conclusion

Document Sets in SharePoint provide a structured, repeatable way to manage related content with consistent metadata, templates, and lifecycle governance. When designed thoughtfully, they reduce errors, accelerate delivery, and improve compliance across projects, cases, and campaigns.

Saturday, 24 January 2026

Integrate bootstrap and jquery in spfx


1. Create your solution with "No JavaScript Framework"

2. Install below bootsrap and jquery using npm command

npm install jquery --save
npm install @types/jquery --save
npm install bootstrap --save
npm install @types/bootstrap --save
npm install @types/jqueryui --save

3. add reference on webpart where you want to call bootstrap and jquery

import * as jQuery from 'jquery';
import * as bootstrap from 'bootstrap';
import 'jquery/dist/jquery.min.js';
import 'bootstrap/dist/css/bootstrap.css';
import 'bootstrap/dist/js/bootstrap.js';

4. suppose that we have to use bootstap modal  then we have paste below HTML inside  render method

<button type="button" id="btnModel" class="${styles.button}">Open Modal</button>
<div class="modal fade" id="myModal" role="dialog">
    <div class="modal-dialog">
        <div class="modal-content">
            <div class="modal-header">
                <button type="button" class="close" data-dismiss="modal">&times;</button>
                <h4 class="modal-title">Modal Header</h4>
            </div>
            <div class="modal-body">
                <p>Welcome bootstrap model popup in sharepoint framework client side webpart</p>
            </div>
            <div class="modal-footer">
                <button type="button" class="btn btn-default" data-dismiss="modal">Close</button>
            </div>
        </div>
    </div>
</div>



5. Create a funcntion to open the modal popup on button click event

private PageLoad():void{
jQuery("#btnModel").click(f=>{
  jQuery("#myModal").modal("show");
});

6. PageLoad() Function inside Render method after HTML binding

this.PageLoad();






Send Birthday Email from SharePoint online using Flow


Below are steps to configure the flow to send the birthday email from SharePoint List.


1.Configure the Recurrence like in below screen

2.Get items from SharePoint List



3.Use foreach activity on every item to check the condition and send the email


4.Configure Condition Activity like below code
Left Condition

if(equals(items('Apply_to_each')?['DateOfBirth'],null),'',formatDateTime(items('Apply_to_each')?['DateOfBirth'], 'MM/dd'))

is equal to
Right Condition
formatDateTime(utcNow(), 'MM/dd')

4.Send Email if condition is true


What Is a Site Collection in SharePoint? Architecture, Use Cases, and Best Practices

What is a Site Collection in SharePoint?

A site collection in SharePoint is a logical container that groups a top-level site and all its subsites, content, permissions, and features under a single governance boundary. In practical terms, a site collection helps organizations separate projects, departments, or business units so each can manage its own settings, templates, and lifecycle without affecting others.

Key Components and Architecture

Every site collection starts with a top-level site that defines core settings, including features, templates, and governance policies. Beneath it, you can have one or more subsites (in classic architectures) that inherit or customize permissions, navigation, and content types. Storage, search scopes, and features are typically managed at the collection level for consistency and control.

  • Top-Level Site: The root of the site collection, controlling default features and policies.
  • Subsites (Classic): Child sites that can inherit or break from parent settings.
  • Content Database Association: Each site collection is mapped to a content database for storage and performance boundaries.
  • Features and Templates: Enabled at the site collection level to standardize experience and governance.

Modern SharePoint: Site Collections vs. Subsites

Modern SharePoint favors flat information architecture using standalone site collections (team and communication sites) connected via hub sites, rather than deep subsite hierarchies. This improves flexibility, security scoping, and lifecycle management, while enabling consistent navigation and branding across related sites.

  • Flat Structure: Create separate site collections for teams/projects; avoid deep subsite trees.
  • Hub Sites: Associate related site collections to share navigation, theme, and search.
  • Scalability: Independent lifecycle for each site; easier to archive or delete without ripple effects.

Permissions and Security Boundaries

A site collection acts as a primary security and governance boundary. Permissions can be managed at the site collection, site, library, folder, or item level, but keeping most permissioning at the site collection or site level simplifies administration and reduces risk.

  • Default Groups: Owners, Members, and Visitors roles help maintain least-privilege access.
  • Inheritance: Inheriting permissions streamlines management; break inheritance only when necessary.
  • Sensitivity: Use separate site collections for sensitive or regulated data to isolate risk and auditing.

When to Create a New Site Collection

Use a new site collection when you need clear boundaries, autonomy, or distinct policies. This ensures better scalability, performance, and governance.

  • Distinct Ownership: Different owners or admins from other departments or projects.
  • Unique Compliance Needs: Separate retention labels, DLP policies, or auditing requirements.
  • Lifecycle Autonomy: Independent archiving, deletion, or migration plans.
  • Performance Boundaries: Distribute content across site collections to manage growth.

Practical Examples

Example 1: Departmental Sites

An HR site collection with its own permissions, templates (e.g., policy libraries), and retention labels separate from Finance.

Example 2: Project Portfolios

Each strategic project gets a dedicated team site (site collection), all associated to a PMO hub for unified navigation and roll-up news.

Example 3: External Collaboration

A site collection configured for guest access to collaborate with vendors while isolating internal-only content.

Best Practices for Managing Site Collections

  • Adopt a Flat IA: Prefer many site collections connected by hubs over deep subsite trees.
  • Standardize Templates: Use site templates and provisioning to enforce consistency.
  • Govern Permissions: Keep permissions simple; minimize broken inheritance.
  • Apply Sensitivity Labels: Classify sites to control sharing and data loss prevention.
  • Set Lifecycle Policies: Define archival and deletion timelines from the start.
  • Monitor Storage and Activity: Regularly review usage and cleanup stale content.
  • Document Ownership: Assign clear site owners and secondary admins.

Common FAQs

Is a team site the same as a site collection?

In modern SharePoint, each new team or communication site is typically its own site collection, which simplifies management and scaling.

Can I convert subsites into separate site collections?

Yes, but it requires planned migration. Many organizations flatten their hierarchy over time to improve governance and performance.

How do hub sites relate to site collections?

Hub sites connect multiple site collections for shared navigation, branding, and search, without merging their security or content.

Mastering User Input: Best Practices for UX, Accessibility, and Secure Web Forms

What Is User Input and Why It Matters

User input is the data that visitors provide to your website or app—through forms, search bars, surveys, and interactive components. Optimizing user input improves conversion rates, enhances user experience, reduces errors, and protects your site from security threats.

Designing Frictionless User Input Experiences

Clear, intuitive interfaces help users complete tasks quickly and confidently. Focus on reducing cognitive load and guiding users step-by-step.

  • Use clear labels: Place labels close to fields and avoid ambiguous terminology.
  • Provide helper text: Show format hints (e.g., MM/DD/YYYY) and real-time guidance.
  • Minimize fields: Only ask for information you truly need to boost completion rates.
  • Group related fields: Logical sections (e.g., Personal Info, Payment) improve scannability.
  • Offer sensible defaults: Pre-fill known values and use smart suggestions where possible.

Examples of UX Enhancements

  • Inline validation: As a user types an email, show “Looks good” or “Please include @”.
  • Progress indicators: In multi-step forms, display steps like 1 of 3 to set expectations.
  • Accessible error summaries: After submission, list errors at the top with links to each field.

Accessibility First: Inclusive User Input

Accessible user input ensures all users, including those using assistive technologies, can interact with your forms.

  • Associate labels correctly: Ensure every input has a visible, programmatically associated label.
  • Provide clear error messages: Explain what went wrong and how to fix it in simple language.
  • Use sufficient contrast: Labels, placeholders, and errors must be readable.
  • Keyboard navigability: All fields and buttons must be reachable and operable via keyboard.
  • Descriptive help: Use concise instructions and indicate required fields without relying solely on color.

Validation and Error Handling That Builds Trust

Validation prevents incorrect data while maintaining a smooth experience. Use both client-side and server-side checks to balance usability and security.

  • Client-side validation: Guide users in real time to correct issues early.
  • Server-side validation: Enforce rules reliably to protect your system and data integrity.
  • Actionable errors: Say “Password must be at least 12 characters” instead of “Invalid password”.
  • Preserve input: If errors occur, keep the user’s entries so they don’t have to retype.

Common Validation Patterns

  • Email: Check structure (name@domain.tld) and provide clear messages for missing or extra characters.
  • Password: Enforce length and complexity while showing live strength indicators.
  • Phone and dates: Guide formatting with masks and examples.

Security Essentials for Handling User Input

Secure handling of user input protects your users and your platform from attacks.

  • Sanitize and encode: Prevent script injection by cleaning and safely rendering user-supplied data.
  • Validate on the server: Never trust client-side checks alone.
  • Use allowlists: Accept only expected characters and formats where possible.
  • Rate limiting: Throttle requests to stop brute-force and abuse.
  • Secure storage: Hash and salt sensitive credentials; encrypt data at rest and in transit.

Optimizing User Input for Conversions

Small improvements can meaningfully increase completion rates and revenue.

  • Autofill and autocomplete: Support browser hints to reduce typing effort.
  • Mobile-friendly fields: Use appropriate input modes (numeric, email) to trigger the right keyboard.
  • Trust signals: Show security badges and privacy assurances near sensitive fields.
  • Clear CTAs: Use specific button text like “Create Account” instead of “Submit”.

Measuring and Iterating

  • Track drop-off: Identify fields causing abandonment and simplify or remove them.
  • A/B test: Experiment with copy, layout, and step count to find the highest-performing flow.
  • Monitor error rates: High error frequency signals confusing instructions or strict rules.

Checklist: High-Quality User Input Experiences

  • Clear labels, concise copy, and minimal fields
  • Accessible structure, keyboard support, and readable errors
  • Client- and server-side validation with helpful feedback
  • Robust security: sanitization, encoding, allowlists, and rate limits
  • Conversion-focused design with autofill, input modes, and strong CTAs

When you prioritize user input design, you deliver experiences that are faster, safer, and more inclusive—ultimately improving satisfaction and conversion rates across your product.

Avoiding the Pitfalls of Stopping Permission Inheritance: Performance Hotspots, Safer Patterns, and Azure-First Remediation

Stopping inherit permission (i.e., breaking permission inheritance) often seems like a quick fix for access control, but it introduces hidden operational and performance costs. This article explains why breaking inheritance in SharePoint and Azure RBAC leads to complexity, where performance issues occur, and how to remediate with .NET 8, TypeScript, and Azure-first patterns using Managed Identities and Infrastructure as Code.

The Problem

Breaking inheritance creates many one-off permission entries. Over time, this causes:

  • Permission sprawl: hard-to-audit, hard-to-revoke access scattered across items/resources.
  • Performance degradation: larger ACL evaluations, slower queries, and increased throttling risk.
  • Operational friction: brittle reviews, noisy exceptions, and confusing user experiences.

In SharePoint, many uniquely permissioned items slow list queries and complicate sharing. In Azure, assigning roles at leaf scopes (instead of using inherited assignments at management group or subscription levels) increases evaluation overhead and management burden.

Prerequisites

  • .NET 8 SDK
  • Node.js v20+
  • Azure CLI (az) and Azure Bicep
  • Contributor access to a test subscription (for deploying IaC) and Reader/Authorization permissions for audit scenarios

The Solution (Step-by-Step)

Step 1: What to avoid when stopping inheritance

  • SharePoint: Avoid per-item unique permissions for large lists; prefer groups at the site or library level, with exceptions gated by policy and approval.
  • Azure RBAC: Avoid many role assignments at the resource/resource-group level; grant least privilege at a higher scope when practical, and use groups instead of direct user assignments.

Step 2: .NET 8 Minimal API to detect RBAC hotspots (top-level statements, DI, Managed Identity)

// File: Program.cs (.NET 8, top-level statements, minimal API)
// Purpose: Audit Azure RBAC for non-inherited, leaf-level role assignments (hotspots).
// Auth: Uses DefaultAzureCredential (Managed Identity preferred in Azure).
// Note: This sample reads role assignments and surfaces "leaf" hotspots for review.

using Azure;
using Azure.Core;
using Azure.Identity;
using Azure.ResourceManager;
using Azure.ResourceManager.Authorization;
using Azure.ResourceManager.Authorization.Models;
using Microsoft.AspNetCore.Builder;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;

var builder = WebApplication.CreateBuilder(args);

// Register Azure clients via DI
builder.Services.AddSingleton<TokenCredential>(_ => new DefaultAzureCredential());
// ArmClient is the entry point to Resource Manager APIs
builder.Services.AddSingleton<ArmClient>(sp => new ArmClient(sp.GetRequiredService<TokenCredential>()));

var app = builder.Build();

// GET /rbac/hotspots?subscriptionId=<subId>
// Heuristic: Identify role assignments at deeper scopes (resource, resource group) that
// could be consolidated at higher scopes to reduce sprawl and evaluation overhead.
app.MapGet("/rbac/hotspots", async (string subscriptionId, ArmClient arm) =>
{
    // Acquire subscription resource
    var sub = arm.GetSubscriptionResource(new ResourceIdentifier($"/subscriptions/{subscriptionId}"));

    // List role assignments across the subscription
    var assignments = new List<RoleAssignmentData>();
    await foreach (var ra in sub.GetRoleAssignmentsAsync())
    {
        assignments.Add(ra.Data);
    }

    // Group by scope depth (subscription=1, resourceGroup=2, resource=3+)
    // Deeper scopes are more likely to be "breaks" from inheritance-like patterns.
    var hotspots = assignments
        .GroupBy(a => ScopeDepth(a.Scope))
        .OrderByDescending(g => g.Key)
        .Select(g => new
        {
            depth = g.Key,
            count = g.Count(),
            sampleScopes = g.Select(x => x.Scope).Distinct().Take(5).ToArray()
        });

    return Results.Ok(new
    {
        analyzed = assignments.Count,
        hotspots
    });

    // Simple helper to score scope depth by path segments.
    static int ScopeDepth(string scope)
    {
        // Example: /subscriptions/{id} => depth ~ 1
        // /subscriptions/{id}/resourceGroups/{rg} => depth ~ 2
        // /subscriptions/{id}/resourceGroups/{rg}/providers/... => depth >= 3
        return scope.Split('/', StringSplitOptions.RemoveEmptyEntries).Length / 2;
    }
})
.WithName("GetRbacHotspots")
.Produces<object>(200);

app.Run();

Why this helps: Large counts of deep-scope assignments often indicate broken inheritance patterns (per-resource grants). Consolidating to group-based roles at higher scopes can reduce policy evaluation work and administrative overhead.

Step 3: TypeScript Azure Function to list deep-scope role assignments with retry (Managed Identity)

// File: index.ts (Azure Functions v4, Node 20)
// Purpose: Enumerate role assignments and flag deep-scope patterns.
// Auth: ManagedIdentityCredential (no client secrets). Strict typing. Basic retry for 429.
//
// Ensure you enable a system-assigned identity on the Function App and grant it Reader on the subscription.
// package.json should include: @azure/identity, @azure/arm-authorization, zod

import { AzureFunction, Context } from "@azure/functions";
import { ManagedIdentityCredential } from "@azure/identity";
import { AuthorizationManagementClient, RoleAssignment } from "@azure/arm-authorization";
import { z } from "zod";

// Validate required environment variable via Zod (strict typing)
const EnvSchema = z.object({
  SUBSCRIPTION_ID: z.string().min(1)
});
const env = EnvSchema.parse(process.env);

const httpTrigger: AzureFunction = async function (context: Context): Promise<void> {
  // Create credential using Managed Identity (no secrets in code or env)
  const credential = new ManagedIdentityCredential();
  const authClient = new AuthorizationManagementClient(credential, env.SUBSCRIPTION_ID);

  // Simple retry wrapper for throttle-prone calls (e.g., large tenants)
  async function withRetry<T>(fn: () => Promise<T>, attempts = 5, delayMs = 1000): Promise<T> {
    let lastErr: unknown;
    for (let i = 0; i < attempts; i++) {
      try {
        return await fn();
      } catch (err: unknown) {
        const anyErr = err as { statusCode?: number };
        if (anyErr?.statusCode === 429 || anyErr?.statusCode === 503) {
          await new Promise((r) => setTimeout(r, delayMs * (i + 1))); // Exponential-ish backoff
          lastErr = err;
          continue;
        }
        throw err;
      }
    }
    throw lastErr;
  }

  // List role assignments at subscription scope
  const scope = `/subscriptions/${env.SUBSCRIPTION_ID}`;
  const assignments: RoleAssignment[] = [];

  // Use retry around list calls; the SDK returns an async iterator
  const pager = authClient.roleAssignments.listForSubscription();
  for await (const item of pager) {
    assignments.push(item);
  }

  // Score depth by scope path complexity
  const scoreDepth = (s: string): number => s.split("/").filter(Boolean).length / 2;
  const hotspots = assignments
    .map(a => ({ id: a.id!, scope: a.scope!, depth: scoreDepth(a.scope!) }))
    .filter(x => x.depth >= 3); // resource-level assignments

  context.res = {
    status: 200,
    headers: { "content-type": "application/json" },
    body: {
      analyzed: assignments.length,
      resourceLevelAssignments: hotspots.length,
      hotspotSamples: hotspots.slice(0, 10)
    }
  };
};

export default httpTrigger;

Why this helps: A quick serverless audit allows teams to discover where inheritance-like patterns are being bypassed in Azure RBAC, which is a frequent source of performance and governance friction.

Step 4: Azure Bicep to deploy a Function App with Managed Identity and least privilege

// File: main.bicep
// Purpose: Deploy a Function App with system-assigned managed identity,
// and assign Reader at the resource group scope (principle of least privilege).
// Includes a deterministic GUID for role assignment name.
//
// Note: Reader role definition ID is acdd72a7-3385-48ef-bd42-f606fba81ae7
// (Allows viewing resources, not making changes.)

@description('Location for resources')
param location string = resourceGroup().location

@description('Function App name')
param functionAppName string

@description('Storage account name (must be globally unique)')
param storageAccountName string

var roleDefinitionIdReader = '/providers/Microsoft.Authorization/roleDefinitions/acdd72a7-3385-48ef-bd42-f606fba81ae7'

// Storage (for Functions)
resource stg 'Microsoft.Storage/storageAccounts@2023-01-01' = {
  name: storageAccountName
  location: location
  sku: {
    name: 'Standard_LRS'
  }
  kind: 'StorageV2'
  properties: {
    allowBlobPublicAccess: false
    minimumTlsVersion: 'TLS1_2'
    supportsHttpsTrafficOnly: true
  }
}

// Hosting plan (Consumption)
resource plan 'Microsoft.Web/serverfarms@2022-09-01' = {
  name: '${functionAppName}-plan'
  location: location
  sku: {
    name: 'Y1'
    tier: 'Dynamic'
  }
}

// Function App with system-assigned identity
resource func 'Microsoft.Web/sites@2022-09-01' = {
  name: functionAppName
  location: location
  kind: 'functionapp'
  identity: {
    type: 'SystemAssigned'
  }
  properties: {
    serverFarmId: plan.id
    siteConfig: {
      appSettings: [
        { name: 'FUNCTIONS_WORKER_RUNTIME', value: 'node' }
        { name: 'WEBSITE_RUN_FROM_PACKAGE', value: '1' }
      ]
    }
  }
}

// Assign Reader at the resource group scope to the Function's identity
// Use a stable, deterministic GUID based on scope + principal to avoid duplicates.
resource readerAssign 'Microsoft.Authorization/roleAssignments@2022-04-01' = {
  name: guid(resourceGroup().id, func.identity.principalId, roleDefinitionIdReader)
  scope: resourceGroup()
  properties: {
    roleDefinitionId: subscriptionResourceId('Microsoft.Authorization/roleDefinitions', 'acdd72a7-3385-48ef-bd42-f606fba81ae7')
    principalId: func.identity.principalId
    principalType: 'ServicePrincipal'
  }
}

Why this helps: You deploy secure defaults and enforce least privilege by design. The deterministic GUID prevents accidental duplicate role assignments. Reader is explicitly chosen to avoid write permissions while enabling inventory and audits.

Where performance issues occur

  • SharePoint: Many items with unique permissions increase ACL checks and can slow list queries, indexing, and certain sharing operations. Batch operations on items with unique permissions are more likely to hit throttling.
  • Azure RBAC: Thousands of per-resource role assignments increase evaluation work during authorization and complicate policy and compliance scans. It also prolongs investigations during incidents.
  • Auditing and reviews: Per-user, per-resource assignments inflate review surfaces and make access recertification slow and error-prone.

Best Practices & Security

  • Use Managed Identity or Azure AD Workload Identity. Avoid client secrets for server-side workloads. Do not store secrets in environment variables. If you must handle secrets, use Azure Key Vault with RBAC and Managed Identity.
  • Prefer group-based assignments over direct user assignments. This simplifies reviews and minimizes churn.
  • Favor higher-scope role assignments with least privilege. Start at management group or subscription only when justified and narrow to Reader or custom roles that fit the minimal required actions.
  • When exceptions are necessary, document them. Add expiration and owners to each exception.
  • For SharePoint, grant permissions at the site or library level using groups. Reserve item-level breaks for rare, time-bound cases.
  • Monitor continuously. Integrate with Azure Monitor and Azure Policy to detect excessive deep-scope assignments. Create alerts on abnormal growth of role assignments or access anomalies.
  • Implement retry and backoff for API calls that can throttle (429/503), both in audits and operational tooling.
  • Standardize terminology. Use "inheritance" consistently to avoid confusion in documentation and automation.

Recommendation: If you need read-only inventory across a subscription, assign the Reader role (roleDefinitionId acdd72a7-3385-48ef-bd42-f606fba81ae7) to a Managed Identity and call Azure SDKs or ARM REST with exponential backoff.

Summary

  • Breaking inheritance increases complexity and can degrade performance; reserve it for rare, time-bound exceptions.
  • Automate audits with Managed Identity using .NET 8 and TypeScript to find deep-scope hotspots and consolidate access.
  • Ship secure-by-default with Bicep: least privilege (Reader), deterministic role assignment IDs, and continuous monitoring via Azure services.

Configuring Permission in SharePoint with .NET 8 and Microsoft Graph (Azure-first)

If you need to automate permission in SharePoint reliably, use Microsoft Graph with .NET 8 and Azure-managed identities. The goal: grant site-scoped access (least privilege via Sites.Selected), verify effective roles, and perform read/write operations without client secrets.

The Problem

SharePoint permissions are often over-provisioned or managed manually. That leads to audit gaps, break-glass patterns, and production drift. You need a repeatable, least-privilege approach that grants only the required access to specific sites, automates verification, and avoids client secrets.

Prerequisites

Required tools and permissions:

  • .NET 8 SDK
  • Azure CLI v2.58+ (logged in as a tenant admin for one-time grants)
  • Microsoft Graph application permissions consent capability (tenant admin)
  • Azure subscription access to create a user-assigned managed identity (Contributor on resource group)

The Solution (Step-by-Step)

Step 1. Choose the authentication model

Use managed identity for workloads in Azure (Functions, Container Apps). This removes client secrets entirely. For CI/CD, use workload identity federation instead of secrets.

  • Runtime principal: user-assigned/system-assigned managed identity
  • Graph permission model: application permission Sites.Selected for the runtime principal
  • Grant site-scoped roles: read or write at the specific SharePoint site level

Why Sites.Selected: it blocks blanket access (e.g., Sites.Read.All) and forces explicit grants per site.

Step 2. Infrastructure as Code (Bicep): create a user-assigned managed identity

// main.bicep
targetScope = 'resourceGroup'

// User-assigned managed identity that will call Microsoft Graph
resource uami 'Microsoft.ManagedIdentity/userAssignedIdentities@2023-01-31' = {
  name: 'sp-sites-selected-uami'
  location: resourceGroup().location
}

output uamiClientId string = uami.properties.clientId
output uamiPrincipalId string = uami.properties.principalId

After deployment, assign Microsoft Graph application permission Sites.Selected to the managed identity’s service principal. This is a one-time admin action.

# Assign Graph app role (Sites.Selected) to the managed identity service principal
# 1) Get Graph service principal (well-known)
GRAPH_SP_ID=$(az ad sp list --filter "appId eq '00000003-0000-0000-c000-000000000000'" --query "[0].id" -o tsv)

# 2) Get the Sites.Selected app role ID
SITES_SELECTED_ROLE_ID=$(az ad sp show --id $GRAPH_SP_ID --query "appRoles[?value=='Sites.Selected' && allowedMemberTypes[@]=='Application'].id" -o tsv)

# 3) Get your managed identity's service principal object id
UAMI_PRINCIPAL_ID=<uamiPrincipalId from bicep output>

# 4) Assign the app role to the managed identity
az ad sp add-approle-assignment \
  --id $UAMI_PRINCIPAL_ID \
  --principal-object-id $UAMI_PRINCIPAL_ID \
  --resource-id $GRAPH_SP_ID \
  --app-role-id $SITES_SELECTED_ROLE_ID

Admin consent is implicit when you add an app role assignment to Graph for an enterprise application (service principal). Validate in Entra ID under Enterprise applications > Your Managed Identity > Permissions.

Step 3. One-time site-scoped grant to the managed identity

Sites.Selected requires a per-site grant. Use an admin-only tool to grant read/write on a single site to the managed identity. Below is a .NET 8 minimal API you can run locally as a tenant admin via Azure CLI authentication to grant permissions. It uses DI, file-scoped namespaces, and the Microsoft Graph SDK.

using Azure.Identity;
using Microsoft.AspNetCore.Builder;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using Microsoft.Graph;
using Microsoft.Graph.Models;
using Microsoft.Kiota.Abstractions.Authentication;

namespace SharePointPermAdmin;

var builder = WebApplication.CreateBuilder(args);

// Admin credential for local, one-time operations only.
// Uses Azure CLI token of a tenant admin. No client secrets.
builder.Services.AddSingleton((sp) =>
{
    // Authenticate as the signed-in Azure CLI account
    var credential = new AzureCliCredential();

    // Adapter for Microsoft Graph SDK
    var authProvider = new TokenCredentialAuthenticationProvider(
        credential,
        // Microsoft Graph default scope for app-permission endpoints
        new[] { "https://graph.microsoft.com/.default" });

    return new GraphServiceClient(authProvider);
});

var app = builder.Build();

// Code Overview: Grants site-level permission (read or write) to a target application or managed identity
// Endpoint parameters:
//  - hostname: e.g., "contoso.sharepoint.com"
//  - sitePath: e.g., "/sites/Engineering"
//  - targetAppId: the clientId of the target app or managed identity (UAMI clientId)
//  - role: "read" or "write" (least privilege)
app.MapPost("/grant", async (GraphServiceClient graph,
    string hostname, string sitePath, string targetAppId, string role) =>
{
    // 1) Resolve the site by hostname and site path
    var site = await graph.Sites["{hostname}:{sitePath}"].GetAsync();
    if (site is null) return Results.NotFound("Site not found.");

    // 2) Prepare the permission grant
    var requestedRole = role.Equals("write", StringComparison.OrdinalIgnoreCase) ? "write" : "read";

    var permission = new Permission
    {
        // Roles supported for Sites.Selected are "read" and "write"
        Roles = new List<string> { requestedRole },
        GrantedToIdentities = new List<IdentitySet>
        {
            new IdentitySet
            {
                Application = new Identity
                {
                    // targetAppId can be the clientId of a user-assigned managed identity
                    // or an app registration clientId
                    Id = targetAppId,
                    DisplayName = "Sites.Selected App"
                }
            }
        }
    };

    // 3) Grant the permission at the site-level
    var created = await graph.Sites[site.Id].Permissions.PostAsync(permission);

    // 4) Return what was granted for audit
    return Results.Ok(new
    {
        SiteId = site.Id,
        GrantedRoles = created?.Roles,
        GrantedTo = created?.GrantedToIdentities?.Select(x => x.Application?.Id)
    });
});

app.Run();

Run this locally with an Azure CLI context that has tenant admin privileges. This API grants the site-scoped role to your managed identity (identified by its clientId). Keep this tool restricted and audit its use.

Step 4. Runtime workload: access the site using the managed identity

Your production service (Azure Functions or Container Apps) uses DefaultAzureCredential to get a token for Graph and perform read actions (if granted read) or write actions (if granted write).

using Azure.Identity;
using Microsoft.AspNetCore.Builder;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using Microsoft.Graph;
using Microsoft.Kiota.Abstractions.Authentication;

namespace SharePointRuntime;

var builder = WebApplication.CreateBuilder(args);

// Graph client using managed identity in Azure (no secrets)
// In local dev, DefaultAzureCredential falls back to Azure CLI login.
builder.Services.AddSingleton((sp) =>
{
    var credential = new DefaultAzureCredential();
    var authProvider = new TokenCredentialAuthenticationProvider(
        credential,
        new[] { "https://graph.microsoft.com/.default" });

    return new GraphServiceClient(authProvider);
});

var app = builder.Build();

// Code Overview: Example read-only endpoint listing drive root items for a given site.
// Requires the managed identity to have Sites.Selected + "read" on that site.
app.MapGet("/sites/{hostname}/{*sitePath}/drive-items", async (GraphServiceClient graph, string hostname, string sitePath) =>
{
    // 1) Resolve site by hostname and site path
    var site = await graph.Sites[$"{hostname}:{sitePath}"].GetAsync();
    if (site is null) return Results.NotFound("Site not found.");

    // 2) Read drive root items (read permission is sufficient)
    var items = await graph.Sites[site.Id].Drive.Root.Children.GetAsync();

    return Results.Ok(items?.Value?.Select(i => new { i.Id, i.Name, i.Size, i.LastModifiedDateTime }));
});

app.Run();

Step 5. Optional: strict front-end validation for an internal admin form

If you build an admin UI to call the grant API, validate inputs with Zod and strict TypeScript types.

import { z } from 'zod';

// Discriminated union for role
const RoleSchema = z.union([z.literal('read'), z.literal('write')]);

// Strict payload schema
export const GrantPayloadSchema = z.object({
  hostname: z.string().min(1).regex(/^([a-zA-Z0-9-]+)\.sharepoint\.com$/),
  sitePath: z.string().min(1).startsWith('/'),
  targetAppId: z.string().uuid(),
  role: RoleSchema
});

export type GrantPayload = z.infer<typeof GrantPayloadSchema>;

// Example safe submit
export async function submitGrant(payload: GrantPayload) {
  const parsed = GrantPayloadSchema.parse(payload); // throws on invalid input
  const res = await fetch('/grant?'
    + new URLSearchParams({
      hostname: parsed.hostname,
      sitePath: parsed.sitePath,
      targetAppId: parsed.targetAppId,
      role: parsed.role
    }), { method: 'POST' });

  if (!res.ok) throw new Error('Grant failed');
  return res.json();
}

Best Practices & Security

  • Best Practice: Use Sites.Selected and grant per-site roles (read/write) instead of tenant-wide Sites.Read.All or Sites.FullControl.All.
  • Best Practice: Prefer managed identity (Azure) or workload identity federation (CI/CD) over client secrets.
  • Best Practice: Separate duties. Keep the grant tool under tenant admin control; the runtime app should never be able to self-elevate.
  • Best Practice: Log every permission grant with who, what, when, and siteId. Store in an immutable audit store.
  • Best Practice: Implement retries with exponential backoff for Graph calls and handle 429/5xx responses gracefully.
  • Best Practice: Validate all inputs server-side. Reject unknown hostnames and unexpected site paths.
  • Best Practice: Monitor with Microsoft 365 audit logs and alert on unexpected grants or role changes.

Minimum permissions required (explicit)

  • Runtime principal (managed identity): Microsoft Graph application permission Sites.Selected.
  • Admin/grant principal: Microsoft Graph application permission Sites.FullControl.All or delegated SharePoint admin role sufficient to create site-level grants via Graph.
  • SharePoint site roles used for Sites.Selected: read (equivalent to site Read), write (equivalent to Edit). Grant the lowest role that satisfies requirements.

Error handling notes

  • Catch GraphServiceException and map common statuses: 403 (insufficient role), 404 (site not found), 429 (throttle) with Retry-After.
  • Add circuit breakers/timeouts. Do not leak raw exception messages in responses.
  • Return correlation IDs in responses to aid troubleshooting.

Summary

  • Grant permission in SharePoint using Sites.Selected for least privilege and auditability.
  • Use managed identities or workload identity federation to remove secrets and simplify compliance.
  • Automate grants and access with .NET 8, Microsoft Graph SDK, and IaC for consistent, repeatable operations.