Sunday, 22 February 2026

Build a secure MCP-style server for Power Apps with .NET 8 Azure Functions and Azure AD

Creating an MCP server in PowerApps typically means exposing a secure, typed HTTP API that Power Apps can call via a Custom Connector. This article shows why Azure Functions (.NET 8 isolated) with Azure AD (Entra ID) is the right foundation and how to ship a production-ready MCP-style server with OpenAPI, validation, and zero secret management.

The Problem

You need a reliable backend for Power Apps that enforces validation and security, offers a clean contract (OpenAPI), scales serverlessly, and avoids Function Keys or shared secrets. You also want workflows and data operations to be testable and observable.

Prerequisites

  • .NET 8 SDK
  • Azure CLI 2.58+ and Azure Functions Core Tools v4+
  • Azure subscription with permissions to create Resource Group, Function App, and Managed Identity
  • Power Apps environment with permission to create Custom Connectors

Implementation Details

Project setup

// Terminal commands (run locally)
// 1) Create a .NET 8 isolated Azure Functions app
dotnet new func --worker-runtime dotnet-isolated --name PowerAppsMcpServer

cd PowerAppsMcpServer

// 2) Add packages for validation, OpenAPI, and DI helpers
dotnet add package FluentValidation
dotnet add package FluentValidation.DependencyInjectionExtensions
dotnet add package Microsoft.Azure.Functions.Worker.Extensions.OpenApi
dotnet add package Azure.Identity
dotnet add package Microsoft.Extensions.Azure
dotnet add package Microsoft.Extensions.Logging.ApplicationInsights

Program.cs (minimal hosting, DI, OpenAPI)

using Azure.Core;
using Azure.Identity;
using FluentValidation;
using Microsoft.Azure.Functions.Worker;
using Microsoft.Azure.Functions.Worker.Extensions.OpenApi.Extensions;
using Microsoft.Extensions.Azure;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Logging;

// Top-level statements for .NET 8 minimal hosting
var host = new HostBuilder()
    .ConfigureFunctionsWorkerDefaults(builder =>
    {
        // Enables OpenAPI endpoints at /api/swagger.json and /api/swagger/ui
        builder.AddApplicationInsights();
        builder.AddOpenApi();
    })
    .ConfigureServices(services =>
    {
        // Register Application Insights logger
        services.AddLogging();

        // Register validators
        services.AddValidatorsFromAssemblyContaining<CreateOrderRequestValidator>();

        // Register Azure SDK clients using DefaultAzureCredential (Managed Identity in Azure)
        services.AddAzureClients(clientBuilder =>
        {
            clientBuilder.UseCredential(new DefaultAzureCredential());
            // Example: clientBuilder.AddSecretClient(new Uri("https://<your-kv>.vault.azure.net/"));
        });

        // Register domain services
        services.AddSingleton<IOrderService, OrderService>();
    })
    .Build();

host.Run();

Contracts and validation

namespace PowerAppsMcpServer;

public sealed record CreateOrderRequest(
    string CustomerId,      // Must be a known customer
    string Sku,             // Product SKU
    int Quantity            // >= 1
);

public sealed record CreateOrderResponse(
    string OrderId,
    string Status,
    string Message
);
using FluentValidation;

namespace PowerAppsMcpServer;

public sealed class CreateOrderRequestValidator : AbstractValidator<CreateOrderRequest>
{
    public CreateOrderRequestValidator()
    {
        // Ensure CustomerId is present and well-formed
        RuleFor(x => x.CustomerId)
            .NotEmpty().WithMessage("CustomerId is required.")
            .Length(3, 64);

        // Basic SKU constraints
        RuleFor(x => x.Sku)
            .NotEmpty().WithMessage("Sku is required.")
            .Length(2, 64);

        // Quantity must be positive
        RuleFor(x => x.Quantity)
            .GreaterThan(0).WithMessage("Quantity must be at least 1.");
    }
}

Domain service (DI, testable logic)

using System;
using System.Threading;
using System.Threading.Tasks;

namespace PowerAppsMcpServer;

public interface IOrderService
{
    Task<CreateOrderResponse> CreateAsync(CreateOrderRequest request, CancellationToken ct = default);
}

public sealed class OrderService() : IOrderService // Primary constructor (no fields needed)
{
    public Task<CreateOrderResponse> CreateAsync(CreateOrderRequest request, CancellationToken ct = default)
    {
        // Simulate business logic; replace with real persistence/integration
        var orderId = $"ORD-{Guid.NewGuid():N}";
        return Task.FromResult(new CreateOrderResponse(orderId, "Created", "Order accepted"));
    }
}

HTTP-triggered Function with Azure AD auth and OpenAPI

using System.Net;
using System.Text.Json;
using FluentValidation;
using Microsoft.Azure.Functions.Worker;
using Microsoft.Azure.Functions.Worker.Http;
using Microsoft.Extensions.Logging;
using Microsoft.Azure.Functions.Worker.Extensions.OpenApi.Attributes;
using Microsoft.OpenApi.Models;

namespace PowerAppsMcpServer;

public sealed class CreateOrderFunction(
    ILogger<CreateOrderFunction> logger,
    IValidator<CreateOrderRequest> validator,
    IOrderService orderService)
{
    // This endpoint is described for OpenAPI and secured via Azure AD (set on the Function App)
    [Function("CreateOrder")]
    [OpenApiOperation(operationId: "CreateOrder", tags: new[] { "orders" }, Summary = "Create order", Description = "Creates an order with validated input.")]
    [OpenApiRequestBody(contentType: "application/json", bodyType: typeof(CreateOrderRequest), Required = true, Description = "Order payload")]
    [OpenApiResponseWithBody(statusCode: HttpStatusCode.OK, contentType: "application/json", bodyType: typeof(CreateOrderResponse), Summary = "Order created")]
    public async Task<HttpResponseData> Run(
        [HttpTrigger(AuthorizationLevel.Function, "post", Route = "orders")] HttpRequestData req,
        FunctionContext ctx)
    {
        // Note: Use EasyAuth/Azure AD at the Function App level to avoid handling tokens here.
        // The function still requires a valid AAD token via Custom Connector.

        var payload = await JsonSerializer.DeserializeAsync<CreateOrderRequest>(req.Body, new JsonSerializerOptions
        {
            PropertyNameCaseInsensitive = true
        });

        // Validate request
        var result = await validator.ValidateAsync(payload!);
        if (!result.IsValid)
        {
            var bad = req.CreateResponse(HttpStatusCode.BadRequest);
            await bad.WriteStringAsync(JsonSerializer.Serialize(new
            {
                error = "ValidationFailed",
                details = result.Errors.Select(e => new { e.PropertyName, e.ErrorMessage })
            }));
            return bad;
        }

        // Execute domain logic
        var responseModel = await orderService.CreateAsync(payload!, ctx.CancellationToken);

        // Return success
        var ok = req.CreateResponse(HttpStatusCode.OK);
        await ok.WriteStringAsync(JsonSerializer.Serialize(responseModel));
        return ok;
    }
}

Note: Set the Function App to be secured by Azure AD (App Service Authentication/Authorization). Do not rely on Function Keys in production.

OpenAPI for the Custom Connector

  • Run locally and navigate to /api/swagger/ui to inspect the contract. The OpenAPI JSON is available at /api/swagger.json.
  • Export this JSON and import it when creating your Power Apps Custom Connector, selecting OAuth 2.0 (Azure AD) as the authentication type.

Power Apps integration (Custom Connector)

Authentication setup

  • Create an Azure AD App Registration for the Custom Connector (client app) and expose an application ID URI for the Function App (resource app) if you choose user-assigned scopes. Alternatively, enable EasyAuth with “Log in with Azure Active Directory”.
  • In the Custom Connector, choose OAuth 2.0 (Azure AD). Provide the Authorization URL, Token URL, and the client application details. Use the Application ID URI or scope configured for the Function App.
  • Grant users access via Azure AD and Power Platform permissions so they can acquire tokens and use the connector.

Calling the connector from Power Apps

// Power Apps (Canvas) example formula usage (pseudo):
// Assuming Custom Connector named 'OrdersApi'
Set(
    createResult,
    OrdersApi.CreateOrder({
        CustomerId: "CUST-001",
        Sku: "WIDGET-42",
        Quantity: 2
    })
);
// Access response fields: createResult.OrderId, createResult.Status, createResult.Message

Deployment

// Azure deployment with CLI (example)
// 1) Login
az login

// 2) Create resource group
az group create -n rg-mcp-powerapps -l eastus

// 3) Create storage and function app (Linux, isolated, .NET 8)
az storage account create -n mcpfuncstor$RANDOM -g rg-mcp-powerapps -l eastus --sku Standard_LRS
az functionapp create -n mcp-func-app-$RANDOM -g rg-mcp-powerapps -s <storageName> --consumption-plan-location eastus --runtime dotnet-isolated --functions-version 4

// 4) Enable App Service Authentication with Azure AD (EasyAuth)
# Replace <clientId>, <issuerUrl> as per your AAD setup
az webapp auth microsoft update \
  --resource-group rg-mcp-powerapps \
  --name mcp-func-app-XXXX \
  --client-id <clientId> \
  --issuer "https://login.microsoftonline.com/<tenantId>/v2.0" \
  --unauthenticated-client-action RedirectToLoginPage

// 5) Deploy code
func azure functionapp publish mcp-func-app-XXXX

RBAC roles to assign

  • Deployment automation: Contributor on the resource group or scoped roles to Function App and Storage Account.
  • Function App management: Azure Functions Contributor (for CI/CD and app updates).
  • Storage access (if using Managed Identity to access blobs/queues): Storage Blob Data Reader or Storage Blob Data Contributor as needed, following least privilege.
  • Application Insights access (read-only dashboards): Monitoring Reader.

Validation and typing on the client

If you invoke this API from a TypeScript app (e.g., React), validate payloads before sending. Below is a strictly typed example using Zod.

// TypeScript (strict) example with Zod
import { z } from "zod";

const CreateOrderRequest = z.object({
  CustomerId: z.string().min(3).max(64),
  Sku: z.string().min(2).max(64),
  Quantity: z.number().int().positive()
});
type CreateOrderRequest = z.infer<typeof CreateOrderRequest>;

async function createOrder(apiBase: string, token: string, body: CreateOrderRequest) {
  const payload = CreateOrderRequest.parse(body); // Validates at runtime
  const res = await fetch(`${apiBase}/api/orders`, {
    method: "POST",
    headers: {
      "Content-Type": "application/json",
      "Authorization": `Bearer ${token}`
    },
    body: JSON.stringify(payload)
  });
  if (!res.ok) throw new Error(`HTTP ${res.status}`);
  return await res.json() as { OrderId: string; Status: string; Message: string };
}

Best Practices & Security

  • Authentication: Use Azure AD (EasyAuth) with the Custom Connector. Avoid Function Keys to eliminate shared secrets.
  • Authorization: Scope access by Entra app roles or scopes; assign least-privilege RBAC to managed identities.
  • Secrets: Prefer Managed Identity and DefaultAzureCredential to access downstream services. Avoid storing credentials in app settings.
  • Validation: Enforce input validation on the server (FluentValidation) and on the client (Zod) for robust defense-in-depth.
  • Observability: Enable Application Insights. Correlate requests by logging operation IDs and include key business fields.
  • Resiliency: Add retry policies on the client, idempotency keys for order creation, and request timeouts to prevent retries creating duplicates.
  • Versioning: Version endpoints (e.g., /api/v1/orders) and keep your Custom Connector mapped to a specific version.
  • Cost: Consumption plan scales to zero; set sampling in Application Insights to control telemetry cost.

Best Practice: Use AsNoTracking() in Entity Framework when performing read-only queries to improve performance.

Monitoring and alerting

With Application Insights, create alerts on failed requests or latency thresholds.

// Examples (Kusto)
// 1) High failure rate in the last 15 minutes
requests
| where timestamp > ago(15m)
| summarize FailureRate = 100.0 * countif(success == false) / count()

// 2) P95 latency by operation
requests
| where timestamp > ago(1h)
| summarize p95_duration = percentile(duration, 95) by operation_Name

Summary

  • Build your MCP-style server for Power Apps with .NET 8 isolated Azure Functions, DI, and FluentValidation.
  • Secure the API with Azure AD and avoid Function Keys; integrate via a Custom Connector using OAuth 2.0.
  • Ship with OpenAPI, monitoring, and clear RBAC to ensure production readiness.

Surface What’s New: Power Apps integration with a secure .NET 8 API for MCP server updates

The question what is new in powerapps for MCP server lacks a precise product definition, so rather than speculating on features, this guide shows how to reliably surface and version "What’s New" updates into Power Apps using a secure .NET 8 backend, strict TypeScript validation, and Azure-native security. You will get a production-ready pattern that Power Apps can call via a Custom Connector, keeping your release notes current without manual edits.

The Problem

Teams need a trustworthy way to display "What’s New" for MCP server in Power Apps, but the upstream source and format of updates can change. Hardcoding content or querying unsecured endpoints leads to drift, security gaps, and poor developer experience.

Prerequisites

  • .NET 8 SDK
  • Node.js 20+ and a package manager (pnpm/npm)
  • Azure subscription with permissions to create: Azure Functions or Container Apps, Key Vault, Azure API Management, Application Insights
  • Entra ID app registration for the API (OAuth 2.0)
  • Power Apps environment (to build a Custom Connector)

The Solution (Step-by-Step)

1) .NET 8 Minimal API that normalizes "What’s New" items

This minimal API demonstrates production-ready design: DI-first, HttpClientFactory, global exception handling, validation, versioned contract, and managed identity for downstream access.

// Program.cs (.NET 8, file-scoped namespace, minimal API, DI-centric)
using System.Net.Http.Json;
using System.Text.Json;
using System.Text.Json.Serialization;
using Azure.Identity; // DefaultAzureCredential for managed identity
using Azure.Security.KeyVault.Secrets;
using Microsoft.AspNetCore.Diagnostics;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Azure;

var builder = WebApplication.CreateBuilder(args);

// Strongly typed options for upstream source configuration
builder.Services.Configure<NewsOptions>(builder.Configuration.GetSection("News"));

// HttpClient with resilient handler
builder.Services.AddHttpClient<INewsClient, NewsClient>()
    .SetHandlerLifetime(TimeSpan.FromMinutes(5));

// Azure clients via DefaultAzureCredential (uses Managed Identity in Azure)
builder.Services.AddAzureClients(azure =>
{
    azure.UseCredential(new DefaultAzureCredential());
    var kvUri = builder.Configuration["KeyVaultUri"]; // e.g., https://my-kv.vault.azure.net/
    if (!string.IsNullOrWhiteSpace(kvUri))
    {
        azure.AddSecretClient(new Uri(kvUri));
    }
});

// Application Insights (OpenTelemetry auto-collection can also be used)
builder.Services.AddApplicationInsightsTelemetry();

var app = builder.Build();

// Global exception handler producing problem+json
app.UseExceptionHandler(errorApp =>
{
    errorApp.Run(async context =>
    {
        var feature = context.Features.Get<IExceptionHandlerPathFeature>();
        var problem = new ProblemDetails
        {
            Title = "Unexpected error",
            Detail = app.Environment.IsDevelopment() ? feature?.Error.ToString() : "",
            Status = StatusCodes.Status500InternalServerError,
            Type = "https://httpstatuses.com/500"
        };
        context.Response.StatusCode = problem.Status ?? 500;
        context.Response.ContentType = "application/problem+json";
        await context.Response.WriteAsJsonAsync(problem);
    });
});

app.MapGet("/health", () => Results.Ok(new { status = "ok" }));

// Versioned route for the normalized news feed (v1)
app.MapGet("/api/v1/news", async (
    INewsClient client
) => Results.Ok(await client.GetNewsAsync()))
.WithName("GetNewsV1")
.Produces<NewsItemV1[]>(StatusCodes.Status200OK)
.ProducesProblem(StatusCodes.Status500InternalServerError);

app.Run();

// Options to control the upstream feed location and parsing mode
public sealed class NewsOptions
{
    public string? SourceUrl { get; init; } // Upstream JSON or RSS converted via a worker
    public string Format { get; init; } = "json"; // json|rss (extend as needed)
}

// Public DTO contract exposed to Power Apps via Custom Connector
public sealed class NewsItemV1
{
    public required string Id { get; init; } // stable identifier
    public required string Title { get; init; }
    public required string Summary { get; init; }
    public required DateTimeOffset PublishedAt { get; init; }
    public string? Category { get; init; } // optional taxonomy
    public string? Url { get; init; } // link to detail page
}

// Client interface for fetching and normalizing upstream data
public interface INewsClient
{
    Task<NewsItemV1[]> GetNewsAsync(CancellationToken ct = default);
}

// Implementation that reads upstream source, validates, and normalizes
public sealed class NewsClient(HttpClient http, Microsoft.Extensions.Options.IOptions<NewsOptions> options) : INewsClient
{
    private readonly HttpClient _http = http;
    private readonly NewsOptions _opts = options.Value;
    private static readonly JsonSerializerOptions JsonOptions = new()
    {
        PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
        DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull
    };

    public async Task<NewsItemV1[]> GetNewsAsync(CancellationToken ct = default)
    {
        if (string.IsNullOrWhiteSpace(_opts.SourceUrl))
            throw new InvalidOperationException("News:SourceUrl is not configured.");

        // Fetch upstream JSON and map to a stable contract consumed by Power Apps
        var upstreamItems = await _http.GetFromJsonAsync<UpstreamItem[]>(_opts.SourceUrl, JsonOptions, ct)
            ?? Array.Empty<UpstreamItem>();

        // Normalize and order by publish date desc
        return upstreamItems
            .Select(u => new NewsItemV1
            {
                Id = u.Id ?? Guid.NewGuid().ToString("n"),
                Title = u.Title ?? "Untitled",
                Summary = u.Summary ?? string.Empty,
                PublishedAt = u.PublishedAt == default ? DateTimeOffset.UtcNow : u.PublishedAt,
                Category = u.Category,
                Url = u.Url
            })
            .OrderByDescending(n => n.PublishedAt)
            .ToArray();
    }

    // Internal model matching the upstream source (keep flexible)
    private sealed class UpstreamItem
    {
        public string? Id { get; init; }
        public string? Title { get; init; }
        public string? Summary { get; init; }
        public DateTimeOffset PublishedAt { get; init; }
        public string? Category { get; init; }
        public string? Url { get; init; }
    }
}

Configuration (appsettings.json):

{
  "News": {
    "SourceUrl": "https://<your-source>/mcp-news.json",
    "Format": "json"
  },
  "KeyVaultUri": "https://<your-kv>.vault.azure.net/"
}

Pro-Tip: Use AsNoTracking() in Entity Framework when performing read-only queries to improve performance.

2) Secure Azure deployment with Managed Identity and API Management

  • Deploy as Azure Functions (isolated) or Azure Container Apps. Enable System-Assigned Managed Identity.
  • Expose through Azure API Management with OAuth 2.0 (Entra ID) for inbound auth. Create a Power Apps Custom Connector pointing to APIM.

Required RBAC roles (assign to the managed identity or DevOps service principal):

  • Key Vault Secrets User (to read secrets if you store upstream source credentials or API keys)
  • App Configuration Data Reader (if using Azure App Configuration instead of appsettings)
  • API Management Service Contributor (to publish and manage the API surface)
  • Monitoring Reader (to view Application Insights telemetry)
  • Storage Blob Data Reader (only if the upstream source is in Azure Storage)

Pro-Tip: Favor Managed Identity and DefaultAzureCredential across services; avoid connection strings and embedded secrets entirely.

3) Strict TypeScript models with Zod and versioning

The client schema mirrors the v1 API and can evolve with v2+ while keeping backward compatibility in Power Apps and React.

// news.schema.ts (TypeScript, strict mode)
import { z } from "zod";

// Discriminated union enables future breaking changes with clear versioning
export const NewsItemV1 = z.object({
  id: z.string().min(1),
  title: z.string().min(1),
  summary: z.string().default(""),
  publishedAt: z.string().datetime(),
  category: z.string().optional(),
  url: z.string().url().optional()
});

export const NewsResponseV1 = z.array(NewsItemV1);

export type TNewsItemV1 = z.infer<typeof NewsItemV1>;
export type TNewsResponseV1 = z.infer<typeof NewsResponseV1>;

// Future-proof: union for versioned responses
export const NewsResponse = z.union([
  z.object({ version: z.literal("v1"), data: NewsResponseV1 })
]);
export type TNewsResponse = z.infer<typeof NewsResponse>;

4) React 19 component using TanStack Query

Functional component with error and loading states, plus Zod runtime validation.

// NewsPanel.tsx (React 19)
import React from "react";
import { useQuery } from "@tanstack/react-query";
import { NewsResponseV1, type TNewsItemV1 } from "./news.schema";

async function fetchNews(): Promise<TNewsItemV1[]> {
  const res = await fetch("/api/v1/news", { headers: { Accept: "application/json" } });
  if (!res.ok) throw new Error(`Failed: ${res.status}`);
  const json = await res.json();
  const parsed = NewsResponseV1.safeParse(json);
  if (!parsed.success) throw new Error("Schema validation failed");
  return parsed.data;
}

export function NewsPanel(): JSX.Element {
  const { data, error, isLoading } = useQuery({
    queryKey: ["news", "v1"],
    queryFn: fetchNews,
    staleTime: 60_000
  });

  if (isLoading) return <div>Loading updates…</div>;
  if (error) return <div>Failed to load updates.</div>;

  return (
    <ul>
      {data!.map(item => (
        <li key={item.id}>
          <strong>{item.title}</strong> — {new Date(item.publishedAt).toLocaleDateString()}<br />
          {item.summary}
        </li>
      ))}
    </ul>
  );
}

5) Power Apps Custom Connector

Create a Custom Connector targeting the APIM endpoint /api/v1/news. Map the response to your app data schema. Add the connector to your Power App and display the items in a gallery. When the upstream feed changes, you only update the backend normalizer, not the app.

Best Practices & Security

  • Authentication: Use Entra ID for APIM inbound auth. Backend-to-Azure uses DefaultAzureCredential with Managed Identity.
  • Secrets: Store upstream tokens in Key Vault; assign Key Vault Secrets User to the app’s managed identity.
  • Telemetry: Enable Application Insights. Track request IDs and dependency calls for the upstream fetch.
  • Authorization: Restrict APIM access via OAuth scopes and, if needed, IP restrictions or rate limits.
  • Resilience: Configure retry and timeout on HttpClient with sensible limits; add circuit breakers if the upstream is unreliable.
  • Versioning: Pin /api/v1/news; introduce /api/v2/news when the contract changes. Version TypeScript schemas alongside API versions.

App Insights integration (example):

// Add to Program.cs before app.Run(); ensure Application Insights is enabled
app.Use(async (ctx, next) =>
{
    // Correlate requests with upstream calls via trace IDs
    ctx.Response.Headers["Request-Id"] = System.Diagnostics.Activity.Current?.Id ?? Guid.NewGuid().ToString();
    await next();
});

Testing strategy:

  • API: Unit test NewsClient with mocked HttpMessageHandler; integration test /api/v1/news in-memory.
  • TypeScript: Schema tests to ensure validation rejects malformed payloads; component tests for loading/error states.
  • Contract: Add a CI step that fetches a sample payload from the upstream and validates against NewsResponseV1.

Security roles recap:

  • API surface: API Management Service Contributor
  • Secrets: Key Vault Secrets User
  • Config (if used): App Configuration Data Reader
  • Monitoring: Monitoring Reader
  • Storage (if used): Storage Blob Data Reader

Pro-Tip: Use APIM policies (validate-content, rate-limit-by-key) to protect the API consumed by Power Apps.

Summary

  • Do not guess "what’s new" for MCP server; centralize updates behind a stable, versioned API.
  • Use Managed Identity, DefaultAzureCredential, and APIM OAuth to secure end-to-end access for Power Apps.
  • Validate with Zod, monitor with Application Insights, and evolve safely via API/schema versioning.

Saturday, 7 February 2026

Mastering Dynamics CRM Plugin Triggers: Pre-Validation, Pre-Operation, Post-Operation, and Async with Azure-Ready Patterns

Dynamics CRM plugin triggers define when your custom logic runs in the Dataverse pipeline. If you understand how Dynamics CRM plugin triggers behave across Pre-Validation, Pre-Operation, Post-Operation, and Asynchronous execution, you can write reliable, idempotent, and production-ready business logic that scales with Azure.

The Problem

Developers struggle to pick the correct stage and execution mode for Dynamics 365/Dataverse plugins, causing issues like recursion, lost transactions, or performance bottlenecks. You need clear rules, copy-paste-safe examples, and guidance on automation, security, and Azure integration without manual portal steps.

Prerequisites

• .NET 8 SDK installed (for companion services and automation)
• Power Platform Tools (PAC CLI) installed
• Azure CLI (az) installed, logged in with least-privilege account
• Access to a Dataverse environment and solution where you can register plugins
• Basic familiarity with IPlugin, IPluginExecutionContext, and IServiceProvider

The Solution (Step-by-Step)

1) Know the stages and when to use each

• Pre-Validation (Stage 10, synchronous): Validate input early, block bad requests before the main transaction. Good for authorization and schema checks.
• Pre-Operation (Stage 20, synchronous): Mutate Target before it’s saved. Good for defaulting fields, data normalization, or cross-field validation.
• Post-Operation (Stage 40, synchronous): Runs after the record is saved, still in the transaction. Good for operations that must be atomic with the main operation (e.g., child record creation that must roll back with parent).
• Post-Operation (Asynchronous): Offload non-transactional, latency-tolerant work (notifications, integrations). Improves throughput and user experience.

2) Messages and images

• Common messages: Create, Update, Delete, Assign, SetState, Associate/Disassociate, Merge, Retrieve/RetrieveMultiple (use sparingly to avoid performance impact).
• Filtering attributes (Update): Only trigger when specific columns change to reduce overhead.
• Images: Use Pre-Image for old values, Post-Image for new values. Keep images minimal to reduce payload and improve performance.

3) Synchronous Pre-Operation example (mutate data safely)

Target framework note: Dataverse runtime support for .NET versions can vary. The C# syntax below follows modern patterns while remaining compatible with the Dataverse plugin model. Always target the supported framework for your environment at build time.

using System;
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Extensions; // For helpful extension methods
using Microsoft.Extensions.DependencyInjection; // For DI patterns inside plugin
using System.Globalization;

// File-scoped namespace for clean organization
namespace Company.Plugins;

// Primary-constructor-like pattern for clarity; the Dataverse runtime will call the parameterless constructor.
public sealed class AccountNormalizeNamePlugin : IPlugin
{
    // Build a tiny DI container once per plugin instance to follow DI principles instead of static helpers.
    private readonly IServiceProvider _rootServices;

    public AccountNormalizeNamePlugin()
    {
        var services = new ServiceCollection();
        services.AddSingleton<INameNormalizer, TitleCaseNameNormalizer>();
        _rootServices = services.BuildServiceProvider();
    }

    public void Execute(IServiceProvider serviceProvider)
    {
        // Standard service access from the pipeline
        var context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));
        var factory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
        var service = factory.CreateOrganizationService(context.UserId);
        var tracing = (ITracingService)serviceProvider.GetService(typeof(ITracingService));

        // Guard: Ensure we only run on Update of account and when 'name' changes
        if (!string.Equals(context.PrimaryEntityName, "account", StringComparison.OrdinalIgnoreCase) ||
            !string.Equals(context.MessageName, "Update", StringComparison.OrdinalIgnoreCase))
        {
            return;
        }

        // Prevent recursion: depth should be 1 for first-level execution
        if (context.Depth > 1) return;

        var target = context.InputParameters.Contains("Target") ? context.InputParameters["Target"] as Entity : null;
        if (target == null) return;

        // Run only when 'name' was provided in this Update
        if (!target.Contains("name")) return;

        // Resolve our normalizer from DI
        var normalizer = _rootServices.GetRequiredService<INameNormalizer>();

        // Normalize 'name' to Title Case
        var originalName = target.GetAttributeValue<string>("name");
        var normalized = normalizer.Normalize(originalName);
        target["name"] = normalized;

        tracing.Trace($"AccountNormalizeNamePlugin: normalized '{originalName}' to '{normalized}'.");
    }
}

// Service abstraction for testability and SRP
public interface INameNormalizer
{
    string Normalize(string? input);
}

public sealed class TitleCaseNameNormalizer : INameNormalizer
{
    public string Normalize(string? input)
    {
        if (string.IsNullOrWhiteSpace(input)) return input ?? string.Empty;
        var textInfo = CultureInfo.InvariantCulture.TextInfo;
        return textInfo.ToTitleCase(input.Trim().ToLowerInvariant());
    }
}

Registration guidelines: Register this on account Update, Stage Pre-Operation (20), Synchronous, with filtering attributes = name. Add a minimal Pre-Image if you need original values.

4) Synchronous Post-Operation example (atomic child creation)

using System;
using Microsoft.Xrm.Sdk;

namespace Company.Plugins;

public sealed class ContactCreateWelcomeTaskPlugin : IPlugin
{
    public void Execute(IServiceProvider serviceProvider)
    {
        var context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));
        var factory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
        var service = factory.CreateOrganizationService(context.UserId);
        var tracing = (ITracingService)serviceProvider.GetService(typeof(ITracingService));

        // Only run on Contact Create, after it is created (Post-Operation)
        if (!string.Equals(context.PrimaryEntityName, "contact", StringComparison.OrdinalIgnoreCase) ||
            !string.Equals(context.MessageName, "Create", StringComparison.OrdinalIgnoreCase))
        {
            return;
        }

        if (context.Depth > 1) return;

        var contactId = context.PrimaryEntityId;
        if (contactId == Guid.Empty) return;

        // Create a follow-up task; if this plugin throws, both contact and task roll back
        var task = new Entity("task");
        task["subject"] = "Welcome new contact";
        task["regardingobjectid"] = new EntityReference("contact", contactId);
        task["prioritycode"] = new OptionSetValue(1); // High
        service.Create(task);

        tracing.Trace("ContactCreateWelcomeTaskPlugin: created welcome task.");
    }
}

5) Asynchronous Post-Operation example (offload integration)

Use Async Post-Operation for non-transactional work such as calling Azure services. Prefer a durable, retry-enabled mechanism (queue, function) over direct HTTP. The plugin should enqueue a message; an Azure Function (managed identity) processes it.

using System;
using Microsoft.Xrm.Sdk;

namespace Company.Plugins;

public sealed class ContactCreatedEnqueueIntegrationPlugin : IPlugin
{
    public void Execute(IServiceProvider serviceProvider)
    {
        var context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));
        var factory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
        var service = factory.CreateOrganizationService(context.UserId);
        var tracing = (ITracingService)serviceProvider.GetService(typeof(ITracingService));

        if (!string.Equals(context.PrimaryEntityName, "contact", StringComparison.OrdinalIgnoreCase) ||
            !string.Equals(context.MessageName, "Create", StringComparison.OrdinalIgnoreCase))
        {
            return;
        }

        // Idempotency key: use the contact id
        var contactId = context.PrimaryEntityId;
        if (contactId == Guid.Empty) return;

        // Example: write an integration record for downstream Azure Function (poll or Dataverse Change Tracking)
        // This avoids secrets and direct outbound calls from the plugin.
        var integrationLog = new Entity("new_integrationmessage"); // Custom table
        integrationLog["new_name"] = $"ContactCreated:{contactId}";
        integrationLog["new_payload"] = contactId.ToString();
        service.Create(integrationLog);

        tracing.Trace("ContactCreatedEnqueueIntegrationPlugin: queued integration message.");
    }
}

6) Automate registration with PAC CLI (no manual portal)

:: Batch/PowerShell snippet to build and register the assembly
:: 1) Build plugin project (target a runtime supported by your environment)
dotnet build .\src\Company.Plugins\Company.Plugins.csproj -c Release

:: 2) Pack into a solution if applicable
pac solution pack --zipFilePath .\dist\CompanySolution.zip --folder .\solution

:: 3) Import or update solution into the environment
pac auth create --url https://<yourorg>.crm.dynamics.com --cloud Public
pac solution import --path .\dist\CompanySolution.zip --activate-plugins true

This keeps registration repeatable in CI/CD without manual steps.

7) Azure companion Minimal API (for outbound webhooks or admin tools)

For external processing, build a Minimal API or Azure Function with managed identity and Azure RBAC. Example Minimal API (.NET 8) that reads from Storage using DefaultAzureCredential.

using Azure;
using Azure.Identity;
using Azure.Storage.Blobs;
using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Azure;
using Microsoft.Extensions.DependencyInjection;

var builder = WebApplication.CreateBuilder(args);

// Use DefaultAzureCredential to prefer Managed Identity in Azure and dev fallbacks locally
builder.Services.AddAzureClients(az =>
{
    az.UseCredential(new DefaultAzureCredential());
    az.AddBlobServiceClient(new Uri(builder.Configuration["BLOB_ENDPOINT"]!));
});

var app = builder.Build();

// Simple endpoint to fetch a blob; secure this behind Azure AD (AAD) in production
app.MapGet("/files/{name}", async (string name, BlobServiceClient blobs) =>
{
    // Access container 'docs' with RBAC: Storage Blob Data Reader/Contributor on the Managed Identity
    var container = blobs.GetBlobContainerClient("docs");
    var client = container.GetBlobClient(name);

    if (!await container.ExistsAsync()) return Results.NotFound("Container not found.");
    if (!await client.ExistsAsync()) return Results.NotFound("Blob not found.");

    var stream = await client.OpenReadAsync();
    return Results.Stream(stream, "application/octet-stream");
});

await app.RunAsync();

Required Azure RBAC role for the app's managed identity: Storage Blob Data Reader (read-only) or Storage Blob Data Contributor (read-write) on the storage account or specific container scope.

8) IaC with Bicep: storage + managed identity + role assignment

// main.bicep
targetScope = 'resourceGroup'

param location string = resourceGroup().location
param storageName string
param identityName string = 'dv-plugin-mi'

// Storage Account
resource stg 'Microsoft.Storage/storageAccounts@2023-05-01' = {
  name: storageName
  location: location
  sku: {
    name: 'Standard_LRS'
  }
  kind: 'StorageV2'
}

// User Assigned Managed Identity
resource uami 'Microsoft.ManagedIdentity/userAssignedIdentities@2023-01-31' = {
  name: identityName
  location: location
}

// Blob Data Reader role on storage for the identity
resource role 'Microsoft.Authorization/roleAssignments@2022-04-01' = {
  name: guid(stg.id, uami.id, 'ba92f5b4-2d11-453d-a403-e96b0029c9fe') // Storage Blob Data Reader
  scope: stg
  properties: {
    principalId: uami.properties.principalId
    roleDefinitionId: subscriptionResourceId('Microsoft.Authorization/roleDefinitions', 'ba92f5b4-2d11-453d-a403-e96b0029c9fe')
    principalType: 'ServicePrincipal'
  }
}

Deploy with: az deployment group create -g <rg> -f main.bicep -p storageName=<name>.

Best Practices & Security

Pick the right trigger

• Pre-Validation: Reject invalid input early (authorization, schema, required business rules).
• Pre-Operation: Mutate data before save, avoid external calls here.
• Post-Operation (sync): Keep logic small and deterministic to minimize transaction time.
• Post-Operation (async): Offload long-running and I/O-heavy work.

Recursion, idempotency, and performance

• Check context.Depth to prevent infinite loops.
• Use idempotency keys (primary entity id) in integration logs.
• Keep images and columns minimal; filter attributes to reduce trigger noise.
• Use AsNoTracking() in external EF Core services when reading data.

Pro-Tip: Use AsNoTracking() in Entity Framework when performing read-only queries to improve performance.

Security and authentication

• Use Azure AD and Managed Identity for external services; never store secrets in plugin code.
• Apply least privilege with Azure RBAC. Examples: Storage Blob Data Reader/Contributor for the app workload identity; Key Vault Secrets User if retrieving secrets via a separate process.
• In Dataverse, ensure the application user has the minimal security roles necessary for the operations (table-level privileges only on the entities it touches).

Automation and IaC

• Use PAC CLI and CI/CD to register and update plugins, avoiding manual portal steps.
• Use Bicep or azd to provision Azure resources, assign RBAC, and configure endpoints.

Error handling and resiliency

• Synchronous plugins should throw InvalidPluginExecutionException only for business errors that must roll back the transaction.
• For external work, prefer async steps that enqueue messages and rely on Azure Functions with retry policies and dead-letter queues (e.g., Azure Storage Queues or Service Bus).
• Trace key events with ITracingService for diagnosability.

Testing strategy

• Abstract logic behind interfaces and inject into the plugin to enable unit testing without Dataverse.
• Use fakes for IOrganizationService and validate behavior under different stages and messages.
• Add integration tests in a sandbox environment using PAC CLI to seed and verify behavior.

References

• Azure RBAC built-in roles: https://learn.microsoft.com/azure/role-based-access-control/built-in-roles
• DefaultAzureCredential: https://learn.microsoft.com/dotnet/api/azure.identity.defaultazurecredential
• Power Platform CLI: https://learn.microsoft.com/power-platform/developer/cli/introduction

Summary

• Choose the correct trigger: Pre-Validation for guards, Pre-Operation for mutation, Post-Operation for atomic side-effects, Async for integrations.
• Enforce security: Managed Identity for auth, Azure RBAC with least privilege, and no secrets in code.
• Automate everything: PAC CLI for plugin registration, Bicep for Azure resources, and add retries and dead-lettering for resilient async flows.

Sunday, 1 February 2026

MCP Server in AI: A Complete Guide to the Model Context Protocol for Tool-Enabled AI

What Is an MCP Server in AI?

The term MCP server in AI refers to a server that implements the Model Context Protocol (MCP), a standardized way for AI clients (like chat assistants or agents) to securely access tools, data sources, and workflows. An MCP server exposes capabilities—such as APIs, databases, files, prompts, and utility functions—so AI systems can request them in a predictable, controlled manner.

Why MCP Matters

MCP creates a consistent contract between AI clients and external resources. Instead of bespoke integrations, developers can add or swap back-end capabilities with less friction. This improves maintainability, security, and reliability while enabling richer, more grounded AI behavior.

  • Standardization: One protocol to expose many tools/resources.
  • Security: Clear permissions and controlled access to data and actions.
  • Scalability: Add new tools or data sources without redesigning the AI client.
  • Traceability: Requests and responses are structured for logging and auditing.

How an MCP Server Works

At a high level, the AI client connects to an MCP server and discovers what it can do. The client then issues structured requests for actions or data, and the MCP server fulfills them via its configured tools and resources.

Core Components

  • Client: The AI application (chatbot/agent) that understands MCP and sends requests.
  • Server: The MCP endpoint that advertises capabilities and executes requests.
  • Tools: Actions the server can perform (e.g., call an API, run a query, send an email).
  • Resources: Data the server can read (files, database tables, knowledge bases).
  • Prompts/Templates: Reusable instruction blocks or chains the client can invoke via the server.
  • Sessions: Contextual interactions that can track state across multiple requests.

Typical Request Flow

  • Capability discovery: The client lists available tools/resources from the MCP server.
  • Request: The client sends a structured call (e.g., tool.invoke with specific parameters).
  • Execution: The server runs the tool or fetches the resource safely and deterministically.
  • Response: The server returns results with metadata (status, content type, usage notes).

Benefits for Teams and Developers

  • Faster integrations: Plug in new data sources or utilities via MCP without rewriting the client.
  • Access control: Gate sensitive operations and monitor usage centrally.
  • Consistency: Uniform patterns for error handling, timeouts, and retries.
  • Observability: Better logs and diagnostics for AI tool calls.

Use Cases and Examples

Enterprise Knowledge and Operations

  • Search internal documents: A tool that queries a document index or enterprise search.
  • Pull CRM records: Read-only resource access to customer profiles and activity history.
  • Create tickets: A tool to open an issue in a tracker with validated fields.

Data and Analytics

  • SQL query tool: Safely run parameterized queries against a data warehouse.
  • Metrics fetcher: Read metrics or dashboards for real-time insights.
  • Report generator: Produce summarized reports and export to files.

Automation and Productivity

  • Email sender: A tool to draft and send emails with approval steps.
  • Calendar manager: Create and modify events with conflict checks.
  • File utilities: Read, write, and transform files with strict path controls.

Security and Best Practices

  • Principle of least privilege: Expose only the tools and data needed.
  • Input validation: Enforce schemas and sanitize parameters for tools.
  • Audit logging: Log requests, results, and errors with minimal sensitive data.
  • Rate limiting and quotas: Prevent abuse and control costs.
  • Secrets management: Store API keys and credentials securely, never in prompts.

High-Level Setup Steps

  • Define capabilities: Identify which tools, resources, and prompts to expose.
  • Implement adapters: Connect to APIs, databases, and file systems with constrained permissions.
  • Describe schemas: Use structured inputs/outputs to ensure predictable behavior.
  • Configure policies: Authentication, authorization, and rate limits per tool or resource.
  • Test and observe: Validate responses, edge cases, and error handling with logs and metrics.

FAQ

Is an MCP server the same as a normal API?

No. An MCP server is a standardized interface purpose-built for AI clients to discover and use tools/resources consistently, whereas a normal API is typically application-specific.

Can I use MCP with existing systems?

Yes. You can wrap existing APIs, databases, or automation scripts as MCP tools/resources with appropriate permissions and validation.

How does MCP help with reliability?

By enforcing structured calls, typed parameters, and clear error semantics, MCP reduces ambiguity and makes failures easier to detect and recover from.

Key Takeaways

  • An MCP server in AI standardizes how AI clients access tools, data, and workflows.
  • It improves security, observability, and maintainability for AI-enabled applications.
  • Adopt best practices—least privilege, validation, logging—to run MCP safely at scale.

Monday, 26 January 2026

Call a React Component with TypeScript and Zod: A Step-by-Step, Production-Ready Pattern

The fastest way to call a component in React is to render it via JSX and pass strictly typed props. This article shows how to call a component in React with TypeScript and Zod so you get compile-time and runtime safety, clear state management, and production-ready patterns.

The Problem

Developers often "call" (render) a component without strict typing or validation, leading to runtime bugs, unclear state, and hard-to-test UI.

Prerequisites

Node.js 20+, pnpm or npm, React 19, TypeScript 5+, Zod 3+, a modern browser. Ensure tsconfig has strict: true.

The Solution (Step-by-Step)

Step 1: Bootstrap a minimal TypeScript + React app

// package.json (excerpt) - ensures React 19 and strict TS
{
  "name": "react-call-component-ts",
  "private": true,
  "type": "module",
  "scripts": {
    "dev": "vite",
    "build": "tsc -b && vite build",
    "preview": "vite preview"
  },
  "dependencies": {
    "react": "^19.0.0",
    "react-dom": "^19.0.0",
    "zod": "^3.22.0"
  },
  "devDependencies": {
    "typescript": "^5.6.0",
    "vite": "^5.0.0",
    "@types/react": "^18.3.0",
    "@types/react-dom": "^18.3.0"
  }
}
// tsconfig.json - strict mode enabled for maximum safety
{
  "compilerOptions": {
    "target": "ES2022",
    "lib": ["ES2022", "DOM"],
    "jsx": "react-jsx",
    "module": "ESNext",
    "moduleResolution": "Bundler",
    "strict": true,
    "noFallthroughCasesInSwitch": true,
    "noUncheckedIndexedAccess": true,
    "skipLibCheck": true
  },
  "include": ["src"]
}

Step 2: Create a strictly typed child component with runtime validation

// src/components/Greeting.tsx
import React, { memo } from "react";
import { z } from "zod";

// 1) Define compile-time props shape via TypeScript
export type GreetingProps = {
  name: string;                 // Required user name
  mode: "friendly" | "formal";  // Discriminated literal union for behavior
};

// 2) Define runtime schema using Zod for additional safety in production
const greetingPropsSchema = z.object({
  name: z.string().min(1, "name is required"),
  mode: z.union([z.literal("friendly"), z.literal("formal")])
});

// 3) React.memo to avoid unnecessary re-renders when props are stable
export const Greeting = memo(function Greeting(props: GreetingProps) {
  // Validate props at runtime to fail fast in dev and log issues in prod
  const result = greetingPropsSchema.safeParse(props);
  if (!result.success) {
    // Render a small fallback and log schema errors for debugging
    console.error("Greeting props invalid:", result.error.format());
    return Invalid greeting config;
  }

  // Safe, parsed props
  const { name, mode } = result.data;

  // Render based on discriminated union value
  if (mode === "friendly") {
    return 

Hi, {name}! Welcome back.

; } return

Hello, {name}. It is good to see you.

; });

Explanation: We "call" a component in React by placing it in JSX like <Greeting name="Sam" mode="friendly" />. The TypeScript type enforces correct usage at compile time; Zod enforces it at runtime.

Step 3: Manage parent state with discriminated unions and render the child

// src/App.tsx
import React, { useEffect, useState } from "react";
import { Greeting } from "./components/Greeting";

// Discriminated union for page state: guarantees exhaustive checks
type PageState =
  | { kind: "loading" }
  | { kind: "ready"; userName: string }
  | { kind: "error"; message: string };

export function App() {
  const [state, setState] = useState({ kind: "loading" });

  // Simulate fetching the current user, then set ready state
  useEffect(() => {
    const timer = setTimeout(() => {
      // In a real app, replace with a fetch call and proper error handling
      setState({ kind: "ready", userName: "Sam" });
    }, 300);
    return () => clearTimeout(timer);
  }, []);

  // Render different UI based on discriminated union state
  if (state.kind === "loading") {
    return 

Loading…

; } if (state.kind === "error") { return

Error: {state.message}

; } // Key line: this is how you "call" (render) the component with props return (

Dashboard

{/* Rendering a list of components safely */} {(["Ada", "Linus", "Grace"] as const).map((n) => ( ))}
); }

Step 4: Mount the app

// src/main.tsx
import React from "react";
import { createRoot } from "react-dom/client";
import { App } from "./App";

const container = document.getElementById("root");
if (!container) throw new Error("Root container missing");

createRoot(container).render(
  // StrictMode helps surface potential issues
  
    
  
);

Best Practices & Security

Pro-Tip: Use React.memo for presentational components to avoid unnecessary re-renders.

Pro-Tip: Use discriminated unions for UI state to guarantee exhaustive handling and safer refactors.

Pro-Tip: Validate at runtime with Zod for boundary inputs (API responses, query params, environment-driven config).

Pro-Tip: Prefer useCallback and stable prop shapes when passing callbacks to memoized children.

Pro-Tip: Keep components pure; avoid hidden side effects inside render logic.

Security note (front-end): Do not embed secrets in the client. If you integrate with Azure or any backend, call a secured API instead of accessing resources directly from the browser.

Security note (Azure backend integration): Use Managed Identity and DefaultAzureCredential in the server/API, not the frontend. Grant the server's managed identity least-privilege RBAC roles only. Example: for Azure Storage reads, assign Storage Blob Data Reader to the API identity at the specific container scope.

Security note (data flow): Validate user input and API responses at the edge (API) with Zod or similar, then keep the front-end strictly typed.

Summary

• You call a component in React by rendering it in JSX with strictly typed, validated props.

• Discriminated unions make UI state predictable, and React.memo boosts performance.

• For real backends, keep secrets server-side, use Managed Identity with least-privilege RBAC, and validate at the edge.

Testing Quickstart

Test a component render with React Testing Library

// src/components/Greeting.test.tsx
import React from "react";
import { render, screen } from "@testing-library/react";
import "@testing-library/jest-dom";
import { Greeting } from "./Greeting";

test("renders friendly greeting", () => {
  render(<Greeting name="Sam" mode="friendly" />);
  expect(screen.getByText(/Hi, Sam!/)).toBeInTheDocument();
});

test("renders formal greeting", () => {
  render(<Greeting name="Ada" mode="formal" />);
  expect(screen.getByText(/Hello, Ada\./)).toBeInTheDocument();
});

This test verifies the component is "called" with valid props and renders deterministic output. For invalid props, assert that the fallback appears and console error is triggered.

React 19: The Practical Difference Between Hooks and Components (With TypeScript and Azure Integration)

The difference between react hooks and components matters because it defines how you separate logic from presentation. Problem: teams mix stateful logic directly inside UI and struggle to test, reuse, and scale. Solution: put data fetching, validation, and side effects in reusable hooks; keep rendering in lean components. Value: cleaner architecture, easier testing, fewer bugs, and production-ready integration with Azure using least privilege.

The Problem

Developers often blur the line between where logic lives (hooks) and where UI renders (components). This leads to duplicated code, tangled effects, and UI tests that are slow and brittle. We need a clear pattern: hooks encapsulate logic and I/O; components focus on layout and accessibility.

Prerequisites

Node.js v20+, TypeScript 5+ with strict mode, React 19, TanStack Query v5+, Zod v3+, Azure Functions Core Tools v4+, .NET 8 SDK, Azure CLI (or azd), and a browser-compatible fetch API.

The Solution (Step-by-Step)

1) Define strict runtime and compile-time types

// src/schemas/user.ts
import { z } from "zod";

// Zod schema for runtime validation and safe parsing
export const UserSchema = z.object({
  id: z.string().uuid(),
  email: z.string().email(),
  name: z.string().min(1),
});

export const UsersSchema = z.array(UserSchema);

export type User = z.infer<typeof UserSchema>;

2) Create a focused hook: logic, data fetching, and validation

// src/hooks/useUsers.ts
import { useMemo } from "react";
import { useQuery, QueryClient, QueryClientProvider } from "@tanstack/react-query";
import { UsersSchema, type User } from "../schemas/user";

// Discriminated union for explicit UI states
export type UsersState =
  | { status: "loading" }
  | { status: "error"; error: string }
  | { status: "success"; data: ReadonlyArray<User> };

// Fetch function with runtime validation and descriptive errors
async function fetchUsers(): Promise<ReadonlyArray<User>> {
  const res = await fetch("/api/users", { headers: { "accept": "application/json" } });
  if (!res.ok) {
    // Include status for observability; avoid leaking server internals
    throw new Error(`Request failed: ${res.status}`);
  }
  const json = await res.json();
  // Validate and coerce; throws if shape is wrong
  return UsersSchema.parse(json);
}

export function useUsers(): UsersState {
  const { data, error, status } = useQuery({
    queryKey: ["users"],
    queryFn: fetchUsers,
    staleTime: 60_000, // cache for 1 minute
    retry: 2,          // conservative retry policy
  });

  // Map TanStack Query status to a strict discriminated union for the UI
  return useMemo((): UsersState => {
    if (status === "pending") return { status: "loading" };
    if (status === "error") return { status: "error", error: (error as Error).message };
    // At this point data is defined and validated by Zod
    return { status: "success", data: data ?? [] };
  }, [status, error, data]);
}

// Optional: provide a QueryClient at the app root (copy-paste ready)
export const queryClient = new QueryClient();

// In your app root (e.g., src/main.tsx):
// import { createRoot } from "react-dom/client";
// import { QueryClientProvider } from "@tanstack/react-query";
// import { queryClient } from "./hooks/useUsers";
// import { App } from "./App";
// createRoot(document.getElementById("root")!).render(
//   <QueryClientProvider client={queryClient}>
//     <App />
//   </QueryClientProvider>
// );

3) Keep components presentational and accessible

// src/components/UsersList.tsx
import React from "react";
import { useUsers } from "../hooks/useUsers";

// Functional component focuses on rendering and accessibility
export function UsersList(): JSX.Element {
  const state = useUsers();

  if (state.status === "loading") {
    // Keep loading states lightweight and non-blocking
    return <p role="status" aria-live="polite">Loading users...</p>;
  }

  if (state.status === "error") {
    // Display a user-friendly message without revealing internals
    return <p role="alert">Could not load users. Please try again.</p>;
  }

  // Success path: minimal, semantic markup
  return (
    <ul aria-label="Users">
      {state.data.map(u => (
        <li key={u.id}>{u.name} ({u.email})</li>
      ))}
    </ul>
  );
}

4) Optional Azure back end: least-privilege, Managed Identity

This example shows an Azure Functions .NET 8 HTTP API that returns users. It authenticates to Azure Cosmos DB using DefaultAzureCredential and a system-assigned Managed Identity, avoiding connection strings. Assign only the necessary RBAC role.

// FunctionApp/Program.cs (.NET 8 isolated worker)
using Azure.Identity; // DefaultAzureCredential
using Microsoft.Azure.Functions.Worker;
using Microsoft.Azure.Functions.Worker.Http;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.DependencyInjection;
using System.Net;
using Azure.Core;
using Azure.Cosmos; // Azure.Data.Cosmos alternative for .NET SDK

var host = new HostBuilder()
    .ConfigureFunctionsWorkerDefaults()
    .ConfigureServices(services =>
    {
        // Use managed identity via DefaultAzureCredential
        services.AddSingleton<TokenCredential>(_ => new DefaultAzureCredential());

        services.AddSingleton<CosmosClient>(sp =>
        {
            var credential = sp.GetRequiredService<TokenCredential>();
            // Endpoint from configuration (no keys). Use App Settings.
            var endpoint = Environment.GetEnvironmentVariable("COSMOS_ENDPOINT")!;
            return new CosmosClient(endpoint, credential);
        });
    })
    .Build();

await host.RunAsync();

// FunctionApp/GetUsers.cs
using Microsoft.Azure.Functions.Worker;
using Microsoft.Azure.Functions.Worker.Http;
using System.Net;
using Azure.Cosmos;
using System.Text.Json;

namespace FunctionApp;

public class GetUsers(CosmosClient cosmos)
{
    // HTTP-triggered function returning JSON users
    [Function("GetUsers")] 
    public async Task<HttpResponseData> Run(
        [HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = "users")] HttpRequestData req)
    {
        var db = cosmos.GetDatabase("app");
        var container = db.GetContainer("users");

        var iterator = container.GetItemQueryIterator<UserDoc>("SELECT c.id, c.email, c.name FROM c");
        var results = new List<UserDoc>();
        while (iterator.HasMoreResults)
        {
            var page = await iterator.ReadNextAsync();
            results.AddRange(page);
        }

        var res = req.CreateResponse(HttpStatusCode.OK);
        await res.WriteStringAsync(JsonSerializer.Serialize(results));
        res.Headers.Add("Content-Type", "application/json");
        return res;
    }
}

public record UserDoc(string id, string email, string name);

Required RBAC (principle of least privilege): Assign the Function App's system-assigned identity the Cosmos DB Built-in Data Reader role scoped to the specific database or container. Avoid account-level permissions.

5) Minimal IaC for role assignment (Azure Bicep)

// main.bicep: create role assignment for Function App's managed identity
param cosmosAccountId string
param databaseRid string // scope appropriately (e.g., database or container resource ID)
param functionPrincipalId string // Function App's system-assigned identity principalId

resource roleDefinition 'Microsoft.Authorization/roleDefinitions@2022-04-01' existing = {
  scope: subscription()
  name: '00000000-0000-0000-0000-000000000001' // placeholder, replace with Cosmos DB Built-in Data Reader GUID
}

resource roleAssignment 'Microsoft.Authorization/roleAssignments@2022-04-01' = {
  name: guid(functionPrincipalId, databaseRid, roleDefinition.name)
  scope: resourceId('Microsoft.DocumentDB/databaseAccounts/sqlDatabases', cosmosAccountId, 'app')
  properties: {
    roleDefinitionId: roleDefinition.id
    principalId: functionPrincipalId
    principalType: 'ServicePrincipal'
  }
}

Note: Use azd to provision and configure environment variables like COSMOS_ENDPOINT. Never embed secrets or connection strings in code.

6) Wire up the React client to the Azure Function

// src/hooks/useUsers.ts (override the URL to your deployed Function App)
async function fetchUsers(): Promise<ReadonlyArray<User>> {
  const res = await fetch(import.meta.env.VITE_API_BASE + "/users", {
    headers: { accept: "application/json" },
    credentials: "include", // if using auth; otherwise omit
  });
  if (!res.ok) throw new Error(`Request failed: ${res.status}`);
  const json = await res.json();
  return UsersSchema.parse(json);
}

7) Testing hooks and components separately

// tests/useUsers.test.tsx
import { describe, it, expect } from "vitest";
import { QueryClient, QueryClientProvider } from "@tanstack/react-query";
import { renderHook, waitFor } from "@testing-library/react";
import { useUsers } from "../src/hooks/useUsers";

function wrapper({ children }: { children: React.ReactNode }) {
  const client = new QueryClient();
  return <QueryClientProvider client={client}>{children}</QueryClientProvider>;
}

describe("useUsers", () => {
  it("returns success after fetching", async () => {
    global.fetch = async () => new Response(JSON.stringify([]), { status: 200 });
    const { result } = renderHook(() => useUsers(), { wrapper });

    await waitFor(() => {
      expect(result.current.status).toBe("success");
    });
  });
});
// tests/UsersList.test.tsx
import { describe, it, expect } from "vitest";
import { render, screen } from "@testing-library/react";
import { QueryClient, QueryClientProvider } from "@tanstack/react-query";
import { UsersList } from "../src/components/UsersList";

function renderWithQuery(ui: React.ReactElement) {
  const client = new QueryClient();
  return render(<QueryClientProvider client={client}>{ui}</QueryClientProvider>);
}

describe("UsersList", () => {
  it("renders loading state", () => {
    renderWithQuery(<UsersList />);
    expect(screen.getByRole("status")).toHaveTextContent(/loading/i);
  });
});

Best Practices & Security

Hooks own side effects; components remain pure and predictable. Validate all external data with Zod and use strict TypeScript to catch issues at compile time. For Azure, prefer Managed Identity with DefaultAzureCredential and apply the smallest RBAC scope required. Keep API base URLs and configuration in environment variables managed by azd or your CI/CD system, not in source. For database reads with Entity Framework in .NET APIs, use AsNoTracking() to avoid unnecessary change tracking.

Summary

Hooks encapsulate reusable logic, I/O, and validation, while components render UI and stay testable. Strong typing with Zod and discriminated unions keeps state explicit and safe. Azure integration is secure with Managed Identity, least-privilege RBAC, and IaC via azd or Bicep.

SharePoint List vs Library: Key Differences, Use Cases, and Best Practices

Overview: What’s the Difference Between a List and a Library in SharePoint?

The primary question many teams ask is the difference between list and library in SharePoint. In simple terms, a SharePoint list manages rows of data (like a table), while a SharePoint document library manages files and their metadata. Understanding how they differ helps you choose the right container for your content and build a scalable information architecture.

Core Definitions

What is a SharePoint List?

A list stores structured data as items, similar to a spreadsheet or database table. Each item contains columns (text, number, choice, date, person, lookup, etc.). Lists are ideal for tracking processes and records that are not file-based.

  • Examples: Issue tracker, asset inventory, change requests, event registrations.
  • Typical columns: Status, Priority, Due Date, Assigned To, Category.

What is a SharePoint Document Library?

A document library stores files (documents, images, PDFs) plus metadata about those files. Libraries are designed for document-centric collaboration with rich file features.

  • Examples: Policies and procedures, project documents, design assets, client deliverables.
  • Typical metadata: Document Type, Owner, Project, Department, Confidentiality.

Key Differences at a Glance

  • Primary content: Lists store items (rows of data); libraries store files with metadata.
  • File handling: Libraries support check-in/out, file previews, co-authoring, and Office integration; lists don’t need file operations.
  • Versioning: Lists track item versions; libraries track both file and metadata versions with richer controls.
  • Templates & content types: Libraries often use document content types (e.g., Policy, Contract) with specific templates; lists use item content types.
  • Views & formatting: Both support custom views and conditional formatting; libraries add file-centric filters (e.g., by file type).
  • Automation: Both integrate with Power Automate; libraries frequently use flows for approvals and publishing.
  • Permissions: Both support unique permissions; libraries commonly secure folders or documents for compliance.

When to Use a List vs. a Library

Choose a List When

  • You track structured records without needing to store a file per record.
  • You need form-based data entry and validation across many columns.
  • You want lightweight workflows for requests, approvals, or status tracking.
  • You plan to integrate with Power Apps to build a data-driven app.

Choose a Library When

  • Your primary asset is a file (Word, Excel, PowerPoint, PDF, image, CAD).
  • You need co-authoring, track changes, and document version history.
  • You require document sets to group related files with shared metadata.
  • You want retention labels, records management, and approval workflows.

Practical Examples

Example 1: IT Asset Tracking (List)

Create a list with columns such as Asset Tag (single line), Model (choice), Assigned To (person), Purchase Date (date), Warranty Expiry (date), and Status (choice). Build views for “Assigned” and “In Repair”. Automate notifications when Warranty Expiry is within 30 days.

Example 2: Policy Management (Library)

Use a library with metadata: Policy Type (choice), Owner (person), Review Cycle (choice), Effective Date (date), Compliance Tag (choice). Enable major/minor versioning, check-out, and an approval flow. Use views for “Pending Review” and “Effective Policies.”

Example 3: Project Delivery Docs (Library with Document Sets)

Create a library using Document Sets for each project. Metadata like Client, Project Manager, Phase, and Confidentiality classify files. Configure folders or sets with unique permissions for client-specific access.

Power Features and Governance

Versioning and Check-In/Out

Libraries provide robust versioning for files, enabling approval, drafts, and rollbacks. Lists also version items, which is useful for audit trails on data changes.

Metadata and Content Types

Both support custom columns and content types. Use site columns to enforce consistency across sites. For libraries, align document content types with templates and approval policies.

Views, Filters, and Formatting

Use views like Group By, conditional formatting, and filters to surface relevant content. In libraries, combine metadata-driven navigation with pinned filters to flatten folder hierarchies.

Automation and Integrations

Leverage Power Automate for alerts, approvals, and review reminders. Use Power Apps to create forms for lists (e.g., requests), and Office desktop/web apps for library co-authoring.

Performance and Limits

  • Thresholds: Both are affected by the list view threshold (commonly 5,000 items for certain operations). Use indexed columns and filtered views to scale.
  • File handling: Libraries include file size limits and supported types; consider chunked uploads and OneDrive sync for large files.

Security and Compliance

  • Apply sensitivity labels and retention labels to libraries holding regulated documents.
  • Use unique permissions sparingly; favor SharePoint groups and inheritance to keep access manageable.
  • Enable auditing in Purview/M365 for critical lists and libraries.

Quick Decision Guide

  • If you primarily manage data records without files, choose a List.
  • If you primarily manage files and need collaboration features, choose a Library.
  • Combine both when needed: store requests in a list and link to documents in a library via lookup columns.

Best Practices

  • Design metadata first to enable better search, filters, and governance.
  • Favor views over deep folders, especially in libraries.
  • Standardize with site columns and content types for consistency.
  • Document naming conventions and permissions to reduce confusion.
  • Train users on co-authoring, versioning, and approvals in libraries.

FAQ

Can a list store files?

Lists can include an attachment per item, but this is limited and lacks rich document management features. For file-centric work, use a library.

Can I convert a list to a library?

No direct conversion exists. Instead, create a library, migrate files, and map metadata. Keep the list for tracking if needed.

Do both support Power Automate?

Yes. Triggers and actions exist for both list items and library documents, enabling approvals, notifications, and archival flows.

What Is a Document Set in SharePoint? Definition, Benefits, and Best Practices

What Is a Document Set in SharePoint?

A Document Set in SharePoint is a special content type that lets you manage multiple related documents as a single unit. Think of it like a project or case folder with its own metadata, shared versioning, and standardized templates that apply to every file inside. Document Sets streamline document management by grouping files that belong together—such as proposals, briefs, and reports—so teams can work consistently and efficiently.

Key Benefits of Using Document Sets

  • Unified metadata: Apply shared properties (e.g., Client, Project ID, Case Number) to the entire set and inherit them across all documents.
  • Consistent templates: Start each set with predefined document templates (like a cover sheet, briefing note, and checklist) to enforce standards.
  • Batch operations: Move, copy, share, or archive the entire set as one unit, reducing manual steps and errors.
  • Versioning at set level: Capture milestones of the whole set, not just individual files, for complete auditability.
  • Improved governance: Centrally control content types, policies, and workflows for entire document collections.
  • Better findability: Search and filter by shared metadata so related files surface together.
  • Repeatable processes: Package best-practice structure into a reusable set for repeat scenarios.

Real-World Examples

Marketing Campaign Kit

  • Templates: Creative brief, timeline, asset checklist, budget sheet.
  • Shared metadata: Campaign name, region, launch date, product line.
  • Outcome: Faster kickoff and consistent deliverables across teams.

Client Project Workspace

  • Templates: Statement of Work, Project Plan, Risk Log, Status Report.
  • Shared metadata: Client, Project ID, Account Manager, Phase.
  • Outcome: Centralized visibility and fewer filing mistakes.

Legal Case File

  • Templates: Case summary, evidence index, correspondence log.
  • Shared metadata: Case number, matter type, jurisdiction, confidentiality level.
  • Outcome: Strong compliance and easier audits.

How Document Sets Work

Document Sets are built on SharePoint content types. You enable the Document Set feature, create a new Document Set content type, assign templates and metadata, and add it to a library. Users then create a new set just like they would create a new folder—except it comes preconfigured with rules, templates, and shared properties.

Step-by-Step: Setting Up a Document Set

  • Enable the feature: Ensure the Document Set feature is activated at the site collection level (SharePoint Online has it available by default in most scenarios).
  • Create a content type: In Site Settings, create a new content type that inherits from Document Set.
  • Define metadata: Add site columns (e.g., Client, Project ID) that will apply across the set.
  • Add templates: Upload starter files (DOCX, XLSX, PPTX, etc.) to the Document Set so each new set is pre-populated.
  • Configure welcome page: Customize the Document Set home (welcome) page to guide users with instructions, links, and key properties.
  • Add to library: Add your Document Set content type to the target document library and set it as default if desired.
  • Permissions and policies: Apply permissions, retention labels, and workflows as needed.

Best Practices for SharePoint Document Sets

  • Design metadata first: Standardize site columns and content types to avoid future refactoring.
  • Keep it simple: Limit required fields to what users can reliably fill in during creation.
  • Template discipline: Use a minimal, approved set of templates to avoid clutter and confusion.
  • Automate where possible: Use Power Automate to create sets from requests, populate metadata, or move to an archive library at project close.
  • Govern naming: Enforce naming conventions (e.g., PROJ-1234 - Client - Phase) via guidance or automation.
  • Secure the set: If needed, break inheritance on the set to restrict access, but use sparingly to reduce admin overhead.
  • Train and document: Provide a short guide on when to use Document Sets vs. folders or standard libraries.

When to Use Document Sets vs. Alternatives

  • Use Document Sets when: You need shared metadata, standardized templates, and milestone versioning across multiple related files.
  • Use standard folders when: You only need lightweight grouping without metadata or templates.
  • Use separate libraries when: You need distinct permissions, advanced retention, or unique workflows per group.

Limitations and Considerations

  • Sync and OneDrive: Document Sets behave like folders in sync clients, but advanced features (welcome page) are web-only.
  • M365 sensitivity labels: Apply labels thoughtfully at the library or item level to avoid conflicts with set-level permissions.
  • Migrations: Ensure your migration tool supports Document Sets, content types, and metadata mapping.
  • External sharing: Verify sharing policies; sharing a set exposes all items inside.
  • Mobile experience: Core functions work, but configuration and welcome page customization are best on web.

Quick FAQ

Is a Document Set the same as a folder?

No. While it looks like a folder, a Document Set adds shared metadata, templates, a welcome page, and set-level versioning and policies.

Can I use approvals and workflows?

Yes. You can trigger flows on set creation, status changes, or on items within the set using Power Automate.

Does search recognize Document Sets?

Yes. Shared properties help group results, and you can refine search by Document Set metadata.

Conclusion

Document Sets in SharePoint provide a structured, repeatable way to manage related content with consistent metadata, templates, and lifecycle governance. When designed thoughtfully, they reduce errors, accelerate delivery, and improve compliance across projects, cases, and campaigns.

Saturday, 24 January 2026

Integrate bootstrap and jquery in spfx


1. Create your solution with "No JavaScript Framework"

2. Install below bootsrap and jquery using npm command

npm install jquery --save
npm install @types/jquery --save
npm install bootstrap --save
npm install @types/bootstrap --save
npm install @types/jqueryui --save

3. add reference on webpart where you want to call bootstrap and jquery

import * as jQuery from 'jquery';
import * as bootstrap from 'bootstrap';
import 'jquery/dist/jquery.min.js';
import 'bootstrap/dist/css/bootstrap.css';
import 'bootstrap/dist/js/bootstrap.js';

4. suppose that we have to use bootstap modal  then we have paste below HTML inside  render method

<button type="button" id="btnModel" class="${styles.button}">Open Modal</button>
<div class="modal fade" id="myModal" role="dialog">
    <div class="modal-dialog">
        <div class="modal-content">
            <div class="modal-header">
                <button type="button" class="close" data-dismiss="modal">&times;</button>
                <h4 class="modal-title">Modal Header</h4>
            </div>
            <div class="modal-body">
                <p>Welcome bootstrap model popup in sharepoint framework client side webpart</p>
            </div>
            <div class="modal-footer">
                <button type="button" class="btn btn-default" data-dismiss="modal">Close</button>
            </div>
        </div>
    </div>
</div>



5. Create a funcntion to open the modal popup on button click event

private PageLoad():void{
jQuery("#btnModel").click(f=>{
  jQuery("#myModal").modal("show");
});

6. PageLoad() Function inside Render method after HTML binding

this.PageLoad();






Send Birthday Email from SharePoint online using Flow


Below are steps to configure the flow to send the birthday email from SharePoint List.


1.Configure the Recurrence like in below screen

2.Get items from SharePoint List



3.Use foreach activity on every item to check the condition and send the email


4.Configure Condition Activity like below code
Left Condition

if(equals(items('Apply_to_each')?['DateOfBirth'],null),'',formatDateTime(items('Apply_to_each')?['DateOfBirth'], 'MM/dd'))

is equal to
Right Condition
formatDateTime(utcNow(), 'MM/dd')

4.Send Email if condition is true