Showing posts with label Azure. Show all posts
Showing posts with label Azure. Show all posts

Sunday, 22 February 2026

Surface What’s New: Power Apps integration with a secure .NET 8 API for MCP server updates

The question what is new in powerapps for MCP server lacks a precise product definition, so rather than speculating on features, this guide shows how to reliably surface and version "What’s New" updates into Power Apps using a secure .NET 8 backend, strict TypeScript validation, and Azure-native security. You will get a production-ready pattern that Power Apps can call via a Custom Connector, keeping your release notes current without manual edits.

The Problem

Teams need a trustworthy way to display "What’s New" for MCP server in Power Apps, but the upstream source and format of updates can change. Hardcoding content or querying unsecured endpoints leads to drift, security gaps, and poor developer experience.

Prerequisites

  • .NET 8 SDK
  • Node.js 20+ and a package manager (pnpm/npm)
  • Azure subscription with permissions to create: Azure Functions or Container Apps, Key Vault, Azure API Management, Application Insights
  • Entra ID app registration for the API (OAuth 2.0)
  • Power Apps environment (to build a Custom Connector)

The Solution (Step-by-Step)

1) .NET 8 Minimal API that normalizes "What’s New" items

This minimal API demonstrates production-ready design: DI-first, HttpClientFactory, global exception handling, validation, versioned contract, and managed identity for downstream access.

// Program.cs (.NET 8, file-scoped namespace, minimal API, DI-centric)
using System.Net.Http.Json;
using System.Text.Json;
using System.Text.Json.Serialization;
using Azure.Identity; // DefaultAzureCredential for managed identity
using Azure.Security.KeyVault.Secrets;
using Microsoft.AspNetCore.Diagnostics;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Azure;

var builder = WebApplication.CreateBuilder(args);

// Strongly typed options for upstream source configuration
builder.Services.Configure<NewsOptions>(builder.Configuration.GetSection("News"));

// HttpClient with resilient handler
builder.Services.AddHttpClient<INewsClient, NewsClient>()
    .SetHandlerLifetime(TimeSpan.FromMinutes(5));

// Azure clients via DefaultAzureCredential (uses Managed Identity in Azure)
builder.Services.AddAzureClients(azure =>
{
    azure.UseCredential(new DefaultAzureCredential());
    var kvUri = builder.Configuration["KeyVaultUri"]; // e.g., https://my-kv.vault.azure.net/
    if (!string.IsNullOrWhiteSpace(kvUri))
    {
        azure.AddSecretClient(new Uri(kvUri));
    }
});

// Application Insights (OpenTelemetry auto-collection can also be used)
builder.Services.AddApplicationInsightsTelemetry();

var app = builder.Build();

// Global exception handler producing problem+json
app.UseExceptionHandler(errorApp =>
{
    errorApp.Run(async context =>
    {
        var feature = context.Features.Get<IExceptionHandlerPathFeature>();
        var problem = new ProblemDetails
        {
            Title = "Unexpected error",
            Detail = app.Environment.IsDevelopment() ? feature?.Error.ToString() : "",
            Status = StatusCodes.Status500InternalServerError,
            Type = "https://httpstatuses.com/500"
        };
        context.Response.StatusCode = problem.Status ?? 500;
        context.Response.ContentType = "application/problem+json";
        await context.Response.WriteAsJsonAsync(problem);
    });
});

app.MapGet("/health", () => Results.Ok(new { status = "ok" }));

// Versioned route for the normalized news feed (v1)
app.MapGet("/api/v1/news", async (
    INewsClient client
) => Results.Ok(await client.GetNewsAsync()))
.WithName("GetNewsV1")
.Produces<NewsItemV1[]>(StatusCodes.Status200OK)
.ProducesProblem(StatusCodes.Status500InternalServerError);

app.Run();

// Options to control the upstream feed location and parsing mode
public sealed class NewsOptions
{
    public string? SourceUrl { get; init; } // Upstream JSON or RSS converted via a worker
    public string Format { get; init; } = "json"; // json|rss (extend as needed)
}

// Public DTO contract exposed to Power Apps via Custom Connector
public sealed class NewsItemV1
{
    public required string Id { get; init; } // stable identifier
    public required string Title { get; init; }
    public required string Summary { get; init; }
    public required DateTimeOffset PublishedAt { get; init; }
    public string? Category { get; init; } // optional taxonomy
    public string? Url { get; init; } // link to detail page
}

// Client interface for fetching and normalizing upstream data
public interface INewsClient
{
    Task<NewsItemV1[]> GetNewsAsync(CancellationToken ct = default);
}

// Implementation that reads upstream source, validates, and normalizes
public sealed class NewsClient(HttpClient http, Microsoft.Extensions.Options.IOptions<NewsOptions> options) : INewsClient
{
    private readonly HttpClient _http = http;
    private readonly NewsOptions _opts = options.Value;
    private static readonly JsonSerializerOptions JsonOptions = new()
    {
        PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
        DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull
    };

    public async Task<NewsItemV1[]> GetNewsAsync(CancellationToken ct = default)
    {
        if (string.IsNullOrWhiteSpace(_opts.SourceUrl))
            throw new InvalidOperationException("News:SourceUrl is not configured.");

        // Fetch upstream JSON and map to a stable contract consumed by Power Apps
        var upstreamItems = await _http.GetFromJsonAsync<UpstreamItem[]>(_opts.SourceUrl, JsonOptions, ct)
            ?? Array.Empty<UpstreamItem>();

        // Normalize and order by publish date desc
        return upstreamItems
            .Select(u => new NewsItemV1
            {
                Id = u.Id ?? Guid.NewGuid().ToString("n"),
                Title = u.Title ?? "Untitled",
                Summary = u.Summary ?? string.Empty,
                PublishedAt = u.PublishedAt == default ? DateTimeOffset.UtcNow : u.PublishedAt,
                Category = u.Category,
                Url = u.Url
            })
            .OrderByDescending(n => n.PublishedAt)
            .ToArray();
    }

    // Internal model matching the upstream source (keep flexible)
    private sealed class UpstreamItem
    {
        public string? Id { get; init; }
        public string? Title { get; init; }
        public string? Summary { get; init; }
        public DateTimeOffset PublishedAt { get; init; }
        public string? Category { get; init; }
        public string? Url { get; init; }
    }
}

Configuration (appsettings.json):

{
  "News": {
    "SourceUrl": "https://<your-source>/mcp-news.json",
    "Format": "json"
  },
  "KeyVaultUri": "https://<your-kv>.vault.azure.net/"
}

Pro-Tip: Use AsNoTracking() in Entity Framework when performing read-only queries to improve performance.

2) Secure Azure deployment with Managed Identity and API Management

  • Deploy as Azure Functions (isolated) or Azure Container Apps. Enable System-Assigned Managed Identity.
  • Expose through Azure API Management with OAuth 2.0 (Entra ID) for inbound auth. Create a Power Apps Custom Connector pointing to APIM.

Required RBAC roles (assign to the managed identity or DevOps service principal):

  • Key Vault Secrets User (to read secrets if you store upstream source credentials or API keys)
  • App Configuration Data Reader (if using Azure App Configuration instead of appsettings)
  • API Management Service Contributor (to publish and manage the API surface)
  • Monitoring Reader (to view Application Insights telemetry)
  • Storage Blob Data Reader (only if the upstream source is in Azure Storage)

Pro-Tip: Favor Managed Identity and DefaultAzureCredential across services; avoid connection strings and embedded secrets entirely.

3) Strict TypeScript models with Zod and versioning

The client schema mirrors the v1 API and can evolve with v2+ while keeping backward compatibility in Power Apps and React.

// news.schema.ts (TypeScript, strict mode)
import { z } from "zod";

// Discriminated union enables future breaking changes with clear versioning
export const NewsItemV1 = z.object({
  id: z.string().min(1),
  title: z.string().min(1),
  summary: z.string().default(""),
  publishedAt: z.string().datetime(),
  category: z.string().optional(),
  url: z.string().url().optional()
});

export const NewsResponseV1 = z.array(NewsItemV1);

export type TNewsItemV1 = z.infer<typeof NewsItemV1>;
export type TNewsResponseV1 = z.infer<typeof NewsResponseV1>;

// Future-proof: union for versioned responses
export const NewsResponse = z.union([
  z.object({ version: z.literal("v1"), data: NewsResponseV1 })
]);
export type TNewsResponse = z.infer<typeof NewsResponse>;

4) React 19 component using TanStack Query

Functional component with error and loading states, plus Zod runtime validation.

// NewsPanel.tsx (React 19)
import React from "react";
import { useQuery } from "@tanstack/react-query";
import { NewsResponseV1, type TNewsItemV1 } from "./news.schema";

async function fetchNews(): Promise<TNewsItemV1[]> {
  const res = await fetch("/api/v1/news", { headers: { Accept: "application/json" } });
  if (!res.ok) throw new Error(`Failed: ${res.status}`);
  const json = await res.json();
  const parsed = NewsResponseV1.safeParse(json);
  if (!parsed.success) throw new Error("Schema validation failed");
  return parsed.data;
}

export function NewsPanel(): JSX.Element {
  const { data, error, isLoading } = useQuery({
    queryKey: ["news", "v1"],
    queryFn: fetchNews,
    staleTime: 60_000
  });

  if (isLoading) return <div>Loading updates…</div>;
  if (error) return <div>Failed to load updates.</div>;

  return (
    <ul>
      {data!.map(item => (
        <li key={item.id}>
          <strong>{item.title}</strong> — {new Date(item.publishedAt).toLocaleDateString()}<br />
          {item.summary}
        </li>
      ))}
    </ul>
  );
}

5) Power Apps Custom Connector

Create a Custom Connector targeting the APIM endpoint /api/v1/news. Map the response to your app data schema. Add the connector to your Power App and display the items in a gallery. When the upstream feed changes, you only update the backend normalizer, not the app.

Best Practices & Security

  • Authentication: Use Entra ID for APIM inbound auth. Backend-to-Azure uses DefaultAzureCredential with Managed Identity.
  • Secrets: Store upstream tokens in Key Vault; assign Key Vault Secrets User to the app’s managed identity.
  • Telemetry: Enable Application Insights. Track request IDs and dependency calls for the upstream fetch.
  • Authorization: Restrict APIM access via OAuth scopes and, if needed, IP restrictions or rate limits.
  • Resilience: Configure retry and timeout on HttpClient with sensible limits; add circuit breakers if the upstream is unreliable.
  • Versioning: Pin /api/v1/news; introduce /api/v2/news when the contract changes. Version TypeScript schemas alongside API versions.

App Insights integration (example):

// Add to Program.cs before app.Run(); ensure Application Insights is enabled
app.Use(async (ctx, next) =>
{
    // Correlate requests with upstream calls via trace IDs
    ctx.Response.Headers["Request-Id"] = System.Diagnostics.Activity.Current?.Id ?? Guid.NewGuid().ToString();
    await next();
});

Testing strategy:

  • API: Unit test NewsClient with mocked HttpMessageHandler; integration test /api/v1/news in-memory.
  • TypeScript: Schema tests to ensure validation rejects malformed payloads; component tests for loading/error states.
  • Contract: Add a CI step that fetches a sample payload from the upstream and validates against NewsResponseV1.

Security roles recap:

  • API surface: API Management Service Contributor
  • Secrets: Key Vault Secrets User
  • Config (if used): App Configuration Data Reader
  • Monitoring: Monitoring Reader
  • Storage (if used): Storage Blob Data Reader

Pro-Tip: Use APIM policies (validate-content, rate-limit-by-key) to protect the API consumed by Power Apps.

Summary

  • Do not guess "what’s new" for MCP server; centralize updates behind a stable, versioned API.
  • Use Managed Identity, DefaultAzureCredential, and APIM OAuth to secure end-to-end access for Power Apps.
  • Validate with Zod, monitor with Application Insights, and evolve safely via API/schema versioning.

Tuesday, 20 January 2026

Dataverse CRUD with a .NET 8 Console App using Managed Identity, DI, and Polly

Dataverse CRUD with a .NET 8 console app: this guide shows how to build a production-ready C# application that performs create, read, update, and delete operations against Microsoft Dataverse using Dependency Injection, DefaultAzureCredential, resilient retries (Polly), and clean configuration. You get copy-pasteable code, least-privilege security guidance, and optional Azure Container Apps + Key Vault deployment via azd/Bicep.

The Problem

You need a reliable way to run Dataverse CRUD from a console app without hardcoded secrets, avoiding static helpers, and ensuring the code is cloud-ready and testable. Many examples skip DI, retries, and security, leading to brittle apps.

Prerequisites

.NET 8 SDK, Azure CLI v2.58+, Azure Developer CLI (azd) v1.9+, Power Platform tooling access to create an Application User in Dataverse mapped to your app's Service Principal or Managed Identity, A Dataverse environment URL (e.g., https://yourorg.crm.dynamics.com)

The Solution (Step-by-Step)

1) Project setup and configuration

Create a new console app and add packages.

// Bash: create project and add packages
 dotnet new console -n DataverseCrud.Console --framework net8.0
 cd DataverseCrud.Console
 dotnet add package Microsoft.PowerPlatform.Dataverse.Client --version 1.1.27
 dotnet add package Azure.Identity --version 1.11.3
 dotnet add package Polly.Extensions.Http --version 3.0.0
 dotnet add package Microsoft.Extensions.Http.Polly --version 8.0.1

Add appsettings.json for configuration (no secrets).

{
  "Dataverse": {
    "OrganizationUrl": "https://yourorg.crm.dynamics.com",
    "ApiVersion": "v9.2", 
    "DefaultTableLogicalName": "account"
  },
  "Logging": {
    "LogLevel": {
      "Default": "Information",
      "Microsoft": "Warning"
    }
  }
}

Pro-Tip: Keep ApiVersion configurable so you can upgrade Dataverse API without code changes.

2) Implement the Generic Host with DI, typed options, and retry policies

Program.cs uses file-scoped namespaces, DI, and structured logging via Microsoft.Extensions.Logging. Polly adds resilient retries for transient failures.

using System.Net.Http.Headers;
using Azure.Core;
using Azure.Identity;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using Polly;
using Polly.Extensions.Http;
using Microsoft.PowerPlatform.Dataverse.Client;

namespace DataverseCrud.ConsoleApp;

// Options record for strongly-typed config
public sealed record DataverseOptions(string OrganizationUrl, string ApiVersion, string DefaultTableLogicalName);

// Abstraction for testability
public interface IDataverseRepository
{
    // CRUD signatures using basic types for clarity
    Task<Guid> CreateAsync(string tableLogicalName, IDictionary<string, object> attributes, CancellationToken ct);
    Task<Dictionary<string, object>?> RetrieveAsync(string tableLogicalName, Guid id, string columnsCsv, CancellationToken ct);
    Task UpdateAsync(string tableLogicalName, Guid id, IDictionary<string, object> attributes, CancellationToken ct);
    Task DeleteAsync(string tableLogicalName, Guid id, CancellationToken ct);
}

// Concrete repository using ServiceClient
public sealed class DataverseRepository(IOptions<DataverseOptions> options, ILogger<DataverseRepository> logger, ServiceClient serviceClient) : IDataverseRepository
{
    private readonly DataverseOptions _opt = options.Value;
    private readonly ILogger<DataverseRepository> _logger = logger;
    private readonly ServiceClient _client = serviceClient;

    // Create a row in Dataverse
    public async Task<Guid> CreateAsync(string tableLogicalName, IDictionary<string, object> attributes, CancellationToken ct)
    {
        // Convert to Entity representation
        var entity = new Microsoft.Xrm.Sdk.Entity(tableLogicalName);
        foreach (var kvp in attributes)
            entity.Attributes[kvp.Key] = kvp.Value;

        var id = await _client.CreateAsync(entity, ct).ConfigureAwait(false);
        _logger.LogInformation("Created {Table} record with Id {Id}", tableLogicalName, id);
        return id;
    }

    // Retrieve a row with selected columns
    public async Task<Dictionary<string, object>?> RetrieveAsync(string tableLogicalName, Guid id, string columnsCsv, CancellationToken ct)
    {
        var columns = new Microsoft.Xrm.Sdk.Query.ColumnSet(columnsCsv.Split(',', StringSplitOptions.TrimEntries | StringSplitOptions.RemoveEmptyEntries));
        var entity = await _client.RetrieveAsync(tableLogicalName, id, columns, ct).ConfigureAwait(false);
        if (entity == null)
        {
            _logger.LogWarning("Entity not found: {Table}/{Id}", tableLogicalName, id);
            return null;
        }
        return entity.Attributes.ToDictionary(k => k.Key, v => v.Value);
    }

    // Update specific attributes
    public async Task UpdateAsync(string tableLogicalName, Guid id, IDictionary<string, object> attributes, CancellationToken ct)
    {
        var entity = new Microsoft.Xrm.Sdk.Entity(tableLogicalName) { Id = id };
        foreach (var kvp in attributes)
            entity.Attributes[kvp.Key] = kvp.Value;

        await _client.UpdateAsync(entity, ct).ConfigureAwait(false);
        _logger.LogInformation("Updated {Table} record {Id}", tableLogicalName, id);
    }

    // Delete a row
    public async Task DeleteAsync(string tableLogicalName, Guid id, CancellationToken ct)
    {
        await _client.DeleteAsync(tableLogicalName, id, ct).ConfigureAwait(false);
        _logger.LogInformation("Deleted {Table} record {Id}", tableLogicalName, id);
    }
}

public class Program
{
    public static async Task Main(string[] args)
    {
        using var host = Host.CreateApplicationBuilder(args);

        // Bind options from configuration
        host.Services.AddOptions<DataverseOptions>()
            .Bind(host.Configuration.GetSection("Dataverse"))
            .ValidateDataAnnotations()
            .Validate(opt => !string.IsNullOrWhiteSpace(opt.OrganizationUrl) && !string.IsNullOrWhiteSpace(opt.ApiVersion));

        host.Services.AddLogging(builder =>
        {
            builder.AddSimpleConsole(o =>
            {
                o.TimestampFormat = "yyyy-MM-dd HH:mm:ss ";
                o.SingleLine = true;
            });
        });

        // Configure resilient HTTP with Polly for Dataverse traffic
        host.Services.AddHttpClient("dataverse")
            .SetHandlerLifetime(TimeSpan.FromMinutes(5))
            .AddPolicyHandler(GetRetryPolicy());

        // Register ServiceClient using DefaultAzureCredential (prefers Managed Identity in Azure)
        host.Services.AddSingleton(sp =>
        {
            var opts = sp.GetRequiredService<IOptions<DataverseOptions>>().Value;
            var credential = new DefaultAzureCredential(new DefaultAzureCredentialOptions
            {
                // In production on Azure Container Apps/VMs, Managed Identity will be used.
                // Locally, DeveloperCredential/VisualStudioCredential will be used.
                ExcludeManagedIdentityCredential = false
            });

            var svc = new ServiceClient(new Uri(opts.OrganizationUrl), credential, useUniqueInstance: true, useWebApi: true);
            // Configure ServiceClient API version if supported/necessary by environment
            svc.SdkVersion = opts.ApiVersion; // Documented: keep version configurable
            return svc;
        });

        host.Services.AddSingleton<IDataverseRepository, DataverseRepository>();

        var app = host.Build();

        // Execute a CRUD flow
        var logger = app.Services.GetRequiredService<ILogger<Program>>();
        var repo = app.Services.GetRequiredService<IDataverseRepository>();
        var cfg = app.Services.GetRequiredService<IOptions<DataverseOptions>>().Value;

        using var cts = new CancellationTokenSource(TimeSpan.FromMinutes(2));

        try
        {
            // 1) Create
            var accountId = await repo.CreateAsync(cfg.DefaultTableLogicalName, new Dictionary<string, object>
            {
                ["name"] = "Contoso (Console App)",
                ["telephone1"] = "+1-425-555-0100"
            }, cts.Token);

            // 2) Retrieve
            var data = await repo.RetrieveAsync(cfg.DefaultTableLogicalName, accountId, "name,telephone1", cts.Token);
            logger.LogInformation("Retrieved: {Data}", string.Join("; ", data!.Select(kv => $"{kv.Key}={kv.Value}")));

            // 3) Update
            await repo.UpdateAsync(cfg.DefaultTableLogicalName, accountId, new Dictionary<string, object>
            {
                ["telephone1"] = "+1-425-555-0101"
            }, cts.Token);

            // 4) Delete
            await repo.DeleteAsync(cfg.DefaultTableLogicalName, accountId, cts.Token);

            logger.LogInformation("CRUD flow completed successfully.");
        }
        catch (Exception ex)
        {
            logger.LogError(ex, "Failure during CRUD flow");
            Environment.ExitCode = 1;
        }

        await app.StopAsync();
    }

    // Exponential backoff for transient HTTP 5xx/429 responses
    private static IAsyncPolicy<HttpResponseMessage> GetRetryPolicy() =>
        HttpPolicyExtensions
            .HandleTransientHttpError()
            .OrResult(r => (int)r.StatusCode == 429)
            .WaitAndRetryAsync(5, retryAttempt => TimeSpan.FromSeconds(Math.Pow(2, retryAttempt)));
}

Pro-Tip: Add structured properties to logs (e.g., table, id) for easier querying in Application Insights.

3) Dataverse authentication without secrets

The ServiceClient uses DefaultAzureCredential, which in Azure will use Managed Identity. In local development, it falls back to developer credentials. You must create an Application User in Dataverse mapped to the Service Principal or Managed Identity and assign the least-privilege Dataverse security role needed for CRUD on targeted tables.

Pro-Tip: Scope Dataverse privileges to only the tables and operations your app requires.

4) Optional: Run in Azure Container Apps with Managed Identity (IaC via azd/Bicep)

If you want to run this console app on a schedule or on-demand in Azure, deploy to Azure Container Apps with a System-Assigned Managed Identity. Use Key Vault to hold non-secret config like OrganizationUrl and ApiVersion for central governance.

// azure.yaml for azd (simplified)
name: dataverse-crud
metadata:
  template: dataverse-crud
services:
  app:
    project: ./
    language: dotnet
    host: containerapp
    docker:
      path: ./Dockerfile
      context: ./

// main.bicep (simplified): Container App + Key Vault + MI and RBAC
param location string = resourceGroup().location
param appName string = 'dataverse-crud-app'
param kvName string = 'dvcrud-kv-${uniqueString(resourceGroup().id)}'

resource kv 'Microsoft.KeyVault/vaults@2023-07-01' = {
  name: kvName
  location: location
  properties: {
    tenantId: subscription().tenantId
    sku: { family: 'A', name: 'standard' }
    accessPolicies: [] // Use RBAC rather than access policies
    enableRbacAuthorization: true
  }
}

resource caEnv 'Microsoft.App/managedEnvironments@2023-05-01' = {
  name: 'dvcrud-env'
  location: location
}

resource app 'Microsoft.App/containerApps@2023-05-01' = {
  name: appName
  location: location
  identity: {
    type: 'SystemAssigned'
  }
  properties: {
    managedEnvironmentId: caEnv.id
    template: {
      containers: [
        {
          name: 'app'
          image: 'ghcr.io/yourrepo/dataverse-crud:latest'
          env: [
            // Pull configuration from Key Vault via secrets or set as plain env if non-secret
            { name: 'Dataverse__OrganizationUrl', value: 'https://yourorg.crm.dynamics.com' },
            { name: 'Dataverse__ApiVersion', value: 'v9.2' },
            { name: 'Dataverse__DefaultTableLogicalName', value: 'account' }
          ]
        }
      ]
    }
  }
}

// Grant the Container App's Managed Identity access to Key Vault secrets if used for configuration
resource kvAccess 'Microsoft.Authorization/roleAssignments@2022-04-01' = {
  name: guid(kv.id, 'kv-secrets-user', app.name)
  scope: kv
  properties: {
    roleDefinitionId: subscriptionResourceId('Microsoft.Authorization/roleDefinitions', '4633458b-17de-408a-b874-0445c86b69e6') // Key Vault Secrets User
    principalId: app.identity.principalId
    principalType: 'ServicePrincipal'
  }
}

Azure RBAC roles used here: Key Vault Secrets User on the vault to read secrets if you store any configuration as secrets. The Container App requires no additional Azure RBAC to call Dataverse; it authenticates via its Managed Identity to Entra ID, and Dataverse access is governed by the Application User and Dataverse security role assignments.

Pro-Tip: Prefer System-Assigned Managed Identity for lifecycle simplicity unless you need identity reuse across apps, in which case use a User-Assigned Managed Identity.

5) Testing and mocking

Abstract your repository behind IDataverseRepository to unit test business logic without hitting Dataverse. For integration tests, use a test environment with a restricted Dataverse security role.

// Example unit test using a fake repository
public sealed class FakeRepo : IDataverseRepository
{
    public List<string> Calls { get; } = new();
    public Task<Guid> CreateAsync(string t, IDictionary<string, object> a, CancellationToken ct) { Calls.Add("create"); return Task.FromResult(Guid.NewGuid()); }
    public Task<Dictionary<string, object>?> RetrieveAsync(string t, Guid id, string cols, CancellationToken ct) { Calls.Add("retrieve"); return Task.FromResult<Dictionary<string, object>?>(new()); }
    public Task UpdateAsync(string t, Guid id, IDictionary<string, object> a, CancellationToken ct) { Calls.Add("update"); return Task.CompletedTask; }
    public Task DeleteAsync(string t, Guid id, CancellationToken ct) { Calls.Add("delete"); return Task.CompletedTask; }
}

Pro-Tip: Use dependency injection to pass a fake or mock implementation into services that depend on IDataverseRepository.

Best Practices & Security

Authentication and secrets: Use DefaultAzureCredential. Do not embed client secrets or connection strings. In Azure, rely on Managed Identity. Locally, authenticate via developer credentials.

Dataverse permissions: Assign a least-privilege Dataverse security role to the Application User (e.g., CRUD on specific tables only). Avoid granting environment-wide privileges.

Azure RBAC: If using Key Vault, grant only the Key Vault Secrets User role to the app’s Managed Identity. Do not grant Owner or Contributor without justification.

Resilience: The Polly retry policy handles 5xx/429 responses with exponential backoff; tune retry counts/timeouts per SLOs.

Observability: Use Microsoft.Extensions.Logging with structured state for IDs and table names. Forward logs to Application Insights via the built-in provider when running in Azure.

Configuration hygiene: Keep OrganizationUrl, ApiVersion, and default table logical names in configuration. Validate options at startup.

Token reuse: DefaultAzureCredential and underlying token providers cache tokens. Avoid recreating credentials per request; register ServiceClient as a singleton as shown.

Pro-Tip: For read-only operations at scale, prefer the Web API with select columns and server-side paging; avoid retrieving full records when not needed.

Summary

• You built a .NET 8 console app that performs Dataverse CRUD using DI, DefaultAzureCredential, and Polly retries.

• Security is hardened with Managed Identity, least-privilege Dataverse roles, and optional Azure RBAC for Key Vault.

• The solution is production-ready, testable, and configurable, with an optional path to deploy on Azure Container Apps via azd/Bicep.

Next step: map your Managed Identity or app registration to a Dataverse Application User, assign a minimal security role, and run the CRUD flow against your target tables.


Wednesday, 14 January 2026

What’s New in PnP for SPFx: PnPjs v3+, React Controls, and Secure Patterns

PnP for SPFx has evolved with practical updates that reduce bundle size, improve performance, and harden security. The problem: teams migrating or maintaining SPFx solutions are unsure which PnP changes truly matter and how to adopt them safely. The solution: adopt PnPjs v3+ modular imports, leverage updated PnP SPFx React Controls where it makes sense, and implement concrete RBAC permissions with least privilege. The value: smaller bundles, faster pages, and auditable access aligned to enterprise security.

The Problem

Developers building SPFx web parts and extensions need a clear, production-grade path to modern PnP usage. Without guidance, projects risk bloated bundles, brittle permissions, and fragile data access patterns.

Prerequisites

  • Node.js v20+
  • SPFx v1.18+ (Yo @microsoft/sharepoint generator)
  • TypeScript 5+ with strict mode enabled
  • Office 365 tenant with App Catalog and permission to deploy apps
  • PnPjs v3+ and @pnp/spfx-controls-react
  • Optional: PnP PowerShell (latest), Azure CLI if integrating with Azure services

The Solution (Step-by-Step)

1) Adopt PnPjs v3+ with strict typing, ESM, and SPFx behavior

Use modular imports and the SPFx behavior to bind to the current context. Validate runtime data with Zod for resilient web parts.

/* Strict TypeScript example for SPFx with PnPjs v3+ */
import { spfi, SPFI } from "@pnp/sp"; // Core PnPjs factory and interface
import { SPFx } from "@pnp/sp/behaviors/spfx"; // Binds SPFx context as a behavior
import "@pnp/sp/items"; // Bring in list items API surface
import "@pnp/sp/lists"; // Bring in lists API surface
import { z } from "zod"; // Runtime schema validation

// Minimal shape for data we expect from SharePoint
const TaskSchema = z.object({
  Id: z.number(),
  Title: z.string(),
  Status: z.string().optional(),
});

type Task = z.infer<typeof TaskSchema>;

// SPFx helper to create a bound SP instance. This avoids global state and is testable.
export function getSP(context: unknown): SPFI {
  // context should be the WebPartContext or Extension context
  return spfi().using(SPFx(context as object));
}

// Fetch list items with strong typing and runtime validation
export async function fetchTasks(sp: SPFI, listTitle: string): Promise<readonly Task[]> {
  // Select only the fields needed for minimal payloads
  const raw = await sp.web.lists.getByTitle(listTitle).items.select("Id", "Title", "Status")();
  // Validate at runtime to catch unexpected shapes
  const parsed = z.array(TaskSchema).parse(raw);
  return parsed;
}

Why this matters: smaller imports improve tree shaking, and behaviors keep your data layer clean, testable, and context-aware.

2) Use batching and caching behaviors for fewer round-trips

Batch multiple reads to reduce network overhead, and apply caching for read-heavy views.

import { spfi, SPFI } from "@pnp/sp";
import { SPFx } from "@pnp/sp/behaviors/spfx";
import "@pnp/sp/webs";
import "@pnp/sp/lists";
import "@pnp/sp/items";
import { Caching } from "@pnp/queryable"; // Behavior for query caching

export function getCachedSP(context: unknown): SPFI {
  return spfi().using(SPFx(context as object)).using(
    Caching({
      store: "local", // Use localStorage for simplicity; consider session for sensitive data
      defaultTimeout: 30000, // 30s cache duration; tune to your UX needs
    })
  );
}

export async function batchedRead(sp: SPFI, listTitle: string): Promise<{ count: number; first: string }> {
  // Create a batched instance
  const [batchedSP, execute] = sp.batched();

  // Queue multiple operations
  const itemsPromise = batchedSP.web.lists.getByTitle(listTitle).items.select("Id", "Title")();
  const topItemPromise = batchedSP.web.lists.getByTitle(listTitle).items.top(1).select("Title")();

  // Execute the batch
  await execute();

  const items = await itemsPromise;
  const top = await topItemPromise;

  return { count: items.length, first: (top[0]?.Title ?? "") };
}

Pro-Tip: Combine select, filter, and top to minimize payloads and speed up rendering.

3) Use PnP SPFx React Controls when they save time

Prefer controls that encapsulate complex, well-tested UX patterns. Examples:

  • PeoplePicker for directory-aware selection
  • FilePicker for consistent file selection
  • ListView for performant tabular data
import * as React from "react";
import { PeoplePicker, PrincipalType } from "@pnp/spfx-controls-react/lib/PeoplePicker";

// Strongly typed shape for selected people
export type Person = {
  id: string;
  text: string;
  secondaryText?: string;
};

type Props = {
  onChange: (people: readonly Person[]) => void;
};

export function PeopleSelector(props: Props): JSX.Element {
  return (
    <div>
      <PeoplePicker
        context={(window as unknown as { spfxContext: unknown }).spfxContext}
        titleText="Select people"
        personSelectionLimit={3}
        principalTypes={[PrincipalType.User]}
        showtooltip
        required={false}
        onChange={(items) => {
          const mapped: readonly Person[] = items.map((i) => ({
            id: String(i.id),
            text: i.text,
            secondaryText: i.secondaryText,
          }));
          props.onChange(mapped);
        }}
      />
    </div>
  );
}

Pro-Tip: Keep these controls behind thin adapters so you can swap or mock them in tests without touching business logic.

4) Streamline deployment with PnP PowerShell

Automate packaging and deployment to ensure consistent, auditable releases.

# Install: https://pnp.github.io/powershell/
# Deploy an SPFx package to the tenant app catalog and install to a site
Connect-PnPOnline -Url https://contoso-admin.sharepoint.com -Interactive

# Publish/overwrite SPPKG into the tenant catalog
Add-PnPApp -Path .\sharepoint\solution\my-solution.sppkg -Scope Tenant -Publish -Overwrite

# Install the app to a specific site
Connect-PnPOnline -Url https://contoso.sharepoint.com/sites/ProjectX -Interactive
$pkg = Get-PnPApp | Where-Object { $_.Title -eq "My Solution" }
Install-PnPApp -Identity $pkg.Id -Scope Site -Overwrite

Pro-Tip: Run these commands from CI using OIDC to Azure AD (no stored secrets) and conditional approvals for production sites.

5) Security and RBAC: explicit, least-privilege permissions

Be explicit about the minimal roles required:

  • SharePoint site and list permissions: Read (for read-only web parts), Edit or Contribute (only when creating/updating items). Prefer item- or list-scoped permissions over site-wide.
  • Graph delegated permissions in SPFx: User.Read, User.ReadBasic.All, Sites.Read.All (only if cross-site reads are required). Request via API access in the package solution. Avoid .All scopes unless necessary.
  • Azure service calls via backend API: If your SPFx calls an Azure Function or App Service, secure the backend with Entra ID and assign a Managed Identity to the backend. Grant that identity minimal roles such as Storage Blob Data Reader or Storage Blob Data Contributor on specific storage accounts or containers only.

Pro-Tip: Prefer resource-specific consent to SharePoint or Graph endpoints and scope consents to the smallest set of sites or resources.

6) Add an error boundary for resilient UI

SPFx runs inside complex pages; isolate failures so one component does not break the whole canvas.

import * as React from "react";

type BoundaryState = { hasError: boolean };

export class ErrorBoundary extends React.Component<React.PropsWithChildren<unknown>, BoundaryState> {
  state: BoundaryState = { hasError: false };

  static getDerivedStateFromError(): BoundaryState {
    return { hasError: true };
  }

  componentDidCatch(error: unknown): void {
    // Log to a centralized telemetry sink (e.g., Application Insights)
    // Avoid PII; sanitize messages before sending
    console.error("ErrorBoundary caught:", error);
  }

  render(): React.ReactNode {
    if (this.state.hasError) {
      return <div role="alert">Something went wrong. Please refresh or try again later.</div>;
    }
    return this.props.children;
  }
}

Wrap your data-heavy components with ErrorBoundary and fail gracefully.

7) Modernize imports for tree shaking and smaller bundles

Only import what you use. Avoid star imports.

// Good: minimal surface
import { spfi } from "@pnp/sp";
import "@pnp/sp/items";
import "@pnp/sp/lists";

// Avoid: broad or legacy preset imports that include APIs you don't need
// import "@pnp/sp/presets/all";

Pro-Tip: Run webpack-bundle-analyzer to confirm reductions as you trim imports.

Best Practices & Security

  • Principle of Least Privilege: grant Read before Edit or Contribute; avoid tenant-wide Sites.Read.All unless essential.
  • Runtime validation: use Zod to guard against content type or field drift.
  • Behavior-driven PnPjs: keep SPFx context in a factory; never in globals.
  • Resiliency: add retries/backoff for throttling with PnPjs behaviors; display non-blocking toasts for transient failures.
  • No secrets in client code: if integrating with Azure, call a backend secured with Entra ID; use Managed Identities on the backend instead of keys.
  • Accessibility: ensure controls include aria labels and keyboard navigation.
  • Observability: log warnings and errors with correlation IDs to diagnose issues across pages.

Pro-Tip: For heavy reads, combine batching with narrow select filters and increase cache duration carefully; always provide a user-initiated refresh.

Summary

  • PnPjs v3+ with behaviors, batching, and caching delivers smaller, faster, and cleaner SPFx data access.
  • PnP SPFx React Controls accelerate complex UX while remaining testable behind adapters.
  • Explicit RBAC and runtime validation raise your security bar without slowing delivery.