Sunday, 22 February 2026

Build a secure MCP-style server for Power Apps with .NET 8 Azure Functions and Azure AD

Creating an MCP server in PowerApps typically means exposing a secure, typed HTTP API that Power Apps can call via a Custom Connector. This article shows why Azure Functions (.NET 8 isolated) with Azure AD (Entra ID) is the right foundation and how to ship a production-ready MCP-style server with OpenAPI, validation, and zero secret management.

The Problem

You need a reliable backend for Power Apps that enforces validation and security, offers a clean contract (OpenAPI), scales serverlessly, and avoids Function Keys or shared secrets. You also want workflows and data operations to be testable and observable.

Prerequisites

  • .NET 8 SDK
  • Azure CLI 2.58+ and Azure Functions Core Tools v4+
  • Azure subscription with permissions to create Resource Group, Function App, and Managed Identity
  • Power Apps environment with permission to create Custom Connectors

Implementation Details

Project setup

// Terminal commands (run locally)
// 1) Create a .NET 8 isolated Azure Functions app
dotnet new func --worker-runtime dotnet-isolated --name PowerAppsMcpServer

cd PowerAppsMcpServer

// 2) Add packages for validation, OpenAPI, and DI helpers
dotnet add package FluentValidation
dotnet add package FluentValidation.DependencyInjectionExtensions
dotnet add package Microsoft.Azure.Functions.Worker.Extensions.OpenApi
dotnet add package Azure.Identity
dotnet add package Microsoft.Extensions.Azure
dotnet add package Microsoft.Extensions.Logging.ApplicationInsights

Program.cs (minimal hosting, DI, OpenAPI)

using Azure.Core;
using Azure.Identity;
using FluentValidation;
using Microsoft.Azure.Functions.Worker;
using Microsoft.Azure.Functions.Worker.Extensions.OpenApi.Extensions;
using Microsoft.Extensions.Azure;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Logging;

// Top-level statements for .NET 8 minimal hosting
var host = new HostBuilder()
    .ConfigureFunctionsWorkerDefaults(builder =>
    {
        // Enables OpenAPI endpoints at /api/swagger.json and /api/swagger/ui
        builder.AddApplicationInsights();
        builder.AddOpenApi();
    })
    .ConfigureServices(services =>
    {
        // Register Application Insights logger
        services.AddLogging();

        // Register validators
        services.AddValidatorsFromAssemblyContaining<CreateOrderRequestValidator>();

        // Register Azure SDK clients using DefaultAzureCredential (Managed Identity in Azure)
        services.AddAzureClients(clientBuilder =>
        {
            clientBuilder.UseCredential(new DefaultAzureCredential());
            // Example: clientBuilder.AddSecretClient(new Uri("https://<your-kv>.vault.azure.net/"));
        });

        // Register domain services
        services.AddSingleton<IOrderService, OrderService>();
    })
    .Build();

host.Run();

Contracts and validation

namespace PowerAppsMcpServer;

public sealed record CreateOrderRequest(
    string CustomerId,      // Must be a known customer
    string Sku,             // Product SKU
    int Quantity            // >= 1
);

public sealed record CreateOrderResponse(
    string OrderId,
    string Status,
    string Message
);
using FluentValidation;

namespace PowerAppsMcpServer;

public sealed class CreateOrderRequestValidator : AbstractValidator<CreateOrderRequest>
{
    public CreateOrderRequestValidator()
    {
        // Ensure CustomerId is present and well-formed
        RuleFor(x => x.CustomerId)
            .NotEmpty().WithMessage("CustomerId is required.")
            .Length(3, 64);

        // Basic SKU constraints
        RuleFor(x => x.Sku)
            .NotEmpty().WithMessage("Sku is required.")
            .Length(2, 64);

        // Quantity must be positive
        RuleFor(x => x.Quantity)
            .GreaterThan(0).WithMessage("Quantity must be at least 1.");
    }
}

Domain service (DI, testable logic)

using System;
using System.Threading;
using System.Threading.Tasks;

namespace PowerAppsMcpServer;

public interface IOrderService
{
    Task<CreateOrderResponse> CreateAsync(CreateOrderRequest request, CancellationToken ct = default);
}

public sealed class OrderService() : IOrderService // Primary constructor (no fields needed)
{
    public Task<CreateOrderResponse> CreateAsync(CreateOrderRequest request, CancellationToken ct = default)
    {
        // Simulate business logic; replace with real persistence/integration
        var orderId = $"ORD-{Guid.NewGuid():N}";
        return Task.FromResult(new CreateOrderResponse(orderId, "Created", "Order accepted"));
    }
}

HTTP-triggered Function with Azure AD auth and OpenAPI

using System.Net;
using System.Text.Json;
using FluentValidation;
using Microsoft.Azure.Functions.Worker;
using Microsoft.Azure.Functions.Worker.Http;
using Microsoft.Extensions.Logging;
using Microsoft.Azure.Functions.Worker.Extensions.OpenApi.Attributes;
using Microsoft.OpenApi.Models;

namespace PowerAppsMcpServer;

public sealed class CreateOrderFunction(
    ILogger<CreateOrderFunction> logger,
    IValidator<CreateOrderRequest> validator,
    IOrderService orderService)
{
    // This endpoint is described for OpenAPI and secured via Azure AD (set on the Function App)
    [Function("CreateOrder")]
    [OpenApiOperation(operationId: "CreateOrder", tags: new[] { "orders" }, Summary = "Create order", Description = "Creates an order with validated input.")]
    [OpenApiRequestBody(contentType: "application/json", bodyType: typeof(CreateOrderRequest), Required = true, Description = "Order payload")]
    [OpenApiResponseWithBody(statusCode: HttpStatusCode.OK, contentType: "application/json", bodyType: typeof(CreateOrderResponse), Summary = "Order created")]
    public async Task<HttpResponseData> Run(
        [HttpTrigger(AuthorizationLevel.Function, "post", Route = "orders")] HttpRequestData req,
        FunctionContext ctx)
    {
        // Note: Use EasyAuth/Azure AD at the Function App level to avoid handling tokens here.
        // The function still requires a valid AAD token via Custom Connector.

        var payload = await JsonSerializer.DeserializeAsync<CreateOrderRequest>(req.Body, new JsonSerializerOptions
        {
            PropertyNameCaseInsensitive = true
        });

        // Validate request
        var result = await validator.ValidateAsync(payload!);
        if (!result.IsValid)
        {
            var bad = req.CreateResponse(HttpStatusCode.BadRequest);
            await bad.WriteStringAsync(JsonSerializer.Serialize(new
            {
                error = "ValidationFailed",
                details = result.Errors.Select(e => new { e.PropertyName, e.ErrorMessage })
            }));
            return bad;
        }

        // Execute domain logic
        var responseModel = await orderService.CreateAsync(payload!, ctx.CancellationToken);

        // Return success
        var ok = req.CreateResponse(HttpStatusCode.OK);
        await ok.WriteStringAsync(JsonSerializer.Serialize(responseModel));
        return ok;
    }
}

Note: Set the Function App to be secured by Azure AD (App Service Authentication/Authorization). Do not rely on Function Keys in production.

OpenAPI for the Custom Connector

  • Run locally and navigate to /api/swagger/ui to inspect the contract. The OpenAPI JSON is available at /api/swagger.json.
  • Export this JSON and import it when creating your Power Apps Custom Connector, selecting OAuth 2.0 (Azure AD) as the authentication type.

Power Apps integration (Custom Connector)

Authentication setup

  • Create an Azure AD App Registration for the Custom Connector (client app) and expose an application ID URI for the Function App (resource app) if you choose user-assigned scopes. Alternatively, enable EasyAuth with “Log in with Azure Active Directory”.
  • In the Custom Connector, choose OAuth 2.0 (Azure AD). Provide the Authorization URL, Token URL, and the client application details. Use the Application ID URI or scope configured for the Function App.
  • Grant users access via Azure AD and Power Platform permissions so they can acquire tokens and use the connector.

Calling the connector from Power Apps

// Power Apps (Canvas) example formula usage (pseudo):
// Assuming Custom Connector named 'OrdersApi'
Set(
    createResult,
    OrdersApi.CreateOrder({
        CustomerId: "CUST-001",
        Sku: "WIDGET-42",
        Quantity: 2
    })
);
// Access response fields: createResult.OrderId, createResult.Status, createResult.Message

Deployment

// Azure deployment with CLI (example)
// 1) Login
az login

// 2) Create resource group
az group create -n rg-mcp-powerapps -l eastus

// 3) Create storage and function app (Linux, isolated, .NET 8)
az storage account create -n mcpfuncstor$RANDOM -g rg-mcp-powerapps -l eastus --sku Standard_LRS
az functionapp create -n mcp-func-app-$RANDOM -g rg-mcp-powerapps -s <storageName> --consumption-plan-location eastus --runtime dotnet-isolated --functions-version 4

// 4) Enable App Service Authentication with Azure AD (EasyAuth)
# Replace <clientId>, <issuerUrl> as per your AAD setup
az webapp auth microsoft update \
  --resource-group rg-mcp-powerapps \
  --name mcp-func-app-XXXX \
  --client-id <clientId> \
  --issuer "https://login.microsoftonline.com/<tenantId>/v2.0" \
  --unauthenticated-client-action RedirectToLoginPage

// 5) Deploy code
func azure functionapp publish mcp-func-app-XXXX

RBAC roles to assign

  • Deployment automation: Contributor on the resource group or scoped roles to Function App and Storage Account.
  • Function App management: Azure Functions Contributor (for CI/CD and app updates).
  • Storage access (if using Managed Identity to access blobs/queues): Storage Blob Data Reader or Storage Blob Data Contributor as needed, following least privilege.
  • Application Insights access (read-only dashboards): Monitoring Reader.

Validation and typing on the client

If you invoke this API from a TypeScript app (e.g., React), validate payloads before sending. Below is a strictly typed example using Zod.

// TypeScript (strict) example with Zod
import { z } from "zod";

const CreateOrderRequest = z.object({
  CustomerId: z.string().min(3).max(64),
  Sku: z.string().min(2).max(64),
  Quantity: z.number().int().positive()
});
type CreateOrderRequest = z.infer<typeof CreateOrderRequest>;

async function createOrder(apiBase: string, token: string, body: CreateOrderRequest) {
  const payload = CreateOrderRequest.parse(body); // Validates at runtime
  const res = await fetch(`${apiBase}/api/orders`, {
    method: "POST",
    headers: {
      "Content-Type": "application/json",
      "Authorization": `Bearer ${token}`
    },
    body: JSON.stringify(payload)
  });
  if (!res.ok) throw new Error(`HTTP ${res.status}`);
  return await res.json() as { OrderId: string; Status: string; Message: string };
}

Best Practices & Security

  • Authentication: Use Azure AD (EasyAuth) with the Custom Connector. Avoid Function Keys to eliminate shared secrets.
  • Authorization: Scope access by Entra app roles or scopes; assign least-privilege RBAC to managed identities.
  • Secrets: Prefer Managed Identity and DefaultAzureCredential to access downstream services. Avoid storing credentials in app settings.
  • Validation: Enforce input validation on the server (FluentValidation) and on the client (Zod) for robust defense-in-depth.
  • Observability: Enable Application Insights. Correlate requests by logging operation IDs and include key business fields.
  • Resiliency: Add retry policies on the client, idempotency keys for order creation, and request timeouts to prevent retries creating duplicates.
  • Versioning: Version endpoints (e.g., /api/v1/orders) and keep your Custom Connector mapped to a specific version.
  • Cost: Consumption plan scales to zero; set sampling in Application Insights to control telemetry cost.

Best Practice: Use AsNoTracking() in Entity Framework when performing read-only queries to improve performance.

Monitoring and alerting

With Application Insights, create alerts on failed requests or latency thresholds.

// Examples (Kusto)
// 1) High failure rate in the last 15 minutes
requests
| where timestamp > ago(15m)
| summarize FailureRate = 100.0 * countif(success == false) / count()

// 2) P95 latency by operation
requests
| where timestamp > ago(1h)
| summarize p95_duration = percentile(duration, 95) by operation_Name

Summary

  • Build your MCP-style server for Power Apps with .NET 8 isolated Azure Functions, DI, and FluentValidation.
  • Secure the API with Azure AD and avoid Function Keys; integrate via a Custom Connector using OAuth 2.0.
  • Ship with OpenAPI, monitoring, and clear RBAC to ensure production readiness.

Surface What’s New: Power Apps integration with a secure .NET 8 API for MCP server updates

The question what is new in powerapps for MCP server lacks a precise product definition, so rather than speculating on features, this guide shows how to reliably surface and version "What’s New" updates into Power Apps using a secure .NET 8 backend, strict TypeScript validation, and Azure-native security. You will get a production-ready pattern that Power Apps can call via a Custom Connector, keeping your release notes current without manual edits.

The Problem

Teams need a trustworthy way to display "What’s New" for MCP server in Power Apps, but the upstream source and format of updates can change. Hardcoding content or querying unsecured endpoints leads to drift, security gaps, and poor developer experience.

Prerequisites

  • .NET 8 SDK
  • Node.js 20+ and a package manager (pnpm/npm)
  • Azure subscription with permissions to create: Azure Functions or Container Apps, Key Vault, Azure API Management, Application Insights
  • Entra ID app registration for the API (OAuth 2.0)
  • Power Apps environment (to build a Custom Connector)

The Solution (Step-by-Step)

1) .NET 8 Minimal API that normalizes "What’s New" items

This minimal API demonstrates production-ready design: DI-first, HttpClientFactory, global exception handling, validation, versioned contract, and managed identity for downstream access.

// Program.cs (.NET 8, file-scoped namespace, minimal API, DI-centric)
using System.Net.Http.Json;
using System.Text.Json;
using System.Text.Json.Serialization;
using Azure.Identity; // DefaultAzureCredential for managed identity
using Azure.Security.KeyVault.Secrets;
using Microsoft.AspNetCore.Diagnostics;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Azure;

var builder = WebApplication.CreateBuilder(args);

// Strongly typed options for upstream source configuration
builder.Services.Configure<NewsOptions>(builder.Configuration.GetSection("News"));

// HttpClient with resilient handler
builder.Services.AddHttpClient<INewsClient, NewsClient>()
    .SetHandlerLifetime(TimeSpan.FromMinutes(5));

// Azure clients via DefaultAzureCredential (uses Managed Identity in Azure)
builder.Services.AddAzureClients(azure =>
{
    azure.UseCredential(new DefaultAzureCredential());
    var kvUri = builder.Configuration["KeyVaultUri"]; // e.g., https://my-kv.vault.azure.net/
    if (!string.IsNullOrWhiteSpace(kvUri))
    {
        azure.AddSecretClient(new Uri(kvUri));
    }
});

// Application Insights (OpenTelemetry auto-collection can also be used)
builder.Services.AddApplicationInsightsTelemetry();

var app = builder.Build();

// Global exception handler producing problem+json
app.UseExceptionHandler(errorApp =>
{
    errorApp.Run(async context =>
    {
        var feature = context.Features.Get<IExceptionHandlerPathFeature>();
        var problem = new ProblemDetails
        {
            Title = "Unexpected error",
            Detail = app.Environment.IsDevelopment() ? feature?.Error.ToString() : "",
            Status = StatusCodes.Status500InternalServerError,
            Type = "https://httpstatuses.com/500"
        };
        context.Response.StatusCode = problem.Status ?? 500;
        context.Response.ContentType = "application/problem+json";
        await context.Response.WriteAsJsonAsync(problem);
    });
});

app.MapGet("/health", () => Results.Ok(new { status = "ok" }));

// Versioned route for the normalized news feed (v1)
app.MapGet("/api/v1/news", async (
    INewsClient client
) => Results.Ok(await client.GetNewsAsync()))
.WithName("GetNewsV1")
.Produces<NewsItemV1[]>(StatusCodes.Status200OK)
.ProducesProblem(StatusCodes.Status500InternalServerError);

app.Run();

// Options to control the upstream feed location and parsing mode
public sealed class NewsOptions
{
    public string? SourceUrl { get; init; } // Upstream JSON or RSS converted via a worker
    public string Format { get; init; } = "json"; // json|rss (extend as needed)
}

// Public DTO contract exposed to Power Apps via Custom Connector
public sealed class NewsItemV1
{
    public required string Id { get; init; } // stable identifier
    public required string Title { get; init; }
    public required string Summary { get; init; }
    public required DateTimeOffset PublishedAt { get; init; }
    public string? Category { get; init; } // optional taxonomy
    public string? Url { get; init; } // link to detail page
}

// Client interface for fetching and normalizing upstream data
public interface INewsClient
{
    Task<NewsItemV1[]> GetNewsAsync(CancellationToken ct = default);
}

// Implementation that reads upstream source, validates, and normalizes
public sealed class NewsClient(HttpClient http, Microsoft.Extensions.Options.IOptions<NewsOptions> options) : INewsClient
{
    private readonly HttpClient _http = http;
    private readonly NewsOptions _opts = options.Value;
    private static readonly JsonSerializerOptions JsonOptions = new()
    {
        PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
        DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull
    };

    public async Task<NewsItemV1[]> GetNewsAsync(CancellationToken ct = default)
    {
        if (string.IsNullOrWhiteSpace(_opts.SourceUrl))
            throw new InvalidOperationException("News:SourceUrl is not configured.");

        // Fetch upstream JSON and map to a stable contract consumed by Power Apps
        var upstreamItems = await _http.GetFromJsonAsync<UpstreamItem[]>(_opts.SourceUrl, JsonOptions, ct)
            ?? Array.Empty<UpstreamItem>();

        // Normalize and order by publish date desc
        return upstreamItems
            .Select(u => new NewsItemV1
            {
                Id = u.Id ?? Guid.NewGuid().ToString("n"),
                Title = u.Title ?? "Untitled",
                Summary = u.Summary ?? string.Empty,
                PublishedAt = u.PublishedAt == default ? DateTimeOffset.UtcNow : u.PublishedAt,
                Category = u.Category,
                Url = u.Url
            })
            .OrderByDescending(n => n.PublishedAt)
            .ToArray();
    }

    // Internal model matching the upstream source (keep flexible)
    private sealed class UpstreamItem
    {
        public string? Id { get; init; }
        public string? Title { get; init; }
        public string? Summary { get; init; }
        public DateTimeOffset PublishedAt { get; init; }
        public string? Category { get; init; }
        public string? Url { get; init; }
    }
}

Configuration (appsettings.json):

{
  "News": {
    "SourceUrl": "https://<your-source>/mcp-news.json",
    "Format": "json"
  },
  "KeyVaultUri": "https://<your-kv>.vault.azure.net/"
}

Pro-Tip: Use AsNoTracking() in Entity Framework when performing read-only queries to improve performance.

2) Secure Azure deployment with Managed Identity and API Management

  • Deploy as Azure Functions (isolated) or Azure Container Apps. Enable System-Assigned Managed Identity.
  • Expose through Azure API Management with OAuth 2.0 (Entra ID) for inbound auth. Create a Power Apps Custom Connector pointing to APIM.

Required RBAC roles (assign to the managed identity or DevOps service principal):

  • Key Vault Secrets User (to read secrets if you store upstream source credentials or API keys)
  • App Configuration Data Reader (if using Azure App Configuration instead of appsettings)
  • API Management Service Contributor (to publish and manage the API surface)
  • Monitoring Reader (to view Application Insights telemetry)
  • Storage Blob Data Reader (only if the upstream source is in Azure Storage)

Pro-Tip: Favor Managed Identity and DefaultAzureCredential across services; avoid connection strings and embedded secrets entirely.

3) Strict TypeScript models with Zod and versioning

The client schema mirrors the v1 API and can evolve with v2+ while keeping backward compatibility in Power Apps and React.

// news.schema.ts (TypeScript, strict mode)
import { z } from "zod";

// Discriminated union enables future breaking changes with clear versioning
export const NewsItemV1 = z.object({
  id: z.string().min(1),
  title: z.string().min(1),
  summary: z.string().default(""),
  publishedAt: z.string().datetime(),
  category: z.string().optional(),
  url: z.string().url().optional()
});

export const NewsResponseV1 = z.array(NewsItemV1);

export type TNewsItemV1 = z.infer<typeof NewsItemV1>;
export type TNewsResponseV1 = z.infer<typeof NewsResponseV1>;

// Future-proof: union for versioned responses
export const NewsResponse = z.union([
  z.object({ version: z.literal("v1"), data: NewsResponseV1 })
]);
export type TNewsResponse = z.infer<typeof NewsResponse>;

4) React 19 component using TanStack Query

Functional component with error and loading states, plus Zod runtime validation.

// NewsPanel.tsx (React 19)
import React from "react";
import { useQuery } from "@tanstack/react-query";
import { NewsResponseV1, type TNewsItemV1 } from "./news.schema";

async function fetchNews(): Promise<TNewsItemV1[]> {
  const res = await fetch("/api/v1/news", { headers: { Accept: "application/json" } });
  if (!res.ok) throw new Error(`Failed: ${res.status}`);
  const json = await res.json();
  const parsed = NewsResponseV1.safeParse(json);
  if (!parsed.success) throw new Error("Schema validation failed");
  return parsed.data;
}

export function NewsPanel(): JSX.Element {
  const { data, error, isLoading } = useQuery({
    queryKey: ["news", "v1"],
    queryFn: fetchNews,
    staleTime: 60_000
  });

  if (isLoading) return <div>Loading updates…</div>;
  if (error) return <div>Failed to load updates.</div>;

  return (
    <ul>
      {data!.map(item => (
        <li key={item.id}>
          <strong>{item.title}</strong> — {new Date(item.publishedAt).toLocaleDateString()}<br />
          {item.summary}
        </li>
      ))}
    </ul>
  );
}

5) Power Apps Custom Connector

Create a Custom Connector targeting the APIM endpoint /api/v1/news. Map the response to your app data schema. Add the connector to your Power App and display the items in a gallery. When the upstream feed changes, you only update the backend normalizer, not the app.

Best Practices & Security

  • Authentication: Use Entra ID for APIM inbound auth. Backend-to-Azure uses DefaultAzureCredential with Managed Identity.
  • Secrets: Store upstream tokens in Key Vault; assign Key Vault Secrets User to the app’s managed identity.
  • Telemetry: Enable Application Insights. Track request IDs and dependency calls for the upstream fetch.
  • Authorization: Restrict APIM access via OAuth scopes and, if needed, IP restrictions or rate limits.
  • Resilience: Configure retry and timeout on HttpClient with sensible limits; add circuit breakers if the upstream is unreliable.
  • Versioning: Pin /api/v1/news; introduce /api/v2/news when the contract changes. Version TypeScript schemas alongside API versions.

App Insights integration (example):

// Add to Program.cs before app.Run(); ensure Application Insights is enabled
app.Use(async (ctx, next) =>
{
    // Correlate requests with upstream calls via trace IDs
    ctx.Response.Headers["Request-Id"] = System.Diagnostics.Activity.Current?.Id ?? Guid.NewGuid().ToString();
    await next();
});

Testing strategy:

  • API: Unit test NewsClient with mocked HttpMessageHandler; integration test /api/v1/news in-memory.
  • TypeScript: Schema tests to ensure validation rejects malformed payloads; component tests for loading/error states.
  • Contract: Add a CI step that fetches a sample payload from the upstream and validates against NewsResponseV1.

Security roles recap:

  • API surface: API Management Service Contributor
  • Secrets: Key Vault Secrets User
  • Config (if used): App Configuration Data Reader
  • Monitoring: Monitoring Reader
  • Storage (if used): Storage Blob Data Reader

Pro-Tip: Use APIM policies (validate-content, rate-limit-by-key) to protect the API consumed by Power Apps.

Summary

  • Do not guess "what’s new" for MCP server; centralize updates behind a stable, versioned API.
  • Use Managed Identity, DefaultAzureCredential, and APIM OAuth to secure end-to-end access for Power Apps.
  • Validate with Zod, monitor with Application Insights, and evolve safely via API/schema versioning.

Saturday, 7 February 2026

Mastering Dynamics CRM Plugin Triggers: Pre-Validation, Pre-Operation, Post-Operation, and Async with Azure-Ready Patterns

Dynamics CRM plugin triggers define when your custom logic runs in the Dataverse pipeline. If you understand how Dynamics CRM plugin triggers behave across Pre-Validation, Pre-Operation, Post-Operation, and Asynchronous execution, you can write reliable, idempotent, and production-ready business logic that scales with Azure.

The Problem

Developers struggle to pick the correct stage and execution mode for Dynamics 365/Dataverse plugins, causing issues like recursion, lost transactions, or performance bottlenecks. You need clear rules, copy-paste-safe examples, and guidance on automation, security, and Azure integration without manual portal steps.

Prerequisites

• .NET 8 SDK installed (for companion services and automation)
• Power Platform Tools (PAC CLI) installed
• Azure CLI (az) installed, logged in with least-privilege account
• Access to a Dataverse environment and solution where you can register plugins
• Basic familiarity with IPlugin, IPluginExecutionContext, and IServiceProvider

The Solution (Step-by-Step)

1) Know the stages and when to use each

• Pre-Validation (Stage 10, synchronous): Validate input early, block bad requests before the main transaction. Good for authorization and schema checks.
• Pre-Operation (Stage 20, synchronous): Mutate Target before it’s saved. Good for defaulting fields, data normalization, or cross-field validation.
• Post-Operation (Stage 40, synchronous): Runs after the record is saved, still in the transaction. Good for operations that must be atomic with the main operation (e.g., child record creation that must roll back with parent).
• Post-Operation (Asynchronous): Offload non-transactional, latency-tolerant work (notifications, integrations). Improves throughput and user experience.

2) Messages and images

• Common messages: Create, Update, Delete, Assign, SetState, Associate/Disassociate, Merge, Retrieve/RetrieveMultiple (use sparingly to avoid performance impact).
• Filtering attributes (Update): Only trigger when specific columns change to reduce overhead.
• Images: Use Pre-Image for old values, Post-Image for new values. Keep images minimal to reduce payload and improve performance.

3) Synchronous Pre-Operation example (mutate data safely)

Target framework note: Dataverse runtime support for .NET versions can vary. The C# syntax below follows modern patterns while remaining compatible with the Dataverse plugin model. Always target the supported framework for your environment at build time.

using System;
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Extensions; // For helpful extension methods
using Microsoft.Extensions.DependencyInjection; // For DI patterns inside plugin
using System.Globalization;

// File-scoped namespace for clean organization
namespace Company.Plugins;

// Primary-constructor-like pattern for clarity; the Dataverse runtime will call the parameterless constructor.
public sealed class AccountNormalizeNamePlugin : IPlugin
{
    // Build a tiny DI container once per plugin instance to follow DI principles instead of static helpers.
    private readonly IServiceProvider _rootServices;

    public AccountNormalizeNamePlugin()
    {
        var services = new ServiceCollection();
        services.AddSingleton<INameNormalizer, TitleCaseNameNormalizer>();
        _rootServices = services.BuildServiceProvider();
    }

    public void Execute(IServiceProvider serviceProvider)
    {
        // Standard service access from the pipeline
        var context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));
        var factory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
        var service = factory.CreateOrganizationService(context.UserId);
        var tracing = (ITracingService)serviceProvider.GetService(typeof(ITracingService));

        // Guard: Ensure we only run on Update of account and when 'name' changes
        if (!string.Equals(context.PrimaryEntityName, "account", StringComparison.OrdinalIgnoreCase) ||
            !string.Equals(context.MessageName, "Update", StringComparison.OrdinalIgnoreCase))
        {
            return;
        }

        // Prevent recursion: depth should be 1 for first-level execution
        if (context.Depth > 1) return;

        var target = context.InputParameters.Contains("Target") ? context.InputParameters["Target"] as Entity : null;
        if (target == null) return;

        // Run only when 'name' was provided in this Update
        if (!target.Contains("name")) return;

        // Resolve our normalizer from DI
        var normalizer = _rootServices.GetRequiredService<INameNormalizer>();

        // Normalize 'name' to Title Case
        var originalName = target.GetAttributeValue<string>("name");
        var normalized = normalizer.Normalize(originalName);
        target["name"] = normalized;

        tracing.Trace($"AccountNormalizeNamePlugin: normalized '{originalName}' to '{normalized}'.");
    }
}

// Service abstraction for testability and SRP
public interface INameNormalizer
{
    string Normalize(string? input);
}

public sealed class TitleCaseNameNormalizer : INameNormalizer
{
    public string Normalize(string? input)
    {
        if (string.IsNullOrWhiteSpace(input)) return input ?? string.Empty;
        var textInfo = CultureInfo.InvariantCulture.TextInfo;
        return textInfo.ToTitleCase(input.Trim().ToLowerInvariant());
    }
}

Registration guidelines: Register this on account Update, Stage Pre-Operation (20), Synchronous, with filtering attributes = name. Add a minimal Pre-Image if you need original values.

4) Synchronous Post-Operation example (atomic child creation)

using System;
using Microsoft.Xrm.Sdk;

namespace Company.Plugins;

public sealed class ContactCreateWelcomeTaskPlugin : IPlugin
{
    public void Execute(IServiceProvider serviceProvider)
    {
        var context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));
        var factory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
        var service = factory.CreateOrganizationService(context.UserId);
        var tracing = (ITracingService)serviceProvider.GetService(typeof(ITracingService));

        // Only run on Contact Create, after it is created (Post-Operation)
        if (!string.Equals(context.PrimaryEntityName, "contact", StringComparison.OrdinalIgnoreCase) ||
            !string.Equals(context.MessageName, "Create", StringComparison.OrdinalIgnoreCase))
        {
            return;
        }

        if (context.Depth > 1) return;

        var contactId = context.PrimaryEntityId;
        if (contactId == Guid.Empty) return;

        // Create a follow-up task; if this plugin throws, both contact and task roll back
        var task = new Entity("task");
        task["subject"] = "Welcome new contact";
        task["regardingobjectid"] = new EntityReference("contact", contactId);
        task["prioritycode"] = new OptionSetValue(1); // High
        service.Create(task);

        tracing.Trace("ContactCreateWelcomeTaskPlugin: created welcome task.");
    }
}

5) Asynchronous Post-Operation example (offload integration)

Use Async Post-Operation for non-transactional work such as calling Azure services. Prefer a durable, retry-enabled mechanism (queue, function) over direct HTTP. The plugin should enqueue a message; an Azure Function (managed identity) processes it.

using System;
using Microsoft.Xrm.Sdk;

namespace Company.Plugins;

public sealed class ContactCreatedEnqueueIntegrationPlugin : IPlugin
{
    public void Execute(IServiceProvider serviceProvider)
    {
        var context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));
        var factory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
        var service = factory.CreateOrganizationService(context.UserId);
        var tracing = (ITracingService)serviceProvider.GetService(typeof(ITracingService));

        if (!string.Equals(context.PrimaryEntityName, "contact", StringComparison.OrdinalIgnoreCase) ||
            !string.Equals(context.MessageName, "Create", StringComparison.OrdinalIgnoreCase))
        {
            return;
        }

        // Idempotency key: use the contact id
        var contactId = context.PrimaryEntityId;
        if (contactId == Guid.Empty) return;

        // Example: write an integration record for downstream Azure Function (poll or Dataverse Change Tracking)
        // This avoids secrets and direct outbound calls from the plugin.
        var integrationLog = new Entity("new_integrationmessage"); // Custom table
        integrationLog["new_name"] = $"ContactCreated:{contactId}";
        integrationLog["new_payload"] = contactId.ToString();
        service.Create(integrationLog);

        tracing.Trace("ContactCreatedEnqueueIntegrationPlugin: queued integration message.");
    }
}

6) Automate registration with PAC CLI (no manual portal)

:: Batch/PowerShell snippet to build and register the assembly
:: 1) Build plugin project (target a runtime supported by your environment)
dotnet build .\src\Company.Plugins\Company.Plugins.csproj -c Release

:: 2) Pack into a solution if applicable
pac solution pack --zipFilePath .\dist\CompanySolution.zip --folder .\solution

:: 3) Import or update solution into the environment
pac auth create --url https://<yourorg>.crm.dynamics.com --cloud Public
pac solution import --path .\dist\CompanySolution.zip --activate-plugins true

This keeps registration repeatable in CI/CD without manual steps.

7) Azure companion Minimal API (for outbound webhooks or admin tools)

For external processing, build a Minimal API or Azure Function with managed identity and Azure RBAC. Example Minimal API (.NET 8) that reads from Storage using DefaultAzureCredential.

using Azure;
using Azure.Identity;
using Azure.Storage.Blobs;
using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Azure;
using Microsoft.Extensions.DependencyInjection;

var builder = WebApplication.CreateBuilder(args);

// Use DefaultAzureCredential to prefer Managed Identity in Azure and dev fallbacks locally
builder.Services.AddAzureClients(az =>
{
    az.UseCredential(new DefaultAzureCredential());
    az.AddBlobServiceClient(new Uri(builder.Configuration["BLOB_ENDPOINT"]!));
});

var app = builder.Build();

// Simple endpoint to fetch a blob; secure this behind Azure AD (AAD) in production
app.MapGet("/files/{name}", async (string name, BlobServiceClient blobs) =>
{
    // Access container 'docs' with RBAC: Storage Blob Data Reader/Contributor on the Managed Identity
    var container = blobs.GetBlobContainerClient("docs");
    var client = container.GetBlobClient(name);

    if (!await container.ExistsAsync()) return Results.NotFound("Container not found.");
    if (!await client.ExistsAsync()) return Results.NotFound("Blob not found.");

    var stream = await client.OpenReadAsync();
    return Results.Stream(stream, "application/octet-stream");
});

await app.RunAsync();

Required Azure RBAC role for the app's managed identity: Storage Blob Data Reader (read-only) or Storage Blob Data Contributor (read-write) on the storage account or specific container scope.

8) IaC with Bicep: storage + managed identity + role assignment

// main.bicep
targetScope = 'resourceGroup'

param location string = resourceGroup().location
param storageName string
param identityName string = 'dv-plugin-mi'

// Storage Account
resource stg 'Microsoft.Storage/storageAccounts@2023-05-01' = {
  name: storageName
  location: location
  sku: {
    name: 'Standard_LRS'
  }
  kind: 'StorageV2'
}

// User Assigned Managed Identity
resource uami 'Microsoft.ManagedIdentity/userAssignedIdentities@2023-01-31' = {
  name: identityName
  location: location
}

// Blob Data Reader role on storage for the identity
resource role 'Microsoft.Authorization/roleAssignments@2022-04-01' = {
  name: guid(stg.id, uami.id, 'ba92f5b4-2d11-453d-a403-e96b0029c9fe') // Storage Blob Data Reader
  scope: stg
  properties: {
    principalId: uami.properties.principalId
    roleDefinitionId: subscriptionResourceId('Microsoft.Authorization/roleDefinitions', 'ba92f5b4-2d11-453d-a403-e96b0029c9fe')
    principalType: 'ServicePrincipal'
  }
}

Deploy with: az deployment group create -g <rg> -f main.bicep -p storageName=<name>.

Best Practices & Security

Pick the right trigger

• Pre-Validation: Reject invalid input early (authorization, schema, required business rules).
• Pre-Operation: Mutate data before save, avoid external calls here.
• Post-Operation (sync): Keep logic small and deterministic to minimize transaction time.
• Post-Operation (async): Offload long-running and I/O-heavy work.

Recursion, idempotency, and performance

• Check context.Depth to prevent infinite loops.
• Use idempotency keys (primary entity id) in integration logs.
• Keep images and columns minimal; filter attributes to reduce trigger noise.
• Use AsNoTracking() in external EF Core services when reading data.

Pro-Tip: Use AsNoTracking() in Entity Framework when performing read-only queries to improve performance.

Security and authentication

• Use Azure AD and Managed Identity for external services; never store secrets in plugin code.
• Apply least privilege with Azure RBAC. Examples: Storage Blob Data Reader/Contributor for the app workload identity; Key Vault Secrets User if retrieving secrets via a separate process.
• In Dataverse, ensure the application user has the minimal security roles necessary for the operations (table-level privileges only on the entities it touches).

Automation and IaC

• Use PAC CLI and CI/CD to register and update plugins, avoiding manual portal steps.
• Use Bicep or azd to provision Azure resources, assign RBAC, and configure endpoints.

Error handling and resiliency

• Synchronous plugins should throw InvalidPluginExecutionException only for business errors that must roll back the transaction.
• For external work, prefer async steps that enqueue messages and rely on Azure Functions with retry policies and dead-letter queues (e.g., Azure Storage Queues or Service Bus).
• Trace key events with ITracingService for diagnosability.

Testing strategy

• Abstract logic behind interfaces and inject into the plugin to enable unit testing without Dataverse.
• Use fakes for IOrganizationService and validate behavior under different stages and messages.
• Add integration tests in a sandbox environment using PAC CLI to seed and verify behavior.

References

• Azure RBAC built-in roles: https://learn.microsoft.com/azure/role-based-access-control/built-in-roles
• DefaultAzureCredential: https://learn.microsoft.com/dotnet/api/azure.identity.defaultazurecredential
• Power Platform CLI: https://learn.microsoft.com/power-platform/developer/cli/introduction

Summary

• Choose the correct trigger: Pre-Validation for guards, Pre-Operation for mutation, Post-Operation for atomic side-effects, Async for integrations.
• Enforce security: Managed Identity for auth, Azure RBAC with least privilege, and no secrets in code.
• Automate everything: PAC CLI for plugin registration, Bicep for Azure resources, and add retries and dead-lettering for resilient async flows.

Sunday, 1 February 2026

MCP Server in AI: A Complete Guide to the Model Context Protocol for Tool-Enabled AI

What Is an MCP Server in AI?

The term MCP server in AI refers to a server that implements the Model Context Protocol (MCP), a standardized way for AI clients (like chat assistants or agents) to securely access tools, data sources, and workflows. An MCP server exposes capabilities—such as APIs, databases, files, prompts, and utility functions—so AI systems can request them in a predictable, controlled manner.

Why MCP Matters

MCP creates a consistent contract between AI clients and external resources. Instead of bespoke integrations, developers can add or swap back-end capabilities with less friction. This improves maintainability, security, and reliability while enabling richer, more grounded AI behavior.

  • Standardization: One protocol to expose many tools/resources.
  • Security: Clear permissions and controlled access to data and actions.
  • Scalability: Add new tools or data sources without redesigning the AI client.
  • Traceability: Requests and responses are structured for logging and auditing.

How an MCP Server Works

At a high level, the AI client connects to an MCP server and discovers what it can do. The client then issues structured requests for actions or data, and the MCP server fulfills them via its configured tools and resources.

Core Components

  • Client: The AI application (chatbot/agent) that understands MCP and sends requests.
  • Server: The MCP endpoint that advertises capabilities and executes requests.
  • Tools: Actions the server can perform (e.g., call an API, run a query, send an email).
  • Resources: Data the server can read (files, database tables, knowledge bases).
  • Prompts/Templates: Reusable instruction blocks or chains the client can invoke via the server.
  • Sessions: Contextual interactions that can track state across multiple requests.

Typical Request Flow

  • Capability discovery: The client lists available tools/resources from the MCP server.
  • Request: The client sends a structured call (e.g., tool.invoke with specific parameters).
  • Execution: The server runs the tool or fetches the resource safely and deterministically.
  • Response: The server returns results with metadata (status, content type, usage notes).

Benefits for Teams and Developers

  • Faster integrations: Plug in new data sources or utilities via MCP without rewriting the client.
  • Access control: Gate sensitive operations and monitor usage centrally.
  • Consistency: Uniform patterns for error handling, timeouts, and retries.
  • Observability: Better logs and diagnostics for AI tool calls.

Use Cases and Examples

Enterprise Knowledge and Operations

  • Search internal documents: A tool that queries a document index or enterprise search.
  • Pull CRM records: Read-only resource access to customer profiles and activity history.
  • Create tickets: A tool to open an issue in a tracker with validated fields.

Data and Analytics

  • SQL query tool: Safely run parameterized queries against a data warehouse.
  • Metrics fetcher: Read metrics or dashboards for real-time insights.
  • Report generator: Produce summarized reports and export to files.

Automation and Productivity

  • Email sender: A tool to draft and send emails with approval steps.
  • Calendar manager: Create and modify events with conflict checks.
  • File utilities: Read, write, and transform files with strict path controls.

Security and Best Practices

  • Principle of least privilege: Expose only the tools and data needed.
  • Input validation: Enforce schemas and sanitize parameters for tools.
  • Audit logging: Log requests, results, and errors with minimal sensitive data.
  • Rate limiting and quotas: Prevent abuse and control costs.
  • Secrets management: Store API keys and credentials securely, never in prompts.

High-Level Setup Steps

  • Define capabilities: Identify which tools, resources, and prompts to expose.
  • Implement adapters: Connect to APIs, databases, and file systems with constrained permissions.
  • Describe schemas: Use structured inputs/outputs to ensure predictable behavior.
  • Configure policies: Authentication, authorization, and rate limits per tool or resource.
  • Test and observe: Validate responses, edge cases, and error handling with logs and metrics.

FAQ

Is an MCP server the same as a normal API?

No. An MCP server is a standardized interface purpose-built for AI clients to discover and use tools/resources consistently, whereas a normal API is typically application-specific.

Can I use MCP with existing systems?

Yes. You can wrap existing APIs, databases, or automation scripts as MCP tools/resources with appropriate permissions and validation.

How does MCP help with reliability?

By enforcing structured calls, typed parameters, and clear error semantics, MCP reduces ambiguity and makes failures easier to detect and recover from.

Key Takeaways

  • An MCP server in AI standardizes how AI clients access tools, data, and workflows.
  • It improves security, observability, and maintainability for AI-enabled applications.
  • Adopt best practices—least privilege, validation, logging—to run MCP safely at scale.