Thursday, 15 January 2026

Const vs readonly in C#: Practical Rules, .NET 8 Examples, and When to Use Each

Const vs readonly in C#: use const for compile-time literals that never change and readonly for runtime-initialized values that should not change after construction. This article shows clear rules, .NET 8 examples with Dependency Injection, and production considerations so you pick the right tool every time.

The Problem

Mixing const and readonly without intent leads to brittle releases, hidden performance costs, and binary-compatibility breaks. You need a simple, reliable decision framework and copy-paste-ready code that works in modern .NET 8 Minimal APIs with DI.

Prerequisites

  • .NET 8 SDK: Needed to compile and run the Minimal API and C# 12 features.
  • An editor (Visual Studio Code or Visual Studio 2022+): For building and debugging the examples.
  • Azure CLI (optional): If you apply the security section with Managed Identity and RBAC for external configuration.

The Solution (Step-by-Step)

1) What const means

  • const is a compile-time constant. The value is inlined at call sites during compilation.
  • Only allowed for types with compile-time constants (primitive numeric types, char, bool, string, and enum).
  • Changing a public const value in a library can break consumers until they recompile, because callers hold the old inlined value.
// File: AppConstants.cs
namespace MyApp;

// Static class is acceptable here because it only holds constants and does not manage state or dependencies.
public static class AppConstants
{
    // Compile-time literals. Safe to inline and extremely fast to read.
    public const string AppName = "OrdersService";    // Inlined at compile-time
    public const int DefaultPageSize = 50;             // Only use if truly invariant
}

2) What readonly means

  • readonly fields are assigned exactly once at runtime: either at the declaration or in a constructor.
  • Use readonly when the value is not known at compile-time (e.g., injected through DI, environment-based, or computed) but must not change after creation.
  • static readonly is runtime-initialized once per type and is not inlined by callers, preserving binary compatibility across versions.
// File: Slug.cs
namespace MyApp;

// Simple immutable value object using readonly field.
public sealed class Slug
{
    public readonly string Value; // Assigned once; immutable thereafter.

    public Slug(string value)
    {
        // Validate then assign. Once assigned, cannot change.
        Value = string.IsNullOrWhiteSpace(value)
            ? throw new ArgumentException("Slug cannot be empty")
            : value.Trim().ToLowerInvariant();
    }
}

3) Prefer static readonly for non-literal shared values

  • Use static readonly for objects like Regex, TimeSpan, Uri, or configuration-derived values that are constant for the process lifetime.
// File: Parsing.cs
using System.Text.RegularExpressions;

namespace MyApp;

public static class Parsing
{
    // Compiled Regex cached for reuse. Not a compile-time literal, so static readonly, not const.
    public static readonly Regex SlugPattern = new(
        pattern: "^[a-z0-9-]+$",
        options: RegexOptions.Compiled | RegexOptions.CultureInvariant
    );
}

4) Minimal API (.NET 8) with DI using readonly

// File: Program.cs
using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Options;

namespace MyApp;

// Options record for settings that may vary by environment.
public sealed record PaginationOptions(int DefaultPageSize, int MaxPageSize);

// Service depending on options. Primary constructor for clarity.
public sealed class ProductService(IOptions<PaginationOptions> options)
{
    private readonly int _defaultPageSize = options.Value.DefaultPageSize; // readonly: set once from DI

    public IResult List(int? pageSize)
    {
        // Enforce immutable default from DI; callers can't mutate _defaultPageSize
        var size = pageSize is > 0 ? pageSize.Value : _defaultPageSize;
        return Results.Ok(new { PageSize = size, Source = "DI/readonly" });
    }
}

var builder = WebApplication.CreateBuilder(args);

// Bind options from configuration once. Keep them immutable after construction.
builder.Services.Configure<PaginationOptions>(builder.Configuration.GetSection("Pagination"));

// Register ProductService for DI.
builder.Services.AddSingleton<ProductService>();

var app = builder.Build();

// Use const for literal routes and tags: truly invariant strings.
app.MapGet("/products", (ProductService svc, int? pageSize) => svc.List(pageSize))
   .WithTags(AppConstants.AppName);

app.Run();

5) When to use which (decision rules)

  • Use const when: the value is a true literal that will never change across versions, and you accept inlining (e.g., mathematical constants, semantic tags, fixed route segments).
  • Use readonly when: the value is computed, injected, environment-specific, or may change across versions without forcing consumer recompilation.
  • Use static readonly for: reference types (Regex, TimeSpan, Uri) or structs not representable as compile-time constants, shared across the app.
  • Avoid public const in libraries for values that might change; prefer public static readonly to avoid binary-compat issues.

6) Performance and threading

  • const reads are effectively free due to inlining.
  • static readonly reads are a single memory read; their initialization is thread-safe under the CLR type initializer semantics.
  • RegexOptions.Compiled with static readonly avoids repeated parsing and allocation under load.

7) Advanced: readonly struct for immutable value types

  • Use readonly struct to guarantee all instance members do not mutate state and to enable defensive copies avoidance by the compiler.
  • Prefer struct only for small, immutable value types to avoid copying overhead.
// File: Money.cs
namespace MyApp;

public readonly struct Money
{
    public decimal Amount { get; }
    public string Currency { get; }

    public Money(decimal amount, string currency)
    {
        Amount = amount;
        Currency = currency;
    }

    // Methods cannot mutate fields because the struct is readonly.
    public Money Convert(decimal rate) => new(Amount * rate, Currency);
}

8) Binary compatibility and versioning

  • Public const values are inlined into consuming assemblies. If you change the const and do not recompile consumers, they keep the old value. This is a breaking behavior.
  • Public static readonly values are not inlined. Changing them in your library updates behavior without requiring consumer recompilation.
  • Guideline: For public libraries, avoid public const except for values guaranteed to never change (e.g., mathematical constants or protocol IDs defined as forever-stable).

9) Testing and static analysis

  • Roslyn analyzers: Enable CA1802 (use const) to suggest const when fields can be made const; enable IDE0044 to suggest readonly for fields assigned only in constructor.
  • CI/CD: Treat analyzer warnings as errors for categories Design, Performance, and Style to enforce immutability usage consistently.
  • Unit tests: Assert immutability by verifying no public setters exist and by attempting to mutate through reflection only in dedicated tests if necessary.

10) Cross-language note: TypeScript immutability parallel

If your stack includes TypeScript, mirror the C# intent with readonly and schema validation.

// File: settings.ts
// Strict typing; no 'any'. Enforce immutability on config and validate with Zod.
import { z } from "zod";

// Zod schema for runtime validation
export const ConfigSchema = z.object({
  apiBaseUrl: z.string().url(),
  defaultPageSize: z.number().int().positive(),
}).strict();

export type Config = Readonly<{
  apiBaseUrl: string;           // readonly by type
  defaultPageSize: number;      // readonly by type
}>;

export function loadConfig(env: NodeJS.ProcessEnv): Config {
  // Validate at runtime, then freeze object to mimic readonly semantics
  const parsed = ConfigSchema.parse({
    apiBaseUrl: env.API_BASE_URL,
    defaultPageSize: Number(env.DEFAULT_PAGE_SIZE ?? 50),
  });
  return Object.freeze(parsed) as Config;
}

Best Practices & Security

  • Best Practice: Use const only for literals that are guaranteed stable across versions. For anything configuration-related, prefer readonly or static readonly loaded via DI.
  • Best Practice: Static classes holding only const or static readonly are acceptable because they do not manage state or dependencies.
  • Security: If loading values from Azure services (e.g., Azure App Configuration or Key Vault), use Managed Identity instead of connection strings. Grant the minimal RBAC roles required: for Azure App Configuration, assign App Configuration Data Reader to the managed identity; for Key Vault, assign Key Vault Secrets User; for reading resource metadata, the Reader role is sufficient. Do not embed secrets in const or readonly fields.
  • Operational Safety: Avoid public const for values that may change; use public static readonly to prevent consumer inlining issues and to reduce breaking changes.
  • Observability: Expose configuration values carefully in logs; never log secrets. If you must log, redact or hash values and keep them in readonly fields populated via DI.

Summary

  • Use const for true compile-time literals that never change; prefer static readonly for public values to avoid consumer recompilation.
  • Use readonly (and static readonly) for runtime-initialized, immutable values, especially when sourced via DI or environment configuration.
  • Harden production: enforce analyzers in CI, adopt Managed Identity with least-privilege RBAC, and avoid embedding secrets or changeable values in const.

Wednesday, 14 January 2026

What’s New in PnP for SPFx: PnPjs v3+, React Controls, and Secure Patterns

PnP for SPFx has evolved with practical updates that reduce bundle size, improve performance, and harden security. The problem: teams migrating or maintaining SPFx solutions are unsure which PnP changes truly matter and how to adopt them safely. The solution: adopt PnPjs v3+ modular imports, leverage updated PnP SPFx React Controls where it makes sense, and implement concrete RBAC permissions with least privilege. The value: smaller bundles, faster pages, and auditable access aligned to enterprise security.

The Problem

Developers building SPFx web parts and extensions need a clear, production-grade path to modern PnP usage. Without guidance, projects risk bloated bundles, brittle permissions, and fragile data access patterns.

Prerequisites

  • Node.js v20+
  • SPFx v1.18+ (Yo @microsoft/sharepoint generator)
  • TypeScript 5+ with strict mode enabled
  • Office 365 tenant with App Catalog and permission to deploy apps
  • PnPjs v3+ and @pnp/spfx-controls-react
  • Optional: PnP PowerShell (latest), Azure CLI if integrating with Azure services

The Solution (Step-by-Step)

1) Adopt PnPjs v3+ with strict typing, ESM, and SPFx behavior

Use modular imports and the SPFx behavior to bind to the current context. Validate runtime data with Zod for resilient web parts.

/* Strict TypeScript example for SPFx with PnPjs v3+ */
import { spfi, SPFI } from "@pnp/sp"; // Core PnPjs factory and interface
import { SPFx } from "@pnp/sp/behaviors/spfx"; // Binds SPFx context as a behavior
import "@pnp/sp/items"; // Bring in list items API surface
import "@pnp/sp/lists"; // Bring in lists API surface
import { z } from "zod"; // Runtime schema validation

// Minimal shape for data we expect from SharePoint
const TaskSchema = z.object({
  Id: z.number(),
  Title: z.string(),
  Status: z.string().optional(),
});

type Task = z.infer<typeof TaskSchema>;

// SPFx helper to create a bound SP instance. This avoids global state and is testable.
export function getSP(context: unknown): SPFI {
  // context should be the WebPartContext or Extension context
  return spfi().using(SPFx(context as object));
}

// Fetch list items with strong typing and runtime validation
export async function fetchTasks(sp: SPFI, listTitle: string): Promise<readonly Task[]> {
  // Select only the fields needed for minimal payloads
  const raw = await sp.web.lists.getByTitle(listTitle).items.select("Id", "Title", "Status")();
  // Validate at runtime to catch unexpected shapes
  const parsed = z.array(TaskSchema).parse(raw);
  return parsed;
}

Why this matters: smaller imports improve tree shaking, and behaviors keep your data layer clean, testable, and context-aware.

2) Use batching and caching behaviors for fewer round-trips

Batch multiple reads to reduce network overhead, and apply caching for read-heavy views.

import { spfi, SPFI } from "@pnp/sp";
import { SPFx } from "@pnp/sp/behaviors/spfx";
import "@pnp/sp/webs";
import "@pnp/sp/lists";
import "@pnp/sp/items";
import { Caching } from "@pnp/queryable"; // Behavior for query caching

export function getCachedSP(context: unknown): SPFI {
  return spfi().using(SPFx(context as object)).using(
    Caching({
      store: "local", // Use localStorage for simplicity; consider session for sensitive data
      defaultTimeout: 30000, // 30s cache duration; tune to your UX needs
    })
  );
}

export async function batchedRead(sp: SPFI, listTitle: string): Promise<{ count: number; first: string }> {
  // Create a batched instance
  const [batchedSP, execute] = sp.batched();

  // Queue multiple operations
  const itemsPromise = batchedSP.web.lists.getByTitle(listTitle).items.select("Id", "Title")();
  const topItemPromise = batchedSP.web.lists.getByTitle(listTitle).items.top(1).select("Title")();

  // Execute the batch
  await execute();

  const items = await itemsPromise;
  const top = await topItemPromise;

  return { count: items.length, first: (top[0]?.Title ?? "") };
}

Pro-Tip: Combine select, filter, and top to minimize payloads and speed up rendering.

3) Use PnP SPFx React Controls when they save time

Prefer controls that encapsulate complex, well-tested UX patterns. Examples:

  • PeoplePicker for directory-aware selection
  • FilePicker for consistent file selection
  • ListView for performant tabular data
import * as React from "react";
import { PeoplePicker, PrincipalType } from "@pnp/spfx-controls-react/lib/PeoplePicker";

// Strongly typed shape for selected people
export type Person = {
  id: string;
  text: string;
  secondaryText?: string;
};

type Props = {
  onChange: (people: readonly Person[]) => void;
};

export function PeopleSelector(props: Props): JSX.Element {
  return (
    <div>
      <PeoplePicker
        context={(window as unknown as { spfxContext: unknown }).spfxContext}
        titleText="Select people"
        personSelectionLimit={3}
        principalTypes={[PrincipalType.User]}
        showtooltip
        required={false}
        onChange={(items) => {
          const mapped: readonly Person[] = items.map((i) => ({
            id: String(i.id),
            text: i.text,
            secondaryText: i.secondaryText,
          }));
          props.onChange(mapped);
        }}
      />
    </div>
  );
}

Pro-Tip: Keep these controls behind thin adapters so you can swap or mock them in tests without touching business logic.

4) Streamline deployment with PnP PowerShell

Automate packaging and deployment to ensure consistent, auditable releases.

# Install: https://pnp.github.io/powershell/
# Deploy an SPFx package to the tenant app catalog and install to a site
Connect-PnPOnline -Url https://contoso-admin.sharepoint.com -Interactive

# Publish/overwrite SPPKG into the tenant catalog
Add-PnPApp -Path .\sharepoint\solution\my-solution.sppkg -Scope Tenant -Publish -Overwrite

# Install the app to a specific site
Connect-PnPOnline -Url https://contoso.sharepoint.com/sites/ProjectX -Interactive
$pkg = Get-PnPApp | Where-Object { $_.Title -eq "My Solution" }
Install-PnPApp -Identity $pkg.Id -Scope Site -Overwrite

Pro-Tip: Run these commands from CI using OIDC to Azure AD (no stored secrets) and conditional approvals for production sites.

5) Security and RBAC: explicit, least-privilege permissions

Be explicit about the minimal roles required:

  • SharePoint site and list permissions: Read (for read-only web parts), Edit or Contribute (only when creating/updating items). Prefer item- or list-scoped permissions over site-wide.
  • Graph delegated permissions in SPFx: User.Read, User.ReadBasic.All, Sites.Read.All (only if cross-site reads are required). Request via API access in the package solution. Avoid .All scopes unless necessary.
  • Azure service calls via backend API: If your SPFx calls an Azure Function or App Service, secure the backend with Entra ID and assign a Managed Identity to the backend. Grant that identity minimal roles such as Storage Blob Data Reader or Storage Blob Data Contributor on specific storage accounts or containers only.

Pro-Tip: Prefer resource-specific consent to SharePoint or Graph endpoints and scope consents to the smallest set of sites or resources.

6) Add an error boundary for resilient UI

SPFx runs inside complex pages; isolate failures so one component does not break the whole canvas.

import * as React from "react";

type BoundaryState = { hasError: boolean };

export class ErrorBoundary extends React.Component<React.PropsWithChildren<unknown>, BoundaryState> {
  state: BoundaryState = { hasError: false };

  static getDerivedStateFromError(): BoundaryState {
    return { hasError: true };
  }

  componentDidCatch(error: unknown): void {
    // Log to a centralized telemetry sink (e.g., Application Insights)
    // Avoid PII; sanitize messages before sending
    console.error("ErrorBoundary caught:", error);
  }

  render(): React.ReactNode {
    if (this.state.hasError) {
      return <div role="alert">Something went wrong. Please refresh or try again later.</div>;
    }
    return this.props.children;
  }
}

Wrap your data-heavy components with ErrorBoundary and fail gracefully.

7) Modernize imports for tree shaking and smaller bundles

Only import what you use. Avoid star imports.

// Good: minimal surface
import { spfi } from "@pnp/sp";
import "@pnp/sp/items";
import "@pnp/sp/lists";

// Avoid: broad or legacy preset imports that include APIs you don't need
// import "@pnp/sp/presets/all";

Pro-Tip: Run webpack-bundle-analyzer to confirm reductions as you trim imports.

Best Practices & Security

  • Principle of Least Privilege: grant Read before Edit or Contribute; avoid tenant-wide Sites.Read.All unless essential.
  • Runtime validation: use Zod to guard against content type or field drift.
  • Behavior-driven PnPjs: keep SPFx context in a factory; never in globals.
  • Resiliency: add retries/backoff for throttling with PnPjs behaviors; display non-blocking toasts for transient failures.
  • No secrets in client code: if integrating with Azure, call a backend secured with Entra ID; use Managed Identities on the backend instead of keys.
  • Accessibility: ensure controls include aria labels and keyboard navigation.
  • Observability: log warnings and errors with correlation IDs to diagnose issues across pages.

Pro-Tip: For heavy reads, combine batching with narrow select filters and increase cache duration carefully; always provide a user-initiated refresh.

Summary

  • PnPjs v3+ with behaviors, batching, and caching delivers smaller, faster, and cleaner SPFx data access.
  • PnP SPFx React Controls accelerate complex UX while remaining testable behind adapters.
  • Explicit RBAC and runtime validation raise your security bar without slowing delivery.

What’s New in C# in 2026: Trends, Confirmed Features, and How to Stay Ahead

Overview

Curious about what’s new in C# in 2026? This guide explains how to track official changes, highlights confirmed features available today, and outlines likely areas of evolution so you can plan upgrades with confidence without relying on rumors.

Confirmed C# Features You Can Use Today

While 2026 updates may vary by release timing, several modern C# features (through recent versions) are already production-ready and shape how teams write code:

  • Primary constructors for classes and structs: Concise initialization patterns that reduce boilerplate. Example in text: class Point(int x, int y) { public int X = x; public int Y = y; }
  • Collection expressions: Easier literal-like creation and transformations for collections without verbose constructors.
  • Enhanced pattern matching: More expressive matching for complex data, improving readability and safety over nested if statements.
  • Required members: Enforce construction-time initialization for critical properties to prevent invalid states.
  • Raw string literals: Cleaner multi-line strings for JSON, SQL, and HTML content without excessive escaping.
  • Improved lambda and generic math support: More powerful functional patterns and numeric abstractions for algorithms and analytics.

Example: Clean Data Pipelines with Patterns and Collections

Imagine normalizing input records to a canonical model. With patterns and collection expressions, you can match on shape and materialize results concisely: var normalized = [ from r in records select r switch { { Type: "User", Id: > 0 } => new User(r.Id, r.Name), { Type: "System" } => new User(0, "system"), _ => new User(-1, "unknown") } ];

Likely Areas of Evolution in 2026 (Roadmap-Oriented)

The following areas are commonly emphasized by the C# and .NET ecosystem and are reasonable to monitor in 2026. Treat these as directional, not promises—always verify in official release notes:

  • Pattern matching refinements: Continued expressiveness and performance improvements for complex domain modeling.
  • Source generator ergonomics: Smoother authoring and consumption for meta-programming scenarios.
  • Performance and AOT: Tighter integration with native AOT for smaller, faster apps, especially for microservices and tools.
  • Incremental build and tooling upgrades: Faster inner loops in IDEs and CI with richer diagnostics and analyzers.
  • Cloud-native and container-first defaults: Templates and libraries that minimize cold starts and memory footprints.

Why This Matters

These focus areas help teams ship faster, reduce runtime costs, and maintain safer, more maintainable codebases with fewer dependencies.

How to Verify What’s New in 2026

Use official channels to confirm 2026 updates and avoid misinformation:

  • .NET and C# release notes: Check the official docs for What’s New pages and language proposals.
  • GitHub dotnet/roslyn: Track accepted language proposals and compiler changes.
  • Preview SDKs: Install preview .NET SDKs and enable preview features to test changes early.
  • Conference keynotes: Watch Build, .NET Conf, and Ignite sessions for roadmap confirmation.
  • Analyzer baselines: Enable latest analyzers in your editorconfig to surface language and API guidance as it lands.

2026-Ready Upgrade Checklist

  • Target latest LTS runtime where feasible: Plan migration paths with staged environment rollouts.
  • Enable nullable and latest language version: Adopt safety defaults and modern features incrementally.
  • Introduce required members and primary constructors: Improve model correctness and reduce boilerplate.
  • Adopt collection expressions and raw strings: Simplify data composition and configuration handling.
  • Measure before and after: Use BenchmarkDotNet and dotnet-counters to justify changes with data.
  • Harden CI/CD: Add analyzers, test coverage gates, and API compatibility checks.
  • Container and AOT pilots: Trial native AOT for suitable workloads to reduce cold start and memory.

Practical Examples to Modernize Safely

Primary Constructors with Required Members

public class Order(int id, Customer customer) { public required int Id { get; init; } = id; public required Customer Customer { get; init; } = customer; }

This pattern enforces valid construction and makes intent explicit.

Collection Expressions for Pipelines

var dashboard = [ ..from s in services select new Widget(s.Name, s.Status) ];

Readable, composable, and easy to refactor.

Pattern Matching for Safer Branching

string Describe(object o) => o switch { int n and > 0 => "positive int", string s and not "" => "string", null => "null", _ => "other" };

Key Takeaways

  • Use modern C# features available today to unlock productivity and correctness.
  • Treat 2026 items as directional until confirmed; verify via official sources.
  • Adopt a measured upgrade plan with benchmarks, analyzers, and staged rollouts.

Next Steps

  • Audit your codebase for opportunities to use required members, primary constructors, and collection expressions.
  • Spin up a branch targeting the latest SDK and enable preview features in a test project.
  • Document findings, measure impact, and schedule incremental adoption across services.

Tuesday, 13 January 2026

What’s New in SharePoint in 2026? Trends, Roadmap Clues, and How to Prepare

What’s New in SharePoint in 2026? A Practical Guide

The question of what’s new in SharePoint in 2026 matters to IT leaders, intranet owners, and content teams planning their digital workplace. As of now, Microsoft has not publicly announced a definitive 2026 feature list, but current releases and roadmap patterns point to clear themes you can prepare for today.

What We Know vs. What to Watch

What we know: SharePoint continues to evolve within Microsoft 365—deepening integrations with Teams, Viva, OneDrive, and Power Platform, and investing in performance, security, and AI-driven content experiences.

What to watch: Expect enhancements that make content creation faster, governance more automated, and experiences more personalized—without forcing disruptive rebuilds of existing sites.

Key Themes Likely to Shape SharePoint in 2026

  • AI-assisted content and governance: More copilots and suggestions to draft pages, summarize documents, tag content, and recommend policies.
  • Richer Teams and Loop integration: Easier co-authoring, fluid components embedded in pages, and consistent permissions across apps.
  • Employee experience alignment: Closer ties with Viva Connections, Topics, and Learning to surface targeted content where people work.
  • Performance and design upgrades: Faster page loads, modern web parts, better mobile rendering, and improved templating for consistent branding.
  • Automated lifecycle and compliance: Smarter retention, sensitivity labeling, and archiving guided by content signals.
  • External collaboration controls: Safer B2B sharing, guest management, and activity monitoring without friction.
  • Low-code acceleration: Deeper Power Automate and Power Apps hooks to turn content into streamlined workflows.

How to Prepare Your SharePoint Environment Now

  • Standardize on modern: Migrate classic sites and pages to modern to unlock coming improvements and reduce tech debt.
  • Tighten information architecture: Use hub sites, site templates, content types, and metadata so AI and search can perform better.
  • Establish governance guardrails: Define provisioning, naming, guest access, and lifecycle policies—then automate where possible.
  • Optimize content readiness: Clean up stale libraries, add alt text, use consistent titles, and adopt page templates for quality and accessibility.
  • Integrate with Teams and Viva: Pin intranet resources in Teams, configure Viva Connections dashboards, and align audiences.
  • Measure what matters: Track site analytics, search terms, and task completion to inform future design changes.

Examples to Guide Your 2026 Planning

Example 1: News Hub Modernization

A communications team adopts modern page templates, audience targeting, and image renditions. They tag content with consistent metadata and automate approvals via Power Automate. Result: faster publishing, higher engagement, and analytics that guide future content.

Example 2: Policy Library with Compliance

HR builds a centralized policy site using content types, versioning, and sensitivity labels. Automated reminders prompt owners to review policies quarterly. Users get summaries and related links surfaced contextually in Teams.

Example 3: Project Sites at Scale

PMO uses request forms triggering automated site provisioning with standard navigation, permissions, and retention. Project dashboards surface risks, decisions, and documents, while lifecycle rules archive inactive sites.

Frequently Asked Questions

Will I need to rebuild my intranet? Unlikely. Focus on modern experiences, clean IA, and governance so new capabilities can layer onto your existing sites.

How do I future‑proof content? Use modern pages, structured metadata, accessible media, and standardized templates to benefit from search, AI, and analytics.

What about security and compliance? Expect continued investment in labeling, DLP, auditing, and lifecycle automation—so set clear policies now and automate enforcement.

Bottom Line

While specifics on what’s new in SharePoint in 2026 are not officially detailed, the direction is clear: smarter creation, stronger governance, tighter integration, and better performance. If you invest today in modern foundations, metadata, governance, and measurement, you’ll be ready to adopt 2026 capabilities with minimal disruption and maximum impact.

Power Platform 2026: What’s New, What to Expect, and How to Prepare

Curious about what's new on the Power Platform in 2026? While real-time details may vary by region and release wave, this guide outlines the most credible trends, expected enhancements, and practical steps to help you prepare for Power Platform 2026 updates across Power Apps, Power Automate, Power BI, Dataverse, and governance.

Where to Find the Official 2026 Updates

Microsoft typically publishes semiannual release plans (Wave 1 and Wave 2). For authoritative details on Power Platform 2026 features, review the latest release plans, product blogs, and admin center message posts. Use these sources to validate dates, preview availability, and regional rollout specifics.

What’s Likely New Across the Power Platform in 2026

Power Apps: Faster, Smarter, More Governed

  • Deeper Copilot in app building: Expect broader natural language to app/screen/form experiences, with smarter controls suggestions and data-binding from sample prompts.
  • Advanced performance profiles: More diagnostics and client-side performance hints for complex canvas and model-driven apps.
  • Reusable design systems: Expanded theming and component libraries to accelerate enterprise-wide UI consistency.
  • ALM-ready solutions: Tighter pipelines from dev to prod with improved solution layering and environment variables.

Power Automate: End-to-End Automation Intelligence

  • Process mining at scale: Richer discovery and variant analysis integrated with automation recommendations.
  • Copilot for flows: Natural language flow creation, repair suggestions, and test data generation for robust automation.
  • Robotic automation hardening: More resilient desktop flows with enhanced error handling, retries, and monitoring.
  • Event-driven patterns: Expanded triggers and durable patterns for long-running business processes.

Power BI: Fabric-First and AI-Assisted Insights

  • Seamless Fabric integration: Tighter lakehouse/warehouse connectivity, semantic models, and item-level security alignment.
  • AI-assisted analysis: Enhanced narrative summaries, anomaly detection, and Q&A responsiveness.
  • Governed self-service: Broader deployment pipelines, endorsements, and lineage to scale enterprise BI safely.

Dataverse, Security, and Governance

  • Managed Environments maturity: More policy templates for DLP, data residency, and solution lifecycle guardrails.
  • Dataverse scalability: Performance, indexing, and data mesh patterns for cross-domain collaboration.
  • Compliance and audit: Finer-grained logging, retention, and admin analytics for regulated industries.

Integration Trends to Watch in 2026

  • Connector ecosystem growth: More premium and enterprise-grade connectors with higher throughput and better error transparency.
  • Microsoft Fabric alignment: Unified governance and pipelines spanning data engineering, science, and BI.
  • Responsible AI: Stronger content filters, prompt controls, and audit trails for Copilot experiences.
  • Hybrid and multi-cloud: Expanded patterns for secure integration with non-Microsoft services via standard protocols.

How to Prepare Your Organization

  • Adopt Managed Environments: Standardize policies, usage analytics, and solution checks before large rollouts.
  • Harden ALM pipelines: Use solution-based development, branches, and automated validations to reduce drift.
  • Establish a design system: Build a component library for consistent UX across Power Apps.
  • Create a data foundation: Align Dataverse and Fabric models; document ownership, lineage, and policies.
  • Upskill on Copilot: Train makers and data teams to co-create with AI and review outputs for accuracy and compliance.
  • Triage automations: Prioritize high-ROI flows, add observability, and plan for exception handling.

Example Roadmap to Adopt 2026 Features

  • Quarter 1: Inventory apps/flows, implement Managed Environments, define DLP, and set ALM baselines.
  • Quarter 2: Pilot Copilot-assisted app/flow creation in a sandbox; benchmark performance and quality.
  • Quarter 3: Integrate with Fabric for governed datasets; standardize deployment pipelines for BI.
  • Quarter 4: Scale process mining, consolidate connectors, and publish shared components across teams.

Examples: Practical Use Cases

  • Field service app: Use Copilot to scaffold screens from a Dataverse table, add offline capabilities, and enforce role-based access.
  • Invoice automation: Discover steps via process mining, build an approval flow with exception routing, and log metrics to a central dashboard.
  • Executive analytics: Publish Fabric-backed semantic models with certified datasets and AI-generated summaries for board reviews.

Key Takeaways

  • 2026 will emphasize AI + governance: Expect Copilot advances paired with stronger controls and telemetry.
  • Data is the backbone: Align Dataverse and Fabric to reduce duplication and increase trust.
  • Prepare early: Solid ALM, DLP, and design systems make adopting new features faster and safer.

Staying Current

Track the semiannual release plans, enable previews in non-production environments, and review admin center announcements. Validate features in a controlled rollout before scaling to production to ensure performance, security, and compliance objectives are met.

Sunday, 11 January 2026

Binding Pre-Search on Lookup Fields in Model-Driven Apps Using TypeScript

The Problem

Developers often need to filter lookup fields dynamically in Microsoft Power Apps model-driven forms. The challenge is binding a pre-search event to a lookup control using TypeScript with strong typing, ensuring maintainability and avoiding the use of 'any'.

Prerequisites

  • Node.js v20+
  • TypeScript 5+
  • @types/xrm installed for strong typing
  • Power Apps environment with a model-driven app
  • Appropriate security roles: System Customizer or equivalent with form customization privileges

The Solution (Step-by-Step)

1. Add TypeScript Reference

// Add Xrm type definitions at the top of your file
/// 

2. Implement the Pre-Search Binding

// Function to bind pre-search event on a lookup field
function addPreSearchToLookup(executionContext: Xrm.Events.EventContext): void {
    const formContext = executionContext.getFormContext();

    // Get the lookup control by name
    const lookupControl = formContext.getControl("new_lookupfield");

    if (lookupControl) {
        // Add pre-search event handler
        lookupControl.addPreSearch(() => {
            // Define custom filter XML
            const filterXml = "";

            // Apply the filter to the lookup
            lookupControl.addCustomFilter(filterXml, "entitylogicalname");
        });
    }
}

// Register this function on the form's OnLoad event in the form editor

Explanation:

  • We use addPreSearch to inject a custom filter before the lookup search executes.
  • Strong typing is enforced using Xrm.Controls.LookupControl.
  • No 'any' type is used, ensuring type safety.

Best Practices & Security

  • Type Safety: Always use @types/xrm for strong typing and avoid 'any'.
  • Security Roles: Ensure users have roles like System Customizer or App Maker to customize forms.
  • Principle of Least Privilege: Assign only necessary roles to users who manage form scripts.
  • Deployment Automation: Use Power Platform CLI or Azure DevOps pipelines for ALM. For IaC, consider Azure Bicep to manage environment configurations.
  • Testing: Test the pre-search logic in a sandbox environment before deploying to production.

Pro-Tip: Use lookupControl.addCustomFilter() with specific entity names to avoid applying filters globally, which can cause unexpected behavior.

Summary

  • Bind pre-search events using addPreSearch for dynamic lookup filtering.
  • Enforce strong typing with @types/xrm and avoid 'any'.
  • Secure and automate deployments using Power Platform CLI and Azure IaC tools.

Saturday, 10 January 2026

Binding a Dropdown List in ASP.NET Core with EF Core and Azure Managed Identity

The Problem

Developers often need to populate dropdown lists dynamically from a database in ASP.NET Core applications. The challenge is to do this securely and efficiently, avoiding hardcoded connection strings and ensuring the solution is production-ready with Azure best practices.

Prerequisites

  • .NET 8 SDK installed
  • Azure CLI configured
  • Azure SQL Database with Managed Identity enabled
  • Visual Studio Code or IDE of choice

The Solution (Step-by-Step)

Step 1: Configure Azure Managed Identity and RBAC

Assign your App Service or Azure Function a System-Assigned Managed Identity. Then, grant the identity the Azure SQL Database Contributor role or a custom role with least privilege to access the database.

Step 2: Configure EF Core with DefaultAzureCredential

using Azure.Identity; // For DefaultAzureCredential
using Microsoft.EntityFrameworkCore;

var builder = WebApplication.CreateBuilder(args);

// Use DefaultAzureCredential for Managed Identity authentication
var credential = new DefaultAzureCredential();
var connectionString = builder.Configuration["AZURE_SQL_CONNECTION_STRING"];

builder.Services.AddDbContext<AppDbContext>(options =>
{
    options.UseSqlServer(connectionString, sqlOptions =>
    {
        sqlOptions.EnableRetryOnFailure(); // Resiliency for transient faults
    });
});

var app = builder.Build();
app.Run();

Explanation: We use DefaultAzureCredential to authenticate via Managed Identity, avoiding hardcoded secrets. The connection string should reference the server and database but omit credentials.

Step 3: Create the Model and DbContext

public class Category
{
    public int Id { get; set; }
    public string Name { get; set; } = string.Empty;
}

public class AppDbContext(DbContextOptions<AppDbContext> options) : DbContext(options)
{
    public DbSet<Category> Categories => Set<Category>();
}

Explanation: The Category entity represents the dropdown data source.

Step 4: Implement the Controller Action

using Microsoft.AspNetCore.Mvc;
using Microsoft.AspNetCore.Mvc.Rendering;

[ApiController]
[Route("api/[controller]")]
public class CategoriesController(AppDbContext context) : ControllerBase
{
    [HttpGet("dropdown")]
    public IActionResult GetDropdown()
    {
        // Fetch categories without tracking for performance
        var categories = context.Categories.AsNoTracking()
            .Select(c => new SelectListItem
            {
                Value = c.Id.ToString(),
                Text = c.Name
            }).ToList();

        return Ok(categories);
    }
}

Explanation: The action retrieves categories and maps them to SelectListItem for dropdown binding. AsNoTracking() improves performance for read-only queries.

Step 5: Bind in Razor View

@model YourViewModel

<select asp-for="SelectedCategoryId" asp-items="Model.Categories"></select>

Explanation: The Razor view uses asp-items to bind the dropdown list.

Best Practices & Security

  • Use Managed Identity with DefaultAzureCredential instead of storing credentials in code or configuration files.
  • Apply the Principle of Least Privilege by assigning only necessary RBAC roles like Azure SQL Database Contributor.
  • Validate user input in controllers to prevent injection attacks.
  • Use AsNoTracking() for read-only queries to improve performance.

Summary

  • Bind dropdown lists using EF Core and strongly typed models.
  • Secure database access with Azure Managed Identity and RBAC roles.
  • Follow best practices for performance and security in production.

Wednesday, 7 January 2026

What's New in SharePoint Framework (SPFx): Latest Features and Updates

Introduction to SharePoint Framework (SPFx)

The SharePoint Framework (SPFx) is a modern development model for building custom solutions in SharePoint Online and on-premises. It enables developers to create responsive, mobile-friendly web parts and extensions using popular web technologies like React, Angular, and TypeScript. In this article, we’ll explore what’s new in SPFx and how these updates can enhance your SharePoint development experience.

Key New Features in SPFx

Microsoft continuously improves SPFx to provide developers with better tools and capabilities. Here are some of the latest updates:

  • Support for Microsoft Teams: SPFx now allows developers to build apps that work seamlessly in both SharePoint and Microsoft Teams, improving collaboration and integration.
  • Improved Performance: The latest versions of SPFx include optimizations for faster load times and better resource management, ensuring a smoother user experience.
  • Yarn and NPM Compatibility: Developers can now use Yarn or NPM for package management, offering flexibility in managing dependencies.
  • Enhanced Tooling: Updated Yeoman generators and Gulp tasks make scaffolding and building projects easier than ever.

Benefits of the Latest SPFx Updates

The new features in SPFx bring several benefits to developers and organizations:

  • Cross-Platform Development: Build solutions that work across SharePoint Online, on-premises, and Microsoft Teams.
  • Modern Web Standards: Leverage React, Angular, and other frameworks for creating dynamic, responsive solutions.
  • Faster Deployment: Improved build processes and performance optimizations reduce time-to-market for custom solutions.

Examples of SPFx Use Cases

Here are some practical examples of how organizations use SPFx:

  • Custom Web Parts: Create tailored web parts for dashboards, reports, and interactive content.
  • Extensions: Add custom actions, field customizers, and application customizers to enhance SharePoint functionality.
  • Teams Integration: Build apps that provide a unified experience across SharePoint and Microsoft Teams.

Conclusion

The latest updates in SharePoint Framework (SPFx) make it a powerful tool for modern SharePoint development. By leveraging these new features, developers can create more efficient, integrated, and user-friendly solutions. Stay updated with SPFx releases to maximize your SharePoint investment.

What's New in .NET Core: Latest Features and Enhancements for Developers

Introduction to What's New in .NET Core

The latest updates in .NET Core bring powerful features and performance improvements that developers have been waiting for. Whether you are building web applications, APIs, or microservices, understanding what's new in .NET Core can help you stay ahead in the development world.

Key Features in the Latest .NET Core Release

The new version of .NET Core introduces several enhancements that improve productivity, security, and performance. Here are some of the most notable updates:

  • Improved Performance: The runtime and libraries have been optimized for faster execution and reduced memory usage.
  • Cross-Platform Support: Enhanced compatibility for Linux, macOS, and Windows environments.
  • New APIs: Additional APIs for better integration and functionality in modern applications.
  • Enhanced Security: Built-in security features to protect against common vulnerabilities.

Benefits of Upgrading to the Latest .NET Core

Upgrading to the latest .NET Core version ensures that your applications are future-proof and take advantage of the newest technologies. Some benefits include:

  • Better performance and scalability for enterprise applications.
  • Access to modern development tools and frameworks.
  • Long-term support and regular security updates.

Example: Building a Web API with .NET Core

Creating a Web API in .NET Core is now easier than ever. With the new templates and simplified configuration, developers can quickly set up RESTful services that are fast, secure, and maintainable.

Conclusion

Staying updated with what's new in .NET Core is essential for developers who want to build high-performing, secure, and scalable applications. Explore the latest features today and take your development skills to the next level.

Tuesday, 6 January 2026

How to Create a Polymorphic Column in Dataverse: A Complete Guide

What is a Polymorphic Column in Dataverse?

A polymorphic column in Dataverse is a special type of column that can reference multiple tables instead of being limited to a single table. This feature is particularly useful when you need flexibility in your data model, allowing a single column to store references to different entity types without creating multiple lookup fields.

Why Use Polymorphic Columns?

Polymorphic columns provide several benefits for developers and businesses:

  • Flexibility: They allow a single column to relate to multiple tables, reducing complexity.
  • Efficiency: Simplifies data relationships and reduces the need for redundant fields.
  • Scalability: Ideal for scenarios where the related entity can vary, such as activities linked to different record types.

Steps to Create a Polymorphic Column in Dataverse

Follow these steps to create a polymorphic column in Microsoft Dataverse:

  • Step 1: Navigate to the Power Apps Maker Portal and select your environment.
  • Step 2: Open the Tables section and choose the table where you want to add the column.
  • Step 3: Click on Add Column and select Lookup as the data type.
  • Step 4: In the Related Table dropdown, choose Activity or Customer to enable polymorphic behavior. These are the two primary polymorphic relationships supported by Dataverse.
  • Step 5: Save and publish your changes to make the column available in your apps and flows.

Best Practices for Using Polymorphic Columns

To ensure optimal performance and maintainability, consider these best practices:

  • Use polymorphic columns only when necessary to avoid unnecessary complexity.
  • Document the relationships clearly for future reference.
  • Test your apps thoroughly to ensure the column behaves as expected across different scenarios.

By following these steps and best practices, you can effectively create and manage polymorphic columns in Dataverse, enhancing the flexibility and scalability of your data model.