The Foundation: How I’m Building a Real-World Ordering Demo for Deep EF Core & Lambda Exploration




How I’m Building a Real-World Ordering Demo for Deep EF Core & Lambda Exploration

Before I write a single deep dive about lambda expressions, expression trees, or EF Core translation, I want to be very deliberate about what those examples live inside.

This repository will evolve as the series progresses, each post builds on the same codebase:

Because lambdas don’t exist in isolation.

They behave very differently depending on:

So instead of starting with syntax, I’m starting with a real application, one that’s complex enough to break if you misuse lambdas, but still small enough to understand.

In this post, I’ll walk you through:

  • The demo application I’m building

  • The architectural choices I’ve made

  • And why each decision matters for the rest of the series

Why I’m Not Using Toy Examples

I could explain lambdas with:

users.Where(u => u.IsActive);

But that example never fails.

It doesn’t show:

In production systems, lambdas:

  • cross layers

  • get reused in different contexts

  • are composed dynamically

  • and eventually hit a database provider that doesn’t care how elegant your syntax was

So I need a demo where:

  • bad lambda usage actually hurts

  • good lambda design pays off



The Architecture at a Glance (Before We Go Deeper)

Before I talk about domains, entities, or queries, I want to step back and show you the full shape of the system. Because when we start discussing lambdas later, where the code lives will matter just as much as what the code does.

At a high level, this demo application is split into four logical layers, orchestrated locally using .NET Aspire:


This separation isn’t academic, it’s what allows me to:

  • Understand execution context

  • Control where lambdas run

  • Test translation behaviour in isolation

  • And benchmark real performance differences later

Let me briefly explain what belongs in each layer before we dig deeper into them.


Blazor: User Intent, Not Business Logic

The Blazor frontend exists to:

  • capture user intent

  • render data

  • manage UI state

It does not:

  • filter collections

  • build LINQ queries

  • know anything about EF Core

This is deliberate.

Later in the series, I’ll show why filtering in the UI with lambdas is one of the easiest ways to accidentally:

By keeping Blazor “dumb”, I ensure:

  • all lambdas that matter live on the server

  • execution context is predictable


API: Orchestration, Not Query Logic

The Web API layer:

  • accepts requests

  • validates input

  • invokes application use cases

  • returns DTOs

It does not:

  • compose queries

  • contain Where, Select, or Include logic

  • understand how data is persisted

This keeps the API thin and makes it a perfect boundary for:

  • testing

  • performance measurement

  • future protocol changes (HTTP → gRPC, etc.)


Application Layer: Where Lambdas Live

This is the most important layer for the entire series.

The Application layer is where:

This is where I’ll place:

Why?

Because this layer:

  • is independent of EF Core

  • can be tested in isolation

  • expresses what we want, not how it’s executed

Everything we discuss about lambdas later will anchor here.


Domain Layer: Stability and Constraints

The Domain layer contains:

It knows:

  • what an Order is

  • what states are valid

  • what relationships exist

It knows nothing about:

  • databases

  • LINQ

  • lambdas

  • EF Core

This separation matters because:

  • domain rules shouldn’t care how queries are executed

  • lambdas should describe selection, not behaviour


Infrastructure: Execution and Translation

Infrastructure is where:

  • EF Core lives

  • PostgreSQL is configured

  • expression trees are finally executed or translated

This is the layer that:

  • consumes the expressions

  • turns them into SQL

  • executes them efficiently (or not)

By isolating Infrastructure, I can:

  • swap providers

  • benchmark behaviour

  • inspect generated SQL

  • demonstrate translation failures cleanly


Aspire: The Glue That Makes This Practical

All of this is orchestrated locally using .NET Aspire.

Aspire:

  • provisions PostgreSQL

  • wires connection strings

  • starts all services together

  • removes local setup friction

This is what makes the demo runnable, benchmarkable, and reproducible.


Why This Architectural Overview Matters

Without this context, later posts about lambdas become confusing.

When I say:

  • “this lambda runs in memory”

  • “this expression gets translated”

  • “this breaks SQL generation

…it only makes sense if you know:

  • which layer you’re in

  • what dependencies are present

  • and who is responsible for execution

This section is the map.


Why I’m Using PostgreSQL

PostgreSQL is intentional.

It gives me:

More importantly:

  • EF Core’s translation behaviour is visible

  • ToQueryString() becomes meaningful

  • benchmarks show real differences

Using Postgres prevents the demo from lying to you.


Why Aspire Is Non-Negotiable for This Demo

Most demo repos fail before the code runs.

I don’t want readers to:

  • install Postgres manually

  • configure connection strings

  • run scripts

  • debug local setup issues

With Aspire:

  • infrastructure is declared in code

  • services and databases are wired together

  • connection strings flow automatically

  • the entire system starts with F5

This matters because:

  • readers actually run the code

  • benchmarks are reproducible

  • query behaviour matches production more closely

Aspire turns this from a blog series into a working reference system.


Setting Up the Demo App in Visual Studio 2026 (Aspire + Postgres + Clean Architecture)

We’re going to start by creating an empty solution in Visual Studio and naming it something like "OrderDemo"

First, open Visual Studio 2026 and select  "Create a new project" from the start screen.


On the Create a new project dialog, use the template search box and type Blank Solution.

This should return the Blank Solution template. Select it and click Next.

On the next screen, name the solution OrderDemo.

I’m going to store mine in my usual working directory at: "D:/Development/Blogs/OrderDemo" 

You can store it wherever you like,the location has no impact on how the application runs at runtime.

Finally, click Create to generate the solution.


Add the Clean Architecture projects

In Solution Explorer:

Right-click Solution → AddNew Project and create these projects:

Class libraries

  • Ordering.Domain (Class Library)

  • Ordering.Application (Class Library)

  • Ordering.Infrastructure (Class Library)

Applications

  • Ordering.Api (ASP.NET Core Web API — minimal API is fine)

  • Ordering.Blazor (Blazor Web App / Blazor Server is easiest for a demo)

Now add references:

Ordering.Application

  • references Ordering.Domain

Ordering.Infrastructure

  • references Ordering.Application

  • references Ordering.Domain

Ordering.Api

  • references Ordering.Application

  • references Ordering.Infrastructure

Ordering.Blazor

  • references Ordering.Application


Add .net Aspire

  • Right-click Ordering.ApiAdd.NET Aspire Orchestrator Support.

  • When prompted for the Aspire version, select Latest (at the time of writing, this is 9.5),then click OK

  • Visual Studio will create two new projects:

    • OrderDemo.ServiceDefaults
    • OrderDemo.AppHost

  • Next, right-click Ordering.BlazorAdd.NET Aspire Orchestrator Support.

  • You will be informed that an Aspire orchestrator already exists and that Ordering.Blazor will be added to the existing orchestration.

  • Click OK to complete the setup.


Add Postgres to the .net Aspire AppHost

  • Right-click OrderDemo.AppHostAdd.NET Aspire Packages

  • From the list of available resources, select Aspire.Hosting.PostgreSQL and click Next.
  • Open Program.cs in OrderDemo.AppHost.

  • Locate the following line:

    var builder = DistributedApplication.CreateBuilder(args);
  • Immediately below it, add the PostgreSQL resource:

    var postgres = builder.AddPostgres("postgres").AddDatabase("orderingdb");

What This Code Is Doing

var postgres = builder.AddPostgres("postgres").AddDatabase("ordering");;

Registers a PostgreSQL container with Aspire named postgres.
Aspire will:

  • Start a PostgreSQL container when the AppHost runs
  • Manage ports, credentials, and lifecycle automatically
  • Expose connection information to other services

Creates a logical database named ordering inside the PostgreSQL instance.

This allows multiple databases to exist within the same PostgreSQL container if needed.

var postgres = ...

  • Stores a reference to the PostgreSQL resource so other projects (APIs, workers, etc.) can depend on it.

    At this point, PostgreSQL exists only at the orchestration level, nothing is connected to it yet.

Adding PostgreSQL as a Dependency of Ordering.Api

Still in Program.cs of OrderDemo.AppHost, locate where the API project is registered:

We are going to update are Aspire binding as follows

var builder = DistributedApplication.CreateBuilder(args);

var postgres = builder.AddPostgres("postgres").AddDatabase("orderingdb");

var api = builder.AddProject<Projects.Ordering_Api>("ordering-api").WithReference(postgres).WaitFor(postgres);

builder.AddProject<Projects.Ordering_Blazor>("ordering-blazor").WithReference(api);

builder.Build().Run();

What the Dependency Does

  • WithReference(postgres) tells Aspire:

    • Ordering.Api depends on PostgreSQL

    • The API must not start until PostgreSQL is ready

    • A connection string will be automatically injected into the API’s configuration

  • WithReference(api) tells Aspire:

    • Ordering.Blazor depends on Ordering.Api

    • Ordering.Api  Uri is automatically injected into Ordering.Blazor configuration

This means:

  • No hardcoded connection strings

  • No local secrets in appsettings.json

  • Consistent local, CI, and cloud behavior


Add EF Core Postgres integration the "Aspire way"

In Ordering.API install and the following nugets:

  • Npgsql.EntityFrameworkCore.PostgreSQL

  • Microsoft.EntityFrameworkCore.Design (for migrations tooling)

  • Microsoft.EntityFrameworkCore.Tools (Optional but common)

In Ordering.Infrastructure install and the following nugets:

If you’re using Aspire service defaults, you may already have logging/health packages, so you typically don’t need anything else.


Create the entity model (Domain)

In Ordering.Domain, add these entities (minimal but “real”):

Tenant.cs

namespace Ordering.Domain; public sealed class Tenant { public Guid Id { get; set; } = Guid.NewGuid(); public string Name { get; set; } = default!; }

Customer.cs

namespace Ordering.Domain; public sealed class Customer { public Guid Id { get; set; } = Guid.NewGuid(); public Guid TenantId { get; set; } public Tenant Tenant { get; set; } = default!; public string Name { get; set; } = default!; }

Order.cs

namespace Ordering.Domain; public enum OrderStatus { Draft, Submitted, Paid, Shipped, Cancelled } public sealed class Order { public Guid Id { get; set; } = Guid.NewGuid(); public Guid CustomerId { get; set; } public Customer Customer { get; set; } = default!; public DateTime CreatedUtc { get; set; } = DateTime.UtcNow; public OrderStatus Status { get; set; } = OrderStatus.Draft; public List<OrderItem> Items { get; set; } = new(); public decimal Total => Items.Sum(i => i.Quantity * i.UnitPrice); }

OrderItem.cs

namespace Ordering.Domain; public sealed class OrderItem { public Guid Id { get; set; } = Guid.NewGuid(); public Guid OrderId { get; set; } public Order Order { get; set; } = default!; public string Sku { get; set; } = default!; public int Quantity { get; set; } public decimal UnitPrice { get; set; } }

This model is intentionally chosen to create later “lambda pressure”:


Create DbContext (Infrastructure)

In Ordering.Infrastructure install and the following Nugets:

Microsoft.EntityFrameworkCore

then add:

OrderingDbContext.cs

using Microsoft.EntityFrameworkCore; using Ordering.Domain; namespace Ordering.Infrastructure; public sealed class OrderingDbContext : DbContext { public OrderingDbContext(DbContextOptions<OrderingDbContext> options) : base(options) { } public DbSet<Tenant> Tenants => Set<Tenant>(); public DbSet<Customer> Customers => Set<Customer>(); public DbSet<Order> Orders => Set<Order>(); public DbSet<OrderItem> OrderItems => Set<OrderItem>(); protected override void OnModelCreating(ModelBuilder modelBuilder) { modelBuilder.HasDefaultSchema("ordering"); modelBuilder.Entity<Tenant>() .HasKey(x => x.Id); modelBuilder.Entity<Customer>() .HasKey(x => x.Id); modelBuilder.Entity<Customer>() .HasOne(x => x.Tenant) .WithMany() .HasForeignKey(x => x.TenantId); modelBuilder.Entity<Order>() .HasKey(x => x.Id); modelBuilder.Entity<Order>() .HasOne(x => x.Customer) .WithMany() .HasForeignKey(x => x.CustomerId); modelBuilder.Entity<OrderItem>() .HasKey(x => x.Id); modelBuilder.Entity<OrderItem>() .HasOne(x => x.Order) .WithMany(o => o.Items) .HasForeignKey(x => x.OrderId); base.OnModelCreating(modelBuilder); } }

DependencyInjection.cs

using Microsoft.Extensions.DependencyInjection; namespace Ordering.Infrastructure; public static class DependencyInjection { public static IServiceCollection AddInfrastructure(this IServiceCollection services) { // DbContext is registered in the API using AddNpgsqlDbContext via Aspire. return services; } }

We keep the “DbContext registration” in the API because Aspire owns the connection provisioning.


Wire up the API (Aspire + EF Core + minimal endpoints)

In Ordering.Api/Program.cs:

using Microsoft.EntityFrameworkCore; using Ordering.Infrastructure; var builder = WebApplication.CreateBuilder(args); builder.AddServiceDefaults(); // Aspire defaults (telemetry/health/etc.) // Registers DbContext + pooling + health checks + telemetry using Aspire integration. builder.Services.AddDbContext<OrderingDbContext>(options => { var cs = builder.Configuration.GetConnectionString("orderingdb"); options.UseNpgsql(cs, npgoptions => npgoptions.MigrationsHistoryTable("__EFMigrationsHistory", "ordering")); });
builder.Services.AddEndpointsApiExplorer(); var app = builder.Build(); app.MapDefaultEndpoints(); // Aspire default health endpoints if (app.Environment.IsDevelopment()) { app.MapOpenApi(); }

app.UseHttpsRedirection(); // Minimal “seed” endpoint for day-1 demo app.MapPost("/seed", async (OrderingDbContext db) => { if (await db.Tenants.AnyAsync()) return Results.Ok("Already seeded."); var tenant = new Ordering.Domain.Tenant { Name = "Demo Tenant" }; var customer = new Ordering.Domain.Customer { Name = "Sedgefield Customer", Tenant = tenant }; var order = new Ordering.Domain.Order { Customer = customer, Status = Ordering.Domain.OrderStatus.Submitted }; order.Items.Add(new Ordering.Domain.OrderItem { Sku = "WP-CAP-001", Quantity = 2, UnitPrice = 12.50m }); order.Items.Add(new Ordering.Domain.OrderItem { Sku = "WP-BALL-002", Quantity = 1, UnitPrice = 29.99m }); db.Add(order); await db.SaveChangesAsync(); return Results.Ok("Seeded."); }); app.MapGet("/tenants/{tenantId:guid}/orders", async (Guid tenantId, OrderingDbContext db) => { // This query will become VERY relevant in later posts (expression trees + translation). var orders = await db.Orders .Where(o => o.Customer.TenantId == tenantId) .Select(OrderSummaryDto.Projection) .ToListAsync(); return Results.Ok(orders); }); app.Run();

A few important points:

  • The "orderingdb" name must match the resource name you created in the AppHost via .AddDatabase("orderingdb").

  • The /tenants/{tenantId}/orders endpoint is intentionally designed to produce real SQL translation behaviour later.


Create the database schema (EF Core migrations)

With Aspire, PostgreSQL is created and started on-demand when you run OrderDemo.AppHost. That means:

  • You can’t rely on a pre-existing database being there.

  • You want the API to bootstrap itself once Postgres is available.

  • The most reliable approach is: start container → connect → apply migrations.

This is exactly what Database.Migrate() is for.

First lets create a migration

Creating the Initial EF Core Migration (Package Manager Console)

When your solution is split into API and Infrastructure, EF Core migrations should live in Ordering.Infrastructure, while the start-up project must be Ordering.Api so configuration and DI are available.

Right Click on the Ordering.Api project  → Set as Startup Project

Open Package Manager Console

In Visual Studio:

  1. Go to ToolsNuGet Package Manager

  2. Select Package Manager Console

Make sure the solution is fully loaded before continuing.

Select the correct project

At the top of the Package Manager Console:

  • Set Default project to Ordering.Api

This ensures EF Core knows where your DbContext and migrations live.

Create the migration

Run the following command:

Add-Migration InitialCreate

That’s it.

EF Core will:

  • Inspect OrderingDbContext

  • Generate the migration files

  • Add a Migrations folder to Ordering.Api

Verify the migration

In Ordering.Api you should now see:

Open InitialCreate.cs and confirm it contains:

  • Table creation logic

  • Correct PostgreSQL column types

  • A clean Up() and Down() method

Why this works with Aspire

Even though PostgreSQL does not yet exist, this is fine because:

  • Migrations are generated from the model, not the database

  • Aspire will start PostgreSQL later

  • Database.Migrate() will apply this migration automatically on startup (Lets add this in the next section)

You do not need to run:

Update-Database

Step: Add EF Core migration application at API startup

In Ordering.Api / Program.cs, after:

var app = builder.Build();

Add this:

using (var scope = app.Services.CreateScope()) { var db = scope.ServiceProvider.GetRequiredService<OrderingDbContext>(); db.Database.Migrate(); }

Then keep the rest:

app.MapControllers(); app.Run();

What this does

  • Creates a scoped service provider (same as a request scope)

  • Resolves your OrderingDbContext

  • Runs Migrate(), which:

    • Creates the database schema if it’s not there

    • Applies any pending migrations

    • Does nothing if everything is already up-to-date

So in local Aspire dev:

  • First run: container starts → schema gets created

  • Future runs: migration check is fast → no changes applied


Ensure AppHost is the startup project

In Solution Explorer, right-click OrderDemo.AppHost, Select Set as Startup Project

You always run the system from the AppHost, it’s the orchestrator that starts Postgres, the API, and Blazor together.


Run the application: Start the AppHost

Now press F5 (or Ctrl+F5).

What should happen:

  • Aspire starts the PostgreSQL container

  • Aspire starts Ordering.Api

  • Aspire starts Ordering.Blazor

  • The API runs db.Database.Migrate() on startup and creates the schema in the ordering database

You should see the Aspire dashboard open automatically. From there you can click into:

  • Ordering.Blazor to open the UI

  • Ordering.Api to view logs and test endpoints

  • postgres to confirm the resource is running

Seeding the database (sanity check)

The API exposes a simple seed endpoint to make it easy to verify that:

  • PostgreSQL is running

  • Migrations were applied correctly

  • EF Core can read and write data

app.MapPost("/seed", async (OrderingDbContext db) => { if (await db.Tenants.AnyAsync()) return Results.Ok("Already seeded."); var tenant = new Ordering.Domain.Tenant { Name = "Demo Tenant" }; var customer = new Ordering.Domain.Customer { Name = "Demo Customer", Tenant = tenant }; var order = new Ordering.Domain.Order { Customer = customer, Status = Ordering.Domain.OrderStatus.Submitted }; order.Items.Add(new Ordering.Domain.OrderItem { Sku = "WP-CAP-001", Quantity = 2, UnitPrice = 12.50m }); order.Items.Add(new Ordering.Domain.OrderItem { Sku = "WP-BALL-002", Quantity = 1, UnitPrice = 29.99m }); db.Add(order); await db.SaveChangesAsync(); return Results.Ok("Seeded."); });

What this endpoint does

  • Checks whether any tenants already exist
    → prevents reseeding on every run

  • Creates:

    • A Tenant

    • A Customer linked to that tenant

    • An Order with two OrderItems

  • Persists everything in a single unit of work via EF Core

This makes it ideal as a one-click validation endpoint during development.



Fix: “You need to update Aspire.AppHost” (and then run the solution)

At this point we’ve added PostgreSQL, wired it into the AppHost, and enabled EF Core migrations at startup. Now when you try to run the app you may see an error along the lines of:

You need to update Aspire.AppHost

(or a message indicating the AppHost / Aspire workload is out of date)

This usually happens when your solution projects (or Visual Studio) are targeting a newer Aspire version than the one currently installed on your machine.

In Visual Studio:

  • Right-click the solution → Manage NuGet Packages for Solution…

Go to the Updates tab, Update anything Aspire-related, especially:

  • Aspire.Hosting.*
  • Microsoft.Extensions.ServiceDiscovery.*

  • Microsoft.Extensions.Hosting.* (if prompted)

  • Aspire.* components you’ve referenced

    After updating, Build the solution.

    Tip: Keep OrderDemo.AppHost and OrderDemo.ServiceDefaults on the same Aspire package family/version to avoid runtime mismatches.


    How to use the seed endpoint when running under Aspire (using Postman)

    To verify that everything is wired up correctly end-to-end, we’ll invoke the /seed endpoint using Postman.
    Postman is an ideal tool at this stage because it allows you to:
    • Call APIs without needing the UI to be finished
    • Explicitly control HTTP methods, headers, and payloads
    • Quickly validate API behavior during local development
    • Re-run the same requests consistently as your API evolves

    Run the AppHost

    1. Make sure OrderDemo.AppHost is set as the startup project

    2. Press F5 (or Ctrl+F5)

    Aspire will:

    • Start the PostgreSQL container

    • Start Ordering.Api

    • Start Ordering.Blazor

    • Apply EF Core migrations on API startup

    The Aspire dashboard will open automatically.

    Get the API URL from the Aspire dashboard

    • In the Aspire dashboard, click Ordering.Api

    • Copy the exposed HTTP endpoint URL (for example):

    https://localhost:7283

    (Your port will differ.)

    Create the request in Postman

    • Open Postman

    • Create a new request

    • Configure it as follows:

      • Method: POST

      • URL:

        https://localhost:7283/seed
    • No headers or body are required for this endpoint

    Send the request

    Click Send.

    You should receive one of the following responses:

    • First run

      200 OK "Seeded."
    • Subsequent runs

      200 OK "Already seeded."

    Why Postman is a great fit for API-first development

    Using Postman here is deliberate:

    • UI-independent testing, You can validate your API and data access long before the Blazor UI is complete.

    • Fast feedback loop, A single button click confirms that:

      • PostgreSQL is running

      • Migrations applied correctly

      • EF Core can persist data

      • The API pipeline is functioning end-to-end

    • Repeatable and sharable, Postman requests can be saved, grouped into collections, and shared with other developers or testers.

    • Real-world usage simulation, Postman calls your API exactly the way a real client would over HTTP making it a much more realistic test than calling services directly in code.


    At this point, if /seed succeeds, you’ve proven that:

    • Aspire orchestration is working

    • Infrastructure is ready

    • Your API is correctly connected to PostgreSQL

    From here, you can confidently move on to building query endpoints and wiring up the Blazor UI on top of a known-good foundation.


    Adding a Simple Blazor Page (Seed Data + Load Orders)

    This Blazor page is intentionally simple.

    It does two things only:

    1. Calls the API to seed demo data

    2. Calls the API to load orders for a tenant

    It does not:

    • filter data

    • apply business logic

    • use LINQ

    • contain lambdas beyond trivial UI callbacks

    That restraint is important, we’ll come back to it later in the series.


    Configure HttpClient in the Blazor project

    In Ordering.Blazor/Program.cs, make sure you have an HttpClient configured to talk to the API.

    var builder = WebApplication.CreateBuilder(args); builder.AddServiceDefaults(); // The API name must match what you registered in the AppHost builder.Services.AddHttpClient("OrderingApi", client => { client.BaseAddress = new Uri("https://ordering-api"); }); builder.Services.AddRazorComponents() .AddInteractiveServerComponents(); var app = builder.Build(); app.MapDefaultEndpoints(); app.MapRazorComponents<App>() .AddInteractiveServerRenderMode(); app.Run();

    Aspire resolves https://ordering-api automatically via service discovery.


    Create DTOs in the Ordering.Application project

    In Ordering.Applicaiton project, add a folder called Dtos and create the following:

    CustomerDto.cs

    public sealed record CustomerDto(
        Guid Id,
        Guid TenantId,
        string Name)
    {
        public static CustomerDto FromEntity(Customer customer)
            => new(
                customer.Id,
                customer.TenantId,
                customer.Name
            );
    }

    OrderDto.cs

    public sealed record OrderDto(
      Guid Id,
      Guid CustomerId,
      DateTime CreatedUtc,
      OrderStatus Status,
      decimal Total)
    {
        public static OrderDto FromEntity(Order order)
            => new(
                order.Id,
                order.CustomerId,
                order.CreatedUtc,
                order.Status,
                order.Total
            );
    }

    OrderItemDto.cs

    public sealed record OrderItemDto(
    Guid Id,
    Guid OrderId,
    string Sku,
    int Quantity,
    decimal UnitPrice)
    {
        public static OrderItemDto FromEntity(OrderItem item)
            => new(
                item.Id,
                item.OrderId,
                item.Sku,
                item.Quantity,
                item.UnitPrice
            );
    }


    OrderSummartDto.cs

    public sealed record OrderSummaryDto(
        Guid Id,
        DateTime CreatedUtc,
        OrderStatus Status,
        string Customer,
        int ItemCount,
        decimal Total)
    {
            public static Expression<Func<Order, OrderSummaryDto>> Projection
                => o => new OrderSummaryDto(
                    o.Id,
                    o.CreatedUtc,
                    o.Status,
                    o.Customer.Name,
                    o.Items.Count,
                    o.Items.Sum(i => i.Quantity * i.UnitPrice)
                );
    }


    TenantDto.cs

    public sealed record TenantDTO(
        Guid Id,
        string Name
        )
    {
        public static TenantDTO FromEntity(Tenant tenant)
        => new(
            tenant.Id,
            tenant.Name
        );
    }

    We deliberately do not expose our domain entities directly to the outside world. Instead, we use Data Transfer Objects (DTOs) such as CustomerDto, OrderDto, OrderItemDto, OrderSummaryDto, and TenantDto to shape exactly what data leaves the domain and how it is consumed by the API or UI.

    Clear Separation of Concerns

    Our domain entities (Customer, Order, OrderItem, Tenant, etc.) are designed to model business behaviour and rules, not API contracts. By introducing DTOs:

    • Domain models remain free to evolve

    • API responses stay stable and intentional

    • UI and client code are decoupled from persistence concerns

    This separation becomes critical as the system grows and business logic becomes more complex.

    Explicit Mapping Is Intentional

    Each DTO contains a FromEntity method (or projection) that makes the transformation explicit and discoverable:

    public static CustomerDto FromEntity(Customer customer)

    This approach has several advantages:

    • No “magic” reflection or runtime mapping

    • Compile-time safety

    • Easy debugging and refactoring

    • Clear ownership of how data is exposed

    If a field is missing or renamed, the compiler tells us immediately.

    Optimised Queries with Projections

    For read-heavy endpoints, especially lists and summaries, we go a step further.
    OrderSummaryDto exposes an Expression<Func<Order, OrderSummaryDto>> projection:

    public static Expression<Func<Order, OrderSummaryDto>> Projection

    This allows Entity Framework to:

    • Translate the projection directly into SQL

    • Fetch only the required columns

    • Perform aggregations (Count, Sum) in the database

    • Avoid loading full entity graphs into memory

    The result is better performance, lower memory usage, and cleaner query code.

    Preventing Over-Fetching and Data Leaks

    DTOs also act as a security boundary:

    • Internal fields (audit data, internal IDs, navigation properties) are never exposed accidentally

    • Multi-tenant boundaries (TenantId) are explicit and visible

    • API contracts remain intentional and minimal

    This is especially important in multi-tenant systems where accidental data exposure can be catastrophic.

    Why Records?

    We use record types for DTOs because they are:

    • Immutable by default

    • Value-based (ideal for data transfer)

    • Concise and readable

    • Well-suited to API responses

    DTOs represent data snapshots, not mutable domain objects — records reflect that intent perfectly.

    Key Takeaway

    DTOs are not boilerplate, they are a deliberate architectural boundary.

    By combining:

    • Explicit FromEntity mappings

    • Database-level projections

    • Immutable record types

    we get clearer code, safer APIs, and significantly better performance without relying on hidden abstractions.


    Add the Blazor page (two buttons)

    Create a new Razor component:

    Pages/Orders.razor

    @page "/orders" @inject IHttpClientFactory HttpClientFactory <h3>Order Demo</h3> <div class="mb-3"> <button class="btn btn-primary me-2" @onclick="SeedData"> Seed Demo Data </button> <button class="btn btn-secondary" @onclick="LoadOrders"> Load Orders </button> </div> @if (_loading) { <p>Loading...</p> } else if (_orders.Any()) { <table class="table table-striped"> <thead> <tr> <th>Order Id</th> <th>Created</th> <th>Status</th> <th>Customer</th> <th>Items</th> <th>Total</th> </tr> </thead> <tbody> @foreach (var order in _orders) { <tr> <td>@order.Id</td> <td>@order.CreatedUtc.ToString("u")</td> <td>@order.Status</td> <td>@order.Customer</td> <td>@order.ItemCount</td> <td>@order.Total.ToString("C")</td> </tr> } </tbody> </table> } @code { private readonly List<OrderSummaryDto> _orders = new(); private bool _loading; // Hard-coded for demo simplicity // Later posts will replace this with proper tenant context private Guid _tenantId = Guid.Empty; private async Task SeedData() { var client = HttpClientFactory.CreateClient("OrderingApi"); await client.PostAsync("/seed", null); // In a real app you would return the tenantId from the API // For now, this is intentionally simple for the demo _tenantId = await LoadTenantId(client); } private async Task LoadOrders() { if (_tenantId == Guid.Empty) return; _loading = true; _orders.Clear(); var client = HttpClientFactory.CreateClient("OrderingApi"); var orders = await client.GetFromJsonAsync<List<OrderSummaryDto>>( $"/tenants/{_tenantId}/orders"); if (orders is not null) _orders.AddRange(orders _loading = false; } private async Task<Guid> LoadTenantId(HttpClient client) { // Simple helper endpoint could be added later // For now, this is a placeholder to keep the demo focused var tenantId = await client.GetFromJsonAsync<Guid>("/tenants/demo"); return tenantId; } }

    4) Add a minimal API endpoint to return the demo tenant ID

    In Ordering.Api/Program.cs, add this temporary demo endpoint:

    app.MapGet("/tenants/demo", async (OrderingDbContext db) => { var tenantId = await db.Tenants .Select(t => t.Id) .FirstOrDefaultAsync(); return tenantId == Guid.Empty ? Results.NotFound() : Results.Ok(tenantId); });

    This keeps the Blazor page simple without polluting the architecture.


    5) Add a navigation link (optional but nice)

    In Ordering.Blazor/Components/Layout/NavMenu.razor (or equivalent):

    <li class="nav-item px-3"> <NavLink class="nav-link" href="orders"> <span class="oi oi-list-rich" aria-hidden="true"></span> Orders </NavLink> </li>

    Why this page is intentionally minimal

    This page is not about Blazor.

    It exists so that:

    • you can see data flow end-to-end

    • Aspire orchestration feels real

    • later lambda posts have a UI-driven use case

    • you can point out what the UI is deliberately not doing

    Later in the series, this page becomes a teaching tool:

    • “Why we don’t filter here”

    • “Why we don’t sort here”

    • “Why lambdas belong in the Application layer”


    Result

    When you press F5 on the AppHost:

    1. Aspire starts Postgres, API, and Blazor

    2. Navigate to /orders

    3. Click Seed Demo Data

    4. Click Load Orders

    5. See real data from Postgres rendered in Blazor

    At this point, your demo application:

    • is runnable

    • feels complete

    • supports real EF Core queries

    • and is perfectly set up for the lambda deep dives


    How This Sets Up the Lambda Deep Dives

    With this foundation in place, I can now show:

    • the same lambda compiled as:

    • how EF Core consumes expression trees

    • what breaks translation and why

    • how expression composition works

    • how performance changes based on execution context

    And every example will point to real code in this demo app.


    What’s Next

    In the next post, I’ll zoom in on lambdas themselves and answer the question most developers never fully unpack:

    How can the same lambda syntax execute in memory in one place, and turn into SQL in another?

    That distinction is the key to everything that follows.


    Final Thoughts

    This repository will evolve as the series progresses, each post builds on the same codebase:

    Lambdas aren’t powerful because they’re concise.

    They’re powerful because of what they compile into, where that compilation happens, and how that intent flows through an architecture.

    That’s why I didn’t want to start this series with syntax.

    In real systems, lambda expressions:

    • cross architectural boundaries

    • get reused in ways you didn’t originally expect

    • determine whether work happens in memory or in the database

    • and quietly influence performance, scalability, and correctness

    If you don’t have a clear mental model of the system they live in, it’s easy to write code that looks clean but behaves poorly under real load.

    This demo application exists to make those trade-offs visible.

    Every post that follows will build on this foundation, using real queries, real data, and real architectural constraints, not contrived examples that never fail.

    If anything in this post sparked questions, disagreements, or “we’ve been bitten by that before” moments, I’d genuinely love to hear about it.

    You can reach me via https://dotnetconsult.tech, whether it’s to discuss EF Core internals, lambda expression pitfalls, architecture decisions, or challenges you’re seeing in your own systems.

    The next post dives directly into lambdas themselves, specifically, how the same syntax can compile into very different runtime constructs, and why that distinction underpins everything else in this series.

    Thanks for reading, and I’ll see you in the next deep dive.

    Comments

    Popular posts from this blog

    A Secure Blazor Server Azure Deployment Pipeline

    Stop Wrapping EF Core in Repositories: Use Specifications + Clean Architecture

    Server-Sent Events in .NET 10: Do You Really Need SignalR?