The Foundation: How I’m Building a Real-World Ordering Demo for Deep EF Core & Lambda Exploration
How I’m Building a Real-World Ordering Demo for Deep EF Core & Lambda Exploration
Before I write a single deep dive about lambda expressions, expression trees, or EF Core translation, I want to be very deliberate about what those examples live inside.
Because lambdas don’t exist in isolation.
They behave very differently depending on:
So instead of starting with syntax, I’m starting with a real application, one that’s complex enough to break if you misuse lambdas, but still small enough to understand.
In this post, I’ll walk you through:
-
The demo application I’m building
-
The architectural choices I’ve made
-
And why each decision matters for the rest of the series
Why I’m Not Using Toy Examples
I could explain lambdas with:
But that example never fails.
It doesn’t show:
-
or why expression reuse matters
In production systems, lambdas:
-
cross layers
-
get reused in different contexts
-
are composed dynamically
-
and eventually hit a database provider that doesn’t care how elegant your syntax was
So I need a demo where:
-
bad lambda usage actually hurts
-
good lambda design pays off
The Architecture at a Glance (Before We Go Deeper)
Before I talk about domains, entities, or queries, I want to step back and show you the full shape of the system. Because when we start discussing lambdas later, where the code lives will matter just as much as what the code does.
At a high level, this demo application is split into four logical layers, orchestrated locally using .NET Aspire:
This separation isn’t academic, it’s what allows me to:
-
Understand execution context
-
Control where lambdas run
-
Test translation behaviour in isolation
-
And benchmark real performance differences later
Let me briefly explain what belongs in each layer before we dig deeper into them.
Blazor: User Intent, Not Business Logic
The Blazor frontend exists to:
-
capture user intent
-
render data
-
manage UI state
It does not:
-
filter collections
-
build LINQ queries
-
know anything about EF Core
This is deliberate.
Later in the series, I’ll show why filtering in the UI with lambdas is one of the easiest ways to accidentally:
-
pull too much data
-
hide performance issues
-
break multi-tenant boundaries
By keeping Blazor “dumb”, I ensure:
-
all lambdas that matter live on the server
-
execution context is predictable
API: Orchestration, Not Query Logic
The Web API layer:
-
accepts requests
-
validates input
-
invokes application use cases
-
returns DTOs
It does not:
-
compose queries
-
contain
Where,Select, orIncludelogic -
understand how data is persisted
This keeps the API thin and makes it a perfect boundary for:
-
testing
-
performance measurement
-
future protocol changes (HTTP → gRPC, etc.)
Application Layer: Where Lambdas Live
This is the most important layer for the entire series.
The Application layer is where:
-
query intent is defined
-
lambdas are expressed as expressions
This is where I’ll place:
-
Expression<Func<T, bool>>
Why?
Because this layer:
-
is independent of EF Core
-
can be tested in isolation
-
expresses what we want, not how it’s executed
Everything we discuss about lambdas later will anchor here.
Domain Layer: Stability and Constraints
The Domain layer contains:
-
entities
It knows:
-
what an Order is
-
what states are valid
-
what relationships exist
It knows nothing about:
-
databases
-
LINQ
-
lambdas
-
EF Core
This separation matters because:
-
domain rules shouldn’t care how queries are executed
-
lambdas should describe selection, not behaviour
Infrastructure: Execution and Translation
Infrastructure is where:
-
EF Core lives
-
PostgreSQL is configured
-
expression trees are finally executed or translated
This is the layer that:
-
consumes the expressions
-
turns them into SQL
-
executes them efficiently (or not)
By isolating Infrastructure, I can:
-
swap providers
-
benchmark behaviour
-
inspect generated SQL
-
demonstrate translation failures cleanly
Aspire: The Glue That Makes This Practical
All of this is orchestrated locally using .NET Aspire.
Aspire:
-
provisions PostgreSQL
-
wires connection strings
-
starts all services together
-
removes local setup friction
This is what makes the demo runnable, benchmarkable, and reproducible.
Why This Architectural Overview Matters
Without this context, later posts about lambdas become confusing.
When I say:
-
“this lambda runs in memory”
-
“this expression gets translated”
-
“this breaks SQL generation”
…it only makes sense if you know:
-
which layer you’re in
-
what dependencies are present
-
and who is responsible for execution
This section is the map.
Why I’m Using PostgreSQL
PostgreSQL is intentional.
It gives me:
-
real query plans
-
realistic performance characteristics
-
excellent container support
More importantly:
-
EF Core’s translation behaviour is visible
-
ToQueryString()becomes meaningful -
benchmarks show real differences
Using Postgres prevents the demo from lying to you.
Why Aspire Is Non-Negotiable for This Demo
Most demo repos fail before the code runs.
I don’t want readers to:
-
install Postgres manually
-
configure connection strings
-
run scripts
-
debug local setup issues
With Aspire:
-
infrastructure is declared in code
-
services and databases are wired together
-
connection strings flow automatically
-
the entire system starts with F5
This matters because:
-
readers actually run the code
-
benchmarks are reproducible
-
query behaviour matches production more closely
Aspire turns this from a blog series into a working reference system.
Setting Up the Demo App in Visual Studio 2026 (Aspire + Postgres + Clean Architecture)
We’re going to start by creating an empty solution in Visual Studio and naming it something like "OrderDemo"
First, open Visual Studio 2026 and select "Create a new project" from the start screen.
On the Create a new project dialog, use the template search box and type “Blank Solution”.
This should return the Blank Solution template. Select it and click Next.
On the next screen, name the solution OrderDemo.
I’m going to store mine in my usual working directory at: "D:/Development/Blogs/OrderDemo"
You can store it wherever you like,the location has no impact on how the application runs at runtime.
Finally, click Create to generate the solution.
Add the Clean Architecture projects
Right-click Solution → Add → New Project and create these projects:
Class libraries
-
Ordering.Domain(Class Library) -
Ordering.Application(Class Library) -
Ordering.Infrastructure(Class Library)
Applications
-
Ordering.Api(ASP.NET Core Web API — minimal API is fine) -
Ordering.Blazor(Blazor Web App / Blazor Server is easiest for a demo)
Now add references:
Ordering.Application
-
references
Ordering.Domain
Ordering.Infrastructure
-
references
Ordering.Application -
references
Ordering.Domain
Ordering.Api
-
references
Ordering.Application -
references
Ordering.Infrastructure
Ordering.Blazor
references
Ordering.Application
Add .net Aspire
- Right-click Ordering.Api → Add → .NET Aspire Orchestrator Support.
- When prompted for the Aspire version, select Latest (at the time of writing, this is 9.5),then click OK
Visual Studio will create two new projects:
- OrderDemo.ServiceDefaults
- OrderDemo.AppHost
- Next, right-click Ordering.Blazor → Add → .NET Aspire Orchestrator Support.
- You will be informed that an Aspire orchestrator already exists and that Ordering.Blazor will be added to the existing orchestration.
- Click OK to complete the setup.
Add Postgres to the .net Aspire AppHost
- Right-click OrderDemo.AppHost → Add → .NET Aspire Packages
- From the list of available resources, select Aspire.Hosting.PostgreSQL and click Next.
- Open Program.cs in OrderDemo.AppHost.
-
Locate the following line:
-
Immediately below it, add the PostgreSQL resource:
What This Code Is Doing
var postgres = builder.AddPostgres("postgres").AddDatabase("ordering");;
Registers a PostgreSQL container with Aspire named postgres.
Aspire will:
- Start a PostgreSQL container when the AppHost runs
- Manage ports, credentials, and lifecycle automatically
- Expose connection information to other services
Creates a logical database named ordering inside the PostgreSQL instance.
var postgres = ...
- Stores a reference to the PostgreSQL resource so other projects (APIs, workers, etc.) can depend on it.
At this point, PostgreSQL exists only at the orchestration level, nothing is connected to it yet.
Adding PostgreSQL as a Dependency of Ordering.Api
Still in Program.cs of OrderDemo.AppHost, locate where the API project is registered:
We are going to update are Aspire binding as follows
var builder = DistributedApplication.CreateBuilder(args);
var postgres = builder.AddPostgres("postgres").AddDatabase("orderingdb");
var api = builder.AddProject<Projects.Ordering_Api>("ordering-api").WithReference(postgres).WaitFor(postgres);
builder.AddProject<Projects.Ordering_Blazor>("ordering-blazor").WithReference(api);
builder.Build().Run();
What the Dependency Does
-
WithReference(postgres)tells Aspire:-
Ordering.Apidepends on PostgreSQL -
The API must not start until PostgreSQL is ready
-
A connection string will be automatically injected into the API’s configuration
-
WithReference(api)tells Aspire:Ordering.Blazordepends on Ordering.ApiOrdering.Api Uri is automatically injected into Ordering.Blazor configuration
This means:
-
No hardcoded connection strings
-
No local secrets in
appsettings.json -
Consistent local, CI, and cloud behavior
Add EF Core Postgres integration the "Aspire way"
In Ordering.API install and the following nugets:
Npgsql.EntityFrameworkCore.PostgreSQL
Microsoft.EntityFrameworkCore.Design (for migrations tooling)
Microsoft.EntityFrameworkCore.Tools (Optional but common)
Npgsql.EntityFrameworkCore.PostgreSQL
Microsoft.EntityFrameworkCore.Design (for migrations tooling)
Microsoft.EntityFrameworkCore.Tools (Optional but common)
In Ordering.Infrastructure install and the following nugets:
-
Microsoft.EntityFrameworkCore.Design(for migrations tooling) -
Microsoft.EntityFrameworkCore.Tools(Optional but common)
If you’re using Aspire service defaults, you may already have logging/health packages, so you typically don’t need anything else.
Create the entity model (Domain)
In Ordering.Domain, add these entities (minimal but “real”):
Tenant.cs
Customer.cs
Order.cs
OrderItem.cs
This model is intentionally chosen to create later “lambda pressure”:
-
multi-tenant filtering (
TenantId) -
navigation filtering (
Order.Customer.TenantId) -
aggregates (
Total) -
status workflows
Create DbContext (Infrastructure)
In Ordering.Infrastructure install and the following Nugets:
Microsoft.EntityFrameworkCore
then add:
OrderingDbContext.cs
DependencyInjection.cs
We keep the “DbContext registration” in the API because Aspire owns the connection provisioning.
Wire up the API (Aspire + EF Core + minimal endpoints)
In Ordering.Api/Program.cs:
A few important points:
-
The
"orderingdb"name must match the resource name you created in the AppHost via.AddDatabase("orderingdb"). -
The
/tenants/{tenantId}/ordersendpoint is intentionally designed to produce real SQL translation behaviour later.
Create the database schema (EF Core migrations)
With Aspire, PostgreSQL is created and started on-demand when you run OrderDemo.AppHost. That means:
-
You can’t rely on a pre-existing database being there.
-
You want the API to bootstrap itself once Postgres is available.
-
The most reliable approach is: start container → connect → apply migrations.
This is exactly what Database.Migrate() is for.
First lets create a migration
Creating the Initial EF Core Migration (Package Manager Console)
Open Package Manager Console
In Visual Studio:
-
Go to Tools → NuGet Package Manager
-
Select Package Manager Console
Make sure the solution is fully loaded before continuing.
Select the correct project
At the top of the Package Manager Console:
-
Set Default project to
Ordering.Api
This ensures EF Core knows where your DbContext and migrations live.
Create the migration
Run the following command:
That’s it.
EF Core will:
-
Inspect
OrderingDbContext -
Generate the migration files
-
Add a Migrations folder to
Ordering.Api
Verify the migration
In Ordering.Api you should now see:
Open InitialCreate.cs and confirm it contains:
-
Table creation logic
-
Correct PostgreSQL column types
-
A clean
Up()andDown()method
Why this works with Aspire
Even though PostgreSQL does not yet exist, this is fine because:
-
Migrations are generated from the model, not the database
-
Aspire will start PostgreSQL later
-
Database.Migrate()will apply this migration automatically on startup (Lets add this in the next section)
You do not need to run:
Step: Add EF Core migration application at API startup
In Ordering.Api / Program.cs, after:
Add this:
Then keep the rest:
What this does
-
Creates a scoped service provider (same as a request scope)
-
Resolves your
OrderingDbContext -
Runs
Migrate(), which:-
Creates the database schema if it’s not there
-
Applies any pending migrations
-
Does nothing if everything is already up-to-date
-
So in local Aspire dev:
-
First run: container starts → schema gets created
-
Future runs: migration check is fast → no changes applied
Ensure AppHost is the startup project
In Solution Explorer, right-click OrderDemo.AppHost, Select Set as Startup Project
You always run the system from the AppHost, it’s the orchestrator that starts Postgres, the API, and Blazor together.
Run the application: Start the AppHost
Now press F5 (or Ctrl+F5).
What should happen:
-
Aspire starts the PostgreSQL container
-
Aspire starts Ordering.Api
-
Aspire starts Ordering.Blazor
-
The API runs
db.Database.Migrate()on startup and creates the schema in the ordering database
You should see the Aspire dashboard open automatically. From there you can click into:
-
Ordering.Blazor to open the UI
-
Ordering.Api to view logs and test endpoints
-
postgres to confirm the resource is running
Seeding the database (sanity check)
The API exposes a simple seed endpoint to make it easy to verify that:
-
PostgreSQL is running
-
Migrations were applied correctly
-
EF Core can read and write data
What this endpoint does
-
Checks whether any tenants already exist
→ prevents reseeding on every run
-
Creates:
-
A Tenant
-
A Customer linked to that tenant
-
An Order with two OrderItems
-
Persists everything in a single unit of work via EF Core
Checks whether any tenants already exist
→ prevents reseeding on every run
Creates:
-
A Tenant
-
A Customer linked to that tenant
-
An Order with two OrderItems
Persists everything in a single unit of work via EF Core
This makes it ideal as a one-click validation endpoint during development.
Fix: “You need to update Aspire.AppHost” (and then run the solution)
At this point we’ve added PostgreSQL, wired it into the AppHost, and enabled EF Core migrations at startup. Now when you try to run the app you may see an error along the lines of:
You need to update Aspire.AppHost
(or a message indicating the AppHost / Aspire workload is out of date)
This usually happens when your solution projects (or Visual Studio) are targeting a newer Aspire version than the one currently installed on your machine.
In Visual Studio:
- Right-click the solution → Manage NuGet Packages for Solution…
Go to the Updates tab, Update anything Aspire-related, especially:
- Aspire.Hosting.*
Microsoft.Extensions.ServiceDiscovery.*Microsoft.Extensions.Hosting.*(if prompted)Aspire.*components you’ve referenced
After updating, Build the solution.
Tip: KeepOrderDemo.AppHostandOrderDemo.ServiceDefaultson the same Aspire package family/version to avoid runtime mismatches.
How to use the seed endpoint when running under Aspire (using Postman)
To verify that everything is wired up correctly end-to-end, we’ll invoke the /seed endpoint using Postman.
Postman is an ideal tool at this stage because it allows you to:
- Call APIs without needing the UI to be finished
- Explicitly control HTTP methods, headers, and payloads
- Quickly validate API behavior during local development
- Re-run the same requests consistently as your API evolves
/seed endpoint using Postman.Postman is an ideal tool at this stage because it allows you to:
- Call APIs without needing the UI to be finished
- Explicitly control HTTP methods, headers, and payloads
- Quickly validate API behavior during local development
- Re-run the same requests consistently as your API evolves
Run the AppHost
-
Make sure OrderDemo.AppHost is set as the startup project
-
Press F5 (or Ctrl+F5)
Aspire will:
-
Start the PostgreSQL container
-
Start Ordering.Api
-
Start Ordering.Blazor
-
Apply EF Core migrations on API startup
The Aspire dashboard will open automatically.
Make sure OrderDemo.AppHost is set as the startup project
Press F5 (or Ctrl+F5)
Start the PostgreSQL container
Start Ordering.Api
Start Ordering.Blazor
Apply EF Core migrations on API startup
Get the API URL from the Aspire dashboard
In the Aspire dashboard, click Ordering.Api
-
Copy the exposed HTTP endpoint URL (for example):
(Your port will differ.)
In the Aspire dashboard, click Ordering.Api
Copy the exposed HTTP endpoint URL (for example):
Create the request in Postman
Open Postman
-
Create a new request
-
Configure it as follows:
-
Method: POST
-
URL:
-
No headers or body are required for this endpoint
Open Postman
Create a new request
Configure it as follows:
-
Method:
POST -
URL:
No headers or body are required for this endpoint
Send the request
Click Send.
You should receive one of the following responses:
-
First run
-
Subsequent runs
First run
Subsequent runs
Why Postman is a great fit for API-first development
Using Postman here is deliberate:
-
UI-independent testing, You can validate your API and data access long before the Blazor UI is complete.
-
Fast feedback loop, A single button click confirms that:
-
PostgreSQL is running
-
Migrations applied correctly
-
EF Core can persist data
-
The API pipeline is functioning end-to-end
-
Repeatable and sharable, Postman requests can be saved, grouped into collections, and shared with other developers or testers.
-
Real-world usage simulation, Postman calls your API exactly the way a real client would over HTTP making it a much more realistic test than calling services directly in code.
At this point, if /seed succeeds, you’ve proven that:
-
Aspire orchestration is working
-
Infrastructure is ready
-
Your API is correctly connected to PostgreSQL
From here, you can confidently move on to building query endpoints and wiring up the Blazor UI on top of a known-good foundation.
UI-independent testing, You can validate your API and data access long before the Blazor UI is complete.
Fast feedback loop, A single button click confirms that:
-
PostgreSQL is running
-
Migrations applied correctly
-
EF Core can persist data
-
The API pipeline is functioning end-to-end
Repeatable and sharable, Postman requests can be saved, grouped into collections, and shared with other developers or testers.
Real-world usage simulation, Postman calls your API exactly the way a real client would over HTTP making it a much more realistic test than calling services directly in code.
/seed succeeds, you’ve proven that:Aspire orchestration is working
Infrastructure is ready
Your API is correctly connected to PostgreSQL
Adding a Simple Blazor Page (Seed Data + Load Orders)
This Blazor page is intentionally simple.
It does two things only:
Calls the API to seed demo data
Calls the API to load orders for a tenant
It does not:
filter data
apply business logic
use LINQ
contain lambdas beyond trivial UI callbacks
That restraint is important, we’ll come back to it later in the series.
Configure HttpClient in the Blazor project
In Ordering.Blazor/Program.cs, make sure you have an HttpClient configured to talk to the API.
Create DTOs in the Ordering.Application project
In Ordering.Applicaiton project, add a folder called Dtos and create the following:
CustomerDto.cs
OrderDto.cs
OrderItemDto.cs
OrderSummartDto.cs
TenantDto.cs
We deliberately do not expose our domain entities directly to the outside world. Instead, we use Data Transfer Objects (DTOs) such as CustomerDto, OrderDto, OrderItemDto, OrderSummaryDto, and TenantDto to shape exactly what data leaves the domain and how it is consumed by the API or UI.
Clear Separation of Concerns
Our domain entities (Customer, Order, OrderItem, Tenant, etc.) are designed to model business behaviour and rules, not API contracts. By introducing DTOs:
-
Domain models remain free to evolve
-
API responses stay stable and intentional
-
UI and client code are decoupled from persistence concerns
This separation becomes critical as the system grows and business logic becomes more complex.
Explicit Mapping Is Intentional
Each DTO contains a FromEntity method (or projection) that makes the transformation explicit and discoverable:
This approach has several advantages:
-
No “magic” reflection or runtime mapping
-
Compile-time safety
-
Easy debugging and refactoring
-
Clear ownership of how data is exposed
If a field is missing or renamed, the compiler tells us immediately.
Optimised Queries with Projections
For read-heavy endpoints, especially lists and summaries, we go a step further.
OrderSummaryDto exposes an Expression<Func<Order, OrderSummaryDto>> projection:
This allows Entity Framework to:
-
Translate the projection directly into SQL
-
Fetch only the required columns
-
Perform aggregations (
Count,Sum) in the database -
Avoid loading full entity graphs into memory
The result is better performance, lower memory usage, and cleaner query code.
Preventing Over-Fetching and Data Leaks
DTOs also act as a security boundary:
-
Internal fields (audit data, internal IDs, navigation properties) are never exposed accidentally
-
Multi-tenant boundaries (
TenantId) are explicit and visible -
API contracts remain intentional and minimal
This is especially important in multi-tenant systems where accidental data exposure can be catastrophic.
Why Records?
We use record types for DTOs because they are:
-
Immutable by default
-
Value-based (ideal for data transfer)
-
Concise and readable
-
Well-suited to API responses
DTOs represent data snapshots, not mutable domain objects — records reflect that intent perfectly.
Key Takeaway
DTOs are not boilerplate, they are a deliberate architectural boundary.
By combining:
-
Explicit
FromEntitymappings -
Database-level projections
-
Immutable record types
we get clearer code, safer APIs, and significantly better performance without relying on hidden abstractions.
Add the Blazor page (two buttons)
Create a new Razor component:
Pages/Orders.razor
4) Add a minimal API endpoint to return the demo tenant ID
In Ordering.Api/Program.cs, add this temporary demo endpoint:
This keeps the Blazor page simple without polluting the architecture.
5) Add a navigation link (optional but nice)
In Ordering.Blazor/Components/Layout/NavMenu.razor (or equivalent):
Why this page is intentionally minimal
This page is not about Blazor.
It exists so that:
-
you can see data flow end-to-end
-
Aspire orchestration feels real
-
later lambda posts have a UI-driven use case
-
you can point out what the UI is deliberately not doing
Later in the series, this page becomes a teaching tool:
-
“Why we don’t filter here”
-
“Why we don’t sort here”
-
“Why lambdas belong in the Application layer”
Result
When you press F5 on the AppHost:
-
Aspire starts Postgres, API, and Blazor
-
Navigate to
/orders -
Click Seed Demo Data
-
Click Load Orders
-
See real data from Postgres rendered in Blazor
At this point, your demo application:
-
is runnable
-
feels complete
-
supports real EF Core queries
-
and is perfectly set up for the lambda deep dives
How This Sets Up the Lambda Deep Dives
With this foundation in place, I can now show:
-
the same lambda compiled as:
-
a delegate
-
an expression tree
-
-
how EF Core consumes expression trees
-
what breaks translation and why
-
how expression composition works
-
how performance changes based on execution context
And every example will point to real code in this demo app.
What’s Next
In the next post, I’ll zoom in on lambdas themselves and answer the question most developers never fully unpack:
How can the same lambda syntax execute in memory in one place, and turn into SQL in another?
That distinction is the key to everything that follows.
Final Thoughts
Lambdas aren’t powerful because they’re concise.
They’re powerful because of what they compile into, where that compilation happens, and how that intent flows through an architecture.
That’s why I didn’t want to start this series with syntax.
In real systems, lambda expressions:
cross architectural boundaries
get reused in ways you didn’t originally expect
determine whether work happens in memory or in the database
and quietly influence performance, scalability, and correctness
If you don’t have a clear mental model of the system they live in, it’s easy to write code that looks clean but behaves poorly under real load.
This demo application exists to make those trade-offs visible.
Every post that follows will build on this foundation, using real queries, real data, and real architectural constraints, not contrived examples that never fail.
If anything in this post sparked questions, disagreements, or “we’ve been bitten by that before” moments, I’d genuinely love to hear about it.
You can reach me via https://dotnetconsult.tech, whether it’s to discuss EF Core internals, lambda expression pitfalls, architecture decisions, or challenges you’re seeing in your own systems.
The next post dives directly into lambdas themselves, specifically, how the same syntax can compile into very different runtime constructs, and why that distinction underpins everything else in this series.
Thanks for reading, and I’ll see you in the next deep dive.


Comments
Post a Comment