Using native ahead-of-time compilation (AOT) for AWS Lambda
27 February 2025
Performance is crucial for serverless applications, especially when dealing with cold starts - the time taken for lambda functions to initialise their runtime environments when invoked from an idle state. Since their inception, cold start times for .NET Lambdas have been particularly problematic due to the extra overhead of initialising a .NET runtime and just-in-time compilation.
One technique for mitigating cold start times in .NET applications is with Ahead-of-Time (AOT) compilation, introduced in .NET 8. Here’s what you need to know.

What is native AOT in .NET?
Traditionally, C# code is compiled into Intermediate Language (IL) at build time. During runtime, .NET’s Just-In-Time (JIT) compiler then converts the IL into executable machine code.
With Native AOT, both compilation steps are combined into a single process, producing a trimmed, native machine code executable at build time. This removes JIT overhead, delivering a smaller, faster, and ready-to-run application - crucial for optimising AWS Lambda performance.
Why should you use native AOT?
1. Smaller Application Size
When we build an application using Native AOT, a single executable is produced, containing only the minimum required code. This is achieved through aggressive trimming, which removes unused framework code, metadata, and runtime dependencies. Additionally, because JIT compilation is completely eliminated, the final application is leaner and optimised for deployment.
The result? A smaller binary, reducing AWS Lambda package size, which is a proven technique for improving cold start performance.
2. Faster Startup Time
Cold starts are one of the biggest challenges in serverless applications. These impact both the end-user and the engineering team, in terms of the cost and speed of the application.
So, what performance gains can we expect from utilising native AOT? A great benchmark comes from the serverless-dotnet-demo GitHub repository, which compares cold and warm start times across different .NET implementations. The table below shows the cold and warm start times of Lambda’s running a minimal API on .NET 6 and native AOT on .NET 8. Here we observe a 76.34% in median cold start time - just by enabling Native AOT. This translates to lower compute costs and a better experience for end users.
| Cold Start (ms) - p50 | Cold Start (ms) - p99 | Median Warm Start (ms) - p50 | Median Warm Start (ms) - p99 |
---|---|---|---|---|
.NET 6 Minimal API on ARM64 | 2105.21 | 2215.31 | 6.2 | 20.08 |
.NET 8 Native AOT Minimal API on ARM64 | 498 | 895 | 5.6 | 16.1 |
3. Lower Memory Usage
NET applications typically run within the Common Language Runtime (CLR), a virtual machine that demands additional memory and compute resources for JIT compilation, metadata storage, and garbage collection overhead. These overheads are removed with AOT, leading to lower memory usage. With this reduction in memory usage comes reduced Lambda costs, as these are calculated based on memory allocation and execution time. So, by using Native AOT, we can often use a lower memory tier in Lambda, optimising costs without sacrificing performance.
The tradeoffs
1. Limited Reflection Support
Reflection in .NET allows applications to inspect and manipulate metadata, types, methods, properties, and assemblies at runtime. As the runtime compilation step has been removed with Native AOT, it means many reflection-based operations no longer work dynamically. This means that dynamic JSON serialisation no longer works by default, rendering the Newtonsoft.Json
package unusable. This has major compatibility implications, which are explained in the next section.
One workaround is to use System.Text.Json source generators to precompile JSON serialisation logic at build time. This requires annotating each serialisable type with [JsonSerializable(typeof())]
, allowing the compiler to generate the necessary serialisation code and ensuring AOT compatibility.
2. Compatibility Issues
Since native AOT is relatively new in the .NET ecosystem, many libraries are not fully compatible with it just yet. From our experience, the determining factor if it is AOT compatible is if they use Newtonsoft.Json
for serialisation. We found most NuGet packages are slowly transitioning to be compatible, such as the Lambda Powertools package and AWS SDK. Although with some packages, such as the Auth0 SDK, we weren’t so lucky and had to build our own custom client as a workaround.
3. Increase Build Time and Platform-Specific Builds
Native AOT must fully compile and trim the application during compile time, which can increase the project build times significantly. In our experience, each Lambda project takes ~2-3 minutes per build which can impact developer feedback loops. This slower build time also impacts CI/CD pipelines but this can be mitigated by parallelising the builds across your projects.
Another key consideration is platform-specific compilation. Unlike traditional .NET applications, which can run on any platform with the .NET runtime, AOT produces a native binary that is OS specific. This means, for example, that if you build a Lambda function on a Windows machine, then it cannot be run on a Linux system.
Additionally, if you want to use Graviton Lambda functions, you must build your AOT .NET application on an ARM system. Thankfully, GitHub Actions provides ARM64 runners (which are currently in beta), making it easier to automate builds for ARM-based Lambda functions.
Should you use AOT in your serverless application?
As always, the answer is it depends. The performance benefits of Native AOT, such as faster cold starts and lower memory usage, make it a great choice for new .NET serverless projects, especially as more NuGet packages are becoming AOT-compatible.
But what about an existing, mature serverless project? I would probably advise against a full migration due to a number of challenges:
JSON serialisation refactoring: Since dynamic serialisation won’t work by default, migrating to AOT requires significant refactoring to use source generators (
[JsonSerializable]
).Dependency compatibility: If your project relies on libraries that are not yet AOT-compatible, you may need to replace them or build custom workarounds.
Overall effort: For large projects, the engineering effort to convert to AOT may outweigh the benefits.
A migration carries less risk if you have extensive testing in place. Before adopting AOT, ensure that your project has strong test coverage, particularly for serialisation and external dependencies, to catch regressions early.
Some alternatives
What are the alternatives if a full AOT migration isn’t possible? Firstly, I would recommend AWS Lambda SnapStart. It became generally available for .NET in November 2024. SnapStart can reduce cold start times without requiring a major refactor. It works by taking a snapshot of the Lambda’s memory state after initialisation and restoring it for subsequent invocations. Although this is slower than Native AOT, it is still a big improvement compared to a standard .NET Lambda.
Another alternative approach is to prioritise AOT migration for user-facing Lambdas, such as API handlers, where reducing cold starts would have the biggest impact. This means scheduled or event-driven Lambdas would remain unchanged where cold start times aren’t as impactful.
Article By

Adam McAllister
Software Engineer