Cold Starts in AWS Lambda

thumbnail for this post

This article describes AWS Lambda—the dynamically scaled and billed-per-execution compute service. Instances of Lambdas are added and removed dynamically. When a new instance handles its first request, the response time increases, which is called a cold start.

Read more: Cold Starts in Serverless Functions.

When Does Cold Start Happen?

The very first cold start happens when the first request comes in after deployment.

After that request is processed, the instance stays alive to be reused for subsequent requests. There is no predefined threshold after the instance gets recycled, the empiric data show some variance of the idle period.

The following chart estimates the probability of an instance to be recycled after the given period of inactivity:

Probability of a cold start happening before minute X

Cold starts happen 5 to 7 minutes after the previous request.

Read more: When Does Cold Start Happen on AWS Lambda?

How Slow Are Cold Starts?

The following chart shows the typical range of cold starts in AWS Lambda, broken down per language. The darker ranges are the most common 67% of durations, and lighter ranges include 95%.

Typical cold start durations per language

JavaScript, Python, Go, Java, and Ruby are all comparable: most of the time they complete within 400 milliseconds and almost always within 700 milliseconds.

C# is a distinct underdog. The chart shows statistics for instances with 2+ GB of allocated RAM, which are the faster than smaller ones (see below). Cold starts of this instance size span between 0.4 and 0.9 seconds.

Lambda functions packaged as Docker images are yet slower. A basic container based on the recommended Node.js base image starts up in 0.6 and 1.4 seconds.

View detailed distributions: Cold Start Duration per Language.

Does Instance Size Matter?

AWS Lambda has a setting to define the memory size that gets allocated to a single instance of a function. Are larger instances faster to load?

Most language runtimes have no visible difference in cold start duration of different instance sizes. Here is the chart for JavaScript:

Cold start durations per instance size, JavaScript functions

However, .NET (C#/F#) functions are the exception. The bigger the instance, the faster startup time it has:

Cold start durations per instance size, C# functions

Same comparison for larger functions: Cold Start Duration per Instance Size.

Does Package Size Matter?

The above charts show the statistics for tiny “Hello World”-style functions. Adding dependencies and thus increasing the deployed package size will further increase the cold start durations.

The following chart compares three JavaScript functions with the various number of referenced NPM packages:

Cold start durations per deployment size (zipped)

Indeed, the functions with many dependencies can be 5-10 times slower to start.

The following chart compares three functions packaged as container images of different size. Every container is based on the official Node.js image but have different additional files baked in:

Cold start durations per Docker image extra size

Container image size does not seem to influence the cold start duration.

What Is The Effect Of VPC Access?

AWS Lambda might need to access resources inside Amazon Virtual Private Cloud (Amazon VPC). In the past, configuring VPC access slowed down the cold starts significantly.

This is not true anymore, as the effect of VPC is minimal:

Cold start durations of the same Node.js Lambda with and without VPC access


Cloud developer and researcher.
Software engineer at Pulumi. Microsoft Azure MVP.

Responses
Visit the Github Issue to comment on this page