Deploy a Micronaut function as a GraalVM Native Image to AWS Lambda

Learn how to distribute a Micronaut function built as a GraalVM Native image to AWS Lambda Custom Runtime

Authors: Sergio del Amo

Micronaut Version: 3.2.7

1. Introduction

Please read about Micronaut AWS Lambda Support to learn more about different Lambda runtime, Triggers, and Handlers, and how to integrate with a Micronaut application.

The biggest problem with Java applications and Lambda is how to mitigate Cold startups. Executing GraalVM Native images of a Micronaut function in a Lambda Custom runtime is a solution to this problem.

If you want to respond to triggers such as queue events, S3 events, or single endpoints, you should opt to code your Micronaut functions as Serverless functions.

2. Getting Started

In this guide, we will deploy a Micronaut function written in Kotlin as a GraalVM Native image to an AWS Lambda custom runtime.

3. What you will need

To complete this guide, you will need the following:

  • Some time on your hands

  • A decent text editor or IDE

  • JDK 1.8 or greater installed with JAVA_HOME configured appropriately

4. Solution

We recommend that you follow the instructions in the next sections and create the application step by step. However, you can go right to the completed example.

5. Writing the Application

Create an application using the Micronaut Command Line Interface or with Micronaut Launch.

mn create-function-app example.micronaut.micronautguide --features=graalvm,aws-lambda --build=maven --lang=kotlin
If you don’t specify the --build argument, Gradle is used as the build tool.
If you don’t specify the --lang argument, Java is used as the language.

The previous command creates a Micronaut application with the default package example.micronaut in a directory named micronautguide.

If you use Micronaut Launch, select Serverless function as application type and add the graalvm and aws-lambda features.

5.1. Enable annotation Processing

If you use Java or Kotlin and IntelliJ IDEA, make sure to enable annotation processing.


6. Code

The generated project contains sample code. Let’s explore it.

We want to support a JavaBean as input and output types.

The input is a Book object:

package example.micronaut
import io.micronaut.core.annotation.Introspected

class Book {
    var name: String? = null
  • Annotate the class with @Introspected to generate the Bean Metainformation at compile time.

The output is a BookSaved object:

package example.micronaut
import io.micronaut.core.annotation.Introspected

class BookSaved {
    var name: String? = null
    var isbn: String? = null
  • Annotate the class with @Introspected to generate the Bean Metainformation at compile time.

The application contains a class extending MicronautRequestHandler

package example.micronaut
import io.micronaut.core.annotation.Introspected
import java.util.UUID

class BookRequestHandler : MicronautRequestHandler<Book?, BookSaved?>() {

    override fun execute(input: Book?): BookSaved? {
        return if (input != null) {
            val bookSaved = BookSaved()
            bookSaved.isbn = UUID.randomUUID().toString()
            return bookSaved
        } else {

The generated test shows how to verify the function behaviour:

package example.micronaut
import org.junit.jupiter.api.Test
import org.junit.jupiter.api.Assertions

class BookRequestHandlerTest {

    fun testHandler() {
        val bookRequestHandler = BookRequestHandler()
        val book = Book() = "Building Microservices"
        val bookSaved = bookRequestHandler.execute(book)
        Assertions.assertEquals(, bookSaved!!.name)
  • When you instantiate the Handler, the application context starts.

  • Remember to close your application context when you end your test. You can use your handler to obtain it.

  • Invoke the execute method of the handler.

7. Testing the Application

To run the tests:

./mvnw test

8. Lambda

Create a Lambda Function. As a runtime, select Custom Runtime

create function bootstrap

8.1. Upload Code

The Micronaut framework eases the deployment of your functions as a Custom AWS Lambda runtime.

The main API you will interact with is AbstractMicronautLambdaRuntime. This is an abstract class which you can subclass to create your custom runtime mainClass. That class includes the code to perform the https:/ /[Processing Tasks] described in the Custom Runtime documentation.

The generated project contains such a class:

package example.micronaut

class BookLambdaRuntime : AbstractMicronautLambdaRuntime<APIGatewayProxyRequestEvent?, APIGatewayProxyResponseEvent?, Book?, BookSaved?>() {

    override fun createRequestHandler(vararg args: String?): RequestHandler<Book?, BookSaved?>? {
        return BookRequestHandler()

    companion object {
        fun main(vararg args: String) {
            try {
            } catch (e: MalformedURLException) {
./mvnw package -Dpackaging=docker-native -Dmicronaut.runtime=lambda -Pgraalvm

The above command generates a ZIP file which contains a GraalVM Native Image of the application, and a bootstrap file which executes the native image. The GraalVM Native Image of the application is generated inside a Docker container.

Once you have a ZIP file, upload it

lambda custom runtime uploadcode

8.2. Handler

The handler used is the one created at BookLambdaRuntime.

Thus, you don’t need to specify the handler in the AWS Lambda console.

However, I like to specify it in the console as well:


lambda custom runtime bookrequest handler

8.3. Test

You can test it easily. As Event Template use apigateway-aws-proxy to get you started:

test event
  "body": "{\"name\": \"Building Microservices\"}",
  "resource": "/",
  "path": "/",
  "httpMethod": "POST",
  "isBase64Encoded": false,
  "queryStringParameters": {},
  "multiValueQueryStringParameters": {},
  "pathParameters": {},
  "stageVariables": {},

You should see a 200 response:

test result

9. Next steps

Explore more features with Micronaut Guides.

Read more about:

10. Help with the Micronaut Framework

Object Computing, Inc. (OCI) sponsored the creation of this Guide. A variety of consulting and support services are available.