@serverless-stack/aws-lambda-ric

AWS Lambda Runtime Interface Client for NodeJs

Usage no npm install needed!

<script type="module">
  import serverlessStackAwsLambdaRic from 'https://cdn.skypack.dev/@serverless-stack/aws-lambda-ric';
</script>

README

AWS Lambda NodeJS Runtime Interface Client

We have open-sourced a set of software packages, Runtime Interface Clients (RIC), that implement the Lambda Runtime API, allowing you to seamlessly extend your preferred base images to be Lambda compatible. The Lambda Runtime Interface Client is a lightweight interface that allows your runtime to receive requests from and send requests to the Lambda service.

The Lambda NodeJS Runtime Interface Client is vended through npm. You can include this package in your preferred base image to make that base image Lambda compatible.

Requirements

The NodeJS Runtime Interface Client package currently supports NodeJS versions:

  • 10.x
  • 12.x
  • 14.x

Usage

Creating a Docker Image for Lambda with the Runtime Interface Client

First step is to choose the base image to be used. The supported Linux OS distributions are:

  • Amazon Linux 2
  • Alpine
  • CentOS
  • Debian
  • Ubuntu

The Runtime Interface Client can be installed outside of the Dockerfile as a dependency of the function we want to run in Lambda (run the below command in your function directory to add the dependency to package.json):

npm install aws-lambda-ric --save

or inside the Dockerfile:

RUN npm install aws-lambda-ric

Next step would be to copy your Lambda function code into the image's working directory.

# Copy function code
RUN mkdir -p ${FUNCTION_DIR}
COPY myFunction/* ${FUNCTION_DIR}

WORKDIR ${FUNCTION_DIR}

# If the dependency is not in package.json uncomment the following line
# RUN npm install aws-lambda-ric

RUN npm install

The next step would be to set the ENTRYPOINT property of the Docker image to invoke the Runtime Interface Client and then set the CMD argument to specify the desired handler.

Example Dockerfile (to keep the image light we used a multi-stage build):

# Define custom function directory
ARG FUNCTION_DIR="/function"

FROM node:12-buster as build-image

# Include global arg in this stage of the build
ARG FUNCTION_DIR

# Install aws-lambda-cpp build dependencies
RUN apt-get update && \
    apt-get install -y \
    g++ \
    make \
    cmake \
    unzip \
    libcurl4-openssl-dev

# Copy function code
RUN mkdir -p ${FUNCTION_DIR}
COPY myFunction/* ${FUNCTION_DIR}

WORKDIR ${FUNCTION_DIR}

# If the dependency is not in package.json uncomment the following line
# RUN npm install aws-lambda-ric

RUN npm install

# Grab a fresh slim copy of the image to reduce the final size
FROM node:12-buster-slim

# Include global arg in this stage of the build
ARG FUNCTION_DIR

# Set working directory to function root directory
WORKDIR ${FUNCTION_DIR}

# Copy in the built dependencies
COPY --from=build-image ${FUNCTION_DIR} ${FUNCTION_DIR}

ENTRYPOINT ["/usr/local/bin/npx", "aws-lambda-ric"]
CMD ["app.handler"]

Example NodeJS handler app.js:

"use strict";

exports.handler = async (event, context) => {
    return 'Hello World!';
}

Local Testing

To make it easy to locally test Lambda functions packaged as container images we open-sourced a lightweight web-server, Lambda Runtime Interface Emulator (RIE), which allows your function packaged as a container image to accept HTTP requests. You can install the AWS Lambda Runtime Interface Emulator on your local machine to test your function. Then when you run the image function, you set the entrypoint to be the emulator.

To install the emulator and test your Lambda function

  1. From your project directory, run the following command to download the RIE from GitHub and install it on your local machine.
mkdir -p ~/.aws-lambda-rie && \
    curl -Lo ~/.aws-lambda-rie/aws-lambda-rie https://github.com/aws/aws-lambda-runtime-interface-emulator/releases/latest/download/aws-lambda-rie && \
    chmod +x ~/.aws-lambda-rie/aws-lambda-rie
  1. Run your Lambda image function using the docker run command.
docker run -d -v ~/.aws-lambda-rie:/aws-lambda -p 9000:8080 \
    --entrypoint /aws-lambda/aws-lambda-rie \
    myfunction:latest \
        /usr/local/bin/npx aws-lambda-ric app.handler

This runs the image as a container and starts up an endpoint locally at http://localhost:9000/2015-03-31/functions/function/invocations.

  1. Post an event to the following endpoint using a curl command:
curl -XPOST "http://localhost:9000/2015-03-31/functions/function/invocations" -d '{}'

This command invokes the function running in the container image and returns a response.

Alternately, you can also include RIE as a part of your base image. See the AWS documentation on how to Build RIE into your base image.

Development

Building the package

Clone this repository and run:

make init
make build

Running tests

Make sure the project is built:

make init build

Then,

  • to run unit tests: make test
  • to run integration tests: make test-integ
  • to run smoke tests: make test-smoke

Troubleshooting

While running integration tests, you might encounter the Docker Hub rate limit error with the following body:

You have reached your pull rate limit. You may increase the limit by authenticating and upgrading: https://www.docker.com/increase-rate-limits

To fix the above issue, consider authenticating to a Docker Hub account by setting the Docker Hub credentials as below CodeBuild environment variables.

DOCKERHUB_USERNAME=<dockerhub username>
DOCKERHUB_PASSWORD=<dockerhub password>

Recommended way is to set the Docker Hub credentials in CodeBuild job by retrieving them from AWS Secrets Manager.

Security

If you discover a potential security issue in this project we ask that you notify AWS/Amazon Security via our vulnerability reporting page. Please do not create a public github issue.

License

This project is licensed under the Apache-2.0 License.