Serverless run times with custom Bash AWS Lambda layers
A Code Bash
Amazon Web Services (AWS) Lambda functions [1] have grown immeasurably popular over the last year or two and will unquestionably have a place in the future when it comes to running repetitive tasks on cloud infrastructure while managing to avoid the costs and overhead of maintaining servers.
Something that tripped me up initially when I began adopting Lambda functions, however, was the growth in favoritism for the Python programming language. It seems that Node.js and Python have become the de facto sweetheart languages (or run times) of choice when it comes to Lambda functions, with Ruby appearing at points, too. Although I can pick away and tweak Python scripts, it's never been a priority for me to learn the language; as a result, creating a payload for Lambda functions was never easy.
I discovered something very clever the other day that allows Bash to be used as a custom run time in AWS Lambda functions. A GitHub project [2] clearly extols the virtues of using Bash as a serverless run time. In this article, I explore a relatively new addition to Lambda functions in AWS called "layers" and, with the use of a Bash layer provided on the GitHub page by author Graham Krizek [3], I run a Bash script that uses the latest and greatest serverless tech on AWS.
Bashes and Bumps
In simple terms, you might say that an AWS layer is an additional library you can inject into your Lambda function to run alongside your payload. Why would you want to do that, you might well ask? Well, you might have a common piece of code that you want to share around your infrastructure that gets updated periodically. It might contain anything, such as SSL/TLS certificates for your payload to connect securely into a service, or a shared library from a vendor that's useful to bolt onto your payloads and treat as a module.
According to AWS, a layer "… let[s] you keep your deployment package small, which makes development easier" and is a "… ZIP archive that contains libraries, a custom runtime, or other dependencies" [4].
Currently, you can add a maximum of five layers to a function, with a few size limits so that the "… total unzipped size of the function and all layers can't exceed the unzipped deployment package size limit of 250 MB" [4]. If you're likely to embrace serverless tech to a massive degree, the AWS page on Lambda limits [5] will help explain the relatively sane limitations currently enforced.
When I've created Python Lambda functions in the past, I've always added layers – after getting the Python working. Of course, that won't be possible when it comes to running Bash scripts because they're not native to Lambda functions in AWS. Therefore, the symbiotic relationship between the function and the layer means that they need each other to exist and, as a result, must be present at the same time.
With that little bit of context in place and without further procrastination, I'll crack on and show you how to create a new function.
Injection
After a non-trivial amount of eye strain, I recently deployed layers in Lambda functions with Terraform on a large AWS estate. At the time of writing, it still needs a little maturation to meet some people's desired requirements. That said, Terraform development moves like lightning, and features are improved and added all the time. Whether you are interested in using Terraform or not, the makers of Terraform discuss the important aws_lambda_layer_version
resource on their website [6]:
resource "aws_lambda_layer_version" "lambda_layer"{ filename = "lambda_layer_payload.zip" layer_name = "lambda_layer_name" compatible_runtimes = ["nodejs8.10"] }
The long and short of using this resource is that you simply reference the Amazon Resource Name (ARN) layer from within your aws_lambda_function
resource to inject it into a Lambda function and then configure the layer as per the example. One gotcha is that if the filename
argument for the payload is changed and the payload is therefore pulled in from an AWS S3 bucket with the s3_key
instead, the S3 bucket must
exist in the same region where the Lambda function will run.
In the example in this article, I'm going to use a combination of the AWS command-line interface (CLI) and the AWS Management Console to create and populate a Lambda function, so I'll leave Terraform behind for now.
No Gooey
To begin, I presume you have set up your AWS credentials. If you're not sure how to do that, AWS has instructions [7] that walk you through the process step-by-step. You'll need to ensure that you have a relatively new version of the CLI installed on your machine (I always use the pip
Python package installer when possible to avoid older packages) and have created an access key for your user along with a secret key. The AWS instructions also mention using the Identity and Access Management (IAM) section of the AWS Management Console to create your credentials.
Once you're ready, log in to the AWS Management Console in your browser and navigate to the Lambda section of the website. Look for the orange Create function button at top right to make sure you're in the right place. Here, you'll be able to view any newly created Lambda functions.
The GitHub repository offers instructions on how to make use of the Bash layer so you can run Bash scripts natively. The README file [8] contains all the information needed and starts off on how to create a brand new Lambda function from scratch via the AWS CLI, which also includes the Bash layer already injected. The command from the GitHub README is:
$ aws lambda create-function --function-name bashFunction --role bashFunctionRole --handler index.handler --runtime provided --layers arn:aws:lambda:<region>:744348701589:layer:bash:8 --zip-file fileb://function.zip
You'll note that <region>
is left unpopulated, which as mentioned before, is one limitation to bear in mind when it comes to AWS Lambda layers. (I will be using the Dublin, Ireland, region (eu-west-1
) for my command.) You'll also need to create a Bash script that can be loaded into your Lambda function as a payload.
Listing 1 is a Bash script, acting as the payload, inside a ZIP file, that runs in a Lambda function. It is a slight adaptation of the Krizek example [9]. Having saved the script to index.sh
somewhere on your filesystem where you can find it, you'll need to compress that into a ZIP file to be compatible with AWS Lambda:
Listing 1
index.sh
01 handler () { 02 set -e 03 04 # Event Data is sent as the first parameter 05 EVENT_DATA=$1 06 07 # This is the Event Data 08 echo $EVENT_DATA 09 10 # Example of command usage 11 EVENT_JSON=$(echo $EVENT_DATA | jq .) 12 13 # Example of AWS command that's output will show up in CloudWatch Logs 14 echo "I am a Bash command!" 15 16 # This is the return value because it's being sent to stderr (>&2) 17 echo "{\"success\": true}" >&2 18 }
$ zip list_buckets.zip index.sh adding: list_buckets.sh (deflated 39%)
As you can see, you should now have list_buckets.zip
available on your filesystem.
Before continuing, however, you'll have to pay attention to some minor gotchas, which include explicitly stating which handler
to use when invoking the Lambda function, because Bash behaves differently with error handling. (See the "Take Note" box for more details.)
Take Note
Be aware of two Lilliputian mantraps in your own scripts.
If you've used Bash before, you'll know that it handles errors a little unusually. Therefore, set -e
must be present inside your function's payload. The injected layer needs this to notice an error, so it can then ask the Lambda function to stop running. When it comes to errors in the case of this Bash run time, you should therefore religiously send any
output to stderr, instead of the usual destination, stdout. The output (known as the "return" value in AWS Lambda-speak) uses stderr
for the return value. To send something to stderr, simply append >&2
to the end of the command line in your script.
When you use a method like this, logs are created as expected, and any return output (i.e., the stuff that's intended for the user) ends up, as hoped, in CloudWatch, as well.
Buy this article as PDF
(incl. VAT)
Buy ADMIN Magazine
Subscribe to our ADMIN Newsletters
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Most Popular
Support Our Work
ADMIN content is made possible with support from readers like you. Please consider contributing when you've found an article to be beneficial.