AWS Bites Podcast

64. How do you write Lambda Functions in Rust?

Published 2023-01-20 - Listen on your favourite podcast player

Are you curious about using Rust to write AWS Lambda functions?

In this episode of AWS BItes, we will be discussing the pros and cons of using Rust for serverless applications. With Rust, you'll be able to take advantage of its fast performance and memory efficiency. Plus, its programming model makes it easy to write safe and correct code. However, Rust is not a native runtime for Lambda, but rather a library that implements a custom runtime built and maintained by AWS. This custom runtime is built on top of the Tokio async runtime and even has a built-in middleware engine, which allows for easy hook-in of reusable logic and building your own middlewares.

But what if you're new to Rust? Don't worry, we'll also be walking you through the steps on how to write your first Lambda in Rust. From cargo-lambda to the serverless framework plugin for Rust, we'll be sharing different alternatives for building and deploying your Rust-based Lambda functions.

So join us on this journey as we explore the exciting world of Rust and Lambda.

AWS Bites is sponsored by fourTheorem, an AWS Consulting Partner offering training, cloud migration, and modern application architecture.

In this episode, we mentioned the following resources:

Let's talk!

Do you agree with our opinions? Do you have interesting AWS questions you'd like us to chat about? Leave a comment on YouTube or connect with us on Twitter: @eoins, @loige.

Help us to make this transcription better! If you find an error, please submit a PR with your corrections.

Luciano: Unless you have been living under a rock, you probably notice that Rust is gaining more and more traction by the day. So today we want to talk about writing Lambda functions in Rust. We will be discussing why you might want to do something like this, analyzing pros and cons, then we will look at the steps needed to author and publish your very first Lambda in Rust. Finally, we will give our opinion on what's going to be the future of Rust in Lambda, and why we are so excited about it. My name is Luciano and today I'm joined by Eoin and this is AWS Bites podcast. AWS Bites is sponsored by fourTheorem. fourTheorem is an AWS consulting partner offering training, cloud migration, and modern application architecture. Find out more at fourtheorem.com. You will find the link in the show notes.

Eoin: Luciano, I know you've been learning Rust and sharing as you learn on your channel, which we can put the link for in the show notes. Could you start with a quick description? What is Rust and why is it gaining so much traction?

Luciano: Yeah, I'm gonna try to give my own view. I think Rust is a relatively new programming language, and by that I mean that the first release, the 1.0, was in 2015. So it's not that new. There is a good bit of background there and also existed for a while even before version 1. So it is, I think, still relatively new, meaning that it's getting traction more and more these days, but has been around already for a while. So it's not an immature language.

There is some good background there. It was initially adopted in the context of Mozilla to rewrite some parts of Firefox, or at least that was the probably the first serious project where Rust was being adopted. Before that it was more of a research language. It is a strongly typed compiled language that doesn't have a garbage collector, so that's a very interesting, for instance, in comparison with Go, that is also strongly typed, but as a garbage collector.

It was initially born as a system programming language because it was particularly focused on memory safety and performance, so trying to give people a safer experience when compared to things like C and C++ with also more modern toolchain. But the interesting thing is that these days it doesn't get promoted anymore as a system programming language, but it gets promoted more as a general programming language.

And the website says a language empowering everyone to build reliable and efficient software. And this is specifically true because in Rust these days you can build pretty much anything from operative systems up to front-end application using WebAssembly. So that I think that the language is becoming very very general purpose these days, without losing those major characteristics like being very strongly typed and being very efficient and having an interesting memory model. It is used also by AWS heavily, and one of the more famous examples is Firecracker, which is the engine that powers Lambda and Fargate, and it's open source, so you can check it out if you want to see what a moderately complicated Rust project looks like.

Eoin: Okay that's pretty interesting. So even if you're already using Lambda, Firecracker is already in the picture there because it's running the Lambda infrastructure, that container micro VM infrastructure is already written in Rust. Okay so that's pretty interesting, but I guess the idea of writing Lambda functions themselves in Rust is something that's still pretty new. So what has been your experience with that so far? Have you given it a try?

Luciano: Yeah so I want to emphasize that I'm not an expert in Rust by any means. It's a language that I've been playing with so to speak for the last three years or three years and a half, mostly doing coding challenges, some simple games, building small libraries. I also did a lot of Advent of Code and I've also been streaming that on Twitch with some friends. We're going to have the link in the show notes if you're curious to see these kind of exercises.

So I didn't really do anything production ready. So my perception right now is just from the point of view of somebody that has just been having fun with Rust. We might do this episode again after we build our first production ready Lambda in Rust and maybe reconsider some of the observations today. But I have actually built one Lambda in Rust for my own personal use case. I have basically a simple automation workflow on Twitter where if somebody follows me there is a hook that starts and it basically is going to send a welcome message to this person saying hey thank you for following me, can I help you with anything and it's just a nice way to get in touch with my followers, start a conversation and get to know them a little bit better. And I built this in Rust just because that was the perfect excuse to try to see what building a Lambda in Rust would look like in a very simple use case. Even though it's not really that simple because it still needs to do a lot of sophisticated integration because it needs to connect with the Twitter APIs which are HTTP requests there needs to be some state stored in DynamoDB so there is also the SDK that gets used plus I try to do some testing automated deployment and so on so even if it's a very simple Lambda there is still enough complexity to try to understand what writing Lambdas in Rust would look like.

Eoin: All right, that's pretty interesting, so I guess people can check out your Twitch channel to have a look at what you've been doing. When I'm hearing about writing Lambda functions in Rust, I kind of think back to the limited experience I have at writing Lambda functions in languages like C#/.NET and Java. I've used Java quite a lot in the past but I wouldn't jump to it immediately for Lambda. One of the reasons is that I guess historically it has been slow, right?

There's a lot of steps taken recently to overcome that with things like GraalVM and then SnapStart very recently at re:Invent but generally when it comes to compiled languages, I wouldn't rush to them because I feel that dynamic languages are just a lot simpler when it comes to writing Lambda functions, especially when dealing with JSON payloads, etc. So I feel like there's more steps to perform when you're using compiled languages. What is Rust like in that context and what is the appeal of using it for Lambda? And then I suppose the other question is what's the build and deployment experience like? Are there a lot more steps or do they manage to simplify it somehow?

Luciano: Yeah, so there is definitely an extra step for Rust as well because it's a compiled language and you need to make sure that you compile your code before you are able to execute it and in this particular case, you need to compile your code for the specific target architecture which is the one where the Lambda effectively runs. It's not the one that you have in your own development environment so there might be a little bit of complexity there if you need to cross compile from, for instance, an architecture to another. There are tools that can help and we will probably mention some of them later but yeah there is definitely an extra step there just because of the characteristics of the language.

On the other hand, it's a very fast and memory efficient language which affects very positively cold starts. I've seen incredibly small cold starts sometimes even a single digit millisecond cold start but generally doesn't go above 20 milliseconds for relatively simple Lambdas written in Rust. Also one thing that I really like is that the programming model helps to write safe and correct code and this is actually one of the reasons why I'm enjoying learning and writing Rust just because the language in itself is very mature.

I think it brings in a lot of the nice things that came out of many different other languages. For instance there are no null types. You need to use optional and you need to be very explicit when some data type is going to be there 100% or there may be a case where it's not going to be there. And every time you have this kind of situation, you need to explicitly decide what's going to happen when the value is not there.

And this is something that I found helps a lot to prevent a certain classes of bugs so by just learning the idiomatic way of writing Rust and the constructs that the language gives you, I think it's much easier to end up writing code that is actually more correct and there are less bugs which is definitely another advantage to keep in mind when choosing a language. And it's also very modern in terms of toolchain and ecosystem on libraries. For instance, there is a tool called Cargo which is somewhat similar to NPM if you come from Node.js, but it's very rich and allows you not just to install dependencies but also to build your code.

It can easily be extended also for testing and for a bunch of other things, so there is this one tool that does almost everything you can possibly need and if you compare that for instance with Python where you have like 12 or more tools if you have to do installing dependencies, packaging, building native dependencies and so on it's a very different experience there. Another thing is that it supports async, so you can write very efficient servers or code that relies heavily on I/O without having to necessarily spin up multiple threads.

So that's another interesting thing for a compiled language and I think actually Rust spend a lot of time, the community of Rust spend a lot of time trying to come up with a definition of async that doesn't affect performance and it's actually considered one of the most efficient implementation of async so also very interesting that aspect. The thing that is not as nice or at least initially might look a little bit weird is that there isn't that official runtime for Rust in AWS, so if you look at the list of the supported runtimes Rust is not there.

So how it is even possible that you can write lambdas in Rust? Well effectively you can do that by creating your own custom runtime. AWS makes that a little bit easier than it seems if you have to do it from scratch because they literally give you a library that is an official library written by AWS (and we will have the link in the show notes) and that library is basically something that you use in your own code in your own handler and basically at the end of the day you are packaging one single binary which is a custom runtime that contains the runtime itself from the library and then your handler which is basically your own custom code.

So you can imagine you wrap your handler with a custom runtime but that runtime is given to you by AWS in the form of a Rust library. Another interesting thing why I think that this is a very interesting and modern approach is that this custom runtime is fully async and in Rust there are different runtimes for async. One of the most famous is called Tokio (T-O-K-I-O) and Tokio is very interesting because it's kind of an ecosystem of libraries.

It's not just the runtime but there is a lot more that it gives you and one of the things that gives you is something called Tower which is a way to define services. So in the most generic sense and you can imagine that a Lambda is a service on its own and a service it's something that automatically is like a middleware engine when you use this library called Tower so basically you don't need something like Middy if you're writing a Lambda in Rust because it's already built in. You can just write your middleware straight away and connect them to your handlers and if you want some examples of what you can do with that there is in the Tower repository itself - actually no, in the Lambda runtime repository itself - in the AWS Rust runtime, there is an example on how to use the Tower library in combination with the runtime to enable tracing and you just enable it as a middleware around your handler code and of course you can build your own custom middlewares for validation logging, deserialization, error handling and so on.

Eoin: Is this a little bit like Middy for Rust, then? It's just built into that runtime library they provide?

Luciano: In a way, I would say yes because you already have that engine so when we say Middy, it is probably just Middy Core, just the engine. Then there isn't yet like a very mature ecosystem of middlewares that are specialized for Lambda. That can be an interesting other open source project if you wantto you or somebody else wants to explore building something like the middleware ecosystem for Rust.

Eoin: Yeah okay, that's an interesting approach because I guess every language has a slightly different take on that. So I think, based on what you said, I'm kind of convinced that there's enough of a benefit there for me to try it but I've got literally zero experience with Rust, I've just seen the hype emerge. So where do I start? How would you start? Would I start with Rust first or would I just jump in and try and write my first Lambda function in Rust? What's the best place to begin?

Luciano: I'm gonna assume for now that you know a little bit of Rust, even if maybe that's not the case. We can discuss later some tips on how to get started and let's just focus for now on if you know a little bit of Rust, how do you write your first lambda in Rust. Of course, you need to have Rust installed. There is actually a very good tool called Rustup which is basically a tool that allows you to install Rust in your system but also to keep it up to date every time there are new releases so definitely, the most recommended way to install Rust in any machine.

And that way you can easily keep it up to date. This week actually I came across an interesting article that showcases exactly like what is the experience of building a Lambda in Rust from scratch. It's actually not an Hello World Lambda, it's actually an interesting Lambda because it's doing some query over a JSON file that is compressed. There is actually quite a bit of logic. It can be an interesting use case and this is coming from a company called Scanner which does logs and tracing so they are actually showing some of their code at the end of the day, probably it's a simplification but it's an actual business case and they actually use Rust as a Lambda in their own environment.

One thing that I learned through this article is that there is a relatively new tool from somebody in AWS that is called Cargo Lambda which is an extension of Cargo, the tool we mentioned before that does package management, testing, building and so on. You can install this extension and this extension basically gives you a bunch of helpers directly that gets directly built in into Cargo to start a new Lambda in Rust, so, create the scaffolding for just like the structure of the code and how do you organize the libraries that you need and also gives you the ability of testing your code locally and also compile it and deploy it to AWS.

So it's kind of one shop kind of tool that you just install, it extends your own main tool, which is Cargo, and then you just run 'cargo something' to to do all the different things that you might want to do for running writing and running and deploying lambdas written in Rust. So the experience is that generally if you use something like Cargo Lambda, you will do, i think it's 'cargo-lambda new' or something like that to create the structure. That gives you already a Rust project with runtime already installed.

It's a little bit like Python or Node where you have a file where you define all your external dependencies, so all this structure is created for you you will have this cargo.toml which is the basically the the package.json of Rust if you want and, in that package.json you already will have a reference to the latest version of the Rust runtime and then it also creates a main file and the main file is basically your entry point when you compile your Rust code.

It's going to start from that file, it's going to look for a main function and that's what gets executed first and that main function is already hooked into the Rust runtime as a library so literally what you have left to write is your handler. So using this tool, the experience of writing Lambda code is not very different from the experience you would have when writing something in Python or in Node.js and another interesting thing that it's a common gotcha at the beginning is that if you're used to writing Lambdas in Python or JavaScript, it's very easy to to receive generic pieces of JSON in your event, process them read some data and do something with it and then return another very generic JSON string as an answer.

Of course, when you use compiled languages/strongly-typed languages, it gets a little bit more complicated because you cannot just say whatever data type. You need to be very specific with the data you are receiving and the data that you are returning. There is a library that can help a lot when you want to still keep that very generic ability to handle JSON input you don't want to create like very strict serializer.

Maybe you want to accept very generic JSON. This library is called SERDE and it's the most common library for serialization the serialization Rust and it also supports JSON. With this library, you can write an handler that can receive any arbitrary JSON and then you have different ways to extrapolate data from the arbitrary JSON. It effectively gives you an object that's like a tree and you have to traverse this tree and verify, okay am I getting an object here, is this probably something that exists, is it a string, is it an integer.

It takes a little bit more code to actually deserialize - well - to extrapolate your information but you don't have to deserialize into a very specific type if you want to do that. Actually, I think the runtime comes with for instance HTTP types and the most common types already built in so if you are building handlers that are already targeting very specific types of default events in AWS there are different types already built in that you can just use and you will have access to all the different properties that are expected for those events.

Then the next step is, how do you do local testing. So with cargo lambda you can just run 'cargo-lambda watch' I think is the command which allows you to spin up like a local version of your lambda and then local version restarts if you do changes in your code. So it's kind of a live autoloading server at that point and then you can use 'cargo-lambda invoke' I believe is the other command to send an event to this locally running server and see exactly what happens if your code receives that event. So you can keep doing that until you are happy with with the result of your code and then at that point, you can run 'cargo-lambda build' and that will effectively compile your code and you can even pass a flag to build for graviton which is something I haven't tested yet but I'm very excited to try that because I'm hearing that that makes the performance in certain cases even more interesting. And there is also a template that you can easily when you bootstrap your project you also get GitHub CI template that you can use if you want to automate the testing and building process in yourCI if you use GitHub Actions.

Eoin: I like the idea that this cargo lambda tool gives you this kind of one-stop shop for everything you need to do to create and build rust lambda projects but I'm kind of wondering how does it fit in with the rest of the ecosystem that we know already, like SAM and Serverless Framework, which allow you to create multiple functions and also deploy other resources along with that as well. Is the rust movement with Lambda moving towards its own kind of closed ecosystem that doesn't link through to the other tools or can you use some of those other tools with rust as well?

Luciano: Yeah I don't think we have a critical mass of Lambda Rust projects today to say that the ecosystem is going one direction or another. I think it's still very very new. We saw actually when we were creating the the show notes for these episodes that there is a serverless plugin so if you're using the Serverless Framework, there is a Rust plugin that should help you to to build things correctly and structure things correctly if you're using Rust. We haven't tested it yet, so your mileage might vary but there seems to be different kind of tools, it's not just Cargo Lambda.

You can also use container images, so you could package rather than building this zip file with a binary inside, you can package everythingas a container. And we can reference an article that Eoin you wrote some time ago if you want some generic guidance. You will need to adjust that for Rust and maybe you can find some other material that gives you more guidance on how to do that specifically for Rust.

I did try myself to do things manually because when I wrote my first Lambda I didn't know about this cargo lambda maybe wasn't even there it takes a little bit of work because you need to figure out exactly what are the right flags to compile your code for the right architecture and then how to zip your file correctly so that the runtime is actually bootstrapped correctly so it's not that obvious and when I did it the first time there wasn't a lot of guides out there so I had to spend a little bit of time doing trial and error until I figured out exactly how configure I should have worked so I'm looking forward to to spend more time playing with cargo lambda because it seems like it's the kind of tool that I would have wanted when I was doing this particular exercise and I ended up doing a lot of stuff manually trying to figure out things by trial and error.

Then another interesting thing is that in that particular example that I mentioned before that Twitter integration I needed to connect to something like DynamoDB so you might wonder okay, what's the SDK experience like? It is good actually because now and by now I mean probably this was reinvent of two years ago if I'm not wrong there is an official rust SDK officially supported by AWS before then there used to be somebody writing it as a library so it maybe wasn't the most up-to-date experience. It wasn't that official experience anyway now there is an official SDK so you might expect that it is as good as the other ones if it's maintained by AWS.

It is still a little bit too verbose because I think it's just the nature of strictly-typed languages that you need to when you use this kind of SDKs you are constructing a lot of objects and combining them and then calling specific methods. And it tends to be a lot more verbose when done with a strictly-typed language compared to something like JavaScript or Python. Although one thing that I like is that there are a lot of builder pattern utility so they can give you shortcuts to build objects where you can basically use all the defaults and just specifies the thing that you want to customize. It still takes a little bit of work to get used to it coming from the JavaScript SDK or the Python SDK but I think eventually you can easily get used to it and the typing system it can actually be a good guidance to see exactly which options are supported, which ones are not required and so on. So yeah, could be a bit verbose but also could be a little bit more guidance than you get when using JavaScript or Python.

Maybe I can say why I am particularly excited about Rust and Lambda together. The first thing and we already mentioned that is because for all these cases where memory and speed are directly related to cost and this is particularly the case for serverless. Using Rust can really give you the most optimized experience you can possibly create so you can eventually save a lot of money, so basically what I'm trying to say is that if you're writing something in Rust for Lambda, you should be able to get fairly optimized memory consumption and performance.

So you are reducing to the minimum the two dimensions that eventually affect the cost of running your own Lambda. I think in all these cases where you might be concerned about performance and cost probably investing in Rust might be a good investment. Also again we already said that that the language itself pushes you to write code that is generally more correct so it's potentially harder to create specific kind of bugs so that can be another advantage if you have very business sensitive areas where you want to make sure that you are not writing bugs or at least reduce the risk of writing bugs. So I expect Rust to become more and more prominent in general so we will see a larger ecosystem, more libraries, more tutorials, more examples, more people willing to help or even to be hired in your company with that expertise already. So that, I think, contributes in the future of Rust and Lambda so we will probably see more use cases of Rust and Lambda just because the ecosystem is growing.

Eoin: Yeah it sounds like there's a lot of benefits like from what you said, I can think of, like, you got a modern ecosystem, great tooling right for Rust itself and also for Rust with Lambda now you've got the performance benefit which was definitely going to be of interest to people who are like really trying to optimize. and then also that relates to cost you know so if you've got a really hot Lambda and you can benefit with an order of magnitude improvement in performance potentially with Rust that has a huge appeal. So there's lots of positives there. In the interest of balance, are there any drawbacks we should be aware of? And maybe what's your advice in summary? Should people try and adopt it now or hold off trying Rust with Lambda?

Luciano: Yeah so I think the main decision point on whether you should be using Rust or not is probably the learning curve so there is still a little bit of learning curve. It's not a language that most people know and it's a language that comes with a very specific background that is closer to system programming than it is to web programming. So if you are coming from web programming, there might be a little bit more to learn there before you can fully appreciate the language and then you need to learn all the distinctive characteristics of the language; the syntax, the memory model and so on.

So there is definitely a little bit of learning curve there so it's not something to underestimate that learning curve because if you are building Lambdas you're probably trying to optimize for speed of delivery. When you think that you need to learn a new language and you combine with that probably there is some sacrifices to be made there in the short term to give your team time to learn the language, master it enough that they can be proficient with it and then you are going to get all the benefits that Rust gives you. So that's something that only you can evaluate in the context of your company if it makes sense. But if you are doing maybe some side project or some very small experimental Lambda it might be something worth trying just to get a feeling for what it looks like.

I would say though that before you try it, if you don't know Rust itself, maybe it is worth learning a little bit of Rust first so just to get used to the characteristics of the language. You can do some coding challenges. I really like exercism.org - we will have a link in the show notes - because it has it's a totally free platform. First of all it looks like leetcode, like one of those platforms where you can do coding challenges. But they have a specially good Rust track that is basically a sequence of I think 50 exercises that guides you through all the basics that you need to learn to master the main concepts that you need to know in Rust.

So that could be a very good way to just get started. It's actually quite fun and the website is really well polished so maybe that's a low effort way to to get started with Rust and then when you complete that track maybe you can start to try now how does it look like to actually try to write a Lambda with it. So I think that's that's probably everything for today we probably cover way more than we wanted to cover originally I really encourage everyone to to try this and give us your feedback if you found the experience interesting useful or maybe was just too difficult and you couldn't really progress and you had to go back to JavaScript or Python or something else. And if you build something with it always let us know what do you build because you know we are always curious to get use cases and understand how people use Lambda, Serverless, AWS and definitely we can have some nice conversations after that and we can also revisit our own assumptions and our own understanding of Rust and Lambda together and maybe come up later on with a new episode with a refreshed perspective on this particular topic. So thank you very much for being with us today and we look forward to seeing you in the next episode.