AWS Bites Podcast

67. Top AWS Developer productivity tools

Published 2023-02-10 - Listen on your favourite podcast player

Are you tired of feeling overwhelmed by the vast AWS landscape? Do you find yourself constantly struggling to keep up with all the tasks at hand? Look no further! In this episode of AWS Bites podcast, Eoin and Luciano share their top six time-saving tools to help you reclaim your productivity and make the most of your AWS experience. These tools are designed to make your life easier and allow you to achieve more in less time.

But don't worry, this won't be a boring lecture. Get ready to have some fun as they reveal their top tricks and tips, from profiles and SSO to terminal gems and CLI magic. These tools will have you feeling like a kid in a candy store, soaring through your AWS work with ease. And if that wasn't enough, they've got a few extra special surprises in store to take your AWS skills to new heights.

So buckle up and get ready for a wild ride, it's time to have some fun with AWS!

AWS Bites is sponsored by fourTheorem, an AWS Consulting Partner offering training, cloud migration, and modern application architecture.

In this episode, we mentioned the following resources:

Let's talk!

Do you agree with our opinions? Do you have interesting AWS questions you'd like us to chat about? Leave a comment on YouTube or connect with us on Twitter: @eoins, @loige.

Help us to make this transcription better! If you find an error, please submit a PR with your corrections.

Eoin: The breadth of AWS services is huge. And when you're working on a cloud-based application, sometimes it seems like there's so much to do and it's easy to get lost and distracted. Today we have six time-saving tools to share with you to help to make you a more productive AWS guru. My name is Eoin and I'm joined by Luciano and this is the AWS Bites podcast. AWS Bites is sponsored by fourTheorem. fourTheorem is an AWS consulting partner offering training, cloud migration and modern application architecture, find out more at fourTheorem.com.

You'll find that link in the show notes. Going into today's episode then, Luciano, we're going to try and share our top six tools for AWS productivity, things you may or may not have heard about. And if you hang around to the end, we've actually got a couple more bonus tips and tools to help you take your AWS expertise to the next level. And I'm going to start off with one that I think is really a time-saver for me and really helps you to get organized and helps with a bit of security as well.

So there's a couple of things that I'd say first before I reveal what it is. If you're using AWS with the SDK or the command line, I would really recommend using AWS profiles and try to avoid just sharing environment variables and setting up environment variables for different accounts and roles as you need them. Profiles just make it a little bit easier because you've got named profiles for different AWS accounts and roles.

And you can set those up using your credentials file, but it's even better if you set it up with AWS SSO. So we talked a bit about this before. We've got a previous episode, which we can link in all about identity management on AWS and why you might use SSO instead of just IAM users. When you're using SSO, then you can end up with an organization with lots of accounts. SSO allows you to have different sets of credentials managed automatically, so you never have to copy and paste environment variables or lines into your credentials file.

It's all managed by the SSO credential process for you. But that can get a bit cumbersome if you've got tens or hundreds of accounts to work with. And if you're working with different companies or different customers, you really need to be careful that you're using the right set of credentials at the right time. And there's a really good tool that I started using in the last year, and you can find it on granted.dev.

And the tool is called Assume, and it's a CLI tool that you can install, and it will allow you to assume a role on the command line. And it will basically scan your config file, your AWS config file, allow you to select which permission set, which role you want to assume, and then those credentials will be loaded into your shell. So that's one of the things it does. The other thing it does, which I really like, is it works with your browser.

So I'm using Firefox containers, but this granted.dev tool also has its own containers plugged in for Firefox that allows you to log into the console using a given profile. So you don't have to set up the Firefox container manually and log in every time. It will automate a lot of that for you. So you can have multiple different tabs open for different AWS accounts. They get different colors. They're very easy to identify, and it streamlines all of that process for you.

Now, one thing I'd say is that you do need to somehow load all of your accounts and SSO permission sets into your AWS config file for this to work, and I use Ben Kehoe's aws-sso-util do that. It will essentially populate your config file with all of the permission sets and all of the accounts that you have, and you can give it multiple URLs and just get it to scan all the URLs, the start SSO URLs you need.

And then you can suddenly have like hundreds of profiles in your config file automated for you, and then granted.dev will allow you to log in on the CLI or in the browser really easily. And there's an alternative for this. So some of our colleagues at Fortherium I know use another one from Synfonatic on GitHub called AWS SSO CLI. It kind of does the two things together. It works in a slightly different way, so you can try both, I would say, but I've been really happy with the assume tool from Granted. So that's it. I suppose when you have this set up, then another thing to be aware of is you have to still keep ensuring that you're aware of what profile you're using at any given time. I think you've got a solution for this, Luciano.

Luciano: Yeah, definitely. So my solution for a long time has been always run AWS STS get-collared-entity every once in a while. But of course, that's not perfect science. You end up forgetting about that, and you might end up doing stuff on the wrong account without realizing it a little bit too late. So recently I started to use something called Starship Terminal, which is just kind of a universal terminal prompt.

By that I mean that it works with most of the shells that you might be using. So I worked with Bash, Z Shell, Fish Shell and so on in a kind of a seamless way. And the reason why I really like this particular one is because it's almost zero config. Of course, you can configure it and customize it to your need, but out of the box, it comes with a lot of very good defaults. And one of the things that it does out of the box is that it's capable of recognizing your AWS profile, looking at different things it's able to display in your terminal which profile are you currently using. So that can be something very good to just keep there because every single command you're going to have an indication of which profile is going to be used for that particular command just in your terminal prompt. Of course, you can do the same if you prefer to do it with Z Shell and using plugins, or maybe you want to manually customize your Bash prompt. But all of that stuff will require a little bit more work on the configuration side. So you need to be a little bit more expert on the specific tools to be able to achieve the same kind of thing. Anything else we can do to be productive in the terminal?

Eoin: There's a tool that I really like, but it's actually been discontinued. But I'm going to talk about it anyway, because there is an alternative and it's called AWS Shell. So people probably know the AWS CLI. It's fairly widely used across all the operating systems. AWS Shell was kind of an extension to that, that AWS published, and it gave you a much more interactive layer on top of the AWS CLI. Now, the AWS Shell project has kind of been discontinued since the version 2 CLI came out.

But its functionality has kind of been integrated now into the AWS CLI version 2, at least most of its functionality. So now you've got this option, and the option is --cli-auto-prompt. So CLI auto prompt. And when you run AWS with dash dash CLI auto prompt, it will then give you like an interactive Curses type interface. So when you type AWS, it'll suddenly give you a pre-populated list of all the services you could use. You type in EC2, and then it'll give you all of the actions you can use, like describe instances. And once you've done that, it'll give you a list of the options, and you can browse documentation for the options in line in your terminal. So if you're tired of switching between one terminal for AWS CLI and one for AWS CLI help, or your browser, then this is a way to get it all done in one shell, and it works really well. So I really like that one. Any more CLI tips?

Luciano: Yeah, I actually have another one. So one of the things that I often find myself doing when working with the CLI is consuming large blobs of JSON, because when you use AWS CLI, maybe you, I don't know, try to describe a cloud, a cloud formation stack, or maybe list a bunch of resources, and you get back sometimes very big objects, and it's hard to just find whatever you're looking for in that big object.

So there are many solutions to this problem, but one of my favorite tools is JQ, which is a CLI helper. So you can just pipe output of other commands that produce JSON into JQ, and then JQ will do a bunch of things for you. Like, by default, if you don't pass any parameter, it's just going to nicely format and color the JSON. So if your JSON is not already made as much human readable as possible with proper indentation and spacing, JQ will do all of that for you, and that's already a big step to make it a little bit more readable.

But you can also go another step forward, and you can use specific kind of expression that JQ supports to try to filter down that output. And this is something that sometimes you can use to build, like, automation scripts, because sometimes you want to do a specific AWS command. From the output of that command, you want to extrapolate only a very specific piece of information, so you can pipe that command into JQ with a specific query to extrapolate that information, and then maybe you can pipe that other command to something else, with Xargs or some other kind of pipeline.

So that's something that I find myself doing quite often, and it can be very convenient. Now, there is a little bit of learning curves. Learning the JQ expression, it's something you'll need to try to practice a little bit and spend some time. You will get it wrong, so it can be a little bit frustrating at first to try to run the same command over and over with different filters until you actually get what you want.

So I recently discovered another tool called iJQ, which stands for Interactive JQ, which is basically the same thing. So you can still pipe the output of another command into this tool, but it starts kind of an interactive and courses type of view, where you can try different filters, and in real time it's going to show you the effect of that filter on the actual JSON object. So that's something you can use to kind of speed up that process of trying to figure out what's the right filter for what you want to try to achieve.

Some people might actually know that something similar is already built into the AWS CLI, and there are actually two flags that you can use if you don't want to bother installing JQ or learning, specifically, JQ. So these two flags are dash dash filter, which is on the work server side, so you are basically running a command and AWS will execute a filter on the server and just return less data to your client.

Or if you just want to filter client side, so you're still getting all the data from AWS, but then client side you are displaying less information based on your filter. In that case, you can use dash dash query. So filter to do... dash dash filter to do the server side filtering of the output, dash dash query to do the same thing, but on the client side. I'm not too sure if the filter one is available for every single command, but I'm quite sure that the query is pretty much available in every one.

So maybe worth double checking which one you have available, depending on the commands you're using. And finally, I have one last tip that I find it useful, especially when working with something like S3 or DynamoDB or in general data storage, just when you basically are trying to figure out how to do a specific change. But that change might actually be disruptive. So if you are not 100% sure that your command is right, you can use this flag called dash dash dry run, which is going to kind of simulate that command and give you some useful output, but without actually generating mutations on your account.

So it's kind of a nice way, for instance, when you want to see if a specific folder needs a sync when you're trying to sync with S3, but you are not sure that you are checking the right folders or maybe that you want to actually sync in that moment. So in that case, you can use the dry run flag and it's going to show you exactly what is going to change if you remove the dry run flag, but without applying the changes. So you can double check and if you are okay with it, remove the dry run flag and run it again. So I think that covers this particular point, but kind of leads me to ask you a very specific question. Are there specific tips more on the documentation side? Like how do we learn more about SDK or CloudFormation? Because these are other things where I find myself most of the time wasting time trying to figure out how do I do a specific thing that I want to do? I know it's possible, but I don't remember exactly what's the right way of doing it.

Eoin: Some people rely on extensions in their IDE to integrate that. I just use Dash, which is a Mac OS application for local offline documentation. And I really like the way it works because it's simple. And what I really like about it is that it keeps me out of the browser. So if you end up going to the browser for documentation, there's a risk that you end up getting distracted and falling down a hole on the internet with another tab.

But if you use Dash, then it's just for documentation and then you can load documentation sets or doc sets into it. So there's one there for the AWS CloudFormation docs and AWS SDKs. Actually, funnily enough, we talked about the new SDK for TypeScript recently in JavaScript. It doesn't seem to have doc sets for the version 3 SDK yet. It still has the older version 2 AWS SDK for JavaScript. So that's something that might limit you.

I find that actually the version 3 documentation is quite limited anyway, so I don't think it makes a huge difference. But yeah, if you're doing... Dash also has great doc sets for all of the programming languages as well. So if you just want JavaScript docs or TypeScript docs and lots of other stuff, check it out. And I think there's an alternative for Windows and Linux as well. I don't know what they are, but if you search for Dash alternative, you'll find them. So yeah, that's a hot tip, I think, and a good one to make sure you don't get distracted as you're switching between code and documentation. What do you think Luciano? What else would you use to lean on? What other questions have you got to help you to be more productive at developing code on AWS?

Luciano: One thing that I've been using a lot in the last few months is GitHub CoPilot. And I find it in general very useful if used with moderation. Of course, you shouldn't trust GitHub CoPilot suggestions blindly, always terrified that they really are doing exactly what you were trying to do. And sometimes there are subtle mistakes that can be there. So definitely double check, triple check, GitHub CoPilot suggestion, and just be sure that it's actually what you want to do.

But when you do that, it's still saving you a lot of time because sometimes it auto completes with significant amount of code that it will take you a while to write yourself. Or maybe you don't remember exactly the signature of something and it chances are that what CoPilot is giving you is more correct than what can you remember. So that's definitely something I would suggest. And it seems to work quite well with SDK, at least the one for JavaScript and Python, so far for my use cases.

So sometimes you don't remember exactly how to do something like copy a file into S3. If you start to write code, chances are that CoPilot is going to give you a good enough suggestion that will kind of speed up your development time. And again, always double check, but it's sometimes faster to go through CoPilot and then double checking later if, for instance, with your IDE, you can highlight the specific function and see the documentation.

But again, it's writing a significant chunk of code for you, so that can speed up your development time. Now, if you don't like CoPilot for whatever reason, there are alternatives. I didn't use them, but I heard people using Code Whisperer or tab 9 as alternative to CoPilot. So they might be good enough and they could be interesting to try out. So moving on, when we were preparing this particular list of suggestions, we wanted to select only six of them just to try to keep the episode short. But I think there are still some mentions that we want to give. So Eoin, what do you have as honourable mentions?

Eoin: One that I'd really recommend. Bookmark this one, the IAM policy simulator. If you need to troubleshoot access denied errors with IAM, the IAM policy simulator opens up a fairly crude but very useful web UI that you can use to simulate any action against any service. And you just have to already be logged in with credentials into a console, then you open up in the same browser and it will allow you to run a simulation and it should be able to identify whether you get access denied or not and with reasonable level of accuracy why you're denied.

So if it's a explicit deny or missing allow or if it's something in a service control policy or permissions boundary, you should be able to see that. Another one is for Python developers, or actually anybody doing data work in general is the AWS SDK for Pandas. That's the new name for it. It used to be called AWS Wrangler and that's a Python module that provides a whole set of very convenient APIs for dealing with data.

So reading Parquet files from S3 and writing them, it's built on top of PyArrow and it allows you to read and write from S3, Redshift, other AWS data services and it even allows you to do things like reading logs without having to do all the pagination and stuff yourself. So definitely check that out if you're a Python developer. Last one is Cloud Shell and it's sometimes forgotten, but it's right there at the top of every browser session when you log into the AWS console, you just click on that terminal icon and you can get a shell into your AWS account that you can use for exploring with resources in your AWS account. And you get a limited amount of storage and it's retained for 30 days or something like that. So I think it's completely free for just using one for every account, for every user. And the only thing right now, the only thing that's missing is that it doesn't have VPC access. I hope that's coming soon. So what have you got? What are your three from the best of the best?

Luciano: I like your list and I definitely vouch for IAM policy simulator. Something saved my life a few times. So it's not perfect. Sometimes the labels don't quite make sense, but you still get a lot of value from it to still give you a lot of useful information when you are in trouble with permissions and you need to figure out why. So plus one on that. Other than that, I have a few ones. I'm going to be a little bit selfish because we have worked on some of them.

So the first one is SLIC Watch, which is, I'm just going to say, a plugin for serverless framework, but in reality, we have extended it to work also with CDK and other tooling. So probably it's going to work with most of the infrastructure as code tools that you'd like to use. And basically what SLIC Watch does is tries to make serverless observability easier by just giving you a set of defaults built in.

Like it's going to create dashboards, alarms, based on best practices, just by looking at what's your current infrastructure through your infrastructure as code definition. So check it out. We'll have the link in the show notes. If you like it, let us know. If it doesn't work for you, also let us know why. It's an open source project, so you are also welcome to contribute, open issues, send SPRs. If you find it useful, let's work on it together and let's make it better.

The second one is Middy, which is a serverless framework for Node.js. So the idea is that when you write Lambda code, especially if you are building an application that contains a lot of Lambdas, maybe you are building an API and you have a lot of endpoints spread out in different Lambdas, you might end up with a lot of code duplication. Like there is a lot of concerns that are kind of repeated between Lambdas.

For instance, the way you do validation, the way you do access control, the way you serialize and deserialize, input and output, and all these kind of concerns, you end up repeating that code over and over in a way that is generally not quite testable or usable. So Middy helps with that. It basically uses the middleware pattern heavily to try to extrapolate all this concern outside from your business core logic of your handler.

So it's trying to keep your handler as pure as possible and moving all this concern outside in a way that makes them more usable and testable. And there are a bunch of built-in middlewares that you can just use for all sorts of most common type of concerns that we have seen in the last few years. Recently became the most used framework for Lambda in the Node.js space just by the number of weekly downloads.

So that's just an estimate that it's only useful to people and you might want to check it out if you're not using it yet. Last one, which is closely related to Middy, is PowerTools, which is a library... It's actually multiple libraries for different languages that is built by different teams in AWS. For instance, you have the TypeScript one, which is actually good even for plain JavaScript, of course.

You have the Python one, but there are also the Java one, I believe. So it's kind of a set of libraries that do more or less the same things. So they try to make your experience while writing Lambda a little bit better by giving you especially tools for observability, like making it easy to do structured logging, making it easier to register custom metrics, and also tracing as well. So check it out. I especially like the TypeScript one and the Python one. I've used them a lot and they are quite useful. They will save you a lot of time because all these things will take time to set up correctly. And if you use the library, it mostly works out of the box.

Eoin: We've actually got an article on the PowerTools that we can link in the show notes as well, as well as the link to our previous episodes on SLIC Watch and Middy. So hopefully people will find that useful. At this point, I guess I'm curious what we missed. What are the glaring obvious tools and productivity tips that everyone out there uses? So please put them in the YouTube comments or send them to us on Twitter. We want to know about them. Maybe we'll have enough for another episode. Thanks very much for listening to this one. It's been great to have you with us. Keep listening and we'll see you in the next episode.