Scanning Amazon DynamoDB Tables with AWS SDK for Rust
Learn how to scan AWS DynamoDB tables using the AWS SDK for Rust. Learn how to set up your Rust project, execute it, & paginate scan operations.
Learn how to use the AWS SDK for Rust to access, manage, and interact with AWS services from your Rust applications. Contact us for more information or help!
Using the right programming language is important for developers to optimize performance and cost in the cloud. Although Rust has been around as early as 2006, it hasn’t gained mainstream adoption until 2015.
In 2020, the AWS Open Source blog published a post expressing a desire to support the Rust community. In December 2021, AWS formally announced that a developer preview of the Rust SDK for AWS was available.
Using languages such as PowerShell, Python, and Ruby, developers have been able to prototype applications quickly, and effortlessly build integrations, while sacrificing some performance and efficiency. Reducing prototyping time is a win for developers, but once the workload scales in production, compute costs can add up quickly.
This is where Rust comes into play.
Rust provides developers an efficient way to develop applications, while also not incurring the overhead of fine-grained memory management. It’s similar to writing a C++ application, but without the risks of memory leaks, use after free, and other memory management challenges.
Compared to high-level languages, Rust programs will almost certainly use less RAM and fewer CPU cycles. More efficiency means higher density; higher density results in lower costs, while continuing to operate at the same scale.
Rust is able to intelligently handle these memory concerns by implementing something called the “borrow checker.” The idea with the borrow checker is that memory can be shared between functions as an immutable or mutable reference. The borrow checker ensures that the memory has not been freed before it’s borrowed, and it also closely tracks which scope has “ownership” of the data.
Once a Rust application has been developed and compiled, the developer has a strong level of confidence that their application will safely handle references to memory at runtime.
There are quite a few different ways that you can use Rust with AWS.
Now that you’re convinced you should write Rust applications for deployment in the cloud, let’s explore how you can get started! Let’s first look at using the Rust SDK to manage AWS cloud resources. In other articles, we’ll cover serverless functions with Rust, and containerizing Rust applications in AWS.
The first thing you'll need to do is to install the Rust toolchain. You can do this on Linux, Windows, or MacOS. In fact, you can even install the Rust toolchain inside of a Linux container, or use the pre-built Rust container image on Docker Hub.
Visit https://rustup.rs to find the Rust toolchain installer, known as Rustup. There will be a simple shell command that you can copy and paste into your terminal. This command will install the Rust compiler, the Cargo CLI package management tool, and several other utilities commonly used by Rust projects.
After you’ve installed the Rust toolchain, you can create a new Rust application project. We’ll use the Cargo CLI to create a skeleton project, which we can then customize further. Run the following command to create your new project directory, with the default Rust files.
mkdir ~/git; cd ~/git;
cargo new myrustcloud
You should see a new folder called myrustcloud in your $HOME/git directory. You can open up this project folder inside your code editor of choice, such as Microsoft Visual Studio Code.
In order to call out to AWS service APIs from a Rust application, we need to install the dependency for the appropriate service. In Rust, dependencies are called “crates.” These crates are generally hosted on the community crates.io registry, but can also be sourced from any git repository.
For starters, you will need to install the aws-config crate, for any Rust project using AWS APIs. Although this name is ambiguous with the AWS Config service, that is not what it references. Rather, the aws-config Rust crate is a generic crate that allows you to specify credentials, regions, and other generic AWS connection information. Let’s install that crate into our project with the following command.
cargo add aws-config
Every Rust project with AWS SDK support will also need the tokio crate. Tokio provides async functionality beyond what Rust has built-in. We won’t be focusing on the intricacies of tokio in this article. For now, just know that you need it.
cargo add tokio --features=full
Your cargo.toml file should now contain a couple of entries underneath the [dependencies] heading.
Each service in AWS has a separate “crate” that you can optionally install, if you need to interact with that service. All of the AWS service crates have a standard name prefix of aws-sdk-<service_name>. This per-service approach to dependency management speeds up development tooling and compilation times, and minimizes your compiled binary size. If the entire AWS SDK were provided as a monolithic crate, your development environment would be slower and much larger on-disk.
For this example, we’ll interact with the Amazon S3 service, and create a new bucket in a given AWS region. Let’s add the Rust crate for the S3 service. You will need to run the following command from the context of your project directory.
cargo add aws-sdk-s3
When you run this command, the crate will be downloaded into your $HOME/.cargo/registry/cache directory. In fact, there’s one directory level below that, which is named based on the remote registry that the crate was downloaded from. That directory should be named something similar to index.crates.io-<commit_hash>.
Now that we have imported the necessary dependencies, it’s time to write some Rust code! Open up your main.rs file, and change your definition of the main() function to the following code. This allows the tokio async executor to invoke your entire program, from main().
#[tokio::main]
async fn main() {
// Do AWS stuff here
}
Next, we need to add our AWS credentials into the program. All of the AWS SDKs provide several different mechanisms to provide credentials. One of those mechanisms includes setting a few environment variables, which the SDK can automatically read.
To avoid adding any more dependencies, we are going to define the environment variables directly in our program. This is generally not a good practice, but for now it will keep our code straightforward.
Add the following code to the beginning of your main() function.
// Configure your AWS credentials as process-scoped environment variables
use std::env::set_var;
set_var("AWS_ACCESS_KEY_ID", "<paste_value_here>");
set_var("AWS_SECRET_ACCESS_KEY", "<paste_value_here>");
set_var("AWS_SESSION_TOKEN", "<paste_value_here>"); // This may not be required, if you don’t have one.
Next, you’ll need to define an AWS region, using the AWS_REGION environment variable. You can substitute us-west-2, in the example below, for any other supported AWS region.
set_var("AWS_REGION", "us-west-2");
Now let’s use the AWS SDK to load our AWS credentials and region from the environment variables that we’ve just defined. The aws_config crate exports an async function called load_from_env(), which handles this for us. Since it’s an async function, we will need to use Rust’s await keyword to pause execution of the main() function, until the configuration has completed.
let myconfig = aws_config::load_from_env().await;
Great, we’re done configuring the AWS SDK. It’s time to call a service API!
In the previous section, we configured the AWS SDK, but now we’re ready to start using it to call resource management APIs. For starters, let’s learn how to create an S3 bucket.
The S3 service crate provides a “client” object, using the SDK configuration as input to construct it. We can also apply an alias to the S3 crate, making it easier to call from our Rust program. Write the “use x as y” syntax to accomplish this, like below.
use aws_sdk_s3 as s3;
let mys3 = s3::Client::new(&myconfig);
Now that you’ve created the S3 client object, you can call various S3 APIs from that client! The AWS crates provide “fluent” interfaces that allow you to call Rust functions to configure a request, before it’s sent to the service API. We’ll use the create_bucket() function, on the S3 client, to create a new request, and provide some input parameters before we send it.
The only mandatory input for creating an S3 bucket is the bucket’s name. We’ll use the bucket() function to configure the name of the bucket that we want to create. You can do the following all on one line, but let’s break it down into individual steps, to make it easier to understand.
First, create a mutable create_bucket() request. This creates the request in memory, and exposes additional functions to alter the request inputs.
let mut create_request = mys3.create_bucket();
Next, use the bucket() function to set the name of the bucket that will be created. Assign the mutated value back to the same mutable variable.
create_request = create_request.bucket("stratusgrid-999");
S3 buckets also have something unique called a “location constraint.” The constraint determines which region the bucket will be created in. This value needs to match the region that you configured in your AWS_REGION environment variable. The S3 crate provides a data structure that can be used to provide this input.
use s3::types::builders::CreateBucketConfigurationBuilder;
let constraint = CreateBucketConfigurationBuilder::default();
let region = constraint.location_constraint(s3::types::BucketLocationConstraint::UsWest2).build();
I know that might look intimidating, but trust me, it works!
After creating the constraint, assign it to the create_bucket request with the create_bucket_configuration() function.
create_request = create_request.create_bucket_configuration(region);
Finally, we will send the request to the Amazon S3 API, by using the send() function. The send() function is async, so we need to await it as well.
let result = create_request.send().await;
The previous line of code, that should create your S3 bucket! Now it’s time to run our program and see if it actually works. Make sure you save your main.rs file, and then use the command below, from your project directory, to compile and execute the program.
cargo run
After your program completes, feel free to explore the AWS Management Console to validate that it exists.
If you need to create additional buckets in the same AWS region, you can create an array containing a list of bucket names. Once you’ve created the array, you can then loop over it, and use the code above to create them all!
In this article, we have explored how to create an Amazon S3 bucket using the Rust SDK for AWS. Although the AWS APIs can be a little confusing for newcomers, the more practice you have with them, the more easily you'll be able to repeatedly call them.
Now that you've learned how to use the create_bucket() S3 API, try calling some other S3 APIs on your own! For example, try calling delete_bucket() instead.
Don’t forget to follow the StratusGrid blog and YouTube channel, for more cloud and software content!
AWS has contributed a fair amount of code to the global Rust developer community. Did you know that the AWS Lambda service itself runs on Rust? That’s right, the AWS Firecracker project is written in Rust, and provides the compute isolation for your Lambda functions!
Rust is a powerful language that can be used to build efficient and scalable applications. The AWS SDK for Rust makes it easy to interact with AWS services from Rust applications. If you're looking for a way to improve the performance and cost-effectiveness of your cloud applications, I encourage you to give Rust and the AWS SDK for Rust a try.
If you need help getting started with Rust or the AWS SDK for Rust, please don't hesitate to contact contact us. We have a team of experienced AWS engineers who can help you get up and running quickly. We also offer a variety of other cloud services, such as managed Kubernetes and managed hosting for Rust applications.
Contact StratusGrid today to learn more!
Learn how to scan AWS DynamoDB tables using the AWS SDK for Rust. Learn how to set up your Rust project, execute it, & paginate scan operations.
Discover how to deploy Rust applications in Linux containers using AWS Fargate, Ideal for developers seeking to streamline their Rust deployments.
Discover a useful guide on developing a label detection app using Rust and Amazon Rekognition. Perfect for developers seeking advanced solutions.