Introduction
With the increased complexity of modern software systems, came along the need to break up systems that had outgrown their initial size. This increase in the complexity of systems made it harder to maintain, update, and upgrade them.
This paved the way for microservices that allowed massive monolithic systems to be broken down into smaller services that are loosely coupled but interact to deliver the total functionality of the initial monolithic solution. The loose coupling provides agility and eases the process of maintenance and the addition of new features without having to modify entire systems.
It is in these microservice architectures that Queueing Systems come in handy to facilitate the communication between the separate services that make up the entire setup.
In this post, we will dive into queueing systems, particularly Amazon's Simple Queue Service and demonstrate how we can leverage its features in a microservice environment.
What is Message Queueing?
Before the internet and email came into the picture, people over long distances communicated mostly through the exchange of letters. The letters contained the messages to be shared and were posted at the local post office station from where they would be transferred to the recipient's address.
This might have differed from region to region but the idea was the same. People entrusted intermediaries to deliver their messages for them as they went ahead with their lives.
When a system is broken down into smaller components or services that are expected to work together, they will need to communicate and pass around information from one service to another, depending on the functionality of the individual services.
Message queueing facilitates this process by acting as the "post office service" for microservices. Messages are put in a queue and the target services pick up and act on the ones addressed to them. The messages can contain anything - such as instructions on what steps to take, the data to act upon or save, or asynchronous jobs to be performed.
Message queueing is a mechanism that allows components of a system to communicate and exchange information in an asynchronous manner. This means that the loosely coupled systems do not have to wait for immediate feedback on the messages they send and they can be freed up to continue handling other requests. When the time comes and the response is required, the service can look for the response in the message queue.
Here are some examples of popular message queues or brokers:
- Amazon Simple Queue Service - which is the focus of this article
- RabbitMQ - which is open-source and provides asynchronous messaging capabilities
- Apache Kafka - which is a distributed streaming platform that supports the pub/sub mode of interaction
- Others include Apache RocketMQ, NSQ, and HornetQ
Use-Cases of Message Queueing
Message queues are not needed for every system out there, but there are certain scenarios in which they are worth the effort and resources required to set up and maintain them. When utilized appropriately, message queues are advantageous in several ways.
First, message queues support the decoupling of large systems by providing the communication mechanism in a loosely-coupled system.
Redundancy is bolstered through the usage of message queues by maintaining state in case a service fails. When a failed or faulty service resumes operations, all the operations it was meant to handle will still be in the queue and it can pick them up and continue with the transactions, which could have been otherwise lost.
Message queueing facilitates batching of operations such as sending out emails or inserting records into a database. Batch instructions can be saved in a queue and all processed at the same time in order instead of being processed one by one, which can be inefficient.
Queueing systems can also be useful in ensuring the consistency of operations by ensuring that they are executed in the order they were received. This is especially important when particular components or services of a system have been replicated in order to handle an increased load. This way, the system will scale well to handle the load and also ensure that processed transactions are consistent and in order since all the replicated services will be fetching their instructions from the message queue that will act as the single source of truth.
Amazon Simple Queue Service - SQS
Like most other offerings from Amazon Web Services, the Simple Queue Service (SQS) is a message queueing solution that is distributed and fully managed by Amazon, much like serverless computing via Chalice.
SQS allows us to send and receive messages or instructions between software components enabling us to implement and scale microservices in our systems without the hassle of setting up and maintaining a queueing system.
Like other AWS services, SQS scales dynamically based on demand while ensuring the security of the data passed through (optional) encryption of the messages.
Demo Project
To explore the Amazon Simple Queue Service, we will create a decoupled system in Node.js, which each component will interact with the others by sending and retrieving messages from SQS.
Since we are a small organization that doesn't have the bandwidth to handle orders as they come in, we will have one service to receive user's orders and another that will deliver all orders posted that day to our email inbox at a certain time of the day for batch processing. All orders will be stored in the queue until they are collected by our second service and delivered to our email inbox.
Our microservices will consist of simple Node.js APIs, one that receives the order information from users and another that sends confirmation emails to the users.
Sending email confirmations asynchronously through the messaging queue will allow our orders service to continue receiving orders despite the load since it does not have to worry about sending the emails.
Also, in case the mail service goes down, once brought back up, it will continue dispatching emails from the queue, therefore, we won't have to worry about lost orders.
Amazon Web Services
For this project, we will need an active and valid AWS account which we can sign up for at the AWS homepage. AWS requires that we not only offer some personal details but also our billing details. In order to avoid being billed for this demo project, we will be using the AWS Free Tier for testing and development purposes.
We will also need to install the AWS CLI tool in order to interact with our AWS resources from our machines. Instructions to install the AWS CLI tool on multiple platforms can be found here.
With the AWS CLI tool in place, we can head to the AWS Console and under our profile dropdown there is a section called "My Security Credentials". Here we will be able to create credentials that will be used when interacting with the AWS console.
These credentials will be also used by the Amazon CLI tool, which we will configure by running:
$ aws configure
We will get a prompt to fill in our Access Key ID
, Secret Access Key
, and default regions and output formats. The last two are optional but we will need the access key and secret that we obtained from the AWS console dashboard.
With our AWS account up and running, and the AWS CLI configured, we can set up our AWS Simple Queue Service by navigating to the SQS home page.
As we can see in the following screenshot, after specifying our queue name as nodeshop.fifo
we are presented with two queue options:
We have the option to choose between a Standard Queue or a FIFO Queue. A Standard Queue does not maintain the order of the messages it receives and is better suited for projects that prioritize throughput over the order of events.
A FIFO queue, on the other hand, maintains the order of the messages as received and they are also retrieved in the same First-In-First-Out order.
Given that we will be building a mini shopping platform, it is important to maintain the order of requests since we hope to sell items to people in the order of their purchases. Once we have chosen the kind of queue we require, we can modify some additional configuration of our queue:
We can configure the time before a message is automatically deleted from a queue and the size of a message, among other options. For now, we will configure our FIFO queue using the default values. With our queue ready, we can now create our Node.js APIs that will read from and write to our Amazon SQS FIFO queue.
Setting Up Node.js and NPM
The instruction to set up Node.js on multiple platforms can be found here on the Node.js official website. The Node Package Manager (NPM) ships with Node.js and we can verify our installation as follows:
# Node.js version
$ node -v
v12.12.0
# NPM version
$ npm -v
6.11.3
The next step is to set up our folders as follows:
# create folder and move into it
$ mkdir nodeshop_apis && cd $_
# create the orders and emails services folders
$ mkdir orderssvc emailssvc
Setting Up the Node APIs
We will build the orders
service first since it is the one that receives the orders from the users and posts the information onto our queue. Our emails
service will then read from the queue and dispatch the emails.
We will initialize a Node.js project and install the Express.js framework, which we will use to build our minimalist API. We will also install the body-parser middleware to handle our request data for us and also validate it.
To achieve this in our root folder:
# initialize node project
$ npm init
# install express and body-parser
$ npm install express body-parser --save
Check out our hands-on, practical guide to learning Git, with best-practices, industry-accepted standards, and included cheat sheet. Stop Googling Git commands and actually learn it!
Once Express and body-parser
are installed, they will be automatically added to the dependencies section of our package.json
file thanks to the --save
option.
Since we will have multiple services that will be running at the same time, we will also install the npm-run-all package to help us start up all our services at the same time and not have to run commands in multiple terminal windows:
$ npm install npm-run-all --save
With npm-run-all
installed, let us now tweak the scripts
entry in our package.json
file to include the commands to start our services and one command to run them all:
{
// Truncated for brevity...
"scripts": {
"start-orders-svc": "node ./orderssvc/index.js 8081",
"start-emails-svc": "node ./emailssvc/index.js",
"start": "npm-run-all -p -r start-orders-svc"
},
// ...
}
We will add the commands start-orders-svc
and start-emails-svc
to run our orders
and emails
services respectively. We will then configure the start
command to execute them both using npm-run-all
.
With this setup, running all our services will be as easy as executing the following command:
$ npm start
We can create our orders
API in the index.js
file as follows:
const express = require('express');
const bodyParser = require('body-parser');
const port = process.argv.slice(2)[0];
const app = express();
app.use(bodyParser.json());
app.get('/index', (req, res) => {
res.send("Welcome to NodeShop Orders.")
});
console.log(`Orders service listening on port ${port}`);
app.listen(port);
After adding the required libraries to our Express app
, the "/index" endpoint will respond by simply sending a welcome message. Finally, the API will listen on a port that we will specify when starting it up.
We'll start the app by running the npm start
command and interact with our APIs using the Postman Application:
We will implement the emails
service later on. For now, our orders
service is set up and we can now implement our business logic.
Implementation: Orders Service
To implement our business logic we will start with the orders
service that will receive our orders and write them to our Amazon SQS queue.
We will achieve this by introducing a new route and controller to handle order input from the end-user, and send the order data to our Amazon SQS queue.
Before implementing the controller, we will need to install the Amazon SDK for Node.js:
$ npm install aws-sdk --save
Our new "/order" endpoint will receive a payload that contains the order data and send it to our SQS queue using the AWS SDK:
// ./orderssvc/index.js
//
// Code removed for brevity...
//
// Import the AWS SDK
const AWS = require('aws-sdk');
// Configure the region
AWS.config.update({region: 'us-east-1'});
// Create an SQS service object
const sqs = new AWS.SQS({apiVersion: '2012-11-05'});
const queueUrl = "SQS_QUEUE_URL";
// the new endpoint
app.post('/order', (req, res) => {
let orderData = {
'userEmail': req.body['userEmail'],
'itemName': req.body['itemName'],
'itemPrice': req.body['itemPrice'],
'itemsQuantity': req.body['itemsQuantity']
}
let sqsOrderData = {
MessageAttributes: {
"userEmail": {
DataType: "String",
StringValue: orderData.userEmail
},
"itemName": {
DataType: "String",
StringValue: orderData.itemName
},
"itemPrice": {
DataType: "Number",
StringValue: orderData.itemPrice
},
"itemsQuantity": {
DataType: "Number",
StringValue: orderData.itemsQuantity
}
},
MessageBody: JSON.stringify(orderData),
MessageDeduplicationId: req.body['userEmail'],
MessageGroupId: "UserOrders",
QueueUrl: queueUrl
};
// Send the order data to the SQS queue
let sendSqsMessage = sqs.sendMessage(sqsOrderData).promise();
sendSqsMessage.then((data) => {
console.log(`OrdersSvc | SUCCESS: ${data.MessageId}`);
res.send("Thank you for your order. Check you inbox for the confirmation email.");
}).catch((err) => {
console.log(`OrdersSvc | ERROR: ${err}`);
// Send email to emails API
res.send("We ran into an error. Please try again.");
});
});
The AWS SDK requires us to build a payload object specifying the data we are sending to the queue, in our case we define it as sqsOrderData
.
We then pass this object to the sendMessage()
function that will send our message to the queue using the credentials we used to configure the AWS CLI. Finally, we wait for the response and notify the user that their order has been received successfully and that they should check for the email confirmation.
To test the orders
service, we run the command npm start
and send the following payload to localhost:8081/order
:
{
"itemName": "Phone case",
"itemPrice": "10",
"userEmail": "[email protected]",
"itemsQuantity": "2"
}
This will submit our order to the orders
service, from where the message will be sent to our SQS queue. We can view the order in SQS queue through the AWS console, as shown:
Our orders
service has been able to receive a user's order and successfully send the data to our queue on the Simple Queue Service.
Implementation: Emails Service
Our orders
service is ready and already receiving orders from users. The emails
service will be responsible for reading the messages stored in the queue and dispatching confirmation emails to the users. This service is not notified when orders are placed and therefore has to keep checking the queue for any new orders.
To ensure that our emails
service is continually checking for new orders we will use the sqs-consumer
library that will continually and periodically check for new orders and dispatch the emails to the users. sqs-consumer
will also delete the messages from the queue once it has successfully read them from the queue.
We will start by installing the sqs-consumer
library by running the following command:
$ npm install sqs-consumer --save
Now we can implement the emails
service as follows:
const AWS = require('aws-sdk');
const { Consumer } = require('sqs-consumer');
// Configure the region
AWS.config.update({region: 'us-east-1'});
const queueUrl = "SQS_QUEUE_URL";
// Configure Nodemailer to user Gmail
let transport = nodemailer.createTransport({
host: 'smtp.googlemail.com',
port: 587,
auth: {
user: 'Email address',
pass: 'Password'
}
});
function sendMail(message) {
let sqsMessage = JSON.parse(message.Body);
const emailMessage = {
from: 'sender_email_adress', // Sender address
to: sqsMessage.userEmail, // Recipient address
subject: 'Order Received | NodeShop', // Subject line
html: `<p>Hi ${sqsMessage.userEmail}.</p. <p>Your order of ${sqsMessage.itemsQuantity} ${sqsMessage.itemName} has been received and is being processed.</p> <p> Thank you for shopping with us! </p>` // Plain text body
};
transport.sendMail(emailMessage, (err, info) => {
if (err) {
console.log(`EmailsSvc | ERROR: ${err}`)
} else {
console.log(`EmailsSvc | INFO: ${info}`);
}
});
}
// Create our consumer
const app = Consumer.create({
queueUrl: queueUrl,
handleMessage: async (message) => {
sendMail(message);
},
sqs: new AWS.SQS()
});
app.on('error', (err) => {
console.error(err.message);
});
app.on('processing_error', (err) => {
console.error(err.message);
});
console.log('Emails service is running');
app.start();
We will create a new sqs-consumer
application by using the Consumer.create()
function and provide the query URL and the function to handle the messages fetched from the SQS queue.
In our case, the function sendMail()
will take the message fetched from the queue, extract the details of the user's order then send an email to the user using Nodemailer
. Check out our article sending emails in Node.js, if you'd like to learn more.
Our emails
service is now ready. To integrate it to our execution script, we will simply modify the scripts
option in our package.json
:
{
// Truncated for brevity...
"scripts": {
"start-orders-svc": "node ./orderssvc/index.js 8081",
"start-emails-svc": "node ./emailssvc/index.js",
// Update this line
"start": "npm-run-all -p -r start-orders-svc start-emails-svc"
},
// ...
}
When we submit a new order through the orders
service, we get the following email delivered in our inbox:
Conclusion
In this post, we used Node.js and Express to create an API that was meant to receive users' orders and post the order details to our SQS queue on AWS. We then built another service to fetch the messages as posted on the queue and send confirmation emails to the users that posted the orders.
We separated the ordering logic from the email management logic and brought the two services together using a message queue system. This way our orders
service can handle order placement while the emails
service dispatches the emails to the users.
The source code for this project is available here on GitHub./