Message Queue — In a Nutshell

Photo by Chad Montano on Unsplash

What is a message queue?

A message queue is like a buffer that decouples sending and receiving of messages. It has:

  • Async operation by nature: the producer no need to wait for consumer to retrieve and process the message
  • Decouple: separating the post and receipt of messages to allow multiple producers/receivers post/receive through one/multiple queue.

Workflow:

  • Producer pushes a message to the queue.
  • Consumer polls a message from the queue (or peeks the next available message without moving the message out of the queue).

When to use:

  • Decoupling workloads: Decouple the message processing from sending so the producer’s thread is not blocked. This is very common in asynchronous event-driven system.
  • Load balancing: When it’s expensive to process a message, deploying a queue can distribute the load by adding the consumers(workers).
  • Load levelling: At peak request window, a sudden increase in message volume might exhaust the current workers (this is called back pressure), and queue can acts as a buffer for the load on workers to amortize overtime. The queue can further implement some throttling mechanism in that when the queue is bigger than a certain size, it can reject incoming request with statusCode 503 so the workers won’t be Karoshi.
  • Reliability. Dead letter queue can be introduced when the consumer is not able to process the message successfully.
  • Resilient message handling: You can use a message queue to add resiliency to the consumers in your system. For example, a consumer can “peek” and lock the next available message in a queue. This action retrieves a copy of the message but locks the original in the queue to prevent it being processed by another consumer. If the process fails, the message will then be released.

Patterns

  • One-way messaging:

The sender simply posts a message to the queue and leaves it to the receiver to process it at some point.

  • Request/response:

Sender posts a message to a queue and waits for an acknowledgment from the receiver. This is more reliable comparing to the above as the the sender can implement custom retry or error handling logic when no response coming back for a timespan.

However, this usually requires a separate communications channel in the form of a dedicated message queue to which the receiver can post its response messages.

E.g. the ReplyTo property from Azure Service Bus Queues

  • Broadcast/Fanout:

The sender posts a message to a queue, and multiple receivers can read a copy of the message. This often works with publisher/subscriber model together.

Some filtering mechanism is possible with the message metadata, e.g. a message labeled as “red” sent to receiver A and “blue” to receiver B.

Source:

That’s it!

Happy Reading!

--

--

--

Hi :)

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Building a Python REST API : Getting started with FastAPI

How to add live chat to your website in 5 minutes !!

How to enable Google CDN for custom origin websites | Google CDN for external websites

Cloud-Native Associate (KCNA) is Available for enrolment

What I don’t like about OutSystems

Raw Product — Episode 1: Launching on the Slack Directory and how we doubled signups with a simple…

This Week In TurtleCoin (June 17, 2019)

Why choose a Game Engine?

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
E.Y.

E.Y.

Hi :)

More from Medium

Design a Parking Lot System

Software Engineer Interview Experience with SmartNews Japan

Cache Part 2 — Cache update strategy

Designing a Code-Deployment System (Question from AlgoExpert)