Point-to-Point Integrations with EventBridge Pipes

Tim Rosenblüh
16. October 2023
Reading time: 1 min
Point-to-Point Integrations with EventBridge Pipes

Introduction

EventBridge Pipes is a new functionality that was released at last year’s re:Invent and is intended to simplify various integrations between different AWS Services.

Until now, these point-to-point connections had to be established manually, e.g. via lambda functions, which then passed on the respective payloads (events/messages) from the producer to the consumer. With Eventbridge Pipes, this intermediate step is now completely removed and taken care of.

That means: No more glue code!

 class=

In addition, EventBridge Pipes offers the possibility to enrich or filter data on the way from the producer to the consumer. So you can add, remove or transform the data depending on your use case. If you want to learn more about this, take a look at the documentation.

Components

Producers

To get an overview of the available services, let’s first have a look at some possible producers:

Amazon SQS

In the context of EventBridge Pipes, a queue can serve as a source of messages which typically come from a preceding application/microservice.

DynamoDB Streams

With DynamoDB Streams, changes to a table can easily be used as a basis for further processing. For example, you can react to the creation, update, or removal of an entry.

Kinesis Data Streams

Kinesis Data Streams can also be used as a producer for a pipe. This allows for a wide range of data to be streamed to further processing steps.

In addition, the following services can also be used with EventBridge Pipes:

  • Amazon MQ Broker
  • Amazon MSK Stream
  • Self managed Apache Kafka stream

To get a deeper insight into the various sources/producers, it is worth taking a look at the documentation.

Consumers

With a basic understanding of the available producers, we can now take a look at some of the consumer services we can use. Since there are a few more possibilities here, I will only mention a subset.

If you want to know about all possible consumers follow the link to the documentation.

Step Functions state machine

A Step Functions workflow can be used as a consumer. The State Machine is executed as soon as a message/event is available on the producer’s side.

SNS Topic

We can also use SNS Topics as a target. For example, notifications can be sent based on the producer’s messages to downstream services.

API Gateway

API Gateway can also function as a consumer with pipes to invoke a specific API resource and forward your event to other services or applications.

Demo

In the following section, we will create a little example that uses SQS as the producer and a Step Functions State Machine as the consumer and connect the two using EventBridge Pipes.

Before we look at the implementation with pipes, let’s first go over how the integration between SQS and Step Functions was created previously:

 class=

We would typically use a lambda function to establish the connection between the two services. This glue code must of course be provided by us. Additionally, we have to guarantee that our function is triggered by SQS and set up the call to start the step functions workflow so that the message from the queue is used as input for the state machine. These tasks, although not incredibly extensive, unfortunately still provide the possibility to introduce errors into the integration. It should also be taken into account that this additional function can cause non-negligible costs depending on the number of calls.

This is where EventBridge Pipes comes in and allows us to create this integration without an intermediate step. We therefore no longer need the glue code or in this case the additional lambda function:

 class=

Implementation

  • To create a pipe, simply navigate to the EventBridge console and select the Pipes option on the left.
  • The pipe can then be created via Create Pipe.
  • You can select your Source via the Dropdown. In this case, we use SQS and select the Queue (The Queue has to be created beforehand)
  • (The Filtering and Enrichment options can be removed)
  • As Target we select Step Functions and chose the State Machine we want to use (The Step Function used in this example has a simple Pass step to illustrate the functionality)

After creating the pipe the visualization should look like this:

 class=

Of course, CloudFormation or CDK could also be used for this demo. However, at the time of writing this blog post, there are only L1 constructs available. With that said, L2 constructs will certainly be supported in the future.

Test the implementation

In SQS, we can then put a message in the queue to test the integration.

{
  "message": "Hello from SQS!"
}

In the Step Functions State Machine we can then see the message that has arrived:

[
  {
    // ...
    "body": "{n"message": "Hello from SQS!"n}",
    // ...

  }
]

With this, we have successfully established the connection between SQS and our Step Functions State Machine without having to worry about implementing the integration.

Of course, we could go further and filter our message or add an enrichment step to extend the message with data from various other sources. For a general introduction, however, this will suffice. If you want to learn more about the various integrations or just want a deeper understanding, follow the link to the documentation.

Summary

EventBridge Pipes simplifies and trivializes many point-to-point integrations between AWS services. This means that the glue code that previously held these integrations together is now simply no longer necessary. This not only reduces the time required to implement these connections but also removes a potential source of errors and speeds up the overall development process by eliminating unnecessary work which had to be done in the past.

Thank you for reading this blog post. Feel free to provide me with any feedback!