Praseela Radhakrishnan, Author at Perficient Blogs https://blogs.perficient.com/author/pradhakrishnan/ Expert Digital Insights Thu, 26 Aug 2021 18:03:30 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png Praseela Radhakrishnan, Author at Perficient Blogs https://blogs.perficient.com/author/pradhakrishnan/ 32 32 30508587 Sending and Receiving an event through EventBridge with Multiple Targets https://blogs.perficient.com/2021/08/10/sending-and-receiving-an-event-through-eventbridge-with-multiple-targets/ https://blogs.perficient.com/2021/08/10/sending-and-receiving-an-event-through-eventbridge-with-multiple-targets/#respond Tue, 10 Aug 2021 05:09:27 +0000 https://blogs.perficient.com/?p=296110

In this context, we will show how to send an event using custom rule and retrieve an event by adding target. Here I have added two targets which is simple pub/sub implementation as Amazon SNS as our publishing service, Amazon SQS as a subscriber and monitor the success event logs using AWS CloudWatch in EVENTBRIDGE.

What is EventBridge?

Amazon EventBridge is a serverless event bus that makes it easier to connect applications with data from a variety of sources. These event sources can be custom applications, AWS services and partner SaaS applications.

It provides a flexible way of filtering to allow events to be published to the event bus, and then, based on the target routing rules, sends the eligible events to the configured target applications or services.

Step 1: Create Event:

Create a new event bus in the EventBridge console and name it as test-event-bus.

Step 2: Create Custom Rule:

On the EventBridge homepage, Select Rules.

  • From the Event bus dropdown, select the test-event-bus.
  • Click Create rule and name it as sample-eventBridge-rule.

Step 3: Under Define pattern

  • Choose Event pattern
  • Under Event matching pattern, select Custom pattern and add your custom pattern.

Define

Basically, rule will check our event data. The source, detail-type, detail as the three parameters in event pattern and these would be constant. If it matched, then only the rule will pass.

Here I have filtered the event based on “Jack and Jim” in detail (our event message) params. If I gave the input with Jack/Jim the message will pass. If we gave any new value instead of our custom pattern, then the request event will be created. But we are not able to monitor that failure event in SNS, SQS and CloudWatch log.

Step 4: Create Target:

For Select targets, choose the AWS service that you want to act when EventBridge detects an event of the selected type.

We can create 25 targets for a single rule. Here I have used two targets:

  • Choose target as CloudWatch log group and create a log group as sample-eventBridge-log.
  • Choose target as SNS topic and select the SNS topic name as test-eventBridge-topic.

Target for CloudWatch:

Target Cloud

Target for SNS:

Target Sns

In SNS target I have transformed the input message based on my logic for input path and template. As shown below, the first part as Input path  where the required data values are extracted from the event payload. The second part is the Input template where the outgoing event data is created by incorporating the previously extracted values. One thing to note here is that the outgoing event data doesn’t need to be a JSON!

Transformer

Step 5: Send events:

I have created a simple EventBridge application using spring boot with gradle project.

Prjt Strure

Dependencies:

implementation group: ‘software.amazon.awssdk’, name: ‘eventbridge’, version: ‘2.16.101’

In EventBridgeController we have two api calls, one is for send an event to eventbridge and second one is for retrieving rules from an event. Finally added a SQS Listener, this is used to consume the message from a triggered event which from SNS target in rule. I have subscribed this queue from SNS topic. When an event successfully passed the rule the pub/sub messaging will occur.

Controller

In service classPutEventsRequest action sends multiple events to EventBridge in a single request.

Service

Step 6: Testing the event:

From local:

Local

From AWS console:

Aws Console

Triggered event output from both the AWS console and Postman. AWS SQS will be consumed only for the success message as highlighted. If it did not satisfy the rule, the event could be created with id but not consumed from SQS as second log in below screenshot.

Local Log

CloudWatch monitor the success log:

Cloudwatch Log

Conclusion:

This is a brief write up on EventBridge focusing mainly on the event routing rule configurations. If used wisely, it certainly can bring more versatility to the entire event ingestion and the delivery mechanism that we benefit from Amazon EventBridge. Please look into my GitHub repo for the full implementation of the applications and feel free to contribute to the repo.

I hope you found the details I shared here useful. Until then, happy eventing!

 

]]>
https://blogs.perficient.com/2021/08/10/sending-and-receiving-an-event-through-eventbridge-with-multiple-targets/feed/ 0 296110
Multiple Keys/Value pair with Custom Filters in Apache Kafka https://blogs.perficient.com/2021/07/21/multiple-keys-value-pair-with-custom-filters-in-apache-kafka/ https://blogs.perficient.com/2021/07/21/multiple-keys-value-pair-with-custom-filters-in-apache-kafka/#comments Wed, 21 Jul 2021 06:09:50 +0000 https://blogs.perficient.com/?p=295170

Introduction:

Before I even start talking about Apache Kafka here, is it possible to use list of keys in Kafka? Yes, It’s all possible.

In this post, we’ll see how to use multiple keys/value pairs with multiple filters. First create a simple Kafka producer and a Kafka consumer in a Spring Boot application using a very simple method.

Apache Kafka and its Architecture:

Kafka is a distributed streaming platform that is used publish and subscribe to streams of records.

  • Kafka maintains feeds of messages in categories called topics.
  • Processes that publish messages to a Kafka topic are called producers.
  • Processes that subscribe to topics and process the feed of published messages are called consumers.
  • Kafka is run as a cluster comprised of one or more servers each of which is called a broker.

Producer —> Kafka Cluster (Broker & Zookeeper) —> Consumer

Kafka Setup:

Follow the step-by-step instruction from the website, and you’ll get Kafka up and running in your local environment.

Create a new Spring Boot Application:

In this example, I’m working with a Spring-Boot application using gradle project in eclipse IDE. You can use Spring initializr to generate our projects and unzip it in your local.

Dependencies:

These are dependencies for kafka add in your build.gradle file.Dependencies

Project Structure:

Project Structure

1.Spring Boot Controller for Kafka:

Create a controller class, it has two endpoints one is for message would be String type and the second one is for Json type. The KafkaController class will expose two endpoints, using which we will send message through Postman → then it will go to Producer, which will publish it to Kafka queue → and then our Consumer will catch it, and handle the way we set up — just log to the console.

In this controller, we are passing list of keys with value (Multiple keys with single value) as a request param and finally added topic name from the property file.

Controller

2.The Kafka Producer:

2.1. Producer Configuration:

To enable the kafka producer kafkaTemplateString() and producerFactoryString() methods should be implemented in KafkaConfiguration class. The kafkaTemplate() will return a new kafkaTemplate based on the configuration defined in producerFactoryString(). Here we autowired KafkaTemplate, and we will use this instance to publish messages to the queue.

2.1

Here I have used two input types. The configuration steps are almost same, we need to change only in the value serializer as JsonSerializer instead of StringSerializer and change the ProducerFactory type as your pojo name as value.

Create a service class, it has contained implementation for both String and Json type inputs comes from the controller layer. Here I have looped the keys, because producer at a time it will push only single message to the queue.

Send Message

That’s it for producer! Let’s move to Consumer — the service that will be responsible for catching messages and further handling(based on your own logic).

3.The Kafka Consumer:

3.1.Custom Filter:

Here I have added a custom filter for key from the queue. We are passing multiple keys with a single message from producer. In consumer it will retrieve a specific message depending on a key using KafkaListener.

Recordstrategy

In my case I have negotiate the recordFilterStrategy condition because I have returned the value/message based on a specific key. You can change the condition accordingly.

3.2.Consumer Configuration:

Implement consumerFactoryWithFilter () and stringFilterKafkaListenerContainerFactory () methods in KafkaConfiguration class where both methods are used to enable kafka custom filter. In order to set the RecordFilterStrategy (message filtering strategy) listening capabilities in kafka listener. When it returns true, the message will be discarded. When it returns false, the message can normally reach the listening container.

Config

The configuration steps are almost same we need to change only in the value serializer as JsonDeserializer instead of StringDeserializer and change the ConsumerFactory type as your pojo name as value.

Consumer will pull the message from the queue based on the consume configuration. The method that is intended to consume the message should be annotated with KafkaListener and pass the topic, groupId and the most important field as containerFactory as a parameter.

The ContainerFactory will check with consumer configuration in kafkaconfiguration class where methods are used  to set filter for the given message or not.

Listener

In this example I have explained with and without filter call using multiple listeners for both the input types. If you need filter you can set the record filter strategy or else go with the flow.

4.Testing API with Postman:

Post: http://localhost:9000/kafka/publish?keys=Key1, Key2, Key3&message=TestMessage

String with Filter:

String Filter

String without Filter:

Without Filter

Json with filter:

POST: https://localhost:9000/kafka/publishTransaction?keys=Key1, Key2, Key3

Postman

Sample request:

{

“name”:”TestMessage”,

“type”: “json”

}

Output:

Json Filter

Use these commands to start zookeeper and Kafka server.

Hope you like our explanation. Please look into my GitHub repo for the full implementation of the applications and feel free to contribute to the repo.

 

 

 

 

 

]]>
https://blogs.perficient.com/2021/07/21/multiple-keys-value-pair-with-custom-filters-in-apache-kafka/feed/ 1 295170
RequestBody and Multipart on Spring Boot https://blogs.perficient.com/2020/07/27/requestbody-and-multipart-on-spring-boot/ https://blogs.perficient.com/2020/07/27/requestbody-and-multipart-on-spring-boot/#comments Mon, 27 Jul 2020 05:55:36 +0000 https://blogs.perficient.com/?p=277856

Introduction:

Recently I have faced an issue in my project on using the multipart and JSON simultaneously. I would like to share my knowledge on how this issue can be fixed.

For this scenario, I have taken a sample Spring-Boot application that uses JSON/ POJO and Multipart. Below are the steps to achieve passing the multiple files using POJO at a same time

Swagger-UI (2.9.2) doesn’t support the list of multipart file API. So, I used Postman. Let’s go for it.

Step 1: Create a simple Spring-Boot application.

Create a Spring or Spring-Boot application in eclipse IDE. Alternatively, we can download it from Spring.io on website.

Project Structure:

Project Structure

Step 2: Create one Controller, Model and Service.

Now, I’m taking the user details and a user can upload list of files simultaneously. File details are not required to be added in the User POJO class.

User:

PojoCreated getters and setters for the POJO class.

Step 3: Use Multipart in Controller.

As we all know, Controller defines our action and when we want to create certain API and get a response through the API.

To pass the Json and Multipart in the POST method we need to mention our content type in the consume part.

consumes = { MediaType.APPLICATION_JSON_VALUE,MediaType.MULTIPART_FORM_DATA_VALUE })

And we need to pass the given parameter as User and Multipart file. Here, make sure we can pass only String + file not POJO + file.

@RequestPart(“user”) String user, @RequestPart(“file”) List<MultipartFile> file

Then convert the String to Json using ObjectMapper in Service layer.

UserController:

ControllerStep 4: Change the POJO to String.

Usually we add @RequestBody and mention the Class name for creating a data using POST method. But here, we should add both Json and multipart both. So, annotation should be altered.

         @RequestPart(“user”) instead of @RequestBody(“user”)

Step 5: Invoke the String to Json.

Convert the String to Json using ObjectMapper. Call it from controller to Service as below :

         User userJson = userService.getJson(user, file);

UserService:

ServiceStep 6: Testing with our API.

Eclipse Run

Step 7: Upload the Multipart file and POJO in Postman.

Now, Test the response using Postman by adding the body parameters and values as below.

Endpoint: Posthttp://localhost:8080/upload

Postman

Sample Request:

{

“firstName”: “Aadhi”,

“lastName” :”Sakthi”,

“age”: 10,

“place”: “Chennai”

}

Now let us attach the sample file which we need to pass in the request.

Response:

Postman ResponseLooks like all done and dusted…!!!

You may get the source code from here: PraseelaRadhakrishnan/Spring-Boot/Spring-Boot-Upload

Conclusion:

Hopefully this documentation will be useful for you.

]]>
https://blogs.perficient.com/2020/07/27/requestbody-and-multipart-on-spring-boot/feed/ 25 277856