To receive the SQS Queue messages from exst-prod-integration-hub-outbound-eit.fifo Queue,
- Expand more on the SQS Queue (exst-prod-integration-hub-outbound-eit.fifo) and check the available messages,
- Go to Lambda function (PROD_SQS_Receive_Message) and click Add Trigger, Refer below code
- Select SQS as source,
- Select exst-prod-integration-hub-outbound-eit.fifo as Queue name,
- Select 1 as Batch Size then only we’ll get 1 record on CSV file in JSON format,
- Maximum Concurrency is 2
- Click Add on the trigger page
- We can confirm the Lambda on SQS Page under “Lambda Triggers”
- Once we are done with Lambda Execution, select the Trigger and Delete from SQS Page or else it’ll keep on receive the message and post to Cloud watch event,
- Go to Cloudwatch and then under Log groups search for “” and look for Log streams,
- Under Cloudwatch LogGroup, on top right click “View in Logs Insight”. Click custom timeline and click the date twice so we get log for full day and paste the below code and Run Query.
- Once the Query is displayed then verify the number of message on the Metrics and click “Export results” and then select “Download table (JSON)”
- Rename the file with Date and messages and upload to S3 bucket under json folder
Alternatively you can follow the manual step.
- Click the Log Stream and click ACTIONS and select “Download search results (CSV)” then we can export it to csv format
- Login to https://jsonlint.com/ and validate the JSON and send to Laurie
SQS Queue : exst-prod-integration-hub-outbound-eit.fifo
Lambda Function: PROD_SQS_Receive_Message
import json
def lambda_handler(event, context):
# TODO implement
for x in event['Records']:
print(x)
Cloudwatch : /aws/lambda/PROD_SQS_Receive_Message
fields @message
| filter @message like "messageId"
S3 Bucket URL : exst-prod-ih-outbound-eit-sqs
No comments:
Post a Comment