Description
I am creating an intermediary service between my frontend and my backend using AWS lambda function.
The job of the function is to take the payload from the UI and send to the backend, if Backend is live then send the200
status (progress of ML model, short pooling). Suppose the backend is not live and the backend pod is restarting use the previous JSON saved inside aresponse.json
file and return that back to the frontend until the pod is restarted and live to serve.
Here is my lambda function.
My AWS Lambda code
import json
import os
import requests
def save_json_file_to_tmp(response_payload, filename="response.json"):
tmp_dir = "/tmp"
tmp_file_path = os.path.join(tmp_dir, filename)
with open(tmp_file_path, "w") as f:
json.dump(response_payload, f)
def read_json_file(filename="response.json"):
tmp_dir = "/tmp"
tmp_file_path = os.path.join(tmp_dir, filename)
with open(filename, "r") as f:
# Load the JSON payload.
response_payload = json.load(f)
return response_payload
def lambda_handler(event, context):
"""Forwards a POST request to another service and redirects back to the first microservice.
Args:
event: The Lambda event.
context: The Lambda context.
Returns:
A JSON response object.
"""
print(event)
print(type(event))
print("=========================================================")
# Get the request data from the event.
try:
request_data = json.loads(event["body"])
except KeyError as e:
request_data = event
# Get the destination URL from the request data.
dest_url = request_data["dest_url"]
# Forward the POST request to the destination service.
response = requests.post(dest_url, json=request_data)
# Check the response status code.
if response.status_code == 200:
# Get the response payload.
response_payload = json.loads(response.content)
print("++++++++++++++++++++++++++++++++++")
print(response_payload)
print("++++++++++++++++++++++++++++++++++")
# Save the response payload to a temporary file.
save_json_file_to_tmp(response_payload)
# Redirect back to the first microservice with the response payload.
return {
"statusCode": 200,
"body": json.dumps(response_payload),
"headers": {
"Content-Type": "application/json"
}
}
elif response.status_code == 502:
# Read the response payload from the temporary file.
response_payload = read_json_file_to_tmp()
# Redirect back to the first microservice with the response payload.
return {
"statusCode": 200,
"body": json.dumps(response_payload),
"headers": {
"Content-Type": "application/json"
}
}
else:
# Return an error response.
return {
"statusCode": 500,
"body": json.dumps({
"error": "Unexpected response from destination service."
}),
"headers": {
"Content-Type": "application/json"
}
}
Now to test the lambda I created a test event checkLiveOrNot
{
"app": "my_app",
"session_id": "3432428374827347234",
"table": "progress",
"user_id": "[email protected]",
"flag": false,
"project_name": "mybig1gbproject",
"user_name": "john doe",
"file_name": "",
"fit_progress": 0,
"model_fit": 0,
"total_fit": 0,
"model_calc": 0,
"total_calc": 0,
"calc_progress": 0,
"model_arch": "lambda",
"dest_url": "https://mybackendservice/api/maincore/progress"
}
and invoked and it returned me the proper output in the lambda test console.
Test Event Name
checkLiveOrNot
Response
{
"statusCode": 200,
"body": "{"_id": "34347tyi738483h8d73h", "session_id": "3432428374827347234", "data": 0, "model_no": 0, "table": "progress", "total_fit": 1, "total_calc": 1, "orc_fail": 0, "model_progress": 0.0, "current_model": 0, "model_progress": 100.0, "current_calc": 1}",
"headers": {
"Content-Type": "application/json"
}
}
Function Logs
START RequestId: 0d820ce34rtry9-2e11-420dsdfa-961e-c4434515b18f Version: $LATEST
{'app': 'my-app', 'session_id': '34347tyi738483h8d73h', 'table': 'progress', 'user_id': '[email protected]', 's3_flag': False, 'project_name': 'checksampleforapitesting', 'user_name': 'danish.xavier', 'file_name': '', 'model_fit_progress': 0, 'current_model_fit': 0, 'total_model_fit': 0, 'current_model_calc': 0, 'total_model_calc': 0, 'model_calc_progress': 0, 'model_arch': 'eks', 'dest_url': 'https://promo-internal-promo-usv-demotemp-env.mmx.mco.solutions.iqvia.com/api/core/progress_lambda'}
<class 'dict'>
=========================================================
++++++++++++++++++++++++++++++++++
{'_id': '34347tyi738483h8d73h', 'session_id': '34347tyi738483h8d73h', 'data': 0, 'model_no': 0, 'table': 'progress', 'total_fit': 1, 'total_calc': 1, 'orc_fail': 0, 'model_progress': 0.0, 'current_model': 0, 'model_progress': 100.0, 'current_calc': 1}
++++++++++++++++++++++++++++++++++
END RequestId: 0d820ce9-2e11-420a-961e-c4434515b18f
REPORT RequestId: 0d82dfdf0ce9-2dfdfe11-42dfdf0a-961e-c44345dfdf15b18f Duration: 281.79 ms Billed Duration: 282 ms Memory Size: 128 MB Max Memory Used: 54 MB Init Duration: 324.60 ms
Request ID
0d820ce9-2e11-420a-961e-c4434515b18f
Now when I sent a POST request to this lambda via postman with the same payload I am getting 200 Response but showing nothing on the Postman screen. I tried using python and sent request.
here is the code.
import json
import requests
# Create the request headers
headers = {
'Content-Type': 'application/json'
}
# Create the request body
payload = {
"app": "my_app",
"session_id": "3432428374827347234",
"table": "progress",
"user_id": "[email protected]",
"flag": false,
"project_name": "mybig1gbproject",
"user_name": "john doe",
"file_name": "",
"fit_progress": 0,
"model_fit": 0,
"total_fit": 0,
"model_calc": 0,
"total_calc": 0,
"calc_progress": 0,
"model_arch": "lambda",
"dest_url": "https://mybackendservice/api/maincore/progress"
}
# Send the POST request
response = requests.post(
'https://8973043wa7lrta.execute-api.us-east-1.amazonaws.com/dev/progress_lambda',
headers=headers,
json=payload
)
# Check the response status code
if response.status_code == 200:
# The request was successful
print('Request successful')
# Get the response payload
response_payload = response.content
print(response_payload)
The Output from the program:
Request successful
b''
2
Answers
Is there a possibility of this:
when response.status_code == 502, your code also returns statusCode == 200, but the response_payload is empty
If I understand your function correctly : you use the directory /tmp to save the result of a call to an external service (API or something else). Later, when the Lambda function is called again, and in the same time the external service returns 502 instead of 200, you want the Lambda function re-use the response previously saved.
But this is not possible to do that using the temporary directory /tmp.
This directory can be used in a Lambda function, but only for saving something which is re-used in the same execution of the Lambda function. /tmp is not shared between multiple instances of the same Lambda Function.
The solution could be to not write/read using /tmp, but instead to use an external service to write/read the response, like S3, or any other database or caching system.
Check-out the AWS Lambda FAQ :
An also :