skip to Main Content

I have python scripts

script1.py

script2.py

script3.py

script4.py

from my local machine when I ran those scripts it creates log files so I will send those log files to email manually

first script is creates one log file when i ran and 2nd scripts creates one log file

in 2nd script we have logic which creates JSON file using DBEAVER Database data and upload that JSON file in HUBSPOT API using token and API info

now i want to automate those scripts in AWS, I have python code how can I deploy and automate scripts which needs to send log file and email

second python script is having some logic which connect to Dbeaver Database extract users data and convert in to json and upload that json file in HUBSPOT api

In aws how it crates that json file and how it upload those json file every week

can anyone please help me on this i’m very new to AWS

here is the sample code example

    import***

logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)

class LOAD:
    def DB_conn(self):
            self.connection  = psycopg2.connect(
            host= '', database= "!!", user= "@@",password= "**",port = "#",)
        fh = logging.FileHandler(filename)
            fh.setFormatter(f)
            logger.addHandler(fh)
    def Data(self):
        try:

            conn = self.DB_conn()
            query= 'select ******'
        cursor = self.connection.cursor()
            cursor.execute(query)
            records = cursor.fetchall()
            number_rows = f"Total number of rows in table:  {cursor.rowcount}n"
            logger.info(number_rows)    
        --------------------
        ------------------
           logger.info('converting dict to json datan')
            now = datetime.datetime.now()
            filename = now.strftime('User_data_file_%Y-%m-%d--%H-%M-%S.json')
            with open(filename, 'w+') as f:
                json.dump(json_stri, f,indent = 4,default=str)   
            logger.info('JSON File successfully Created!!!n')
    except Exception as e:
            print("n--- ERROR OCCURED ---nn",e,"nn--- QUITTING TASK ---n")
            logger.error(f'FAILED TO CREATE JSONn {e} n')
        
     def chunk_list(list_to_chunk, number_of_list_items):
             """Yield successive chunk_size-sized chunks from list."""
             for i in range(0, len(list_to_chunk), number_of_list_items):
                 yield list_to_chunk[i:i + number_of_list_items]     
         try:
             url = 'https://api.hubapi.com/contacts/v1/contact/batch'
             headers = {
                 'Authorization':  "Bearer p*****-bfa4-",
                 'Accept': 'application/json',
                 'Content-Type': 'application/json'
             }
    with open(filename,'r') as run:
                 json_data = json.load(run)
    
     r = requests.post(
                         url,
                         data=json.dumps(json_data),
                         headers=headers,
                         verify=False,
                         timeout=(10, 15),
                         stream=True
                     )
    if r.status_code == '200':
        logger.info("File uploaded successfully")
    except Exception as e:
             logger.error(f'FAILED TO LOAD JSON FIlen ERROR:> {e} n')
             logger.error(f'ERROR OCCURED----nn ERROR:> {e} nn--- QUITTING TASK --- n')
if __name__=='__main__':
    BATCH=LOAD()
    BATCH.Data()

        

so this is the logic of second script, can anyone please suggest ideas

2

Answers


  1. Firstly if your tasks take no longer than 15 minutes and require 10GBs of memory consider moving those code parts to lambda functions. If your tasks take longer than that you can consider AWS Batch/EC2/ECS where you can handle long running tasks.

    For automation part you can take a look at EventBridge. In there you can create synchronous (cron-based) or asynchronous (event-based) triggers to start your workflow.

    Finally if you have more than one lambda function and you want to orchestrate them, take a look at Step Functions, which is relatively easy to use.

    Login or Signup to reply.
  2. My solution consists of two parts:

    1. Get the logs (optionally) and JSON files in an S3 Bucket.
    2. Create a SQS Queue and add s3ObjectCreatedNotification, subscribe a Function to the SQS, here is a good article on that. Finally, send files from s3 to emails in the Function.

    Explanation:

    1. There is no need to send all logs in S3 unless it’s required. Any Function generates logs streams in CloudWatch, where you can set up a Subscription Filter and send email notification, say, when an ERROR occurs. See, it’s a common pattern.
    2. Send the JSON files from your code to an S3 Bucket is relatively easy task using AWS-SDK

    I hope that helps.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search