skip to Main Content

I created a model with Facebook Prophet. I wonder now what is the “best” way to access these predictions from an online web application (Django).

Requirements are that I have to train/update my model on a weekly base with data from my Django application (PostgreSQL). The predictions will be saved and I want to be able to call/access this data from my Django application.

After I looked into Google Cloud and AWS I couldn’t find any solution that hosts my model in a way that I can just access predictions via an API.

My best idea/approach to solve that right now:

1) Build a Flask application that trains my models on a weekly base. Predictions are saved in a PostgreSQL. The data will be a weekly CSV export from my Django web application.

2) Create an API in my Flask application, that can access predictions from the database.

3) From my Django application, I can call the API and access the data whenever needed.

I am pretty sure my approach sounds bumpy and is probably not the way how it is done. Do you have any feedback or ideas on how to solve it better? In short:

1) Predict data from a PostgresSQL database.

2) Serve predictions in a Django web application.

2

Answers


  1. The simplest way to serve pre-calculated forecast values from Prophet is to serve CSV files from S3 or other file servers. You can refresh your models every few days and write the forecast output to S3

    import boto3
    from io import StringIO
    
    DESTINATION = bucket_name
    
    def write_dataframe_to_csv_on_s3(dataframe, filename):
        """ Write a dataframe to a CSV on S3 """
        print("Writing {} records to {}".format(len(dataframe), filename))
        # Create buffer
        csv_buffer = StringIO()
        # Write dataframe to buffer
        dataframe.to_csv(csv_buffer, sep=",", index=False)
        # Create S3 object
        s3_resource = boto3.resource("s3")
        # Write buffer to S3 object
        s3_resource.Object(DESTINATION, filename).put(Body=csv_buffer.getvalue())
    
    results = forecast[['ds', 'yhat', 'yhat_lower', 'yhat_upper']].copy()
    
    write_dataframe_to_csv_on_s3(results, output+file_name+".csv")
    
    Login or Signup to reply.
  2. One of the reasons I visited this question was that I wasn’t sure about which way to go. The answer seems like a great alternative.
    However, I didn’t have many constraints on my Django application, and I was figuring out a simpler way for someone with similar use cases as mine.

    My solution:

    • I have a Django project, a Django app that serves my website, and a Django app for the Prophet model.
    • The Prophet model will be re-trained each day exactly once (after some condition).
    • Each day the model is trained, it predicts for the new data and saves the predictions to a CSV file (which can be stored in a database). It also stores the trained model using pickle.
    • Now, I have access to the trained model and some pre-defined predictions by importing the Django app wherever I need it.

    The project hierarchy:

    project/
        project/
        django-app-for-website/
        django-app-for-prophet/
        manage.py
        requirements.txt
    

    Even though the performance of my project isn’t affected much, it isn’t my priority for now, but it can be yours, in which case, I wouldn’t recommend this solution.

    If you’re looking for the simplest way to serve a Prophet model, this is what I could come up with. Just another possible solution.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search