Running Ollama in Docker File for Python Langchain Application
Background Info I have a python application that uses langchain and Ollama. Running this locally works perfectly fine because I have the Ollama client running on my machine. What I want to do is host this application on a serverless…