skip to Main Content

I am trying to get values from openai json mode for a project. I tried json package, bulit in parsing, etc. but i get

TypeError: the JSON object must be str, bytes or bytearray, not ChatCompletionMessage

json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)`

AttributeError: 'str' object has no attribute 'text'

3

Answers


  1. According to the documentation, you can set response_format to { "type": "json_object" } to enable JSON mode. Below is an example:

    response = openai.Completion.create(
      model="model_name",
      prompt="prompt_text",
      response_format={ "type": "json_object" }
    )
    
    Login or Signup to reply.
  2. The currently provided answers will not give you responses in JSON consistently.

    There are three things you need to do if you want to get responses in JSON consistently:

    1. Use the gpt-4-1106-preview or gpt-3.5-turbo-1106 models.
    2. Set the system message where you explicitly state that you want to get responses in JSON format.
    3. Set the response_format parameter to json_object.

    Code:

    from openai import OpenAI
    client = OpenAI(
      api_key=os.getenv("OPENAI_API_KEY"),
    )
    
    completion = client.chat.completions.create(
      model="gpt-4-1106-preview",
      messages=[
        {"role": "system", "content": "You are a helpful assistant. Your response should be in JSON format."},
        {"role": "user", "content": "Hello!"}
      ],
      response_format={"type": "json_object"}
    )
    
    print(completion.choices[0].message.content)
    

    You can check if the response is valid JSON with the following code (use it in the same Python script as the code above):

    import json
    
    def is_json(myjson):
      try:
        json.loads(myjson)
      except ValueError as e:
        return False
      return True
    
    print(is_json(completion.choices[0].message.content))
    
    Login or Signup to reply.
  3. If you are happy to go with a Completions model like gpt-3.5-turbo-instruct rather than Chat completions like GPT-4, I believe that this gives a JSON response, but error responses may not be in JSON format:

    import requests
    
    headers = {
        "Content-Type": "application/json",
        "Authorization": f"Bearer {API_KEY}"
    }
    payload = {
        "model": "gpt-3.5-turbo-instruct",
        "prompt": "capital city of US",
        "max_tokens": 200,
    }
    
    response = requests.post("https://api.openai.com/v1/completions", headers=headers, json=payload)
    response = response.json()
    
    reply_text = response['choices'][0]['text']
    
    print(reply_text)
    

    But I still can’t get it to accept a specific response_format specification.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search