I am trying to get values from openai json mode for a project. I tried json package, bulit in parsing, etc. but i get
TypeError: the JSON object must be str, bytes or bytearray, not ChatCompletionMessage
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)`
AttributeError: 'str' object has no attribute 'text'
3
Answers
According to the documentation, you can set
response_format
to{ "type": "json_object" }
to enable JSON mode. Below is an example:The currently provided answers will not give you responses in JSON consistently.
There are three things you need to do if you want to get responses in JSON consistently:
gpt-4-1106-preview
orgpt-3.5-turbo-1106
models.response_format
parameter tojson_object
.Code:
You can check if the response is valid JSON with the following code (use it in the same Python script as the code above):
If you are happy to go with a Completions model like gpt-3.5-turbo-instruct rather than Chat completions like GPT-4, I believe that this gives a JSON response, but error responses may not be in JSON format:
But I still can’t get it to accept a specific response_format specification.