skip to Main Content

From the main page and official documentation, OpenAI clearly mentioned we can use the response_format as follows:

enter image description here

What I’m trying in Postman is the following:

{
    "instructions": "You are a personal math tutor. When asked a question, write and run Python code to answer the question.",
    "name": "Math Tutor",
    "tools": [{"type": "code_interpreter"}],
    "model": "GPT-4 Turbo",
    "type": "json_schema"
 }

But I got the following error:

{
    "error": {
        "message": "Unknown parameter: 'type'.",
        "type": "invalid_request_error",
        "param": "type",
        "code": "unknown_parameter"
    }
}

After asking some popular LLM models, they suggested to try the following in Postman:

{
    "instructions": "You are a personal math tutor. When asked a question, write and run Python code to answer the question.",
    "name": "Math_Tutor",
    "tools": [{"type": "code_interpreter"}],
    "model": "gpt-4-turbo",
    "response_format": {
        "type": "json_schema",
        "json_schema": {
            "strict": true,
            "name": "MathTutorResponse",
            "schema": {
                "type": "object",
                "properties": {
                    "result": {
                        "type": "number"
                    },
                    "explanation": {
                        "type": "string"
                    }
                },
                "required": ["result", "explanation"],
                "additionalProperties": false
            }
        }
    }
}

But I got the following error:

{
    "error": {
        "message": "Invalid parameter: 'response_format' of type 'json_schema' is not supported with model version `gpt-4-turbo`.",
        "type": "invalid_request_error",
        "param": "response_format",
        "code": null
    }
}

Am I missing anything here? Please provide me with a snippet where I can configure assistant output to be in JSON format.

2

Answers


  1. You have confused two things:

    As stated in the official OpenAI documentation:

    Structured Outputs is the evolution of JSON mode. While both ensure
    valid JSON is produced, only Structured Outputs ensure schema
    adherance.
    Both Structured Outputs and JSON mode are supported in the
    Chat Completions API, Assistants API, Fine-tuning API and Batch API.

    We recommend always using Structured Outputs instead of JSON mode when
    possible.

    However, Structured Outputs with response_format: {type: "json_schema", ...} is only supported with the gpt-4o-mini,
    gpt-4o-mini-2024-07-18, and gpt-4o-2024-08-06 model snapshots and
    later.

    While both use the response_format parameter, you enable them differently. See the table below on how to use them properly:

    Structured Outputs JSON mode
    Outputs valid JSON Yes Yes
    Adheres to schema Yes (see supported schemas) No
    Compatible models gpt-4o-mini
    gpt-4o-2024-08-06 and later
    gpt-3.5-turbo
    gpt-4-*
    gpt-4o-*
    Enabling response_format: { type: "json_schema", json_schema: {"strict": true, "schema": ...} } response_format: { type: "json_object" }
    Login or Signup to reply.
  2. You may also want to change the version of the OpenAI api, at least for me it was a blocker to not able to use structured outputs.

    I changed env variable from 2024-03-01-preview to 2024-08-01-preview, and it worked.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search