skip to Main Content

I’ve been trying to integrate gpt-3.5-turbo in my Flutter app while maintaining the chat history. I used FlutterFlow to generate the boilerplate code and then downloaded the code to further edit it. I have successfully integrated the model while maintaining the chat history but I am unable to figure out how to add the "system" message into the prompt.

Here’s the API call code:

class OpenAIChatGPTGroup {
  static String baseUrl = 'https://api.openai.com/v1';
  static Map<String, String> headers = {
    'Content-Type': 'application/json',
  };
  static SendFullPromptCall sendFullPromptCall = SendFullPromptCall();
}

class SendFullPromptCall {
  Future<ApiCallResponse> call({
    String? apiKey = 'sk-xxxxxxxxxx',
    dynamic? promptJson,
  }) {
    final prompt = _serializeJson(promptJson);
    final body = '''
{
  "messages": ${prompt},
  "temperature": 0.8,
  "model": "gpt-3.5-turbo"
}''';
    return ApiManager.instance.makeApiCall(
      callName: 'Send Full Prompt',
      apiUrl: '${OpenAIChatGPTGroup.baseUrl}/chat/completions',
      callType: ApiCallType.POST,
      headers: {
        ...OpenAIChatGPTGroup.headers,
        'Authorization':
            'Bearer sk-xxxxxxxxxx',
      },
      params: {},
      body: body,
      bodyType: BodyType.JSON,
      returnBody: true,
      encodeBodyUtf8: true,
      decodeUtf8: true,
      cache: false,
    );
  }

  dynamic createdTimestamp(dynamic response) => getJsonField(
        response,
        r'''$.created''',
      );
  dynamic role(dynamic response) => getJsonField(
        response,
        r'''$.choices[:].message.role''',
      );
  dynamic content(dynamic response) => getJsonField(
        response,
        r'''$.choices[:].message.content''',
      );
}

Here’s _serializeJson() function:

String _serializeJson(dynamic jsonVar, [bool isList = false]) {
  jsonVar ??= (isList ? [] : {});
  try {
    return json.encode(jsonVar);
  } catch (_) {
    return isList ? '[]' : '{}';
  }
}

Here’s the code in the onPressed() function of the submit button:

setState(() {
                                    _model.chatHistory =
                                        functions.saveChatHistory(
                                            _model.chatHistory,
                                            functions.convertToJSON(
                                                _model.textController.text));
                                  });
                                  _model.chatGPTResponse =
                                      await OpenAIChatGPTGroup
                                          .sendFullPromptCall
                                          .call(
                                    apiKey:
                                        'sk-xxxxxxxxxx',
                                    promptJson: _model.chatHistory,
                                  );

Here’s saveChatHistory() function:

dynamic saveChatHistory(
  dynamic chatHistory,
  dynamic newChat,
) {
  // If chatHistory isn't a list, make it a list and then add newChat
  if (chatHistory is List) {
    chatHistory.add(newChat);
    return chatHistory;
  } else {
    return [newChat];
  }
}

Here’s convertToJSON() function:

dynamic convertToJSON(String prompt) {
  // take the prompt and return a JSON with form [{"role": "user", "content": prompt}]
  return json.decode('{"role": "user", "content": "$prompt"}');
}

I’ve tried adding the "system" message in the convertToJSON() function like this:

dynamic convertToJSON(String prompt) {
  // take the prompt and return a JSON with form [{"role": "user", "content": prompt}]
  return json.decode('[{"role": "system", "content": "system message"}, {"role": "user", "content": "$prompt"}]');
}

but this is returning a 400 error code that indicates bad request.

2

Answers


  1. I think that you are doing wrong when you try to serialize the prompt on your _serializeJson method.

    Even if your chatPrompt has only one message, you need to define the message content as a List. I would try to set your body message hardcoded simulating a conversation between user and system to check if there’s any error on your HTTP request.

    If the request works with a hardcoded body, be sure that the problem is in your serialization methods.

    Login or Signup to reply.
  2. The OpenAI docs have many examples about how to use their APIs. To your case on the messages parameter you will have a list of objects. These objects are the messages entities that compose the conversation.

    You can add a initial message to the system that gives an orientation about how it should behave. For example:

       {
          "model": "gpt-3.5-turbo",
          "messages": [
            {"role": "system", "content": "You are a helpful assistant."}, 
            {"role": "user", "content": "Hello!"},
            ...
          ]
        }
    
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search