chat method Null safety

Future<String> chat(
  1. String prompt,
  2. {bool addToHistory = true}
)

Sends a prompt and returns the complete response as a single String.

This accumulates all streaming chunks and returns the final result. Useful when you don't need real-time token display.

final response = await ai.chat('Tell me a joke');
print(response);

Implementation

Future<String> chat(String prompt, {bool addToHistory = true}) async {
  if (addToHistory) {
    _conversationHistory.add({'role': 'user', 'content': prompt});
  }

  final buffer = StringBuffer();
  await for (final chunk in _performStream(prompt)) {
    buffer.write(chunk);
  }

  final response = buffer.toString();

  if (addToHistory) {
    _conversationHistory.add({'role': 'assistant', 'content': response});
  }

  return response;
}