noc_llm_dart library Null safety
NocLLM Dart - Lightweight, asynchronous Dart & Flutter library for LLM interaction.
Supports Cloud APIs (OpenAI, Gemini, Groq, Sumopod) and Local APIs (LM Studio, Ollama) with SSE streaming and auto-provider detection.
Quick Start
import 'package:noc_llm_dart/noc_llm_dart.dart';
void main() async {
final ai = NocAI(
apiKey: 'YOUR_API_KEY',
baseUrl: 'https://api.openai.com/v1',
model: 'gpt-3.5-turbo',
);
// Streaming (real-time chunks)
await for (final chunk in ai.stream('Tell me a short fable')) {
stdout.write(chunk);
}
// Non-streaming (full response)
final response = await ai.chat('Tell me a joke');
print(response);
ai.dispose();
}
Classes
- NocAI
- The main NocLLM AI client for Dart & Flutter.
- NocAIConfig
- Configuration for a NocAI instance.
Enums
- NocProvider
- Provider type auto-detected from the base URL.
Exceptions / Errors
- NocAuthException
- Thrown when the API returns an authentication error.
- NocConnectionException
- Thrown when the TCP/HTTP connection fails.
- NocLLMException
- Base exception for all NocLLM errors.
- NocParseException
- Thrown when the stream response cannot be parsed.
- NocRateLimitException
- Thrown when the API returns a rate-limit error.