Post
1966
What do you think of my LLM Chat app so far?
Here are some of the features already included (and more are coming):
- Chat with AI models โ Local inference via Ollama
- Reasoning support โ View model thinking process (DeepSeek-R1, Qwen-QwQ, etc.)
- Vision models โ Analyze images with llava, bakllava, moondream
- Image generation โ Local GGUF models with GPU acceleration (CUDA)
- Fullscreen images โ Click generated images to view in fullscreen
- Image attachments โ File picker or clipboard paste (Ctrl+V)
- DeepSearch โ Web search with tool use
- Inference Stats โ Token counts, speed, duration (like Ollama verbose)
- Regenerate โ Re-run any AI response
- Copy โ One-click copy AI responses
Here are some of the features already included (and more are coming):
- Chat with AI models โ Local inference via Ollama
- Reasoning support โ View model thinking process (DeepSeek-R1, Qwen-QwQ, etc.)
- Vision models โ Analyze images with llava, bakllava, moondream
- Image generation โ Local GGUF models with GPU acceleration (CUDA)
- Fullscreen images โ Click generated images to view in fullscreen
- Image attachments โ File picker or clipboard paste (Ctrl+V)
- DeepSearch โ Web search with tool use
- Inference Stats โ Token counts, speed, duration (like Ollama verbose)
- Regenerate โ Re-run any AI response
- Copy โ One-click copy AI responses