This is incorrect, while the project does include many third party integrations it also has integrations for Ollama (which runs almost any open source model available in GGUF) and Whisper.cpp so the transcription and LLM processing can be done entirely with open source models on anyone's local...