Hi! My name is Sergii and I am the developer of Geeps. I'm based in Canada and at work I build software for the healthcare industry. Geeps is a project that I develop in my spare time.
Soon after the release of ChatGPT OpenAI gave developers access to the same powerful models through their API. Projects like Chatbot UI and MacGPT immediately became popular and demonstrated that users could connect to the best models using their own API keys. I really liked this approach – I could still pay for the service while making my usage more flexible and cost-efficient.
The pace of development in the AI world is insane and the big players like OpenAI, Google, Anthropic, and others come up with new models and capabilities on a monthly basis. And all these providers naturally try to lock you in their ecosystems by bundling many capabilities in their Pro/Max subscriptions.
Those are really great offerings and I'm also a subscriber but their combined cost may escalate really quickly especially if you want to use SOTA models - a top tier subscription from each of those companies costs $200+ per month and that's without mentioning other players in the field like xAI, Mistral, Cohere, all the Chinese labs, etc. who all charge a monthly subscription.
And if you don't pay a subscription? You are limited to older models and the ads are coming your way!
My preference is to use intelligence as a utility and pay for the actual usage. It's great to be able to switch between providers and models based on your specific needs.
Fortunately, this is still possible through direct APIs and already works great in AI assisted programming tools like Cursor, Copilot, Cline, Claude Code, OpenCode, Codex, Pi and many others. And there are also excellent LLM routers like OpenRouter and others that give you flexible access to multiple providers and models.
The aim of Geeps is to be a simple and portable cross-platform chat client that works with multiple AI providers in a single app.