TechCrunch→ оригинал

Osaurus released a Mac app where local and cloud AI models work together

Osaurus launched a macOS app that brings local and cloud AI models together in a single window. Memory, files, and tools stay on the user's device, while OpenAI

Osaurus released a Mac app where local and cloud AI models work together
Source: TechCrunch. Коллаж: Hamidun News.
◐ Слушать статью

Osaurus has released a macOS application that combines local and cloud AI models in a single interface. The main idea of the service is that the agent's memory, files, keys, and tools remain on the Mac itself, and the user connects to external models only when it's really necessary.

How Osaurus Works

Osaurus positions itself not as another chat with a model, but as a "harness" — a management layer over different AI models. Users can run local models directly on Apple Silicon, and if they need more powerful generation or a specific provider, they can switch to cloud services like OpenAI, Anthropic, Gemini, or Grok. At the same time, an important part of the context doesn't leave for an external server: history, memory, plugins, and working tools are stored on the device.

The project grew out of an earlier idea for a desktop AI companion called Dinoki. According to co-founder Terrence Pae, users asked why they should buy an application if they still have to pay separately for model tokens. This pushed the team toward a local scenario: if AI lives on your Mac, it can work with files, system settings, and a browser as a personal assistant, rather than just another chat window.

What the Application Can Do

Osaurus's bet is on a hybrid approach: local models for privacy and control, cloud models for tasks where maximum quality or speed is needed. What matters here is not just the list of models, but the fact that the entire scenario is assembled in a single native macOS application: without a terminal, manual configuration, and fragmented services. As a result, the product tries to occupy a niche between developer-first tools like Ollama and more mainstream AI products focused on a convenient interface.

  • Switching between local and cloud models without changing the working environment
  • Storing memory, chat history, and keys on the Mac itself
  • Support for MCP clients and access to tools through a unified server
  • Over 20 built-in plugins for email, calendar, browser, Git, files, and office formats
  • Voice mode and isolated sandbox environment for running code

According to the company, Osaurus already supports MiniMax, Gemma 4, Qwen 3.6, Llama, DeepSeek V4, Apple Foundation Models, and the Liquid AI LFM family, and in the cloud can connect to OpenAI, Anthropic, Gemini, xAI, OpenRouter, LM Studio, and other providers. The project, launched about a year ago, has already exceeded 112,000 downloads. Another important detail: the application works as a full-fledged MCP server, so it can be embedded in a wider stack of AI tools, rather than used only as a standalone desktop client.

Where the Bottlenecks Are

The main limitation is, frankly, hardware. For running models locally, Osaurus recommends a system with a minimum of 64 GB of RAM, and for large models like DeepSeek V4 — around 128 GB. In other words, it's not yet about a basic MacBook Air for the average user, but rather about Mac Studio, upgraded MacBook Pro, and other machines with enough memory. Additionally, the platform itself is only oriented toward the Apple ecosystem.

The second problem is that local AI is still in an early stage of development. But it's precisely here that the founder sees rapid progress: a year ago, local models struggled to finish sentences, and now they can already call tools, write code, and work with a browser. Osaurus is trying to package this leap into a product for regular users, adding a virtual sandbox for security and an interface without a terminal.

Looking ahead, the team is also looking at B2B scenarios — for example, law firms and medicine, where local deployment could reduce privacy risks for sensitive data.

What This Means

The AI market is gradually shifting from a race for models to competition for a management layer on top of them. Osaurus is betting that users don't just need access to Claude or GPT, but a controlled environment where they can combine local and cloud models without losing memory, tools, and privacy. If this approach takes hold, Mac could become not just a client for AI, but the main platform where a personal agent actually lives.

ЖХ
Hamidun News
AI‑новости без шума. Ежедневный редакторский отбор из 400+ источников. Продукт Жемала Хамидуна, Head of AI в Alpina Digital.
What do you think?
Loading comments…