Clippy has returned – but this time it’s not a Microsoft product. It’s a front-end to locally run LLMs.
This appears to be the first time that Clippy has been used as a front-end for locally run LLMs. Clippy is finally useful, ish. It’s now a small app that allows users chat with a variety AI models running localy. Gemma 3, Qwen3, Phi-4 Mini, and Llama 3.0 are all ready-to-download versions. Clippy is also capable of running any other LLM locally from a GGUF.
This is a San Francisco-based developer Felix Rieseberg’s work. Listed Several The app was created as a “love letter” for Clippy. He had been featured in The Register and for his work maintaining the cross-platform development framework Electron. The unofficial app’s GitHub site has written.
The Nu-Clippy has only praise for our “critical voice in the business and tech world”… Thanks, buddy. The business model is wrong in half: We don’t offer subscriptions. We do receive revenue directly from advertisers, and their ad agency; that’s the way we are paid.
“Consider it software art,” Rieseberg said. “If you don’t like it, consider it software satire.”
Rieseberg added on the app’s About Page that he does not mean high art. “I mean ‘art’ in the sense that I’ve made it like other people do watercolors or pottery – I made it because building it was fun for me.”Rieseberg, one of the maintainers for Electron, is a key contributor to the project. It uses a Chromium Engine and Node.js in order to make web apps like HTML, CSS and JavaScript work like desktop applications, regardless of the underlying platforms. This is what the latest version of Clippy demonstrates.
The Nu-Clippy was designed to be a reference implementation for Electron. LLM module. Rieseberg wrote about it in the GitHub documentation.
READ MORE
The unofficial AI version of Clippy (Clippy 2.0.0?) 3.0?) It may be more powerful than its predecessors, but that doesn’t mean it’s packed with features. Clippy, compared to a platform such as LM Studio that allows users to chat and modify models with local LLMs, is a chat interface which lets a user communicate with a local LLM just like they would a LLM that lives in a Datacenter.
It’s definitely a privacy upgrade when compared to ChatGPT, Gemini or its relatives which are always trained on user data. Rieseberg stated in the documentation that Clippy does not go online for anything.
“The only network request Clippy makes is to check for updates (which you can disable),” Rieseberg noted.
- Clippy’s designer was embarrassed to include him in their portfolio
- When Clippy makes you nostalgic for simpler times, something is wrong
- Run an LLM on a PC, not the cloud, in under 10 minutes
- Do not want Copilot on your Windows 11 device? Install this official update.
AI Clippy runs very easily. This vulture tested his MacBook Pro and it was easy to download the package for an Apple Silicon chip. Unzip it, then let it download the default model (Gemma 3 w/ 1 billion parameters) and begin asking questions. Clippy’s Windows 95 chat window remains on the desktop when it is closed. A click brings the window up again for a fresh round of questions.
In response to the question of what AI Clippy would be able to do if Rieseberg was given the time to do so, he told The Register that node llama-cpp – the Node.js file used by Llama, and other LLMs – could allow Clippy access to all the Llama.cpp features that could be used with other locally-run AIs. Rieseberg stated that they are not exposed, except for temperature, top k and system prompting.
“That’s just a matter of me being lazy, though. The code to expose all those options is there,” Rieseberg added. He won’t be able to do this anytime soon as he is scheduled to join Anthropic next week to work on Claude, which means he will be busy with other AI projects. Rieseberg said that while he is not concerned about Microsoft suing him for stealing their desktop companion, he would not fight if they did.
“The moment they tell me to stop, shut it down, and hand over all my code, I will,” Rieseberg told us. He doesn’t believe it would be a good idea for Microsoft to do anything about Clippy or AI. He said. “With Cortana and Copilot they have probably much better characters available.”
Clippy’s new version is available on Windows, macOS and Linux. This makes sense, given the developer’s background in cross-platform development.