Major Update

Major Update

Jun 18, 2024

Introducing Local III



In the same way the automobile granted us personal freedom to explore, Local III starts our journey towards a new freedom — personal, private access to machine intelligence.

Developed by over 100 contributors spanning every timezone, this update includes an easy-to-use local model explorer, deep integrations with inference engines like Ollama, custom profiles for open models like Llama3, Moondream, and Codestral, and a suite of settings to make offline code interpretation more reliable.

Local III also introduces a free, hosted, opt-in model via interpreter --model i. Conversations with the i model will be used to train our own open-source language model for computer control.


The Local Explorer

Local III makes it easier than ever to use local models. With an interactive setup, you can select an inference provider, select a model, download new models, and more.

Want to add an inference engine? Please make a PR into the local explorer here.

The following flag starts the local explorer:

interpreter --local


The i Model

Local III also introduces a free language model endpoint serving Llama3-70B. This endpoint provides users with a setup-free experience while contributing to the training of a small, locally running language model.

We will remove personally identifiable information before open-sourcing the model and the training set.

By engaging with this model, you become an active participant in shaping the future of open-source AI:

interpreter --model


Deep Ollama Integration

To give any Ollama model access to a code interpreter, simply run:

interpreter --model

Where model is a model from Ollama's model library. This unified command abstracts away all model setup commands. It will only download the model if you haven't downloaded it before.


Optimized Profiles

We have experimented extensively with two SOTA local language models, codestral and llama3. You can configure Open Interpreter to use our recommended settings with the following flags:

interpreter --profile codestral.py # Sets optimal settings for Codestral
interpreter --profile llama3.py # Sets optimal settings for Llama3
interpreter --profile qwen.py # Sets optimal settings for Qwen

Note: The profile flag will load settings from files in the profiles directory, which you can open by running:

interpreter --profiles

If you find optimal settings for other local language models, please contribute a PR into the default profiles folder. Simply duplicate a file like codestral.py and configure the model setup, system message, few shot examples, etc.

You can learn more about profiles here.


Local Vision

Images sent to local models are rendered as a description of the image generated by Moondream, a tiny vision model. The model also receives OCR extracted from the image.

interpreter --local --vision


Experimental Local OS Mode

By enabling local vision, Local III also enables experimental local OS mode support.

In this mode, Open Interpreter can control your mouse, keyboard, and see your screen. The LLM can interact with your computer by clicking icons identified by our open-source Point model.

interpreter --local --os



Why Local?

If this revolution is to broadly distribute its benefits, it must belong to the people. In classical computing, society transitioned away from the mainframe era of access to build the personal computer. This helped ensure a destiny for computers which we could control.

Now, an oligopoly of language model providers stand to control the intelligence age. Open Interpreter is a balancing force against that. Our community is rapidly developing a response to ensure our collective freedom— private, local access to powerful AI agents.

Local III is a step towards a new destiny which we, the people, control.


Top Contributors

Special thanks to Ty Fiero, Anton, and CyanideByte for their excellent contributions this release cycle!


All Updates

* Fix Jupyter logging on shutdown by @tyfiero in https://github.com/OpenInterpreter/open-interpreter/pull/1145

* Check for GPU or MPS availability before using CPU by @jcp in https://github.com/OpenInterpreter/open-interpreter/pull/1132

* Add DevContainer Support by @weihongliang233 in https://github.com/OpenInterpreter/open-interpreter/pull/1142

* Updated to address comment regarding pip installer not working by changing from bash to zsh. by @MartinLBeacham in https://github.com/OpenInterpreter/open-interpreter/pull/1130

* Added 'py' alias for Python/JupyterLanguage by @CyanideByte in https://github.com/OpenInterpreter/open-interpreter/pull/1122

* [FIX] Broken Link to Setup Fixed by @Sandeepsuresh1998 in https://github.com/OpenInterpreter/open-interpreter/pull/1121

* Feature/ruby support by @bars0um in https://github.com/OpenInterpreter/open-interpreter/pull/1105

* Fixed task completion message looping by @CyanideByte in https://github.com/OpenInterpreter/open-interpreter/pull/1154

* Generate history conversation filenames in Chinese properly. by @Steve235lab in https://github.com/OpenInterpreter/open-interpreter/pull/1150

* Display Link to Docs with Unrecognized Flags by @benxu3 in https://github.com/OpenInterpreter/open-interpreter/pull/1160

* Add offline doc how-to in README by @dheavy in https://github.com/OpenInterpreter/open-interpreter/pull/1163

* Fixed bug, may have broken something else but I don't think so by @imapersonman in https://github.com/OpenInterpreter/open-interpreter/pull/1174

* Add argument minor refactor by @MikeBirdTech in https://github.com/OpenInterpreter/open-interpreter/pull/1178

* Added flag reset to re-import `computer` instance by @meawal in https://github.com/OpenInterpreter/open-interpreter/pull/1167

* Modify API key storage user recommendation by @rustom in https://github.com/OpenInterpreter/open-interpreter/pull/1165

* Fix/profile disable_telemetry not working by @LucienShui in https://github.com/OpenInterpreter/open-interpreter/pull/1159

* Fix default variable issue.  by @tyfiero in https://github.com/OpenInterpreter/open-interpreter/pull/1187

* Remove pydantic warnings by @imapersonman in https://github.com/OpenInterpreter/open-interpreter/pull/1184

* Multiple display support by @Amazingct in https://github.com/OpenInterpreter/open-interpreter/pull/1161

* Updated litellm now that they fixed pydantic warning by @CyanideByte in https://github.com/OpenInterpreter/open-interpreter/pull/1195

* Fix computer.calendar dates issue by @supersational in https://github.com/OpenInterpreter/open-interpreter/pull/1198

* Optimize rendering of dynamic messages in render_message.py by @kooroshkz in https://github.com/OpenInterpreter/open-interpreter/pull/1200

* Ignore empty messages by @CyanideByte in https://github.com/OpenInterpreter/open-interpreter/pull/1203

* Fix optional import crash and error by @CyanideByte in https://github.com/OpenInterpreter/open-interpreter/pull/1194

* Add function to contribute conversations by @tyfiero in https://github.com/OpenInterpreter/open-interpreter/pull/1230

* Add Ollama with llama3 as Default by @tyfiero in https://github.com/OpenInterpreter/open-interpreter/pull/1222

* Segmented default.yaml into sections so it is clearer how to nest them by @zdaar in https://github.com/OpenInterpreter/open-interpreter/pull/1223

* Remove config from docs by @MikeBirdTech in https://github.com/OpenInterpreter/open-interpreter/pull/1221

* Fix Llama3 backtick hallucination in code blocks by @CyanideByte in https://github.com/OpenInterpreter/open-interpreter/pull/1217

* Fix %% magic command by @Notnaton in https://github.com/OpenInterpreter/open-interpreter/pull/1219

* Refine documentation formatting and style for clarity by @RateteApple in https://github.com/OpenInterpreter/open-interpreter/pull/1212

* Bump version of tiktoken by @minamorl in https://github.com/OpenInterpreter/open-interpreter/pull/1204

* Update llm.py to use litellm.support_function_calling() by @Notnaton in https://github.com/OpenInterpreter/open-interpreter/pull/1215

* update local mode system message by @MikeBirdTech in https://github.com/OpenInterpreter/open-interpreter/pull/1229

* Update local profile so it doen't use function calling by @Notnaton in https://github.com/OpenInterpreter/open-interpreter/pull/1213

* Add local OS profile for local OS control by @MikeBirdTech in https://github.com/OpenInterpreter/open-interpreter/pull/1235

* Update litellm for namespace conflict warning fix by @CyanideByte in https://github.com/OpenInterpreter/open-interpreter/pull/1248

* Spanish readme translation by palnever from discord by @CyanideByte in https://github.com/OpenInterpreter/open-interpreter/pull/1251

* docs: update streaming-response.mdx by @eltociear in https://github.com/OpenInterpreter/open-interpreter/pull/1241

* 更新了README_ZH.md/Updated README.md by @KPCOFGS in https://github.com/OpenInterpreter/open-interpreter/pull/1247

* Contributing interaction and sending command probably by @imapersonman in https://github.com/OpenInterpreter/open-interpreter/pull/1232

* Added batch, bat aliases for shell language by @CyanideByte in https://github.com/OpenInterpreter/open-interpreter/pull/1242

* Local update tons of fixes and new llamafiles by @CyanideByte in https://github.com/OpenInterpreter/open-interpreter/pull/1253

* Fix llama 3 code halucination by @Notnaton in https://github.com/OpenInterpreter/open-interpreter/pull/1250

* Fixed linux installer by @okineadev in https://github.com/OpenInterpreter/open-interpreter/pull/1269

* Updated installation scripts by @okineadev in https://github.com/OpenInterpreter/open-interpreter/pull/1266

* fix typos by @RainRat in https://github.com/OpenInterpreter/open-interpreter/pull/1254


New Contributors

* @jcp made their first contribution in https://github.com/OpenInterpreter/open-interpreter/pull/1132

* @weihongliang233 made their first contribution in https://github.com/OpenInterpreter/open-interpreter/pull/1142

* @MartinLBeacham made their first contribution in https://github.com/OpenInterpreter/open-interpreter/pull/1130

* @Sandeepsuresh1998 made their first contribution in https://github.com/OpenInterpreter/open-interpreter/pull/1121

* @benxu3 made their first contribution in https://github.com/OpenInterpreter/open-interpreter/pull/1160

* @dheavy made their first contribution in https://github.com/OpenInterpreter/open-interpreter/pull/1163

* @imapersonman made their first contribution in https://github.com/OpenInterpreter/open-interpreter/pull/1174

* @meawal made their first contribution in https://github.com/OpenInterpreter/open-interpreter/pull/1167

* @rustom made their first contribution in https://github.com/OpenInterpreter/open-interpreter/pull/1165

* @LucienShui made their first contribution in https://github.com/OpenInterpreter/open-interpreter/pull/1159

* @Amazingct made their first contribution in https://github.com/OpenInterpreter/open-interpreter/pull/1161

* @supersational made their first contribution in https://github.com/OpenInterpreter/open-interpreter/pull/1198

* @kooroshkz made their first contribution in https://github.com/OpenInterpreter/open-interpreter/pull/1200

* @zdaar made their first contribution in https://github.com/OpenInterpreter/open-interpreter/pull/1223

* @RateteApple made their first contribution in https://github.com/OpenInterpreter/open-interpreter/pull/1212

* @minamorl made their first contribution in https://github.com/OpenInterpreter/open-interpreter/pull/1204

* @KPCOFGS made their first contribution in https://github.com/OpenInterpreter/open-interpreter/pull/1247

* @okineadev made their first contribution in https://github.com/OpenInterpreter/open-interpreter/pull/1269

* @RainRat made their first contribution in https://github.com/OpenInterpreter/open-interpreter/pull/1254


**Full Changelog**: https://github.com/OpenInterpreter/open-interpreter/compare/v0.2.4...v0.3.0


Subscribe to future changes

Get notified when we release new features.