OllamaΒ is the easiest way to run open source LLMs locally on your computer.
* To use Ollama, first you'll need toΒ installΒ it on your computer.
* Then you'll need to set an environment variable on your computer to allow this addin to communicate with Ollama:
* Set: OLLAMA_ORIGINS=*
* Restart Ollama (or restart your computer)
* You will also need to create an account and install ngrok
* Next, go to this page and get your domain (you get one for free with ngrok)
* Now run ngrok in the console:
ngrok http 11434 --domain=MY_DOMAIN.ngrok-free.app --host-header="localhost:11434" (replace MY_DOMAIN based on your domain from the previous step)
* Go to the Configure dialog and enter your domain under Ollama as baseurl. For example:
http://MY_DOMAIN.ngrok-free.app
* Now you can easily DOWNLOAD and use any of the models available in Ollama (Your downloaded models will appear in the list when you create the models dropdown).
1 year ago