Skip to content

1.1.3

Latest
Compare
Choose a tag to compare
@Sparky4567 Sparky4567 released this 30 Jun 08:09
· 2 commits to main since this release

Dear Users,

We are thrilled to announce our biggest release yet! This new version of the plugin introduces the ability to use custom endpoints. Simply specify the IP address in the settings tab.

Please ensure that the default Ollama port 11434 is unblocked, as we’ve implemented a custom connection check function to overcome limitations in the Ollama-node library.

We have not tested this with remote servers, but considering many of you are tech enthusiasts or academics renting servers to host your models, you might find it interesting to give it a try.

If the connection check is unsuccessful, a pair of notifications will be shown and the plugin will revert to the default endpoint, which is 127.0.0.1.

After changing the default endpoint to your desired endpoint, please use the "Reload llm plugin" command to apply the changes and reload the Obsidian app.

  • Check for updates, a few more tweaks were added right after the release