In this update, we have added a new app called "OpenAI API".
This app allows changing the inference engine behind Layla to any OpenAI-compatible API. You can configure your own API endpoint, API key, and OpenAI model to use.
Once this app is installed, instead of running the inference calculations locally on your device, it will call the configured API.
The main use case is to connect to powerful inference engines running on your computer on your local network that exposes an OpenAI-compatible API. For example, you can run large 70B on your local computer, setup API access, and use Layla on your phone connecting to it.
Additionally, with a bit of technical tinkering, you can even setup port forwarding on your home router to allow you to access your LLMs running on your PC even while you are away from home!
A short tutorial demonstrating how this can be done using Oobabooga TextGen GUI is shown here:
Changelog:
New features:
New app: OpenAI API app allows Layla to connect to any OpenAI-compatible endpoint!
Improvements
adjusted all preset characters to use TavernPNG format
new characters are uploaded to the Personality Hub!
Comments