Looking to host Lamini on prem? Check out the installer instructions 🔗.
Lamini can be installed using pip, the package manager for Python. To install Lamini, open a command prompt and type:
This will download and install the latest version of Lamini and its dependencies.
Check if your installation was done correctly, by importing the LlamaV2Runner in your python interpreter. Fun fact: Lamini is the tribe of which llamas are a part, so you can import the module
lamini to work with the LLM engine.
Setup your keys
Go to https://lamini.ai. Log in to get your API key and purchase credits (under the Account tab).
~/.powerml/configure_llama.yaml and put a key in it.
Another option is to pass in your production key to the config parameter of the
See the Authentication page 🔗 for more advanced options.
Run the LLM engine with a basic test to see if installation and authentication were set up correctly.
Now you're on your way to building your own LLM for your specific use case!
To play with different types in an interface, you can log in at https://lamini.ai and use the playground there.
In addition to a REST API and Python Package, we also have a web application to help streamline model training and evalutaion. Go to https://app.lamini.ai/ to view your training jobs, see model evaulation, play around with finetuned models in a playground, generate API keys, and monitor usage.