Skip to content

Quick Start

Start using LLMs in just 2 steps with Lamini!

1. Authenticate

First, get <YOUR-LAMINI-API-KEY> at https://app.lamini.ai/account. Need help? Check out API Auth.

2. Run inference

Next, run Lamini:

Install the Python SDK.

pip install --upgrade lamini

Run an LLM with a few lines of code.

# code/quick_start.py

import lamini
# lamini.api_key = "<YOUR-LAMINI-API-KEY>"

llm = lamini.Lamini("meta-llama/Meta-Llama-3.1-8B-Instruct")
print(llm.generate("""<|begin_of_text|><|start_header_id|>user<|end_header_id|>\n\nHow are you?<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n"""))

Run an LLM with one CURL command.

curl --location "https://api.lamini.ai/v1/completions" \
    --header "Authorization: Bearer $LAMINI_API_KEY" \
    --header "Content-Type: application/json" \
    --data '{
        "model_name": "meta-llama/Meta-Llama-3.1-8B-Instruct",
        "prompt": "<|begin_of_text|><|start_header_id|>user<|end_header_id|>\n\nHow are you?<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n"
    }'
Expected Output
"I'm just a language model, I don't have feelings or emotions like humans do, but I'm functioning properly and ready to help with any questions or tasks you have! How can I assist you today?"

That's it! 🎉

Now you can try: