Skip to content

Instantly share code, notes, and snippets.

@wxgeorge
Last active April 22, 2024 13:31
Show Gist options
  • Save wxgeorge/5980cf45e81be19a755c5298dc17c779 to your computer and use it in GitHub Desktop.
Save wxgeorge/5980cf45e81be19a755c5298dc17c779 to your computer and use it in GitHub Desktop.
Using Recursal-hosted RWKV models on the Command Line

Using Recursal Models on the Command Line

Recursal provides an OpenAI compatible interface to their hosted RWKV models.

These can be easily configured to be accessible and interrogable through llm.

Specifically, suppose the Recursal dashboard is suggesting you use the following in your python code

from openai import OpenAI

client = OpenAI(
  base_url="https://router.recursal.com/hVtcCcQK8xF6CPkb4qT2W",
  api_key="API_KEY_REDACTED",
)

response = client.chat.completions.create(
  model='gpt-3.5-turbo',
  messages=[
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "Hello!"}
  ],
)

Suppose further, you have llm installed.

  1. give llm your recursal key via llm keys set recursal
  2. create a file titled extra-openai-models.yaml in the directory given by dirname "$(llm logs path)" and insert the following content
- model_id: recursal-hvtc              # How you reference this model from the CLI (e.g. llm -m recursal-base "Your prompt")
  aliases: ["my-recursal-fine-tune"]   # Alternate CLI references
  model_name: gpt-3.5-turbo            # passed to API. In recursal case, _teacher_/pass through model
  api_base: "https://router.recursal.com/hVtcCcQK8xF6CPkb4qT2W" # URL of a recursal project
  api_key_name: recursal               # set in the previous step

You will now be able to interact with recursal on your cli, ala

llm -m my-recursal-fine-tune "What can you tell me about president Abraham Lincoln?"

Via OpenRouter

Note that at the time of writing (March 13th), Recursal's 7B Eagle model is currently available via openrouter see this blog post.

Install the llm-openrouter plugin (e.g. via pip), make an openrouter API key available to your LLM instance via llm keys set openrouter, and then run

llm -m openrouter/recursal/eagle-7b "Your prompt goes here"
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment