Last active
April 2, 2024 12:46
-
-
Save aantix/6b255b8ecf590e8dde41d5e163ca01e2 to your computer and use it in GitHub Desktop.
Easy LLM setup for Ruby on Rails application
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
class Ollama | |
URL = 'http://localhost:11434/api/chat' | |
def initialize | |
@uri = URI.parse(URL) | |
end | |
def prompt(question) | |
response = Net::HTTP.start(@uri.hostname, @uri.port, use_ssl: @uri.scheme == 'https') do |http| | |
http.post(@uri, JSON.dump({ | |
messages: [{ role: 'user', content: question }], | |
model: 'mistral', | |
stream: false | |
}), 'Content-Type' => 'application/json') | |
end | |
JSON.parse(response.body) | |
end | |
end |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
[1] pry(main)> client = Ollama.new | |
=> #<Ollama:0x000000012fd10260 @uri=#<URI::HTTP http://localhost:11434/api/chat>> | |
[2] pry(main)> client.prompt("What is your name?") | |
=> {"model"=>"mistral", | |
"created_at"=>"2024-03-28T04:21:22.679528Z", | |
"message"=>{"role"=>"assistant", "content"=>" I don't have a name. I'm just an artificial intelligence programmed to assist with information and answer questions to the best of my ability."}, | |
"done"=>true, | |
"total_duration"=>756771750, | |
"load_duration"=>585917, | |
"prompt_eval_count"=>9, | |
"prompt_eval_duration"=>242521000, | |
"eval_count"=>32, | |
"eval_duration"=>512828000} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#!/bin/bash | |
brew install ollama | |
brew services start ollama | |
ollama pull mistral |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment