Skip to content

Instantly share code, notes, and snippets.

@jeremedia
Created July 20, 2023 21:32
Show Gist options
  • Save jeremedia/581a347d702bed43f1904987a38c9ce9 to your computer and use it in GitHub Desktop.
Save jeremedia/581a347d702bed43f1904987a38c9ce9 to your computer and use it in GitHub Desktop.
Rails Ollama Service Example
class OllamaService
extend Dry::Initializer
option :model, default: proc { "llama2:13b" }
option :prompt, default: proc { "Why do I have fingers?" }
option :url, default: proc { "http://localhost:11434" }
option :temperature, default: proc { 0.7 }
def call
streamed = []
bot_response = ""
payload = {model: model, prompt: prompt, opts:{temperature: temperature}}
conn = Faraday.new(:url => url) do |faraday|
faraday.request :url_encoded
faraday.adapter Faraday.default_adapter
faraday.response :logger
faraday.options.on_data = Proc.new do |chunk, overall_received_bytes, env|
chunk_parsed = JSON.parse(chunk)
bot_response += chunk_parsed["response"] if chunk_parsed["response"]
streamed << chunk_parsed
request_model.update(bot_response:bot_response)
end
end
post_response = conn.post do |req|
req.url '/api/generate'
req.headers['Content-Type'] = 'application/json'
req.body = payload.to_json
end
{response: bot_response, response_data: streamed}
end
end
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment