Skip to content

Instantly share code, notes, and snippets.

@alexrudall
Last active April 29, 2024 00:26
Show Gist options
  • Save alexrudall/cb5ee1e109353ef358adb4e66631799d to your computer and use it in GitHub Desktop.
Save alexrudall/cb5ee1e109353ef358adb4e66631799d to your computer and use it in GitHub Desktop.
ChatGPT streaming with ruby-openai, Rails 7, Hotwire, Turbostream, Sidekiq and Tailwind!

How to add ChatGPT streaming to your Ruby on Rails 7 app!

This guide will walk you through adding a ChatGPT-like messaging stream to your Ruby on Rails 7 app using ruby-openai, Rails 7, Hotwire, Turbostream, Sidekiq and Tailwind. All code included below!

Alt Text

First, add the ruby-openai gem! Needs to be at least version 4. Add Sidekiq too.

# Gemfile
gem "ruby-openai", "~> 4.0.0"

# Simple, efficient background processing using Redis.
# https://github.com/sidekiq/sidekiq
gem "sidekiq", "~> 7.0.9"

Install Redis on your machine.

brew install redis

Add Redis and Sidekiq to your Procfile so they run when you run bin/dev.

# Procfile.dev
web: bin/rails server -p 3000
css: bin/rails tailwindcss:watch
sidekiq: bundle exec sidekiq -c 2
queue: redis-server

Add your secret OpenAI token to your .env file. Get one from OpenAI here.

OPENAI_ACCESS_TOKEN=abc123

Add the new routes:

# config/routes.rb
resources :chats, only: %i[create show] do
  resources :messages, only: %i[create]
end

Generate the migrations:

bin/rails generate migration CreateChats user:references
bin/rails generate migration CreateMessages chat:references role:integer content:string

Add the rest of the code, full example files below!

# Controllers.
app/controllers/chats_controller.rb
app/controllers/messages_controller.rb

# Sidekiq job to stream the data from the OpenAI API.
app/jobs/get_ai_response.rb

# Migrations
db/migrate/20230427131800_create_chats.rb
db/migrate/20230427131900_create_messages.rb

# Models
app/models/chat.rb
app/models/message.rb

# Views
app/views/chats/show.html.erb
app/views/messages/_form.html.erb
app/views/messages/_message.html.erb
app/views/messages/create.turbo_stream.erb
# db/migrate/20230427131800_create_chats.rb
# bin/rails generate migration CreateChats user:references
class CreateChats < ActiveRecord::Migration[7.0]
def change
create_table :chats do |t|
t.references :user, null: false, foreign_key: true
t.timestamps
end
end
end
# db/migrate/20230427131900_create_messages.rb
# bin/rails generate migration CreateMessages chat:references role:integer content:string
class CreateMessages < ActiveRecord::Migration[7.0]
def change
create_table :messages do |t|
t.references :chat, foreign_key: true
t.integer :role, null: false, default: 0
t.string :content, null: false
t.integer :response_number, null: false, default: 0
t.timestamps
end
end
end
# app/models/chat.rb
class Chat < ApplicationRecord
belongs_to :user
has_many :messages, dependent: :destroy
end
# app/controllers/chats_controller.rb
class ChatsController < ApplicationController
before_action :authenticate_user!
before_action :set_chat, only: %i[show]
def show
respond_with(@chat)
end
def create
@chat = Chat.create(user: current_user)
respond_with(@chat)
end
private
def set_chat
@chat = Chat.find(params[:id])
end
end
# app/jobs/get_ai_response.rb
class GetAiResponse < SidekiqJob
RESPONSES_PER_MESSAGE = 1
def perform(chat_id)
chat = Chat.find(chat_id)
call_openai(chat: chat)
end
private
def call_openai(chat:)
OpenAI::Client.new.chat(
parameters: {
model: "gpt-3.5-turbo",
messages: Message.for_openai(chat.messages),
temperature: 0.8,
stream: stream_proc(chat: chat),
n: RESPONSES_PER_MESSAGE
}
)
end
def create_messages(chat:)
messages = []
RESPONSES_PER_MESSAGE.times do |i|
message = chat.messages.create(role: "assistant", content: "", response_number: i)
message.broadcast_created
messages << message
end
messages
end
def stream_proc(chat:)
messages = create_messages(chat: chat)
proc do |chunk, _bytesize|
new_content = chunk.dig("choices", 0, "delta", "content")
message = messages.find { |m| m.response_number == chunk.dig("choices", 0, "index") }
message.update(content: message.content + new_content) if new_content
end
end
end
# app/models/message.rb
class Message < ApplicationRecord
include ActionView::RecordIdentifier
enum role: { system: 0, assistant: 10, user: 20 }
belongs_to :chat
after_create_commit -> { broadcast_created }
after_update_commit -> { broadcast_updated }
def broadcast_created
broadcast_append_later_to(
"#{dom_id(chat)}_messages",
partial: "messages/message",
locals: { message: self, scroll_to: true },
target: "#{dom_id(chat)}_messages"
)
end
def broadcast_updated
broadcast_append_to(
"#{dom_id(chat)}_messages",
partial: "messages/message",
locals: { message: self, scroll_to: true },
target: "#{dom_id(chat)}_messages"
)
end
def self.for_openai(messages)
messages.map { |message| { role: message.role, content: message.content } }
end
end
# app/controllers/messages_controller.rb
class MessagesController < ApplicationController
include ActionView::RecordIdentifier
before_action :authenticate_user!
def create
@message = Message.create(message_params.merge(chat_id: params[:chat_id], role: "user"))
GetAiResponse.perform_async(@message.chat_id)
respond_to do |format|
format.turbo_stream
end
end
private
def message_params
params.require(:message).permit(:content)
end
end
# app/views/chats/show.html.erb
<div class="mx-auto w-full flex">
<div class="mx-auto">
<div class="bg-white py-8">
<div class="mx-auto max-w-lg px-6 ">
<ul role="list" class="overflow-y-auto max-h-[48vh] flex flex-col-reverse">
<%= turbo_stream_from "#{dom_id(@chat)}_messages" %>
<div id="<%= dom_id(@chat) %>_messages">
<%= render @chat.messages %>
</div>
</ul>
<%= render partial: "messages/form", locals: { chat: @chat } %>
</div>
</div>
</div>
</div>
# app/views/messages/create.turbo_stream.erb
<%= turbo_stream.append "#{dom_id(@message.chat)}_messages" do %>
<%= render "message", message: @message, scroll_to: true %>
<% end %>
<%= turbo_stream.replace "#{dom_id(@message.chat)}_message_form" do %>
<%= render "form", chat: @message.chat %>
<% end %>
# app/views/messages/_form.html.erb
<%= turbo_frame_tag "#{dom_id(chat)}_message_form" do %>
<%= form_with(model: Message.new, url: [chat, chat.messages.new]) do |form| %>
<div class="my-5">
<%= form.text_area :content, rows: 4, class: "block shadow rounded-md border border-gray-200 outline-none px-3 py-2 mt-2 w-full", autofocus: true, "x-on:keydown.cmd.enter" => "$event.target.form.requestSubmit();" %>
</div>
<div class="grid justify-items-end">
<%= form.button type: :submit, class: "rounded-lg py-3 px-5 bg-blue-600 text-white inline-block font-medium cursor-pointer" do %>
<i class="fas fa-paper-plane"></i>
<span class="pl-2">Send</span>
<% end %>
</div>
<% end %>
<% end %>
# app/views/messages/_message.html.erb
# Thanks to github.com/fanahova for this template!
<div id="<%= dom_id(message) %>_messages">
<% if message.user? %>
<div class="bg-sky-400 rounded-lg m-8 text-white p-4">
<%= message.content %>
</div>
<% else %>
<div class="bg-gray-200 rounded-lg m-8 p-4">
<%= message.content %>
</div>
<% end %>
</div>
@nickkirt
Copy link

nickkirt commented Feb 5, 2024

Wow, I fixed it!!!

In the Messages Controller, there's the following line:
GetAiResponse.perform_async(@message.chat_id)

Instead, I used the following:
GetAiResponseJob.perform_now(@message.id)

I replaced it with:
GetAiResponseJob.perform_later(@message.id)

Now it works fine!

I'm surprised that's the problem. But I'm happy that it's working.

@OseasSon
Copy link

Did anyone manage to create a run and stream the response? I'm trying to implement streaming for assistants. I would appreciate any insights 🙏

@joshuachestang
Copy link

Did anyone manage to create a run and stream the response? I'm trying to implement streaming for assistants. I would appreciate any insights 🙏

+1

@ShawnAukstak
Copy link

Did anyone manage to create a run and stream the response? I'm trying to implement streaming for assistants. I would appreciate any insights 🙏

I used the basic structure mentioned in here with a number of changes to create one and it seems to be working well so far at limited scale. If you have any specific questions I'll try to follow-up and help!

Some other resources that might be helpful:
Drifting Ruby - streaming llm response
How To Integrate Chatgpt With Rails 7: Step-by-step Tutorial

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment