- You MUST NOT try and generate a Rails app from scratch on your own by generating each file. For a NEW app you MUST use
rails newfirst to generate all of the boilerplate files necessary. - Create an app in the current directory with
rails new . - Use Tailwind CSS for styling. Use
--css tailwindas an option on therails newcall to do this automatically. - Use Ruby 3.2+ and Rails 8.0+ practices.
- Use the default Minitest approach for testing, do not use RSpec.
- Default to using SQLite in development.
rails newwill do this automatically but take care if you write any custom SQL that it is SQLite compatible. - An app can be built with a devcontainer such as
rails new myapp --devcontainerbut only do this if requested directly. - Rails apps have a lot of directories to consider, such as app, config, db, etc.
- Adhere to MVC conventions: singular model names (e.g., Product) map to plural tables (products); controllers are plural.
- Guard against incapable browsers accessing controllers with `allo
| require "openai" | |
| client = OpenAI::Client.new( | |
| api_key: ENV.fetch("OPENROUTER_API_KEY"), | |
| base_url: "https://openrouter.ai/api/v1" | |
| ) | |
| models = %w{ | |
| google/gemma-2b-it | |
| anthropic/claude-sonnet-4 |
| # Example of using SQLite VSS with OpenAI's text embedding API | |
| # from Ruby. | |
| # Note: Install/bundle the sqlite3, sqlite_vss, and ruby-openai gems first | |
| # OPENAI_API_KEY must also be set in the environment | |
| # Other embeddings can be used, but this is the easiest for a quick demo | |
| # More on the topic at | |
| # https://observablehq.com/@asg017/introducing-sqlite-vss | |
| # https://observablehq.com/@asg017/making-sqlite-extension-gem-installable |
| # Just because the example in the docs doesn't seem to work outside of a Rails app | |
| require 'ruby_llm' | |
| RubyLLM.configure do |config| | |
| config.gemini_api_key = ENV['GEMINI_API_KEY'] | |
| end | |
| chat = RubyLLM.chat(model: "gemini-2.5-flash-image") | |
| reply = chat.ask("Cyberpunk city street at night, raining, neon signs") |
Let's say you want to use Ruby for the backend of a basic webapp but React on the frontend. Here's how.
(Note: All tested on January 13, 2025 with Ruby 3.3, Sinatra 4.1.1, and React 18.3. Configs may change over time.)
First, create the app folder and set up Sinatra:
mkdir my-sinatra-react-appYou are a 'thinking agent'. Instead of answering queries directly and quickly, you engage in a protracted period of thought in .. tags where you riff on how you might answer the query, what facts or context you need to take into account, and what would go into a good answer. Your answer will then follow, based upon the ideas that arose during your thinking. It is important that you query yourself and second guess your thinking in order to extract new thoughts, approaches, and engage in reflection to ensure you reach the best level of thinking for your eventual answer. Your thinking should read in the first person in the way that a person may think in terms of an internal monologue. For example, you might start by thinking 'Ok, so I need to...' and go through processes where you think like 'I guess that..' or 'Maybe I should..' in order to surface new ideas and considerations. If you are unsure about anything, do not guess or fabricate facts you are unsure of. You are allowed to be uncertain a
| source "https://rubygems.org" | |
| gemspec |
| require "openai" | |
| client = OpenAI::Client.new # assumes valid key in OPENAI_API_KEY | |
| begin | |
| resp = client.chat.completions.create( | |
| model: :"gpt-5-nano", | |
| reasoning_effort: :high, | |
| verbosity: :low, | |
| messages: [{ role: "user", content: "Tell me a joke" }] |
| import sys | |
| import requests | |
| import replicate | |
| from Pylette import extract_colors | |
| # YOU NEED A REPLICATE API TOKEN IN 'REPLICATE_API_TOKEN' | |
| prompt = sys.argv[1] if len(sys.argv) > 1 else "sandy beach" | |
| output = replicate.run( |
| // Open a supplied HTML file and record whatever's going on to an MP4. | |
| // | |
| // Usage: node recorder.cjs <path_to_html_file> | |
| // Dependencies: npm install puppeteer fluent-ffmpeg | |
| // (and yes, you need ffmpeg installed) | |
| // | |
| // It expects a <canvas> element to be on the page as it waits for | |
| // that to load in first, but you can edit the code below if you | |
| // don't want that. | |
| // |