Skip to content

Instantly share code, notes, and snippets.

@sgoedecke
Created March 8, 2022 01:21
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save sgoedecke/8383949faaf428c86febced81fb93f3b to your computer and use it in GitHub Desktop.
Save sgoedecke/8383949faaf428c86febced81fb93f3b to your computer and use it in GitHub Desktop.
This gist exceeds the recommended number of files (~10). To access all files, please clone this gist.
# frozen_string_literal: true
module ActiveRecord
module Validations
class AbsenceValidator < ActiveModel::Validations::AbsenceValidator # :nodoc:
def validate_each(record, attribute, association_or_value)
if record.class._reflect_on_association(attribute)
association_or_value = Array.wrap(association_or_value).reject(&:marked_for_destruction?)
end
super
end
end
module ClassMethods
# Validates that the specified attributes are not present (as defined by
# Object#present?). If the attribute is an association, the associated object
# is considered absent if it was marked for destruction.
#
# See ActiveModel::Validations::HelperMethods.validates_absence_of for more information.
def validates_absence_of(*attr_names)
validates_with AbsenceValidator, _merge_attributes(attr_names)
end
end
end
end
# frozen_string_literal: true
require "set"
require "active_record/connection_adapters/sql_type_metadata"
require "active_record/connection_adapters/abstract/schema_dumper"
require "active_record/connection_adapters/abstract/schema_creation"
require "active_support/concurrency/load_interlock_aware_monitor"
require "arel/collectors/bind"
require "arel/collectors/composite"
require "arel/collectors/sql_string"
require "arel/collectors/substitute_binds"
module ActiveRecord
module ConnectionAdapters # :nodoc:
# Active Record supports multiple database systems. AbstractAdapter and
# related classes form the abstraction layer which makes this possible.
# An AbstractAdapter represents a connection to a database, and provides an
# abstract interface for database-specific functionality such as establishing
# a connection, escaping values, building the right SQL fragments for +:offset+
# and +:limit+ options, etc.
#
# All the concrete database adapters follow the interface laid down in this class.
# {ActiveRecord::Base.connection}[rdoc-ref:ConnectionHandling#connection] returns an AbstractAdapter object, which
# you can use.
#
# Most of the methods in the adapter are useful during migrations. Most
# notably, the instance methods provided by SchemaStatements are very useful.
class AbstractAdapter
ADAPTER_NAME = "Abstract"
include ActiveSupport::Callbacks
define_callbacks :checkout, :checkin
include Quoting, DatabaseStatements, SchemaStatements
include DatabaseLimits
include QueryCache
include Savepoints
SIMPLE_INT = /\A\d+\z/
COMMENT_REGEX = %r{(?:--.*\n)*|/\*(?:[^*]|\*[^/])*\*/}m
attr_accessor :pool
attr_reader :visitor, :owner, :logger, :lock
alias :in_use? :owner
set_callback :checkin, :after, :enable_lazy_transactions!
def self.type_cast_config_to_integer(config)
if config.is_a?(Integer)
config
elsif SIMPLE_INT.match?(config)
config.to_i
else
config
end
end
def self.type_cast_config_to_boolean(config)
if config == "false"
false
else
config
end
end
def self.validate_default_timezone(config)
case config
when nil
when "utc", "local"
config.to_sym
else
raise ArgumentError, "default_timezone must be either 'utc' or 'local'"
end
end
DEFAULT_READ_QUERY = [:begin, :commit, :explain, :release, :rollback, :savepoint, :select, :with] # :nodoc:
private_constant :DEFAULT_READ_QUERY
def self.build_read_query_regexp(*parts) # :nodoc:
parts += DEFAULT_READ_QUERY
parts = parts.map { |part| /#{part}/i }
/\A(?:[(\s]|#{COMMENT_REGEX})*#{Regexp.union(*parts)}/
end
def self.quoted_column_names # :nodoc:
@quoted_column_names ||= {}
end
def self.quoted_table_names # :nodoc:
@quoted_table_names ||= {}
end
def initialize(connection, logger = nil, config = {}) # :nodoc:
super()
@raw_connection = connection
@owner = nil
@instrumenter = ActiveSupport::Notifications.instrumenter
@logger = logger
@config = config
@pool = ActiveRecord::ConnectionAdapters::NullPool.new
@idle_since = Process.clock_gettime(Process::CLOCK_MONOTONIC)
@visitor = arel_visitor
@statements = build_statement_pool
@lock = ActiveSupport::Concurrency::LoadInterlockAwareMonitor.new
@prepared_statements = self.class.type_cast_config_to_boolean(
config.fetch(:prepared_statements, true)
)
@advisory_locks_enabled = self.class.type_cast_config_to_boolean(
config.fetch(:advisory_locks, true)
)
@default_timezone = self.class.validate_default_timezone(config[:default_timezone])
@raw_connection_dirty = false
configure_connection
end
EXCEPTION_NEVER = { Exception => :never }.freeze # :nodoc:
EXCEPTION_IMMEDIATE = { Exception => :immediate }.freeze # :nodoc:
private_constant :EXCEPTION_NEVER, :EXCEPTION_IMMEDIATE
def with_instrumenter(instrumenter, &block) # :nodoc:
Thread.handle_interrupt(EXCEPTION_NEVER) do
previous_instrumenter = @instrumenter
@instrumenter = instrumenter
Thread.handle_interrupt(EXCEPTION_IMMEDIATE, &block)
ensure
@instrumenter = previous_instrumenter
end
end
def check_if_write_query(sql) # :nodoc:
if preventing_writes? && write_query?(sql)
raise ActiveRecord::ReadOnlyError, "Write query attempted while in readonly mode: #{sql}"
end
end
def replica?
@config[:replica] || false
end
def use_metadata_table?
@config.fetch(:use_metadata_table, true)
end
def default_timezone
@default_timezone || ActiveRecord.default_timezone
end
# Determines whether writes are currently being prevented.
#
# Returns true if the connection is a replica.
#
# If the application is using legacy handling, returns
# true if +connection_handler.prevent_writes+ is set.
#
# If the application is using the new connection handling
# will return true based on +current_preventing_writes+.
def preventing_writes?
return true if replica?
return ActiveRecord::Base.connection_handler.prevent_writes if ActiveRecord.legacy_connection_handling
return false if connection_class.nil?
connection_class.current_preventing_writes
end
def migrations_paths # :nodoc:
@config[:migrations_paths] || Migrator.migrations_paths
end
def migration_context # :nodoc:
MigrationContext.new(migrations_paths, schema_migration)
end
def schema_migration # :nodoc:
@schema_migration ||= begin
conn = self
spec_name = conn.pool.pool_config.connection_specification_name
return ActiveRecord::SchemaMigration if spec_name == "ActiveRecord::Base"
schema_migration_name = "#{spec_name}::SchemaMigration"
Class.new(ActiveRecord::SchemaMigration) do
define_singleton_method(:name) { schema_migration_name }
define_singleton_method(:to_s) { schema_migration_name }
self.connection_specification_name = spec_name
end
end
end
def prepared_statements?
@prepared_statements && !prepared_statements_disabled_cache.include?(object_id)
end
alias :prepared_statements :prepared_statements?
def prepared_statements_disabled_cache # :nodoc:
ActiveSupport::IsolatedExecutionState[:active_record_prepared_statements_disabled_cache] ||= Set.new
end
class Version
include Comparable
attr_reader :full_version_string
def initialize(version_string, full_version_string = nil)
@version = version_string.split(".").map(&:to_i)
@full_version_string = full_version_string
end
def <=>(version_string)
@version <=> version_string.split(".").map(&:to_i)
end
def to_s
@version.join(".")
end
end
def valid_type?(type) # :nodoc:
!native_database_types[type].nil?
end
# this method must only be called while holding connection pool's mutex
def lease
if in_use?
msg = +"Cannot lease connection, "
if @owner == ActiveSupport::IsolatedExecutionState.context
msg << "it is already leased by the current thread."
else
msg << "it is already in use by a different thread: #{@owner}. " \
"Current thread: #{ActiveSupport::IsolatedExecutionState.context}."
end
raise ActiveRecordError, msg
end
@owner = ActiveSupport::IsolatedExecutionState.context
end
def connection_class # :nodoc:
@pool.connection_class
end
# The role (e.g. +:writing+) for the current connection. In a
# non-multi role application, +:writing+ is returned.
def role
@pool.role
end
# The shard (e.g. +:default+) for the current connection. In
# a non-sharded application, +:default+ is returned.
def shard
@pool.shard
end
def schema_cache
@pool.get_schema_cache(self)
end
def schema_cache=(cache)
cache.connection = self
@pool.set_schema_cache(cache)
end
# this method must only be called while holding connection pool's mutex
def expire
if in_use?
if @owner != ActiveSupport::IsolatedExecutionState.context
raise ActiveRecordError, "Cannot expire connection, " \
"it is owned by a different thread: #{@owner}. " \
"Current thread: #{ActiveSupport::IsolatedExecutionState.context}."
end
@idle_since = Process.clock_gettime(Process::CLOCK_MONOTONIC)
@owner = nil
else
raise ActiveRecordError, "Cannot expire connection, it is not currently leased."
end
end
# this method must only be called while holding connection pool's mutex (and a desire for segfaults)
def steal! # :nodoc:
if in_use?
if @owner != ActiveSupport::IsolatedExecutionState.context
pool.send :remove_connection_from_thread_cache, self, @owner
@owner = ActiveSupport::IsolatedExecutionState.context
end
else
raise ActiveRecordError, "Cannot steal connection, it is not currently leased."
end
end
# Seconds since this connection was returned to the pool
def seconds_idle # :nodoc:
return 0 if in_use?
Process.clock_gettime(Process::CLOCK_MONOTONIC) - @idle_since
end
def unprepared_statement
cache = prepared_statements_disabled_cache.add?(object_id) if @prepared_statements
yield
ensure
cache&.delete(object_id)
end
# Returns the human-readable name of the adapter. Use mixed case - one
# can always use downcase if needed.
def adapter_name
self.class::ADAPTER_NAME
end
# Does the database for this adapter exist?
def self.database_exists?(config)
raise NotImplementedError
end
# Does this adapter support DDL rollbacks in transactions? That is, would
# CREATE TABLE or ALTER TABLE get rolled back by a transaction?
def supports_ddl_transactions?
false
end
def supports_bulk_alter?
false
end
# Does this adapter support savepoints?
def supports_savepoints?
false
end
# Do TransactionRollbackErrors on savepoints affect the parent
# transaction?
def savepoint_errors_invalidate_transactions?
false
end
def supports_restart_db_transaction?
false
end
# Does this adapter support application-enforced advisory locking?
def supports_advisory_locks?
false
end
# Should primary key values be selected from their corresponding
# sequence before the insert statement? If true, next_sequence_value
# is called before each insert to set the record's primary key.
def prefetch_primary_key?(table_name = nil)
false
end
def supports_partitioned_indexes?
false
end
# Does this adapter support index sort order?
def supports_index_sort_order?
false
end
# Does this adapter support partial indices?
def supports_partial_index?
false
end
# Does this adapter support expression indices?
def supports_expression_index?
false
end
# Does this adapter support explain?
def supports_explain?
false
end
# Does this adapter support setting the isolation level for a transaction?
def supports_transaction_isolation?
false
end
# Does this adapter support database extensions?
def supports_extensions?
false
end
# Does this adapter support creating indexes in the same statement as
# creating the table?
def supports_indexes_in_create?
false
end
# Does this adapter support creating foreign key constraints?
def supports_foreign_keys?
false
end
# Does this adapter support creating invalid constraints?
def supports_validate_constraints?
false
end
# Does this adapter support creating deferrable constraints?
def supports_deferrable_constraints?
false
end
# Does this adapter support creating check constraints?
def supports_check_constraints?
false
end
# Does this adapter support views?
def supports_views?
false
end
# Does this adapter support materialized views?
def supports_materialized_views?
false
end
# Does this adapter support datetime with precision?
def supports_datetime_with_precision?
false
end
# Does this adapter support json data type?
def supports_json?
false
end
# Does this adapter support metadata comments on database objects (tables, columns, indexes)?
def supports_comments?
false
end
# Can comments for tables, columns, and indexes be specified in create/alter table statements?
def supports_comments_in_create?
false
end
# Does this adapter support virtual columns?
def supports_virtual_columns?
false
end
# Does this adapter support foreign/external tables?
def supports_foreign_tables?
false
end
# Does this adapter support optimizer hints?
def supports_optimizer_hints?
false
end
def supports_common_table_expressions?
false
end
def supports_lazy_transactions?
false
end
def supports_insert_returning?
false
end
def supports_insert_on_duplicate_skip?
false
end
def supports_insert_on_duplicate_update?
false
end
def supports_insert_conflict_target?
false
end
def supports_concurrent_connections?
true
end
def async_enabled? # :nodoc:
supports_concurrent_connections? &&
!ActiveRecord.async_query_executor.nil? && !pool.async_executor.nil?
end
# This is meant to be implemented by the adapters that support extensions
def disable_extension(name)
end
# This is meant to be implemented by the adapters that support extensions
def enable_extension(name)
end
# This is meant to be implemented by the adapters that support custom enum types
def create_enum(*) # :nodoc:
end
def advisory_locks_enabled? # :nodoc:
supports_advisory_locks? && @advisory_locks_enabled
end
# This is meant to be implemented by the adapters that support advisory
# locks
#
# Return true if we got the lock, otherwise false
def get_advisory_lock(lock_id) # :nodoc:
end
# This is meant to be implemented by the adapters that support advisory
# locks.
#
# Return true if we released the lock, otherwise false
def release_advisory_lock(lock_id) # :nodoc:
end
# A list of extensions, to be filled in by adapters that support them.
def extensions
[]
end
# A list of index algorithms, to be filled by adapters that support them.
def index_algorithms
{}
end
# REFERENTIAL INTEGRITY ====================================
# Override to turn off referential integrity while executing <tt>&block</tt>.
def disable_referential_integrity
yield
end
# Override to check all foreign key constraints in a database.
def all_foreign_keys_valid?
true
end
# CONNECTION MANAGEMENT ====================================
# Checks whether the connection to the database is still active. This includes
# checking whether the database is actually capable of responding, i.e. whether
# the connection isn't stale.
def active?
end
# Disconnects from the database if already connected, and establishes a
# new connection with the database. Implementors should call super
# immediately after establishing the new connection (and while still
# holding @lock).
def reconnect!(restore_transactions: false)
reset_transaction(restore: restore_transactions) do
clear_cache!(new_connection: true)
configure_connection
end
end
# Disconnects from the database if already connected. Otherwise, this
# method does nothing.
def disconnect!
clear_cache!(new_connection: true)
reset_transaction
end
# Immediately forget this connection ever existed. Unlike disconnect!,
# this will not communicate with the server.
#
# After calling this method, the behavior of all other methods becomes
# undefined. This is called internally just before a forked process gets
# rid of a connection that belonged to its parent.
def discard!
# This should be overridden by concrete adapters.
#
# Prevent @raw_connection's finalizer from touching the socket, or
# otherwise communicating with its server, when it is collected.
if schema_cache.connection == self
schema_cache.connection = nil
end
end
# Reset the state of this connection, directing the DBMS to clear
# transactions and other connection-related server-side state. Usually a
# database-dependent operation.
#
# If a database driver or protocol does not support such a feature,
# implementors may alias this to #reconnect!. Otherwise, implementors
# should call super immediately after resetting the connection (and while
# still holding @lock).
def reset!
clear_cache!(new_connection: true)
reset_transaction
configure_connection
end
# Removes the connection from the pool and disconnect it.
def throw_away!
pool.remove self
disconnect!
end
# Clear any caching the database adapter may be doing.
def clear_cache!(new_connection: false)
if @statements
@lock.synchronize do
if new_connection
@statements.reset
else
@statements.clear
end
end
end
end
# Returns true if its required to reload the connection between requests for development mode.
def requires_reloading?
false
end
# Checks whether the connection to the database is still active (i.e. not stale).
# This is done under the hood by calling #active?. If the connection
# is no longer active, then this method will reconnect to the database.
def verify!
reconnect! unless active?
end
# Provides access to the underlying database driver for this adapter. For
# example, this method returns a Mysql2::Client object in case of Mysql2Adapter,
# and a PG::Connection object in case of PostgreSQLAdapter.
#
# This is useful for when you need to call a proprietary method such as
# PostgreSQL's lo_* methods.
def raw_connection
disable_lazy_transactions!
@raw_connection_dirty = true
@raw_connection
end
def default_uniqueness_comparison(attribute, value) # :nodoc:
attribute.eq(value)
end
def case_sensitive_comparison(attribute, value) # :nodoc:
attribute.eq(value)
end
def case_insensitive_comparison(attribute, value) # :nodoc:
column = column_for_attribute(attribute)
if can_perform_case_insensitive_comparison_for?(column)
attribute.lower.eq(attribute.relation.lower(value))
else
attribute.eq(value)
end
end
def can_perform_case_insensitive_comparison_for?(column)
true
end
private :can_perform_case_insensitive_comparison_for?
# Check the connection back in to the connection pool
def close
pool.checkin self
end
def default_index_type?(index) # :nodoc:
index.using.nil?
end
# Called by ActiveRecord::InsertAll,
# Passed an instance of ActiveRecord::InsertAll::Builder,
# This method implements standard bulk inserts for all databases, but
# should be overridden by adapters to implement common features with
# non-standard syntax like handling duplicates or returning values.
def build_insert_sql(insert) # :nodoc:
if insert.skip_duplicates? || insert.update_duplicates?
raise NotImplementedError, "#{self.class} should define `build_insert_sql` to implement adapter-specific logic for handling duplicates during INSERT"
end
"INSERT #{insert.into} #{insert.values_list}"
end
def get_database_version # :nodoc:
end
def database_version # :nodoc:
schema_cache.database_version
end
def check_version # :nodoc:
end
# Returns the version identifier of the schema currently available in
# the database. This is generally equal to the number of the highest-
# numbered migration that has been executed, or 0 if no schema
# information is present / the database is empty.
def schema_version
migration_context.current_version
end
def field_ordered_value(column, values) # :nodoc:
node = Arel::Nodes::Case.new(column)
values.each.with_index(1) do |value, order|
node.when(value).then(order)
end
Arel::Nodes::Ascending.new(node.else(values.length + 1))
end
class << self
def register_class_with_precision(mapping, key, klass, **kwargs) # :nodoc:
mapping.register_type(key) do |*args|
precision = extract_precision(args.last)
klass.new(precision: precision, **kwargs)
end
end
def extended_type_map(default_timezone:) # :nodoc:
Type::TypeMap.new(self::TYPE_MAP).tap do |m|
register_class_with_precision m, %r(\A[^\(]*time)i, Type::Time, timezone: default_timezone
register_class_with_precision m, %r(\A[^\(]*datetime)i, Type::DateTime, timezone: default_timezone
m.alias_type %r(\A[^\(]*timestamp)i, "datetime"
end
end
private
def initialize_type_map(m)
register_class_with_limit m, %r(boolean)i, Type::Boolean
register_class_with_limit m, %r(char)i, Type::String
register_class_with_limit m, %r(binary)i, Type::Binary
register_class_with_limit m, %r(text)i, Type::Text
register_class_with_precision m, %r(date)i, Type::Date
register_class_with_precision m, %r(time)i, Type::Time
register_class_with_precision m, %r(datetime)i, Type::DateTime
register_class_with_limit m, %r(float)i, Type::Float
register_class_with_limit m, %r(int)i, Type::Integer
m.alias_type %r(blob)i, "binary"
m.alias_type %r(clob)i, "text"
m.alias_type %r(timestamp)i, "datetime"
m.alias_type %r(numeric)i, "decimal"
m.alias_type %r(number)i, "decimal"
m.alias_type %r(double)i, "float"
m.register_type %r(^json)i, Type::Json.new
m.register_type(%r(decimal)i) do |sql_type|
scale = extract_scale(sql_type)
precision = extract_precision(sql_type)
if scale == 0
# FIXME: Remove this class as well
Type::DecimalWithoutScale.new(precision: precision)
else
Type::Decimal.new(precision: precision, scale: scale)
end
end
end
def register_class_with_limit(mapping, key, klass)
mapping.register_type(key) do |*args|
limit = extract_limit(args.last)
klass.new(limit: limit)
end
end
def extract_scale(sql_type)
case sql_type
when /\((\d+)\)/ then 0
when /\((\d+)(,(\d+))\)/ then $3.to_i
end
end
def extract_precision(sql_type)
$1.to_i if sql_type =~ /\((\d+)(,\d+)?\)/
end
def extract_limit(sql_type)
$1.to_i if sql_type =~ /\((.*)\)/
end
end
TYPE_MAP = Type::TypeMap.new.tap { |m| initialize_type_map(m) }
EXTENDED_TYPE_MAPS = Concurrent::Map.new
private
def reconnect_can_restore_state?
transaction_manager.restorable? && !@raw_connection_dirty
end
def extended_type_map_key
if @default_timezone
{ default_timezone: @default_timezone }
end
end
def type_map
if key = extended_type_map_key
self.class::EXTENDED_TYPE_MAPS.compute_if_absent(key) do
self.class.extended_type_map(**key)
end
else
self.class::TYPE_MAP
end
end
def translate_exception_class(e, sql, binds)
message = "#{e.class.name}: #{e.message}"
exception = translate_exception(
e, message: message, sql: sql, binds: binds
)
exception.set_backtrace e.backtrace
exception
end
def log(sql, name = "SQL", binds = [], type_casted_binds = [], statement_name = nil, async: false, &block) # :doc:
@instrumenter.instrument(
"sql.active_record",
sql: sql,
name: name,
binds: binds,
type_casted_binds: type_casted_binds,
statement_name: statement_name,
async: async,
connection: self) do
@lock.synchronize(&block)
rescue => e
raise translate_exception_class(e, sql, binds)
end
end
def transform_query(sql)
ActiveRecord.query_transformers.each do |transformer|
sql = transformer.call(sql)
end
sql
end
def translate_exception(exception, message:, sql:, binds:)
# override in derived class
case exception
when RuntimeError
exception
else
ActiveRecord::StatementInvalid.new(message, sql: sql, binds: binds)
end
end
def without_prepared_statement?(binds)
!prepared_statements || binds.empty?
end
def column_for(table_name, column_name)
column_name = column_name.to_s
columns(table_name).detect { |c| c.name == column_name } ||
raise(ActiveRecordError, "No such column: #{table_name}.#{column_name}")
end
def column_for_attribute(attribute)
table_name = attribute.relation.name
schema_cache.columns_hash(table_name)[attribute.name.to_s]
end
def collector
if prepared_statements
Arel::Collectors::Composite.new(
Arel::Collectors::SQLString.new,
Arel::Collectors::Bind.new,
)
else
Arel::Collectors::SubstituteBinds.new(
self,
Arel::Collectors::SQLString.new,
)
end
end
def arel_visitor
Arel::Visitors::ToSql.new(self)
end
def build_statement_pool
end
# Builds the result object.
#
# This is an internal hook to make possible connection adapters to build
# custom result objects with connection-specific data.
def build_result(columns:, rows:, column_types: {})
ActiveRecord::Result.new(columns, rows, column_types)
end
# Perform any necessary initialization upon the newly-established
# @raw_connection -- this is the place to modify the adapter's
# connection settings, run queries to configure any application-global
# "session" variables, etc.
#
# Implementations may assume this method will only be called while
# holding @lock (or from #initialize).
def configure_connection
end
end
end
end
# frozen_string_literal: true
require "active_record/connection_adapters/abstract_adapter"
require "active_record/connection_adapters/statement_pool"
require "active_record/connection_adapters/mysql/column"
require "active_record/connection_adapters/mysql/explain_pretty_printer"
require "active_record/connection_adapters/mysql/quoting"
require "active_record/connection_adapters/mysql/schema_creation"
require "active_record/connection_adapters/mysql/schema_definitions"
require "active_record/connection_adapters/mysql/schema_dumper"
require "active_record/connection_adapters/mysql/schema_statements"
require "active_record/connection_adapters/mysql/type_metadata"
module ActiveRecord
module ConnectionAdapters
class AbstractMysqlAdapter < AbstractAdapter
include MySQL::Quoting
include MySQL::SchemaStatements
##
# :singleton-method:
# By default, the Mysql2Adapter will consider all columns of type <tt>tinyint(1)</tt>
# as boolean. If you wish to disable this emulation you can add the following line
# to your application.rb file:
#
# ActiveRecord::ConnectionAdapters::Mysql2Adapter.emulate_booleans = false
class_attribute :emulate_booleans, default: true
NATIVE_DATABASE_TYPES = {
primary_key: "bigint auto_increment PRIMARY KEY",
string: { name: "varchar", limit: 255 },
text: { name: "text" },
integer: { name: "int", limit: 4 },
bigint: { name: "bigint" },
float: { name: "float", limit: 24 },
decimal: { name: "decimal" },
datetime: { name: "datetime" },
timestamp: { name: "timestamp" },
time: { name: "time" },
date: { name: "date" },
binary: { name: "blob" },
blob: { name: "blob" },
boolean: { name: "tinyint", limit: 1 },
json: { name: "json" },
}
class StatementPool < ConnectionAdapters::StatementPool # :nodoc:
private
def dealloc(stmt)
stmt.close
end
end
def initialize(connection, logger, connection_options, config)
super(connection, logger, config)
end
def get_database_version # :nodoc:
full_version_string = get_full_version
version_string = version_string(full_version_string)
Version.new(version_string, full_version_string)
end
def mariadb? # :nodoc:
/mariadb/i.match?(full_version)
end
def supports_bulk_alter?
true
end
def supports_index_sort_order?
!mariadb? && database_version >= "8.0.1"
end
def supports_expression_index?
!mariadb? && database_version >= "8.0.13"
end
def supports_transaction_isolation?
true
end
def supports_restart_db_transaction?
true
end
def supports_explain?
true
end
def supports_indexes_in_create?
true
end
def supports_foreign_keys?
true
end
def supports_check_constraints?
if mariadb?
database_version >= "10.2.1"
else
database_version >= "8.0.16"
end
end
def supports_views?
true
end
def supports_datetime_with_precision?
mariadb? || database_version >= "5.6.4"
end
def supports_virtual_columns?
mariadb? || database_version >= "5.7.5"
end
# See https://dev.mysql.com/doc/refman/en/optimizer-hints.html for more details.
def supports_optimizer_hints?
!mariadb? && database_version >= "5.7.7"
end
def supports_common_table_expressions?
if mariadb?
database_version >= "10.2.1"
else
database_version >= "8.0.1"
end
end
def supports_advisory_locks?
true
end
def supports_insert_on_duplicate_skip?
true
end
def supports_insert_on_duplicate_update?
true
end
def field_ordered_value(column, values) # :nodoc:
field = Arel::Nodes::NamedFunction.new("FIELD", [column, values.reverse])
Arel::Nodes::Descending.new(field)
end
def get_advisory_lock(lock_name, timeout = 0) # :nodoc:
query_value("SELECT GET_LOCK(#{quote(lock_name.to_s)}, #{timeout})") == 1
end
def release_advisory_lock(lock_name) # :nodoc:
query_value("SELECT RELEASE_LOCK(#{quote(lock_name.to_s)})") == 1
end
def native_database_types
NATIVE_DATABASE_TYPES
end
def index_algorithms
{
default: "ALGORITHM = DEFAULT",
copy: "ALGORITHM = COPY",
inplace: "ALGORITHM = INPLACE",
instant: "ALGORITHM = INSTANT",
}
end
# HELPER METHODS ===========================================
# The two drivers have slightly different ways of yielding hashes of results, so
# this method must be implemented to provide a uniform interface.
def each_hash(result) # :nodoc:
raise NotImplementedError
end
# Must return the MySQL error number from the exception, if the exception has an
# error number.
def error_number(exception) # :nodoc:
raise NotImplementedError
end
# REFERENTIAL INTEGRITY ====================================
def disable_referential_integrity # :nodoc:
old = query_value("SELECT @@FOREIGN_KEY_CHECKS")
begin
update("SET FOREIGN_KEY_CHECKS = 0")
yield
ensure
update("SET FOREIGN_KEY_CHECKS = #{old}")
end
end
#--
# DATABASE STATEMENTS ======================================
#++
# Executes the SQL statement in the context of this connection.
def execute(sql, name = nil, async: false)
raw_execute(sql, name, async: async)
end
# Mysql2Adapter doesn't have to free a result after using it, but we use this method
# to write stuff in an abstract way without concerning ourselves about whether it
# needs to be explicitly freed or not.
def execute_and_free(sql, name = nil, async: false) # :nodoc:
yield execute(sql, name, async: async)
end
def begin_db_transaction # :nodoc:
execute("BEGIN", "TRANSACTION")
end
def begin_isolated_db_transaction(isolation) # :nodoc:
execute "SET TRANSACTION ISOLATION LEVEL #{transaction_isolation_levels.fetch(isolation)}"
begin_db_transaction
end
def commit_db_transaction # :nodoc:
execute("COMMIT", "TRANSACTION")
end
def exec_rollback_db_transaction # :nodoc:
execute("ROLLBACK", "TRANSACTION")
end
def exec_restart_db_transaction # :nodoc:
execute("ROLLBACK AND CHAIN", "TRANSACTION")
end
def empty_insert_statement_value(primary_key = nil) # :nodoc:
"VALUES ()"
end
# SCHEMA STATEMENTS ========================================
# Drops the database specified on the +name+ attribute
# and creates it again using the provided +options+.
def recreate_database(name, options = {})
drop_database(name)
sql = create_database(name, options)
reconnect!
sql
end
# Create a new MySQL database with optional <tt>:charset</tt> and <tt>:collation</tt>.
# Charset defaults to utf8mb4.
#
# Example:
# create_database 'charset_test', charset: 'latin1', collation: 'latin1_bin'
# create_database 'matt_development'
# create_database 'matt_development', charset: :big5
def create_database(name, options = {})
if options[:collation]
execute "CREATE DATABASE #{quote_table_name(name)} DEFAULT COLLATE #{quote_table_name(options[:collation])}"
elsif options[:charset]
execute "CREATE DATABASE #{quote_table_name(name)} DEFAULT CHARACTER SET #{quote_table_name(options[:charset])}"
elsif row_format_dynamic_by_default?
execute "CREATE DATABASE #{quote_table_name(name)} DEFAULT CHARACTER SET `utf8mb4`"
else
raise "Configure a supported :charset and ensure innodb_large_prefix is enabled to support indexes on varchar(255) string columns."
end
end
# Drops a MySQL database.
#
# Example:
# drop_database('sebastian_development')
def drop_database(name) # :nodoc:
execute "DROP DATABASE IF EXISTS #{quote_table_name(name)}"
end
def current_database
query_value("SELECT database()", "SCHEMA")
end
# Returns the database character set.
def charset
show_variable "character_set_database"
end
# Returns the database collation strategy.
def collation
show_variable "collation_database"
end
def table_comment(table_name) # :nodoc:
scope = quoted_scope(table_name)
query_value(<<~SQL, "SCHEMA").presence
SELECT table_comment
FROM information_schema.tables
WHERE table_schema = #{scope[:schema]}
AND table_name = #{scope[:name]}
SQL
end
def change_table_comment(table_name, comment_or_changes) # :nodoc:
comment = extract_new_comment_value(comment_or_changes)
comment = "" if comment.nil?
execute("ALTER TABLE #{quote_table_name(table_name)} COMMENT #{quote(comment)}")
end
# Renames a table.
#
# Example:
# rename_table('octopuses', 'octopi')
def rename_table(table_name, new_name)
schema_cache.clear_data_source_cache!(table_name.to_s)
schema_cache.clear_data_source_cache!(new_name.to_s)
execute "RENAME TABLE #{quote_table_name(table_name)} TO #{quote_table_name(new_name)}"
rename_table_indexes(table_name, new_name)
end
# Drops a table from the database.
#
# [<tt>:force</tt>]
# Set to +:cascade+ to drop dependent objects as well.
# Defaults to false.
# [<tt>:if_exists</tt>]
# Set to +true+ to only drop the table if it exists.
# Defaults to false.
# [<tt>:temporary</tt>]
# Set to +true+ to drop temporary table.
# Defaults to false.
#
# Although this command ignores most +options+ and the block if one is given,
# it can be helpful to provide these in a migration's +change+ method so it can be reverted.
# In that case, +options+ and the block will be used by create_table.
def drop_table(table_name, **options)
schema_cache.clear_data_source_cache!(table_name.to_s)
execute "DROP#{' TEMPORARY' if options[:temporary]} TABLE#{' IF EXISTS' if options[:if_exists]} #{quote_table_name(table_name)}#{' CASCADE' if options[:force] == :cascade}"
end
def rename_index(table_name, old_name, new_name)
if supports_rename_index?
validate_index_length!(table_name, new_name)
execute "ALTER TABLE #{quote_table_name(table_name)} RENAME INDEX #{quote_table_name(old_name)} TO #{quote_table_name(new_name)}"
else
super
end
end
def change_column_default(table_name, column_name, default_or_changes) # :nodoc:
default = extract_new_default_value(default_or_changes)
change_column table_name, column_name, nil, default: default
end
def change_column_null(table_name, column_name, null, default = nil) # :nodoc:
unless null || default.nil?
execute("UPDATE #{quote_table_name(table_name)} SET #{quote_column_name(column_name)}=#{quote(default)} WHERE #{quote_column_name(column_name)} IS NULL")
end
change_column table_name, column_name, nil, null: null
end
def change_column_comment(table_name, column_name, comment_or_changes) # :nodoc:
comment = extract_new_comment_value(comment_or_changes)
change_column table_name, column_name, nil, comment: comment
end
def change_column(table_name, column_name, type, **options) # :nodoc:
execute("ALTER TABLE #{quote_table_name(table_name)} #{change_column_for_alter(table_name, column_name, type, **options)}")
end
def rename_column(table_name, column_name, new_column_name) # :nodoc:
execute("ALTER TABLE #{quote_table_name(table_name)} #{rename_column_for_alter(table_name, column_name, new_column_name)}")
rename_column_indexes(table_name, column_name, new_column_name)
end
def add_index(table_name, column_name, **options) # :nodoc:
index, algorithm, if_not_exists = add_index_options(table_name, column_name, **options)
return if if_not_exists && index_exists?(table_name, column_name, name: index.name)
create_index = CreateIndexDefinition.new(index, algorithm)
execute schema_creation.accept(create_index)
end
def add_sql_comment!(sql, comment) # :nodoc:
sql << " COMMENT #{quote(comment)}" if comment.present?
sql
end
def foreign_keys(table_name)
raise ArgumentError unless table_name.present?
scope = quoted_scope(table_name)
fk_info = exec_query(<<~SQL, "SCHEMA")
SELECT fk.referenced_table_name AS 'to_table',
fk.referenced_column_name AS 'primary_key',
fk.column_name AS 'column',
fk.constraint_name AS 'name',
rc.update_rule AS 'on_update',
rc.delete_rule AS 'on_delete'
FROM information_schema.referential_constraints rc
JOIN information_schema.key_column_usage fk
USING (constraint_schema, constraint_name)
WHERE fk.referenced_column_name IS NOT NULL
AND fk.table_schema = #{scope[:schema]}
AND fk.table_name = #{scope[:name]}
AND rc.constraint_schema = #{scope[:schema]}
AND rc.table_name = #{scope[:name]}
SQL
fk_info.map do |row|
options = {
column: row["column"],
name: row["name"],
primary_key: row["primary_key"]
}
options[:on_update] = extract_foreign_key_action(row["on_update"])
options[:on_delete] = extract_foreign_key_action(row["on_delete"])
ForeignKeyDefinition.new(table_name, row["to_table"], options)
end
end
def check_constraints(table_name)
if supports_check_constraints?
scope = quoted_scope(table_name)
sql = <<~SQL
SELECT cc.constraint_name AS 'name',
cc.check_clause AS 'expression'
FROM information_schema.check_constraints cc
JOIN information_schema.table_constraints tc
USING (constraint_schema, constraint_name)
WHERE tc.table_schema = #{scope[:schema]}
AND tc.table_name = #{scope[:name]}
AND cc.constraint_schema = #{scope[:schema]}
SQL
sql += " AND cc.table_name = #{scope[:name]}" if mariadb?
chk_info = exec_query(sql, "SCHEMA")
chk_info.map do |row|
options = {
name: row["name"]
}
expression = row["expression"]
expression = expression[1..-2] unless mariadb? # remove parentheses added by mysql
CheckConstraintDefinition.new(table_name, expression, options)
end
else
raise NotImplementedError
end
end
def table_options(table_name) # :nodoc:
create_table_info = create_table_info(table_name)
# strip create_definitions and partition_options
# Be aware that `create_table_info` might not include any table options due to `NO_TABLE_OPTIONS` sql mode.
raw_table_options = create_table_info.sub(/\A.*\n\) ?/m, "").sub(/\n\/\*!.*\*\/\n\z/m, "").strip
return if raw_table_options.empty?
table_options = {}
if / DEFAULT CHARSET=(?<charset>\w+)(?: COLLATE=(?<collation>\w+))?/ =~ raw_table_options
raw_table_options = $` + $' # before part + after part
table_options[:charset] = charset
table_options[:collation] = collation if collation
end
# strip AUTO_INCREMENT
raw_table_options.sub!(/(ENGINE=\w+)(?: AUTO_INCREMENT=\d+)/, '\1')
# strip COMMENT
if raw_table_options.sub!(/ COMMENT='.+'/, "")
table_options[:comment] = table_comment(table_name)
end
table_options[:options] = raw_table_options unless raw_table_options == "ENGINE=InnoDB"
table_options
end
# SHOW VARIABLES LIKE 'name'
def show_variable(name)
query_value("SELECT @@#{name}", "SCHEMA")
rescue ActiveRecord::StatementInvalid
nil
end
def primary_keys(table_name) # :nodoc:
raise ArgumentError unless table_name.present?
scope = quoted_scope(table_name)
query_values(<<~SQL, "SCHEMA")
SELECT column_name
FROM information_schema.statistics
WHERE index_name = 'PRIMARY'
AND table_schema = #{scope[:schema]}
AND table_name = #{scope[:name]}
ORDER BY seq_in_index
SQL
end
def case_sensitive_comparison(attribute, value) # :nodoc:
column = column_for_attribute(attribute)
if column.collation && !column.case_sensitive?
attribute.eq(Arel::Nodes::Bin.new(value))
else
super
end
end
def can_perform_case_insensitive_comparison_for?(column)
column.case_sensitive?
end
private :can_perform_case_insensitive_comparison_for?
# In MySQL 5.7.5 and up, ONLY_FULL_GROUP_BY affects handling of queries that use
# DISTINCT and ORDER BY. It requires the ORDER BY columns in the select list for
# distinct queries, and requires that the ORDER BY include the distinct column.
# See https://dev.mysql.com/doc/refman/en/group-by-handling.html
def columns_for_distinct(columns, orders) # :nodoc:
order_columns = orders.compact_blank.map { |s|
# Convert Arel node to string
s = visitor.compile(s) unless s.is_a?(String)
# Remove any ASC/DESC modifiers
s.gsub(/\s+(?:ASC|DESC)\b/i, "")
}.compact_blank.map.with_index { |column, i| "#{column} AS alias_#{i}" }
(order_columns << super).join(", ")
end
def strict_mode?
self.class.type_cast_config_to_boolean(@config.fetch(:strict, true))
end
def default_index_type?(index) # :nodoc:
index.using == :btree || super
end
def build_insert_sql(insert) # :nodoc:
sql = +"INSERT #{insert.into} #{insert.values_list}"
if insert.skip_duplicates?
no_op_column = quote_column_name(insert.keys.first)
sql << " ON DUPLICATE KEY UPDATE #{no_op_column}=#{no_op_column}"
elsif insert.update_duplicates?
sql << " ON DUPLICATE KEY UPDATE "
if insert.raw_update_sql?
sql << insert.raw_update_sql
else
sql << insert.touch_model_timestamps_unless { |column| "#{column}<=>VALUES(#{column})" }
sql << insert.updatable_columns.map { |column| "#{column}=VALUES(#{column})" }.join(",")
end
end
sql
end
def check_version # :nodoc:
if database_version < "5.5.8"
raise "Your version of MySQL (#{database_version}) is too old. Active Record supports MySQL >= 5.5.8."
end
end
class << self
def extended_type_map(default_timezone: nil, emulate_booleans:) # :nodoc:
super(default_timezone: default_timezone).tap do |m|
if emulate_booleans
m.register_type %r(^tinyint\(1\))i, Type::Boolean.new
end
end
end
private
def initialize_type_map(m)
super
m.register_type(%r(char)i) do |sql_type|
limit = extract_limit(sql_type)
Type.lookup(:string, adapter: :mysql2, limit: limit)
end
m.register_type %r(tinytext)i, Type::Text.new(limit: 2**8 - 1)
m.register_type %r(tinyblob)i, Type::Binary.new(limit: 2**8 - 1)
m.register_type %r(text)i, Type::Text.new(limit: 2**16 - 1)
m.register_type %r(blob)i, Type::Binary.new(limit: 2**16 - 1)
m.register_type %r(mediumtext)i, Type::Text.new(limit: 2**24 - 1)
m.register_type %r(mediumblob)i, Type::Binary.new(limit: 2**24 - 1)
m.register_type %r(longtext)i, Type::Text.new(limit: 2**32 - 1)
m.register_type %r(longblob)i, Type::Binary.new(limit: 2**32 - 1)
m.register_type %r(^float)i, Type::Float.new(limit: 24)
m.register_type %r(^double)i, Type::Float.new(limit: 53)
register_integer_type m, %r(^bigint)i, limit: 8
register_integer_type m, %r(^int)i, limit: 4
register_integer_type m, %r(^mediumint)i, limit: 3
register_integer_type m, %r(^smallint)i, limit: 2
register_integer_type m, %r(^tinyint)i, limit: 1
m.alias_type %r(year)i, "integer"
m.alias_type %r(bit)i, "binary"
m.register_type %r(^enum)i, Type.lookup(:string, adapter: :mysql2)
m.register_type %r(^set)i, Type.lookup(:string, adapter: :mysql2)
end
def register_integer_type(mapping, key, **options)
mapping.register_type(key) do |sql_type|
if /\bunsigned\b/.match?(sql_type)
Type::UnsignedInteger.new(**options)
else
Type::Integer.new(**options)
end
end
end
def extract_precision(sql_type)
if /\A(?:date)?time(?:stamp)?\b/.match?(sql_type)
super || 0
else
super
end
end
end
TYPE_MAP = Type::TypeMap.new.tap { |m| initialize_type_map(m) }
EXTENDED_TYPE_MAPS = Concurrent::Map.new
EMULATE_BOOLEANS_TRUE = { emulate_booleans: true }.freeze
private
def extended_type_map_key
if @default_timezone
{ default_timezone: @default_timezone, emulate_booleans: emulate_booleans }
elsif emulate_booleans
EMULATE_BOOLEANS_TRUE
end
end
def raw_execute(sql, name, async: false)
materialize_transactions
mark_transaction_written_if_write(sql)
log(sql, name, async: async) do
ActiveSupport::Dependencies.interlock.permit_concurrent_loads do
@raw_connection.query(sql)
end
end
end
# See https://dev.mysql.com/doc/mysql-errors/en/server-error-reference.html
ER_DB_CREATE_EXISTS = 1007
ER_FILSORT_ABORT = 1028
ER_DUP_ENTRY = 1062
ER_NOT_NULL_VIOLATION = 1048
ER_NO_REFERENCED_ROW = 1216
ER_ROW_IS_REFERENCED = 1217
ER_DO_NOT_HAVE_DEFAULT = 1364
ER_ROW_IS_REFERENCED_2 = 1451
ER_NO_REFERENCED_ROW_2 = 1452
ER_DATA_TOO_LONG = 1406
ER_OUT_OF_RANGE = 1264
ER_LOCK_DEADLOCK = 1213
ER_CANNOT_ADD_FOREIGN = 1215
ER_CANNOT_CREATE_TABLE = 1005
ER_LOCK_WAIT_TIMEOUT = 1205
ER_QUERY_INTERRUPTED = 1317
ER_QUERY_TIMEOUT = 3024
ER_FK_INCOMPATIBLE_COLUMNS = 3780
def translate_exception(exception, message:, sql:, binds:)
case error_number(exception)
when nil
if exception.message.match?(/MySQL client is not connected/i)
ConnectionNotEstablished.new(exception)
else
super
end
when ER_DB_CREATE_EXISTS
DatabaseAlreadyExists.new(message, sql: sql, binds: binds)
when ER_DUP_ENTRY
RecordNotUnique.new(message, sql: sql, binds: binds)
when ER_NO_REFERENCED_ROW, ER_ROW_IS_REFERENCED, ER_ROW_IS_REFERENCED_2, ER_NO_REFERENCED_ROW_2
InvalidForeignKey.new(message, sql: sql, binds: binds)
when ER_CANNOT_ADD_FOREIGN, ER_FK_INCOMPATIBLE_COLUMNS
mismatched_foreign_key(message, sql: sql, binds: binds)
when ER_CANNOT_CREATE_TABLE
if message.include?("errno: 150")
mismatched_foreign_key(message, sql: sql, binds: binds)
else
super
end
when ER_DATA_TOO_LONG
ValueTooLong.new(message, sql: sql, binds: binds)
when ER_OUT_OF_RANGE
RangeError.new(message, sql: sql, binds: binds)
when ER_NOT_NULL_VIOLATION, ER_DO_NOT_HAVE_DEFAULT
NotNullViolation.new(message, sql: sql, binds: binds)
when ER_LOCK_DEADLOCK
Deadlocked.new(message, sql: sql, binds: binds)
when ER_LOCK_WAIT_TIMEOUT
LockWaitTimeout.new(message, sql: sql, binds: binds)
when ER_QUERY_TIMEOUT, ER_FILSORT_ABORT
StatementTimeout.new(message, sql: sql, binds: binds)
when ER_QUERY_INTERRUPTED
QueryCanceled.new(message, sql: sql, binds: binds)
else
super
end
end
def change_column_for_alter(table_name, column_name, type, **options)
column = column_for(table_name, column_name)
type ||= column.sql_type
unless options.key?(:default)
options[:default] = column.default
end
unless options.key?(:null)
options[:null] = column.null
end
unless options.key?(:comment)
options[:comment] = column.comment
end
unless options.key?(:auto_increment)
options[:auto_increment] = column.auto_increment?
end
td = create_table_definition(table_name)
cd = td.new_column_definition(column.name, type, **options)
schema_creation.accept(ChangeColumnDefinition.new(cd, column.name))
end
def rename_column_for_alter(table_name, column_name, new_column_name)
return rename_column_sql(table_name, column_name, new_column_name) if supports_rename_column?
column = column_for(table_name, column_name)
options = {
default: column.default,
null: column.null,
auto_increment: column.auto_increment?,
comment: column.comment
}
current_type = exec_query("SHOW COLUMNS FROM #{quote_table_name(table_name)} LIKE #{quote(column_name)}", "SCHEMA").first["Type"]
td = create_table_definition(table_name)
cd = td.new_column_definition(new_column_name, current_type, **options)
schema_creation.accept(ChangeColumnDefinition.new(cd, column.name))
end
def add_index_for_alter(table_name, column_name, **options)
index, algorithm, _ = add_index_options(table_name, column_name, **options)
algorithm = ", #{algorithm}" if algorithm
"ADD #{schema_creation.accept(index)}#{algorithm}"
end
def remove_index_for_alter(table_name, column_name = nil, **options)
index_name = index_name_for_remove(table_name, column_name, options)
"DROP INDEX #{quote_column_name(index_name)}"
end
def supports_rename_index?
if mariadb?
database_version >= "10.5.2"
else
database_version >= "5.7.6"
end
end
def supports_rename_column?
if mariadb?
database_version >= "10.5.2"
else
database_version >= "8.0.3"
end
end
def configure_connection
variables = @config.fetch(:variables, {}).stringify_keys
# By default, MySQL 'where id is null' selects the last inserted id; Turn this off.
variables["sql_auto_is_null"] = 0
# Increase timeout so the server doesn't disconnect us.
wait_timeout = self.class.type_cast_config_to_integer(@config[:wait_timeout])
wait_timeout = 2147483 unless wait_timeout.is_a?(Integer)
variables["wait_timeout"] = wait_timeout
defaults = [":default", :default].to_set
# Make MySQL reject illegal values rather than truncating or blanking them, see
# https://dev.mysql.com/doc/refman/en/sql-mode.html#sqlmode_strict_all_tables
# If the user has provided another value for sql_mode, don't replace it.
if sql_mode = variables.delete("sql_mode")
sql_mode = quote(sql_mode)
elsif !defaults.include?(strict_mode?)
if strict_mode?
sql_mode = "CONCAT(@@sql_mode, ',STRICT_ALL_TABLES')"
else
sql_mode = "REPLACE(@@sql_mode, 'STRICT_TRANS_TABLES', '')"
sql_mode = "REPLACE(#{sql_mode}, 'STRICT_ALL_TABLES', '')"
sql_mode = "REPLACE(#{sql_mode}, 'TRADITIONAL', '')"
end
sql_mode = "CONCAT(#{sql_mode}, ',NO_AUTO_VALUE_ON_ZERO')"
end
sql_mode_assignment = "@@SESSION.sql_mode = #{sql_mode}, " if sql_mode
# NAMES does not have an equals sign, see
# https://dev.mysql.com/doc/refman/en/set-names.html
# (trailing comma because variable_assignments will always have content)
if @config[:encoding]
encoding = +"NAMES #{@config[:encoding]}"
encoding << " COLLATE #{@config[:collation]}" if @config[:collation]
encoding << ", "
end
# Gather up all of the SET variables...
variable_assignments = variables.filter_map do |k, v|
if defaults.include?(v)
"@@SESSION.#{k} = DEFAULT" # Sets the value to the global or compile default
elsif !v.nil?
"@@SESSION.#{k} = #{quote(v)}"
end
end.join(", ")
# ...and send them all in one query
execute("SET #{encoding} #{sql_mode_assignment} #{variable_assignments}", "SCHEMA")
end
def column_definitions(table_name) # :nodoc:
execute_and_free("SHOW FULL FIELDS FROM #{quote_table_name(table_name)}", "SCHEMA") do |result|
each_hash(result)
end
end
def create_table_info(table_name) # :nodoc:
exec_query("SHOW CREATE TABLE #{quote_table_name(table_name)}", "SCHEMA").first["Create Table"]
end
def arel_visitor
Arel::Visitors::MySQL.new(self)
end
def build_statement_pool
StatementPool.new(self.class.type_cast_config_to_integer(@config[:statement_limit]))
end
def mismatched_foreign_key(message, sql:, binds:)
match = %r/
(?:CREATE|ALTER)\s+TABLE\s*(?:`?\w+`?\.)?`?(?<table>\w+)`?.+?
FOREIGN\s+KEY\s*\(`?(?<foreign_key>\w+)`?\)\s*
REFERENCES\s*(`?(?<target_table>\w+)`?)\s*\(`?(?<primary_key>\w+)`?\)
/xmi.match(sql)
options = {
message: message,
sql: sql,
binds: binds,
}
if match
options[:table] = match[:table]
options[:foreign_key] = match[:foreign_key]
options[:target_table] = match[:target_table]
options[:primary_key] = match[:primary_key]
options[:primary_key_column] = column_for(match[:target_table], match[:primary_key])
end
MismatchedForeignKey.new(**options)
end
def version_string(full_version_string)
full_version_string.match(/^(?:5\.5\.5-)?(\d+\.\d+\.\d+)/)[1]
end
ActiveRecord::Type.register(:immutable_string, adapter: :mysql2) do |_, **args|
Type::ImmutableString.new(true: "1", false: "0", **args)
end
ActiveRecord::Type.register(:string, adapter: :mysql2) do |_, **args|
Type::String.new(true: "1", false: "0", **args)
end
ActiveRecord::Type.register(:unsigned_integer, Type::UnsignedInteger, adapter: :mysql2)
end
end
end
# frozen_string_literal: true
require "rails/generators/named_base"
require "rails/generators/active_model"
require "rails/generators/active_record/migration"
require "active_record"
module ActiveRecord
module Generators # :nodoc:
class Base < Rails::Generators::NamedBase # :nodoc:
include ActiveRecord::Generators::Migration
# Set the current directory as base for the inherited generators.
def self.base_root
__dir__
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
# :stopdoc:
module Type
class AdapterSpecificRegistry # :nodoc:
def initialize
@registrations = []
end
def initialize_copy(other)
@registrations = @registrations.dup
end
def add_modifier(options, klass, **args)
registrations << DecorationRegistration.new(options, klass, **args)
end
def register(type_name, klass = nil, **options, &block)
unless block_given?
block = proc { |_, *args| klass.new(*args) }
block.ruby2_keywords if block.respond_to?(:ruby2_keywords)
end
registrations << Registration.new(type_name, block, **options)
end
def lookup(symbol, *args, **kwargs)
registration = find_registration(symbol, *args, **kwargs)
if registration
registration.call(self, symbol, *args, **kwargs)
else
raise ArgumentError, "Unknown type #{symbol.inspect}"
end
end
private
attr_reader :registrations
def find_registration(symbol, *args, **kwargs)
registrations
.select { |registration| registration.matches?(symbol, *args, **kwargs) }
.max
end
end
class Registration # :nodoc:
def initialize(name, block, adapter: nil, override: nil)
@name = name
@block = block
@adapter = adapter
@override = override
end
def call(_registry, *args, adapter: nil, **kwargs)
block.call(*args, **kwargs)
end
def matches?(type_name, *args, **kwargs)
type_name == name && matches_adapter?(**kwargs)
end
def <=>(other)
if conflicts_with?(other)
raise TypeConflictError.new("Type #{name} was registered for all
adapters, but shadows a native type with
the same name for #{other.adapter}".squish)
end
priority <=> other.priority
end
protected
attr_reader :name, :block, :adapter, :override
def priority
result = 0
if adapter
result |= 1
end
if override
result |= 2
end
result
end
def priority_except_adapter
priority & 0b111111100
end
private
def matches_adapter?(adapter: nil, **)
(self.adapter.nil? || adapter == self.adapter)
end
def conflicts_with?(other)
same_priority_except_adapter?(other) &&
has_adapter_conflict?(other)
end
def same_priority_except_adapter?(other)
priority_except_adapter == other.priority_except_adapter
end
def has_adapter_conflict?(other)
(override.nil? && other.adapter) ||
(adapter && other.override.nil?)
end
end
class DecorationRegistration < Registration # :nodoc:
def initialize(options, klass, adapter: nil)
@options = options
@klass = klass
@adapter = adapter
end
def call(registry, *args, **kwargs)
subtype = registry.lookup(*args, **kwargs.except(*options.keys))
klass.new(subtype)
end
def matches?(*args, **kwargs)
matches_adapter?(**kwargs) && matches_options?(**kwargs)
end
def priority
super | 4
end
private
attr_reader :options, :klass
def matches_options?(**kwargs)
options.all? do |key, value|
kwargs[key] == value
end
end
end
end
class TypeConflictError < StandardError # :nodoc:
end
# :startdoc:
end
# frozen_string_literal: true
require "openssl"
module ActiveRecord
module Encryption
class Cipher
# A 256-GCM cipher.
#
# By default it will use random initialization vectors. For deterministic encryption, it will use a SHA-256 hash of
# the text to encrypt and the secret.
#
# See +Encryptor+
class Aes256Gcm
CIPHER_TYPE = "aes-256-gcm"
class << self
def key_length
OpenSSL::Cipher.new(CIPHER_TYPE).key_len
end
def iv_length
OpenSSL::Cipher.new(CIPHER_TYPE).iv_len
end
end
# When iv not provided, it will generate a random iv on each encryption operation (default and
# recommended operation)
def initialize(secret, deterministic: false)
@secret = secret
@deterministic = deterministic
end
def encrypt(clear_text)
# This code is extracted from +ActiveSupport::MessageEncryptor+. Not using it directly because we want to control
# the message format and only serialize things once at the +ActiveRecord::Encryption::Message+ level. Also, this
# cipher is prepared to deal with deterministic/non deterministic encryption modes.
cipher = OpenSSL::Cipher.new(CIPHER_TYPE)
cipher.encrypt
cipher.key = @secret
iv = generate_iv(cipher, clear_text)
cipher.iv = iv
encrypted_data = clear_text.empty? ? clear_text.dup : cipher.update(clear_text)
encrypted_data << cipher.final
ActiveRecord::Encryption::Message.new(payload: encrypted_data).tap do |message|
message.headers.iv = iv
message.headers.auth_tag = cipher.auth_tag
end
end
def decrypt(encrypted_message)
encrypted_data = encrypted_message.payload
iv = encrypted_message.headers.iv
auth_tag = encrypted_message.headers.auth_tag
# Currently the OpenSSL bindings do not raise an error if auth_tag is
# truncated, which would allow an attacker to easily forge it. See
# https://github.com/ruby/openssl/issues/63
raise ActiveRecord::Encryption::Errors::EncryptedContentIntegrity if auth_tag.nil? || auth_tag.bytes.length != 16
cipher = OpenSSL::Cipher.new(CIPHER_TYPE)
cipher.decrypt
cipher.key = @secret
cipher.iv = iv
cipher.auth_tag = auth_tag
cipher.auth_data = ""
decrypted_data = encrypted_data.empty? ? encrypted_data : cipher.update(encrypted_data)
decrypted_data << cipher.final
decrypted_data
rescue OpenSSL::Cipher::CipherError, TypeError, ArgumentError
raise ActiveRecord::Encryption::Errors::Decryption
end
private
def generate_iv(cipher, clear_text)
if @deterministic
generate_deterministic_iv(clear_text)
else
cipher.random_iv
end
end
def generate_deterministic_iv(clear_text)
OpenSSL::HMAC.digest(OpenSSL::Digest::SHA256.new, @secret, clear_text)[0, ActiveRecord::Encryption.cipher.iv_length]
end
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
# See ActiveRecord::Aggregations::ClassMethods for documentation
module Aggregations
def initialize_dup(*) # :nodoc:
@aggregation_cache = {}
super
end
def reload(*) # :nodoc:
clear_aggregation_cache
super
end
private
def clear_aggregation_cache
@aggregation_cache.clear if persisted?
end
def init_internals
@aggregation_cache = {}
super
end
# Active Record implements aggregation through a macro-like class method called #composed_of
# for representing attributes as value objects. It expresses relationships like "Account [is]
# composed of Money [among other things]" or "Person [is] composed of [an] address". Each call
# to the macro adds a description of how the value objects are created from the attributes of
# the entity object (when the entity is initialized either as a new object or from finding an
# existing object) and how it can be turned back into attributes (when the entity is saved to
# the database).
#
# class Customer < ActiveRecord::Base
# composed_of :balance, class_name: "Money", mapping: %w(balance amount)
# composed_of :address, mapping: [ %w(address_street street), %w(address_city city) ]
# end
#
# The customer class now has the following methods to manipulate the value objects:
# * <tt>Customer#balance, Customer#balance=(money)</tt>
# * <tt>Customer#address, Customer#address=(address)</tt>
#
# These methods will operate with value objects like the ones described below:
#
# class Money
# include Comparable
# attr_reader :amount, :currency
# EXCHANGE_RATES = { "USD_TO_DKK" => 6 }
#
# def initialize(amount, currency = "USD")
# @amount, @currency = amount, currency
# end
#
# def exchange_to(other_currency)
# exchanged_amount = (amount * EXCHANGE_RATES["#{currency}_TO_#{other_currency}"]).floor
# Money.new(exchanged_amount, other_currency)
# end
#
# def ==(other_money)
# amount == other_money.amount && currency == other_money.currency
# end
#
# def <=>(other_money)
# if currency == other_money.currency
# amount <=> other_money.amount
# else
# amount <=> other_money.exchange_to(currency).amount
# end
# end
# end
#
# class Address
# attr_reader :street, :city
# def initialize(street, city)
# @street, @city = street, city
# end
#
# def close_to?(other_address)
# city == other_address.city
# end
#
# def ==(other_address)
# city == other_address.city && street == other_address.street
# end
# end
#
# Now it's possible to access attributes from the database through the value objects instead. If
# you choose to name the composition the same as the attribute's name, it will be the only way to
# access that attribute. That's the case with our +balance+ attribute. You interact with the value
# objects just like you would with any other attribute:
#
# customer.balance = Money.new(20) # sets the Money value object and the attribute
# customer.balance # => Money value object
# customer.balance.exchange_to("DKK") # => Money.new(120, "DKK")
# customer.balance > Money.new(10) # => true
# customer.balance == Money.new(20) # => true
# customer.balance < Money.new(5) # => false
#
# Value objects can also be composed of multiple attributes, such as the case of Address. The order
# of the mappings will determine the order of the parameters.
#
# customer.address_street = "Hyancintvej"
# customer.address_city = "Copenhagen"
# customer.address # => Address.new("Hyancintvej", "Copenhagen")
#
# customer.address = Address.new("May Street", "Chicago")
# customer.address_street # => "May Street"
# customer.address_city # => "Chicago"
#
# == Writing value objects
#
# Value objects are immutable and interchangeable objects that represent a given value, such as
# a Money object representing $5. Two Money objects both representing $5 should be equal (through
# methods such as <tt>==</tt> and <tt><=></tt> from Comparable if ranking makes sense). This is
# unlike entity objects where equality is determined by identity. An entity class such as Customer can
# easily have two different objects that both have an address on Hyancintvej. Entity identity is
# determined by object or relational unique identifiers (such as primary keys). Normal
# ActiveRecord::Base classes are entity objects.
#
# It's also important to treat the value objects as immutable. Don't allow the Money object to have
# its amount changed after creation. Create a new Money object with the new value instead. The
# <tt>Money#exchange_to</tt> method is an example of this. It returns a new value object instead of changing
# its own values. Active Record won't persist value objects that have been changed through means
# other than the writer method.
#
# The immutable requirement is enforced by Active Record by freezing any object assigned as a value
# object. Attempting to change it afterwards will result in a +RuntimeError+.
#
# Read more about value objects on http://c2.com/cgi/wiki?ValueObject and on the dangers of not
# keeping value objects immutable on http://c2.com/cgi/wiki?ValueObjectsShouldBeImmutable
#
# == Custom constructors and converters
#
# By default value objects are initialized by calling the <tt>new</tt> constructor of the value
# class passing each of the mapped attributes, in the order specified by the <tt>:mapping</tt>
# option, as arguments. If the value class doesn't support this convention then #composed_of allows
# a custom constructor to be specified.
#
# When a new value is assigned to the value object, the default assumption is that the new value
# is an instance of the value class. Specifying a custom converter allows the new value to be automatically
# converted to an instance of value class if necessary.
#
# For example, the +NetworkResource+ model has +network_address+ and +cidr_range+ attributes that should be
# aggregated using the +NetAddr::CIDR+ value class (https://www.rubydoc.info/gems/netaddr/1.5.0/NetAddr/CIDR).
# The constructor for the value class is called +create+ and it expects a CIDR address string as a parameter.
# New values can be assigned to the value object using either another +NetAddr::CIDR+ object, a string
# or an array. The <tt>:constructor</tt> and <tt>:converter</tt> options can be used to meet
# these requirements:
#
# class NetworkResource < ActiveRecord::Base
# composed_of :cidr,
# class_name: 'NetAddr::CIDR',
# mapping: [ %w(network_address network), %w(cidr_range bits) ],
# allow_nil: true,
# constructor: Proc.new { |network_address, cidr_range| NetAddr::CIDR.create("#{network_address}/#{cidr_range}") },
# converter: Proc.new { |value| NetAddr::CIDR.create(value.is_a?(Array) ? value.join('/') : value) }
# end
#
# # This calls the :constructor
# network_resource = NetworkResource.new(network_address: '192.168.0.1', cidr_range: 24)
#
# # These assignments will both use the :converter
# network_resource.cidr = [ '192.168.2.1', 8 ]
# network_resource.cidr = '192.168.0.1/24'
#
# # This assignment won't use the :converter as the value is already an instance of the value class
# network_resource.cidr = NetAddr::CIDR.create('192.168.2.1/8')
#
# # Saving and then reloading will use the :constructor on reload
# network_resource.save
# network_resource.reload
#
# == Finding records by a value object
#
# Once a #composed_of relationship is specified for a model, records can be loaded from the database
# by specifying an instance of the value object in the conditions hash. The following example
# finds all customers with +address_street+ equal to "May Street" and +address_city+ equal to "Chicago":
#
# Customer.where(address: Address.new("May Street", "Chicago"))
#
module ClassMethods
# Adds reader and writer methods for manipulating a value object:
# <tt>composed_of :address</tt> adds <tt>address</tt> and <tt>address=(new_address)</tt> methods.
#
# Options are:
# * <tt>:class_name</tt> - Specifies the class name of the association. Use it only if that name
# can't be inferred from the part id. So <tt>composed_of :address</tt> will by default be linked
# to the Address class, but if the real class name is +CompanyAddress+, you'll have to specify it
# with this option.
# * <tt>:mapping</tt> - Specifies the mapping of entity attributes to attributes of the value
# object. Each mapping is represented as an array where the first item is the name of the
# entity attribute and the second item is the name of the attribute in the value object. The
# order in which mappings are defined determines the order in which attributes are sent to the
# value class constructor.
# * <tt>:allow_nil</tt> - Specifies that the value object will not be instantiated when all mapped
# attributes are +nil+. Setting the value object to +nil+ has the effect of writing +nil+ to all
# mapped attributes.
# This defaults to +false+.
# * <tt>:constructor</tt> - A symbol specifying the name of the constructor method or a Proc that
# is called to initialize the value object. The constructor is passed all of the mapped attributes,
# in the order that they are defined in the <tt>:mapping option</tt>, as arguments and uses them
# to instantiate a <tt>:class_name</tt> object.
# The default is <tt>:new</tt>.
# * <tt>:converter</tt> - A symbol specifying the name of a class method of <tt>:class_name</tt>
# or a Proc that is called when a new value is assigned to the value object. The converter is
# passed the single value that is used in the assignment and is only called if the new value is
# not an instance of <tt>:class_name</tt>. If <tt>:allow_nil</tt> is set to true, the converter
# can return +nil+ to skip the assignment.
#
# Option examples:
# composed_of :temperature, mapping: %w(reading celsius)
# composed_of :balance, class_name: "Money", mapping: %w(balance amount)
# composed_of :address, mapping: [ %w(address_street street), %w(address_city city) ]
# composed_of :gps_location
# composed_of :gps_location, allow_nil: true
# composed_of :ip_address,
# class_name: 'IPAddr',
# mapping: %w(ip to_i),
# constructor: Proc.new { |ip| IPAddr.new(ip, Socket::AF_INET) },
# converter: Proc.new { |ip| ip.is_a?(Integer) ? IPAddr.new(ip, Socket::AF_INET) : IPAddr.new(ip.to_s) }
#
def composed_of(part_id, options = {})
options.assert_valid_keys(:class_name, :mapping, :allow_nil, :constructor, :converter)
unless self < Aggregations
include Aggregations
end
name = part_id.id2name
class_name = options[:class_name] || name.camelize
mapping = options[:mapping] || [ name, name ]
mapping = [ mapping ] unless mapping.first.is_a?(Array)
allow_nil = options[:allow_nil] || false
constructor = options[:constructor] || :new
converter = options[:converter]
reader_method(name, class_name, mapping, allow_nil, constructor)
writer_method(name, class_name, mapping, allow_nil, converter)
reflection = ActiveRecord::Reflection.create(:composed_of, part_id, nil, options, self)
Reflection.add_aggregate_reflection self, part_id, reflection
end
private
def reader_method(name, class_name, mapping, allow_nil, constructor)
define_method(name) do
if @aggregation_cache[name].nil? && (!allow_nil || mapping.any? { |key, _| !read_attribute(key).nil? })
attrs = mapping.collect { |key, _| read_attribute(key) }
object = constructor.respond_to?(:call) ?
constructor.call(*attrs) :
class_name.constantize.send(constructor, *attrs)
@aggregation_cache[name] = object
end
@aggregation_cache[name]
end
end
def writer_method(name, class_name, mapping, allow_nil, converter)
define_method("#{name}=") do |part|
klass = class_name.constantize
unless part.is_a?(klass) || converter.nil? || part.nil?
part = converter.respond_to?(:call) ? converter.call(part) : klass.send(converter, part)
end
hash_from_multiparameter_assignment = part.is_a?(Hash) &&
part.keys.all?(Integer)
if hash_from_multiparameter_assignment
raise ArgumentError unless part.size == part.each_key.max
part = klass.new(*part.sort.map(&:last))
end
if part.nil? && allow_nil
mapping.each { |key, _| write_attribute(key, nil) }
@aggregation_cache[name] = nil
else
mapping.each { |key, value| write_attribute(key, part.send(value)) }
@aggregation_cache[name] = part.freeze
end
end
end
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module AliasPredication
def as(other)
Nodes::As.new self, Nodes::SqlLiteral.new(other)
end
end
end
# frozen_string_literal: true
require "active_support/core_ext/string/conversions"
module ActiveRecord
module Associations
# Keeps track of table aliases for ActiveRecord::Associations::JoinDependency
class AliasTracker # :nodoc:
def self.create(connection, initial_table, joins, aliases = nil)
if joins.empty?
aliases ||= Hash.new(0)
elsif aliases
default_proc = aliases.default_proc || proc { 0 }
aliases.default_proc = proc { |h, k|
h[k] = initial_count_for(connection, k, joins) + default_proc.call(h, k)
}
else
aliases = Hash.new { |h, k|
h[k] = initial_count_for(connection, k, joins)
}
end
aliases[initial_table] = 1
new(connection, aliases)
end
def self.initial_count_for(connection, name, table_joins)
quoted_name = nil
counts = table_joins.map do |join|
if join.is_a?(Arel::Nodes::StringJoin)
# quoted_name should be case ignored as some database adapters (Oracle) return quoted name in uppercase
quoted_name ||= connection.quote_table_name(name)
# Table names + table aliases
join.left.scan(
/JOIN(?:\s+\w+)?\s+(?:\S+\s+)?(?:#{quoted_name}|#{name})\sON/i
).size
elsif join.is_a?(Arel::Nodes::Join)
join.left.name == name ? 1 : 0
else
raise ArgumentError, "joins list should be initialized by list of Arel::Nodes::Join"
end
end
counts.sum
end
# table_joins is an array of arel joins which might conflict with the aliases we assign here
def initialize(connection, aliases)
@aliases = aliases
@connection = connection
end
def aliased_table_for(arel_table, table_name = nil)
table_name ||= arel_table.name
if aliases[table_name] == 0
# If it's zero, we can have our table_name
aliases[table_name] = 1
arel_table = arel_table.alias(table_name) if arel_table.name != table_name
else
# Otherwise, we need to use an alias
aliased_name = @connection.table_alias_for(yield)
# Update the count
count = aliases[aliased_name] += 1
aliased_name = "#{truncate(aliased_name)}_#{count}" if count > 1
arel_table = arel_table.alias(aliased_name)
end
arel_table
end
attr_reader :aliases
private
def truncate(name)
name.slice(0, @connection.table_alias_length - 2)
end
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Nodes
class And < Arel::Nodes::NodeExpression
attr_reader :children
def initialize(children)
super()
@children = children
end
def left
children.first
end
def right
children[1]
end
def hash
children.hash
end
def eql?(other)
self.class == other.class &&
self.children == other.children
end
alias :== :eql?
end
end
end
# frozen_string_literal: true
require "rails/generators/active_record"
module ActiveRecord
module Generators # :nodoc:
class ApplicationRecordGenerator < ::Rails::Generators::Base # :nodoc:
source_root File.expand_path("templates", __dir__)
# FIXME: Change this file to a symlink once RubyGems 2.5.0 is required.
def create_application_record
template "application_record.rb", application_record_file_name
end
private
def application_record_file_name
@application_record_file_name ||=
if namespaced?
"app/models/#{namespaced_path}/application_record.rb"
else
"app/models/application_record.rb"
end
end
end
end
end
# frozen_string_literal: true
require "arel/errors"
require "arel/crud"
require "arel/factory_methods"
require "arel/expressions"
require "arel/predications"
require "arel/filter_predications"
require "arel/window_predications"
require "arel/math"
require "arel/alias_predication"
require "arel/order_predications"
require "arel/table"
require "arel/attributes/attribute"
require "arel/visitors"
require "arel/collectors/sql_string"
require "arel/tree_manager"
require "arel/insert_manager"
require "arel/select_manager"
require "arel/update_manager"
require "arel/delete_manager"
require "arel/nodes"
module Arel
VERSION = "10.0.0"
# Wrap a known-safe SQL string for passing to query methods, e.g.
#
# Post.order(Arel.sql("REPLACE(title, 'misc', 'zzzz') asc")).pluck(:id)
#
# Great caution should be taken to avoid SQL injection vulnerabilities.
# This method should not be used with unsafe values such as request
# parameters or model attributes.
def self.sql(raw_sql)
Arel::Nodes::SqlLiteral.new raw_sql
end
def self.star # :nodoc:
sql "*"
end
def self.arel_node?(value) # :nodoc:
value.is_a?(Arel::Nodes::Node) || value.is_a?(Arel::Attribute) || value.is_a?(Arel::Nodes::SqlLiteral)
end
def self.fetch_attribute(value, &block) # :nodoc:
unless String === value
value.fetch_attribute(&block)
end
end
end
# frozen_string_literal: true
module ActiveRecord
module ConnectionAdapters
module PostgreSQL
module OID # :nodoc:
class Array < Type::Value # :nodoc:
include ActiveModel::Type::Helpers::Mutable
Data = Struct.new(:encoder, :values) # :nodoc:
attr_reader :subtype, :delimiter
delegate :type, :user_input_in_time_zone, :limit, :precision, :scale, to: :subtype
def initialize(subtype, delimiter = ",")
@subtype = subtype
@delimiter = delimiter
@pg_encoder = PG::TextEncoder::Array.new name: "#{type}[]", delimiter: delimiter
@pg_decoder = PG::TextDecoder::Array.new name: "#{type}[]", delimiter: delimiter
end
def deserialize(value)
case value
when ::String
type_cast_array(@pg_decoder.decode(value), :deserialize)
when Data
type_cast_array(value.values, :deserialize)
else
super
end
end
def cast(value)
if value.is_a?(::String)
value = begin
@pg_decoder.decode(value)
rescue TypeError
# malformed array string is treated as [], will raise in PG 2.0 gem
# this keeps a consistent implementation
[]
end
end
type_cast_array(value, :cast)
end
def serialize(value)
if value.is_a?(::Array)
casted_values = type_cast_array(value, :serialize)
Data.new(@pg_encoder, casted_values)
else
super
end
end
def ==(other)
other.is_a?(Array) &&
subtype == other.subtype &&
delimiter == other.delimiter
end
def type_cast_for_schema(value)
return super unless value.is_a?(::Array)
"[" + value.map { |v| subtype.type_cast_for_schema(v) }.join(", ") + "]"
end
def map(value, &block)
value.map(&block)
end
def changed_in_place?(raw_old_value, new_value)
deserialize(raw_old_value) != new_value
end
def force_equality?(value)
value.is_a?(::Array)
end
private
def type_cast_array(value, method)
if value.is_a?(::Array)
value.map { |item| type_cast_array(item, method) }
else
@subtype.public_send(method, value)
end
end
end
end
end
end
end
# frozen_string_literal: true
require "active_support/core_ext/array/extract"
module ActiveRecord
class PredicateBuilder
class ArrayHandler # :nodoc:
def initialize(predicate_builder)
@predicate_builder = predicate_builder
end
def call(attribute, value)
return attribute.in([]) if value.empty?
values = value.map { |x| x.is_a?(Base) ? x.id : x }
nils = values.extract!(&:nil?)
ranges = values.extract! { |v| v.is_a?(Range) }
values_predicate =
case values.length
when 0 then NullPredicate
when 1 then predicate_builder.build(attribute, values.first)
else Arel::Nodes::HomogeneousIn.new(values, attribute, :in)
end
unless nils.empty?
values_predicate = values_predicate.or(attribute.eq(nil))
end
if ranges.empty?
values_predicate
else
array_predicates = ranges.map! { |range| predicate_builder.build(attribute, range) }
array_predicates.inject(values_predicate, &:or)
end
end
private
attr_reader :predicate_builder
module NullPredicate # :nodoc:
def self.or(other)
other
end
end
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Nodes
class Ascending < Ordering
def reverse
Descending.new(expr)
end
def direction
:asc
end
def ascending?
true
end
def descending?
false
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Validations
class AssociatedValidator < ActiveModel::EachValidator # :nodoc:
def validate_each(record, attribute, value)
if Array(value).reject { |r| valid_object?(r) }.any?
record.errors.add(attribute, :invalid, **options.merge(value: value))
end
end
private
def valid_object?(record)
(record.respond_to?(:marked_for_destruction?) && record.marked_for_destruction?) || record.valid?
end
end
module ClassMethods
# Validates whether the associated object or objects are all valid.
# Works with any kind of association.
#
# class Book < ActiveRecord::Base
# has_many :pages
# belongs_to :library
#
# validates_associated :pages, :library
# end
#
# WARNING: This validation must not be used on both ends of an association.
# Doing so will lead to a circular dependency and cause infinite recursion.
#
# NOTE: This validation will not fail if the association hasn't been
# assigned. If you want to ensure that the association is both present and
# guaranteed to be valid, you also need to use
# {validates_presence_of}[rdoc-ref:Validations::ClassMethods#validates_presence_of].
#
# Configuration options:
#
# * <tt>:message</tt> - A custom error message (default is: "is invalid").
# * <tt>:on</tt> - Specifies the contexts where this validation is active.
# Runs in all validation contexts by default +nil+. You can pass a symbol
# or an array of symbols. (e.g. <tt>on: :create</tt> or
# <tt>on: :custom_validation_context</tt> or
# <tt>on: [:create, :custom_validation_context]</tt>)
# * <tt>:if</tt> - Specifies a method, proc, or string to call to determine
# if the validation should occur (e.g. <tt>if: :allow_validation</tt>,
# or <tt>if: Proc.new { |user| user.signup_step > 2 }</tt>). The method,
# proc or string should return or evaluate to a +true+ or +false+ value.
# * <tt>:unless</tt> - Specifies a method, proc, or string to call to
# determine if the validation should not occur (e.g. <tt>unless: :skip_validation</tt>,
# or <tt>unless: Proc.new { |user| user.signup_step <= 2 }</tt>). The
# method, proc, or string should return or evaluate to a +true+ or +false+
# value.
def validates_associated(*attr_names)
validates_with AssociatedValidator, _merge_attributes(attr_names)
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Associations
class Preloader
class Association # :nodoc:
class LoaderQuery
attr_reader :scope, :association_key_name
def initialize(scope, association_key_name)
@scope = scope
@association_key_name = association_key_name
end
def eql?(other)
association_key_name == other.association_key_name &&
scope.table_name == other.scope.table_name &&
scope.values_for_queries == other.scope.values_for_queries
end
def hash
[association_key_name, scope.table_name, scope.values_for_queries].hash
end
def records_for(loaders)
LoaderRecords.new(loaders, self).records
end
def load_records_in_batch(loaders)
raw_records = records_for(loaders)
loaders.each do |loader|
loader.load_records(raw_records)
loader.run
end
end
def load_records_for_keys(keys, &block)
scope.where(association_key_name => keys).load(&block)
end
end
class LoaderRecords
def initialize(loaders, loader_query)
@loader_query = loader_query
@loaders = loaders
@keys_to_load = Set.new
@already_loaded_records_by_key = {}
populate_keys_to_load_and_already_loaded_records
end
def records
load_records + already_loaded_records
end
private
attr_reader :loader_query, :loaders, :keys_to_load, :already_loaded_records_by_key
def populate_keys_to_load_and_already_loaded_records
loaders.each do |loader|
loader.owners_by_key.each do |key, owners|
if loaded_owner = owners.find { |owner| loader.loaded?(owner) }
already_loaded_records_by_key[key] = loader.target_for(loaded_owner)
else
keys_to_load << key
end
end
end
@keys_to_load.subtract(already_loaded_records_by_key.keys)
end
def load_records
loader_query.load_records_for_keys(keys_to_load) do |record|
loaders.each { |l| l.set_inverse(record) }
end
end
def already_loaded_records
already_loaded_records_by_key.values.flatten
end
end
attr_reader :klass
def initialize(klass, owners, reflection, preload_scope, reflection_scope, associate_by_default)
@klass = klass
@owners = owners.uniq(&:__id__)
@reflection = reflection
@preload_scope = preload_scope
@reflection_scope = reflection_scope
@associate = associate_by_default || !preload_scope || preload_scope.empty_scope?
@model = owners.first && owners.first.class
@run = false
end
def table_name
@klass.table_name
end
def future_classes
if run?
[]
else
[@klass]
end
end
def runnable_loaders
[self]
end
def run?
@run
end
def run
return self if run?
@run = true
records = records_by_owner
owners.each do |owner|
associate_records_to_owner(owner, records[owner] || [])
end if @associate
self
end
def records_by_owner
load_records unless defined?(@records_by_owner)
@records_by_owner
end
def preloaded_records
load_records unless defined?(@preloaded_records)
@preloaded_records
end
# The name of the key on the associated records
def association_key_name
reflection.join_primary_key(klass)
end
def loader_query
LoaderQuery.new(scope, association_key_name)
end
def owners_by_key
@owners_by_key ||= owners.each_with_object({}) do |owner, result|
key = convert_key(owner[owner_key_name])
(result[key] ||= []) << owner if key
end
end
def loaded?(owner)
owner.association(reflection.name).loaded?
end
def target_for(owner)
Array.wrap(owner.association(reflection.name).target)
end
def scope
@scope ||= build_scope
end
def set_inverse(record)
if owners = owners_by_key[convert_key(record[association_key_name])]
# Processing only the first owner
# because the record is modified but not an owner
association = owners.first.association(reflection.name)
association.set_inverse_instance(record)
end
end
def load_records(raw_records = nil)
# owners can be duplicated when a relation has a collection association join
# #compare_by_identity makes such owners different hash keys
@records_by_owner = {}.compare_by_identity
raw_records ||= loader_query.records_for([self])
@preloaded_records = raw_records.select do |record|
assignments = false
owners_by_key[convert_key(record[association_key_name])]&.each do |owner|
entries = (@records_by_owner[owner] ||= [])
if reflection.collection? || entries.empty?
entries << record
assignments = true
end
end
assignments
end
end
def associate_records_from_unscoped(unscoped_records)
return if unscoped_records.nil? || unscoped_records.empty?
return if !reflection_scope.empty_scope?
return if preload_scope && !preload_scope.empty_scope?
return if reflection.collection?
unscoped_records.select { |r| r[association_key_name].present? }.each do |record|
owners = owners_by_key[convert_key(record[association_key_name])]
owners&.each_with_index do |owner, i|
association = owner.association(reflection.name)
association.target = record
if i == 0 # Set inverse on first owner
association.set_inverse_instance(record)
end
end
end
end
private
attr_reader :owners, :reflection, :preload_scope, :model
# The name of the key on the model which declares the association
def owner_key_name
reflection.join_foreign_key
end
def associate_records_to_owner(owner, records)
return if loaded?(owner)
association = owner.association(reflection.name)
if reflection.collection?
association.target = records
else
association.target = records.first
end
end
def key_conversion_required?
unless defined?(@key_conversion_required)
@key_conversion_required = (association_key_type != owner_key_type)
end
@key_conversion_required
end
def convert_key(key)
if key_conversion_required?
key.to_s
else
key
end
end
def association_key_type
@klass.type_for_attribute(association_key_name).type
end
def owner_key_type
@model.type_for_attribute(owner_key_name).type
end
def reflection_scope
@reflection_scope ||= reflection.join_scopes(klass.arel_table, klass.predicate_builder, klass).inject(klass.unscoped, &:merge!)
end
def build_scope
scope = klass.scope_for_association
if reflection.type && !reflection.through_reflection?
scope.where!(reflection.type => model.polymorphic_name)
end
scope.merge!(reflection_scope) unless reflection_scope.empty_scope?
if preload_scope && !preload_scope.empty_scope?
scope.merge!(preload_scope)
end
cascade_strict_loading(scope)
end
def cascade_strict_loading(scope)
preload_scope&.strict_loading_value ? scope.strict_loading : scope
end
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
class PredicateBuilder
class AssociationQueryValue # :nodoc:
def initialize(associated_table, value)
@associated_table = associated_table
@value = value
end
def queries
[ associated_table.join_foreign_key => ids ]
end
private
attr_reader :associated_table, :value
def ids
case value
when Relation
value.select_values.empty? ? value.select(primary_key) : value
when Array
value.map { |v| convert_to_id(v) }
else
convert_to_id(value)
end
end
def primary_key
associated_table.join_primary_key
end
def convert_to_id(value)
if value.respond_to?(primary_key)
value.public_send(primary_key)
else
value
end
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
class AssociationRelation < Relation # :nodoc:
def initialize(klass, association, **)
super(klass)
@association = association
end
def proxy_association
@association
end
def ==(other)
other == records
end
%w(insert insert_all insert! insert_all! upsert upsert_all).each do |method|
class_eval <<~RUBY
def #{method}(attributes, **kwargs)
if @association.reflection.through_reflection?
raise ArgumentError, "Bulk insert or upsert is currently not supported for has_many through association"
end
scoping { klass.#{method}(attributes, **kwargs) }
end
RUBY
end
private
def _new(attributes, &block)
@association.build(attributes, &block)
end
def _create(attributes, &block)
@association.create(attributes, &block)
end
def _create!(attributes, &block)
@association.create!(attributes, &block)
end
def exec_queries
super do |record|
@association.set_inverse_instance_from_queries(record)
yield record if block_given?
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Associations
class AssociationScope # :nodoc:
def self.scope(association)
INSTANCE.scope(association)
end
def self.create(&block)
block ||= lambda { |val| val }
new(block)
end
def initialize(value_transformation)
@value_transformation = value_transformation
end
INSTANCE = create
def scope(association)
klass = association.klass
reflection = association.reflection
scope = klass.unscoped
owner = association.owner
chain = get_chain(reflection, association, scope.alias_tracker)
scope.extending! reflection.extensions
scope = add_constraints(scope, owner, chain)
scope.limit!(1) unless reflection.collection?
scope
end
def self.get_bind_values(owner, chain)
binds = []
last_reflection = chain.last
binds << last_reflection.join_id_for(owner)
if last_reflection.type
binds << owner.class.polymorphic_name
end
chain.each_cons(2).each do |reflection, next_reflection|
if reflection.type
binds << next_reflection.klass.polymorphic_name
end
end
binds
end
private
attr_reader :value_transformation
def join(table, constraint)
Arel::Nodes::LeadingJoin.new(table, Arel::Nodes::On.new(constraint))
end
def last_chain_scope(scope, reflection, owner)
primary_key = reflection.join_primary_key
foreign_key = reflection.join_foreign_key
table = reflection.aliased_table
value = transform_value(owner[foreign_key])
scope = apply_scope(scope, table, primary_key, value)
if reflection.type
polymorphic_type = transform_value(owner.class.polymorphic_name)
scope = apply_scope(scope, table, reflection.type, polymorphic_type)
end
scope
end
def transform_value(value)
value_transformation.call(value)
end
def next_chain_scope(scope, reflection, next_reflection)
primary_key = reflection.join_primary_key
foreign_key = reflection.join_foreign_key
table = reflection.aliased_table
foreign_table = next_reflection.aliased_table
constraint = table[primary_key].eq(foreign_table[foreign_key])
if reflection.type
value = transform_value(next_reflection.klass.polymorphic_name)
scope = apply_scope(scope, table, reflection.type, value)
end
scope.joins!(join(foreign_table, constraint))
end
class ReflectionProxy < SimpleDelegator # :nodoc:
attr_reader :aliased_table
def initialize(reflection, aliased_table)
super(reflection)
@aliased_table = aliased_table
end
def all_includes; nil; end
end
def get_chain(reflection, association, tracker)
name = reflection.name
chain = [Reflection::RuntimeReflection.new(reflection, association)]
reflection.chain.drop(1).each do |refl|
aliased_table = tracker.aliased_table_for(refl.klass.arel_table) do
refl.alias_candidate(name)
end
chain << ReflectionProxy.new(refl, aliased_table)
end
chain
end
def add_constraints(scope, owner, chain)
scope = last_chain_scope(scope, chain.last, owner)
chain.each_cons(2) do |reflection, next_reflection|
scope = next_chain_scope(scope, reflection, next_reflection)
end
chain_head = chain.first
chain.reverse_each do |reflection|
reflection.constraints.each do |scope_chain_item|
item = eval_scope(reflection, scope_chain_item, owner)
if scope_chain_item == chain_head.scope
scope.merge! item.except(:where, :includes, :unscope, :order)
elsif !item.references_values.empty?
scope.merge! item.only(:joins, :left_outer_joins)
associations = item.eager_load_values | item.includes_values
unless associations.empty?
scope.joins! item.construct_join_dependency(associations, Arel::Nodes::OuterJoin)
end
end
reflection.all_includes do
scope.includes_values |= item.includes_values
end
scope.unscope!(*item.unscope_values)
scope.where_clause += item.where_clause
scope.order_values = item.order_values | scope.order_values
end
end
scope
end
def apply_scope(scope, table, key, value)
if scope.table == table
scope.where!(key => value)
else
scope.where!(table.name => { key => value })
end
end
def eval_scope(reflection, scope, owner)
relation = reflection.build_scope(reflection.aliased_table)
relation.instance_exec(owner, &scope) || relation
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
class AssociationNotFoundError < ConfigurationError # :nodoc:
attr_reader :record, :association_name
def initialize(record = nil, association_name = nil)
@record = record
@association_name = association_name
if record && association_name
super("Association named '#{association_name}' was not found on #{record.class.name}; perhaps you misspelled it?")
else
super("Association was not found.")
end
end
if defined?(DidYouMean::Correctable) && defined?(DidYouMean::SpellChecker)
include DidYouMean::Correctable
def corrections
if record && association_name
@corrections ||= begin
maybe_these = record.class.reflections.keys
DidYouMean::SpellChecker.new(dictionary: maybe_these).correct(association_name)
end
else
[]
end
end
end
end
class InverseOfAssociationNotFoundError < ActiveRecordError # :nodoc:
attr_reader :reflection, :associated_class
def initialize(reflection = nil, associated_class = nil)
if reflection
@reflection = reflection
@associated_class = associated_class.nil? ? reflection.klass : associated_class
super("Could not find the inverse association for #{reflection.name} (#{reflection.options[:inverse_of].inspect} in #{associated_class.nil? ? reflection.class_name : associated_class.name})")
else
super("Could not find the inverse association.")
end
end
if defined?(DidYouMean::Correctable) && defined?(DidYouMean::SpellChecker)
include DidYouMean::Correctable
def corrections
if reflection && associated_class
@corrections ||= begin
maybe_these = associated_class.reflections.keys
DidYouMean::SpellChecker.new(dictionary: maybe_these).correct(reflection.options[:inverse_of].to_s)
end
else
[]
end
end
end
end
class InverseOfAssociationRecursiveError < ActiveRecordError # :nodoc:
attr_reader :reflection
def initialize(reflection = nil)
if reflection
@reflection = reflection
super("Inverse association #{reflection.name} (#{reflection.options[:inverse_of].inspect} in #{reflection.class_name}) is recursive.")
else
super("Inverse association is recursive.")
end
end
end
class HasManyThroughAssociationNotFoundError < ActiveRecordError # :nodoc:
attr_reader :owner_class, :reflection
def initialize(owner_class = nil, reflection = nil)
if owner_class && reflection
@owner_class = owner_class
@reflection = reflection
super("Could not find the association #{reflection.options[:through].inspect} in model #{owner_class.name}")
else
super("Could not find the association.")
end
end
if defined?(DidYouMean::Correctable) && defined?(DidYouMean::SpellChecker)
include DidYouMean::Correctable
def corrections
if owner_class && reflection
@corrections ||= begin
maybe_these = owner_class.reflections.keys
maybe_these -= [reflection.name.to_s] # remove failing reflection
DidYouMean::SpellChecker.new(dictionary: maybe_these).correct(reflection.options[:through].to_s)
end
else
[]
end
end
end
end
class HasManyThroughAssociationPolymorphicSourceError < ActiveRecordError # :nodoc:
def initialize(owner_class_name = nil, reflection = nil, source_reflection = nil)
if owner_class_name && reflection && source_reflection
super("Cannot have a has_many :through association '#{owner_class_name}##{reflection.name}' on the polymorphic object '#{source_reflection.class_name}##{source_reflection.name}' without 'source_type'. Try adding 'source_type: \"#{reflection.name.to_s.classify}\"' to 'has_many :through' definition.")
else
super("Cannot have a has_many :through association.")
end
end
end
class HasManyThroughAssociationPolymorphicThroughError < ActiveRecordError # :nodoc:
def initialize(owner_class_name = nil, reflection = nil)
if owner_class_name && reflection
super("Cannot have a has_many :through association '#{owner_class_name}##{reflection.name}' which goes through the polymorphic association '#{owner_class_name}##{reflection.through_reflection.name}'.")
else
super("Cannot have a has_many :through association.")
end
end
end
class HasManyThroughAssociationPointlessSourceTypeError < ActiveRecordError # :nodoc:
def initialize(owner_class_name = nil, reflection = nil, source_reflection = nil)
if owner_class_name && reflection && source_reflection
super("Cannot have a has_many :through association '#{owner_class_name}##{reflection.name}' with a :source_type option if the '#{reflection.through_reflection.class_name}##{source_reflection.name}' is not polymorphic. Try removing :source_type on your association.")
else
super("Cannot have a has_many :through association.")
end
end
end
class HasOneThroughCantAssociateThroughCollection < ActiveRecordError # :nodoc:
def initialize(owner_class_name = nil, reflection = nil, through_reflection = nil)
if owner_class_name && reflection && through_reflection
super("Cannot have a has_one :through association '#{owner_class_name}##{reflection.name}' where the :through association '#{owner_class_name}##{through_reflection.name}' is a collection. Specify a has_one or belongs_to association in the :through option instead.")
else
super("Cannot have a has_one :through association.")
end
end
end
class HasOneAssociationPolymorphicThroughError < ActiveRecordError # :nodoc:
def initialize(owner_class_name = nil, reflection = nil)
if owner_class_name && reflection
super("Cannot have a has_one :through association '#{owner_class_name}##{reflection.name}' which goes through the polymorphic association '#{owner_class_name}##{reflection.through_reflection.name}'.")
else
super("Cannot have a has_one :through association.")
end
end
end
class HasManyThroughSourceAssociationNotFoundError < ActiveRecordError # :nodoc:
def initialize(reflection = nil)
if reflection
through_reflection = reflection.through_reflection
source_reflection_names = reflection.source_reflection_names
source_associations = reflection.through_reflection.klass._reflections.keys
super("Could not find the source association(s) #{source_reflection_names.collect(&:inspect).to_sentence(two_words_connector: ' or ', last_word_connector: ', or ')} in model #{through_reflection.klass}. Try 'has_many #{reflection.name.inspect}, :through => #{through_reflection.name.inspect}, :source => <name>'. Is it one of #{source_associations.to_sentence(two_words_connector: ' or ', last_word_connector: ', or ')}?")
else
super("Could not find the source association(s).")
end
end
end
class HasManyThroughOrderError < ActiveRecordError # :nodoc:
def initialize(owner_class_name = nil, reflection = nil, through_reflection = nil)
if owner_class_name && reflection && through_reflection
super("Cannot have a has_many :through association '#{owner_class_name}##{reflection.name}' which goes through '#{owner_class_name}##{through_reflection.name}' before the through association is defined.")
else
super("Cannot have a has_many :through association before the through association is defined.")
end
end
end
class ThroughCantAssociateThroughHasOneOrManyReflection < ActiveRecordError # :nodoc:
def initialize(owner = nil, reflection = nil)
if owner && reflection
super("Cannot modify association '#{owner.class.name}##{reflection.name}' because the source reflection class '#{reflection.source_reflection.class_name}' is associated to '#{reflection.through_reflection.class_name}' via :#{reflection.source_reflection.macro}.")
else
super("Cannot modify association.")
end
end
end
class AmbiguousSourceReflectionForThroughAssociation < ActiveRecordError # :nodoc:
def initialize(klass, macro, association_name, options, possible_sources)
example_options = options.dup
example_options[:source] = possible_sources.first
super("Ambiguous source reflection for through association. Please " \
"specify a :source directive on your declaration like:\n" \
"\n" \
" class #{klass} < ActiveRecord::Base\n" \
" #{macro} :#{association_name}, #{example_options}\n" \
" end"
)
end
end
class HasManyThroughCantAssociateThroughHasOneOrManyReflection < ThroughCantAssociateThroughHasOneOrManyReflection # :nodoc:
end
class HasOneThroughCantAssociateThroughHasOneOrManyReflection < ThroughCantAssociateThroughHasOneOrManyReflection # :nodoc:
end
class ThroughNestedAssociationsAreReadonly < ActiveRecordError # :nodoc:
def initialize(owner = nil, reflection = nil)
if owner && reflection
super("Cannot modify association '#{owner.class.name}##{reflection.name}' because it goes through more than one other association.")
else
super("Through nested associations are read-only.")
end
end
end
class HasManyThroughNestedAssociationsAreReadonly < ThroughNestedAssociationsAreReadonly # :nodoc:
end
class HasOneThroughNestedAssociationsAreReadonly < ThroughNestedAssociationsAreReadonly # :nodoc:
end
# This error is raised when trying to eager load a polymorphic association using a JOIN.
# Eager loading polymorphic associations is only possible with
# {ActiveRecord::Relation#preload}[rdoc-ref:QueryMethods#preload].
class EagerLoadPolymorphicError < ActiveRecordError
def initialize(reflection = nil)
if reflection
super("Cannot eagerly load the polymorphic association #{reflection.name.inspect}")
else
super("Eager load polymorphic error.")
end
end
end
# This error is raised when trying to destroy a parent instance in N:1 or 1:1 associations
# (has_many, has_one) when there is at least 1 child associated instance.
# ex: if @project.tasks.size > 0, DeleteRestrictionError will be raised when trying to destroy @project
class DeleteRestrictionError < ActiveRecordError # :nodoc:
def initialize(name = nil)
if name
super("Cannot delete record because of dependent #{name}")
else
super("Delete restriction error.")
end
end
end
# See ActiveRecord::Associations::ClassMethods for documentation.
module Associations # :nodoc:
extend ActiveSupport::Autoload
extend ActiveSupport::Concern
# These classes will be loaded when associations are created.
# So there is no need to eager load them.
autoload :Association
autoload :SingularAssociation
autoload :CollectionAssociation
autoload :ForeignAssociation
autoload :CollectionProxy
autoload :ThroughAssociation
module Builder # :nodoc:
autoload :Association, "active_record/associations/builder/association"
autoload :SingularAssociation, "active_record/associations/builder/singular_association"
autoload :CollectionAssociation, "active_record/associations/builder/collection_association"
autoload :BelongsTo, "active_record/associations/builder/belongs_to"
autoload :HasOne, "active_record/associations/builder/has_one"
autoload :HasMany, "active_record/associations/builder/has_many"
autoload :HasAndBelongsToMany, "active_record/associations/builder/has_and_belongs_to_many"
end
eager_autoload do
autoload :BelongsToAssociation
autoload :BelongsToPolymorphicAssociation
autoload :HasManyAssociation
autoload :HasManyThroughAssociation
autoload :HasOneAssociation
autoload :HasOneThroughAssociation
autoload :Preloader
autoload :JoinDependency
autoload :AssociationScope
autoload :DisableJoinsAssociationScope
autoload :AliasTracker
end
def self.eager_load!
super
Preloader.eager_load!
JoinDependency.eager_load!
end
# Returns the association instance for the given name, instantiating it if it doesn't already exist
def association(name) # :nodoc:
association = association_instance_get(name)
if association.nil?
unless reflection = self.class._reflect_on_association(name)
raise AssociationNotFoundError.new(self, name)
end
association = reflection.association_class.new(self, reflection)
association_instance_set(name, association)
end
association
end
def association_cached?(name) # :nodoc:
@association_cache.key?(name)
end
def initialize_dup(*) # :nodoc:
@association_cache = {}
super
end
private
def init_internals
@association_cache = {}
super
end
# Returns the specified association instance if it exists, +nil+ otherwise.
def association_instance_get(name)
@association_cache[name]
end
# Set the specified association instance.
def association_instance_set(name, association)
@association_cache[name] = association
end
# \Associations are a set of macro-like class methods for tying objects together through
# foreign keys. They express relationships like "Project has one Project Manager"
# or "Project belongs to a Portfolio". Each macro adds a number of methods to the
# class which are specialized according to the collection or association symbol and the
# options hash. It works much the same way as Ruby's own <tt>attr*</tt>
# methods.
#
# class Project < ActiveRecord::Base
# belongs_to :portfolio
# has_one :project_manager
# has_many :milestones
# has_and_belongs_to_many :categories
# end
#
# The project class now has the following methods (and more) to ease the traversal and
# manipulation of its relationships:
# * <tt>Project#portfolio</tt>, <tt>Project#portfolio=(portfolio)</tt>, <tt>Project#reload_portfolio</tt>
# * <tt>Project#project_manager</tt>, <tt>Project#project_manager=(project_manager)</tt>, <tt>Project#reload_project_manager</tt>
# * <tt>Project#milestones.empty?</tt>, <tt>Project#milestones.size</tt>, <tt>Project#milestones</tt>, <tt>Project#milestones<<(milestone)</tt>,
# <tt>Project#milestones.delete(milestone)</tt>, <tt>Project#milestones.destroy(milestone)</tt>, <tt>Project#milestones.find(milestone_id)</tt>,
# <tt>Project#milestones.build</tt>, <tt>Project#milestones.create</tt>
# * <tt>Project#categories.empty?</tt>, <tt>Project#categories.size</tt>, <tt>Project#categories</tt>, <tt>Project#categories<<(category1)</tt>,
# <tt>Project#categories.delete(category1)</tt>, <tt>Project#categories.destroy(category1)</tt>
#
# === A word of warning
#
# Don't create associations that have the same name as {instance methods}[rdoc-ref:ActiveRecord::Core] of
# <tt>ActiveRecord::Base</tt>. Since the association adds a method with that name to
# its model, using an association with the same name as one provided by <tt>ActiveRecord::Base</tt> will override the method inherited through <tt>ActiveRecord::Base</tt> and will break things.
# For instance, +attributes+ and +connection+ would be bad choices for association names, because those names already exist in the list of <tt>ActiveRecord::Base</tt> instance methods.
#
# == Auto-generated methods
# See also Instance Public methods below for more details.
#
# === Singular associations (one-to-one)
# | | belongs_to |
# generated methods | belongs_to | :polymorphic | has_one
# ----------------------------------+------------+--------------+---------
# other | X | X | X
# other=(other) | X | X | X
# build_other(attributes={}) | X | | X
# create_other(attributes={}) | X | | X
# create_other!(attributes={}) | X | | X
# reload_other | X | X | X
# other_changed? | X | X |
# other_previously_changed? | X | X |
#
# === Collection associations (one-to-many / many-to-many)
# | | | has_many
# generated methods | habtm | has_many | :through
# ----------------------------------+-------+----------+----------
# others | X | X | X
# others=(other,other,...) | X | X | X
# other_ids | X | X | X
# other_ids=(id,id,...) | X | X | X
# others<< | X | X | X
# others.push | X | X | X
# others.concat | X | X | X
# others.build(attributes={}) | X | X | X
# others.create(attributes={}) | X | X | X
# others.create!(attributes={}) | X | X | X
# others.size | X | X | X
# others.length | X | X | X
# others.count | X | X | X
# others.sum(*args) | X | X | X
# others.empty? | X | X | X
# others.clear | X | X | X
# others.delete(other,other,...) | X | X | X
# others.delete_all | X | X | X
# others.destroy(other,other,...) | X | X | X
# others.destroy_all | X | X | X
# others.find(*args) | X | X | X
# others.exists? | X | X | X
# others.distinct | X | X | X
# others.reset | X | X | X
# others.reload | X | X | X
#
# === Overriding generated methods
#
# Association methods are generated in a module included into the model
# class, making overrides easy. The original generated method can thus be
# called with +super+:
#
# class Car < ActiveRecord::Base
# belongs_to :owner
# belongs_to :old_owner
#
# def owner=(new_owner)
# self.old_owner = self.owner
# super
# end
# end
#
# The association methods module is included immediately after the
# generated attributes methods module, meaning an association will
# override the methods for an attribute with the same name.
#
# == Cardinality and associations
#
# Active Record associations can be used to describe one-to-one, one-to-many, and many-to-many
# relationships between models. Each model uses an association to describe its role in
# the relation. The #belongs_to association is always used in the model that has
# the foreign key.
#
# === One-to-one
#
# Use #has_one in the base, and #belongs_to in the associated model.
#
# class Employee < ActiveRecord::Base
# has_one :office
# end
# class Office < ActiveRecord::Base
# belongs_to :employee # foreign key - employee_id
# end
#
# === One-to-many
#
# Use #has_many in the base, and #belongs_to in the associated model.
#
# class Manager < ActiveRecord::Base
# has_many :employees
# end
# class Employee < ActiveRecord::Base
# belongs_to :manager # foreign key - manager_id
# end
#
# === Many-to-many
#
# There are two ways to build a many-to-many relationship.
#
# The first way uses a #has_many association with the <tt>:through</tt> option and a join model, so
# there are two stages of associations.
#
# class Assignment < ActiveRecord::Base
# belongs_to :programmer # foreign key - programmer_id
# belongs_to :project # foreign key - project_id
# end
# class Programmer < ActiveRecord::Base
# has_many :assignments
# has_many :projects, through: :assignments
# end
# class Project < ActiveRecord::Base
# has_many :assignments
# has_many :programmers, through: :assignments
# end
#
# For the second way, use #has_and_belongs_to_many in both models. This requires a join table
# that has no corresponding model or primary key.
#
# class Programmer < ActiveRecord::Base
# has_and_belongs_to_many :projects # foreign keys in the join table
# end
# class Project < ActiveRecord::Base
# has_and_belongs_to_many :programmers # foreign keys in the join table
# end
#
# Choosing which way to build a many-to-many relationship is not always simple.
# If you need to work with the relationship model as its own entity,
# use #has_many <tt>:through</tt>. Use #has_and_belongs_to_many when working with legacy schemas or when
# you never work directly with the relationship itself.
#
# == Is it a #belongs_to or #has_one association?
#
# Both express a 1-1 relationship. The difference is mostly where to place the foreign
# key, which goes on the table for the class declaring the #belongs_to relationship.
#
# class User < ActiveRecord::Base
# # I reference an account.
# belongs_to :account
# end
#
# class Account < ActiveRecord::Base
# # One user references me.
# has_one :user
# end
#
# The tables for these classes could look something like:
#
# CREATE TABLE users (
# id bigint NOT NULL auto_increment,
# account_id bigint default NULL,
# name varchar default NULL,
# PRIMARY KEY (id)
# )
#
# CREATE TABLE accounts (
# id bigint NOT NULL auto_increment,
# name varchar default NULL,
# PRIMARY KEY (id)
# )
#
# == Unsaved objects and associations
#
# You can manipulate objects and associations before they are saved to the database, but
# there is some special behavior you should be aware of, mostly involving the saving of
# associated objects.
#
# You can set the <tt>:autosave</tt> option on a #has_one, #belongs_to,
# #has_many, or #has_and_belongs_to_many association. Setting it
# to +true+ will _always_ save the members, whereas setting it to +false+ will
# _never_ save the members. More details about <tt>:autosave</tt> option is available at
# AutosaveAssociation.
#
# === One-to-one associations
#
# * Assigning an object to a #has_one association automatically saves that object and
# the object being replaced (if there is one), in order to update their foreign
# keys - except if the parent object is unsaved (<tt>new_record? == true</tt>).
# * If either of these saves fail (due to one of the objects being invalid), an
# ActiveRecord::RecordNotSaved exception is raised and the assignment is
# cancelled.
# * If you wish to assign an object to a #has_one association without saving it,
# use the <tt>#build_association</tt> method (documented below). The object being
# replaced will still be saved to update its foreign key.
# * Assigning an object to a #belongs_to association does not save the object, since
# the foreign key field belongs on the parent. It does not save the parent either.
#
# === Collections
#
# * Adding an object to a collection (#has_many or #has_and_belongs_to_many) automatically
# saves that object, except if the parent object (the owner of the collection) is not yet
# stored in the database.
# * If saving any of the objects being added to a collection (via <tt>push</tt> or similar)
# fails, then <tt>push</tt> returns +false+.
# * If saving fails while replacing the collection (via <tt>association=</tt>), an
# ActiveRecord::RecordNotSaved exception is raised and the assignment is
# cancelled.
# * You can add an object to a collection without automatically saving it by using the
# <tt>collection.build</tt> method (documented below).
# * All unsaved (<tt>new_record? == true</tt>) members of the collection are automatically
# saved when the parent is saved.
#
# == Customizing the query
#
# \Associations are built from <tt>Relation</tt> objects, and you can use the Relation syntax
# to customize them. For example, to add a condition:
#
# class Blog < ActiveRecord::Base
# has_many :published_posts, -> { where(published: true) }, class_name: 'Post'
# end
#
# Inside the <tt>-> { ... }</tt> block you can use all of the usual Relation methods.
#
# === Accessing the owner object
#
# Sometimes it is useful to have access to the owner object when building the query. The owner
# is passed as a parameter to the block. For example, the following association would find all
# events that occur on the user's birthday:
#
# class User < ActiveRecord::Base
# has_many :birthday_events, ->(user) { where(starts_on: user.birthday) }, class_name: 'Event'
# end
#
# Note: Joining, eager loading, and preloading of these associations is not possible.
# These operations happen before instance creation and the scope will be called with a +nil+ argument.
#
# == Association callbacks
#
# Similar to the normal callbacks that hook into the life cycle of an Active Record object,
# you can also define callbacks that get triggered when you add an object to or remove an
# object from an association collection.
#
# class Firm < ActiveRecord::Base
# has_many :clients,
# dependent: :destroy,
# after_add: :congratulate_client,
# after_remove: :log_after_remove
#
# def congratulate_client(record)
# # ...
# end
#
# def log_after_remove(record)
# # ...
# end
#
# It's possible to stack callbacks by passing them as an array. Example:
#
# class Firm < ActiveRecord::Base
# has_many :clients,
# dependent: :destroy,
# after_add: [:congratulate_client, -> (firm, record) { firm.log << "after_adding#{record.id}" }],
# after_remove: :log_after_remove
# end
#
# Possible callbacks are: +before_add+, +after_add+, +before_remove+, and +after_remove+.
#
# If any of the +before_add+ callbacks throw an exception, the object will not be
# added to the collection.
#
# Similarly, if any of the +before_remove+ callbacks throw an exception, the object
# will not be removed from the collection.
#
# Note: To trigger remove callbacks, you must use +destroy+ / +destroy_all+ methods. For example:
#
# * <tt>firm.clients.destroy(client)</tt>
# * <tt>firm.clients.destroy(*clients)</tt>
# * <tt>firm.clients.destroy_all</tt>
#
# +delete+ / +delete_all+ methods like the following do *not* trigger remove callbacks:
#
# * <tt>firm.clients.delete(client)</tt>
# * <tt>firm.clients.delete(*clients)</tt>
# * <tt>firm.clients.delete_all</tt>
#
# == Association extensions
#
# The proxy objects that control the access to associations can be extended through anonymous
# modules. This is especially beneficial for adding new finders, creators, and other
# factory-type methods that are only used as part of this association.
#
# class Account < ActiveRecord::Base
# has_many :people do
# def find_or_create_by_name(name)
# first_name, last_name = name.split(" ", 2)
# find_or_create_by(first_name: first_name, last_name: last_name)
# end
# end
# end
#
# person = Account.first.people.find_or_create_by_name("David Heinemeier Hansson")
# person.first_name # => "David"
# person.last_name # => "Heinemeier Hansson"
#
# If you need to share the same extensions between many associations, you can use a named
# extension module.
#
# module FindOrCreateByNameExtension
# def find_or_create_by_name(name)
# first_name, last_name = name.split(" ", 2)
# find_or_create_by(first_name: first_name, last_name: last_name)
# end
# end
#
# class Account < ActiveRecord::Base
# has_many :people, -> { extending FindOrCreateByNameExtension }
# end
#
# class Company < ActiveRecord::Base
# has_many :people, -> { extending FindOrCreateByNameExtension }
# end
#
# Some extensions can only be made to work with knowledge of the association's internals.
# Extensions can access relevant state using the following methods (where +items+ is the
# name of the association):
#
# * <tt>record.association(:items).owner</tt> - Returns the object the association is part of.
# * <tt>record.association(:items).reflection</tt> - Returns the reflection object that describes the association.
# * <tt>record.association(:items).target</tt> - Returns the associated object for #belongs_to and #has_one, or
# the collection of associated objects for #has_many and #has_and_belongs_to_many.
#
# However, inside the actual extension code, you will not have access to the <tt>record</tt> as
# above. In this case, you can access <tt>proxy_association</tt>. For example,
# <tt>record.association(:items)</tt> and <tt>record.items.proxy_association</tt> will return
# the same object, allowing you to make calls like <tt>proxy_association.owner</tt> inside
# association extensions.
#
# == Association Join Models
#
# Has Many associations can be configured with the <tt>:through</tt> option to use an
# explicit join model to retrieve the data. This operates similarly to a
# #has_and_belongs_to_many association. The advantage is that you're able to add validations,
# callbacks, and extra attributes on the join model. Consider the following schema:
#
# class Author < ActiveRecord::Base
# has_many :authorships
# has_many :books, through: :authorships
# end
#
# class Authorship < ActiveRecord::Base
# belongs_to :author
# belongs_to :book
# end
#
# @author = Author.first
# @author.authorships.collect { |a| a.book } # selects all books that the author's authorships belong to
# @author.books # selects all books by using the Authorship join model
#
# You can also go through a #has_many association on the join model:
#
# class Firm < ActiveRecord::Base
# has_many :clients
# has_many :invoices, through: :clients
# end
#
# class Client < ActiveRecord::Base
# belongs_to :firm
# has_many :invoices
# end
#
# class Invoice < ActiveRecord::Base
# belongs_to :client
# end
#
# @firm = Firm.first
# @firm.clients.flat_map { |c| c.invoices } # select all invoices for all clients of the firm
# @firm.invoices # selects all invoices by going through the Client join model
#
# Similarly you can go through a #has_one association on the join model:
#
# class Group < ActiveRecord::Base
# has_many :users
# has_many :avatars, through: :users
# end
#
# class User < ActiveRecord::Base
# belongs_to :group
# has_one :avatar
# end
#
# class Avatar < ActiveRecord::Base
# belongs_to :user
# end
#
# @group = Group.first
# @group.users.collect { |u| u.avatar }.compact # select all avatars for all users in the group
# @group.avatars # selects all avatars by going through the User join model.
#
# An important caveat with going through #has_one or #has_many associations on the
# join model is that these associations are *read-only*. For example, the following
# would not work following the previous example:
#
# @group.avatars << Avatar.new # this would work if User belonged_to Avatar rather than the other way around
# @group.avatars.delete(@group.avatars.last) # so would this
#
# == Setting Inverses
#
# If you are using a #belongs_to on the join model, it is a good idea to set the
# <tt>:inverse_of</tt> option on the #belongs_to, which will mean that the following example
# works correctly (where <tt>tags</tt> is a #has_many <tt>:through</tt> association):
#
# @post = Post.first
# @tag = @post.tags.build name: "ruby"
# @tag.save
#
# The last line ought to save the through record (a <tt>Tagging</tt>). This will only work if the
# <tt>:inverse_of</tt> is set:
#
# class Tagging < ActiveRecord::Base
# belongs_to :post
# belongs_to :tag, inverse_of: :taggings
# end
#
# If you do not set the <tt>:inverse_of</tt> record, the association will
# do its best to match itself up with the correct inverse. Automatic
# inverse detection only works on #has_many, #has_one, and
# #belongs_to associations.
#
# <tt>:foreign_key</tt> and <tt>:through</tt> options on the associations
# will also prevent the association's inverse from being found automatically,
# as will a custom scopes in some cases. See further details in the
# {Active Record Associations guide}[https://guides.rubyonrails.org/association_basics.html#bi-directional-associations].
#
# The automatic guessing of the inverse association uses a heuristic based
# on the name of the class, so it may not work for all associations,
# especially the ones with non-standard names.
#
# You can turn off the automatic detection of inverse associations by setting
# the <tt>:inverse_of</tt> option to <tt>false</tt> like so:
#
# class Tagging < ActiveRecord::Base
# belongs_to :tag, inverse_of: false
# end
#
# == Nested \Associations
#
# You can actually specify *any* association with the <tt>:through</tt> option, including an
# association which has a <tt>:through</tt> option itself. For example:
#
# class Author < ActiveRecord::Base
# has_many :posts
# has_many :comments, through: :posts
# has_many :commenters, through: :comments
# end
#
# class Post < ActiveRecord::Base
# has_many :comments
# end
#
# class Comment < ActiveRecord::Base
# belongs_to :commenter
# end
#
# @author = Author.first
# @author.commenters # => People who commented on posts written by the author
#
# An equivalent way of setting up this association this would be:
#
# class Author < ActiveRecord::Base
# has_many :posts
# has_many :commenters, through: :posts
# end
#
# class Post < ActiveRecord::Base
# has_many :comments
# has_many :commenters, through: :comments
# end
#
# class Comment < ActiveRecord::Base
# belongs_to :commenter
# end
#
# When using a nested association, you will not be able to modify the association because there
# is not enough information to know what modification to make. For example, if you tried to
# add a <tt>Commenter</tt> in the example above, there would be no way to tell how to set up the
# intermediate <tt>Post</tt> and <tt>Comment</tt> objects.
#
# == Polymorphic \Associations
#
# Polymorphic associations on models are not restricted on what types of models they
# can be associated with. Rather, they specify an interface that a #has_many association
# must adhere to.
#
# class Asset < ActiveRecord::Base
# belongs_to :attachable, polymorphic: true
# end
#
# class Post < ActiveRecord::Base
# has_many :assets, as: :attachable # The :as option specifies the polymorphic interface to use.
# end
#
# @asset.attachable = @post
#
# This works by using a type column in addition to a foreign key to specify the associated
# record. In the Asset example, you'd need an +attachable_id+ integer column and an
# +attachable_type+ string column.
#
# Using polymorphic associations in combination with single table inheritance (STI) is
# a little tricky. In order for the associations to work as expected, ensure that you
# store the base model for the STI models in the type column of the polymorphic
# association. To continue with the asset example above, suppose there are guest posts
# and member posts that use the posts table for STI. In this case, there must be a +type+
# column in the posts table.
#
# Note: The <tt>attachable_type=</tt> method is being called when assigning an +attachable+.
# The +class_name+ of the +attachable+ is passed as a String.
#
# class Asset < ActiveRecord::Base
# belongs_to :attachable, polymorphic: true
#
# def attachable_type=(class_name)
# super(class_name.constantize.base_class.to_s)
# end
# end
#
# class Post < ActiveRecord::Base
# # because we store "Post" in attachable_type now dependent: :destroy will work
# has_many :assets, as: :attachable, dependent: :destroy
# end
#
# class GuestPost < Post
# end
#
# class MemberPost < Post
# end
#
# == Caching
#
# All of the methods are built on a simple caching principle that will keep the result
# of the last query around unless specifically instructed not to. The cache is even
# shared across methods to make it even cheaper to use the macro-added methods without
# worrying too much about performance at the first go.
#
# project.milestones # fetches milestones from the database
# project.milestones.size # uses the milestone cache
# project.milestones.empty? # uses the milestone cache
# project.milestones.reload.size # fetches milestones from the database
# project.milestones # uses the milestone cache
#
# == Eager loading of associations
#
# Eager loading is a way to find objects of a certain class and a number of named associations.
# It is one of the easiest ways to prevent the dreaded N+1 problem in which fetching 100
# posts that each need to display their author triggers 101 database queries. Through the
# use of eager loading, the number of queries will be reduced from 101 to 2.
#
# class Post < ActiveRecord::Base
# belongs_to :author
# has_many :comments
# end
#
# Consider the following loop using the class above:
#
# Post.all.each do |post|
# puts "Post: " + post.title
# puts "Written by: " + post.author.name
# puts "Last comment on: " + post.comments.first.created_on
# end
#
# To iterate over these one hundred posts, we'll generate 201 database queries. Let's
# first just optimize it for retrieving the author:
#
# Post.includes(:author).each do |post|
#
# This references the name of the #belongs_to association that also used the <tt>:author</tt>
# symbol. After loading the posts, +find+ will collect the +author_id+ from each one and load
# all of the referenced authors with one query. Doing so will cut down the number of queries
# from 201 to 102.
#
# We can improve upon the situation further by referencing both associations in the finder with:
#
# Post.includes(:author, :comments).each do |post|
#
# This will load all comments with a single query. This reduces the total number of queries
# to 3. In general, the number of queries will be 1 plus the number of associations
# named (except if some of the associations are polymorphic #belongs_to - see below).
#
# To include a deep hierarchy of associations, use a hash:
#
# Post.includes(:author, { comments: { author: :gravatar } }).each do |post|
#
# The above code will load all the comments and all of their associated
# authors and gravatars. You can mix and match any combination of symbols,
# arrays, and hashes to retrieve the associations you want to load.
#
# All of this power shouldn't fool you into thinking that you can pull out huge amounts
# of data with no performance penalty just because you've reduced the number of queries.
# The database still needs to send all the data to Active Record and it still needs to
# be processed. So it's no catch-all for performance problems, but it's a great way to
# cut down on the number of queries in a situation as the one described above.
#
# Since only one table is loaded at a time, conditions or orders cannot reference tables
# other than the main one. If this is the case, Active Record falls back to the previously
# used <tt>LEFT OUTER JOIN</tt> based strategy. For example:
#
# Post.includes([:author, :comments]).where(['comments.approved = ?', true])
#
# This will result in a single SQL query with joins along the lines of:
# <tt>LEFT OUTER JOIN comments ON comments.post_id = posts.id</tt> and
# <tt>LEFT OUTER JOIN authors ON authors.id = posts.author_id</tt>. Note that using conditions
# like this can have unintended consequences.
# In the above example, posts with no approved comments are not returned at all because
# the conditions apply to the SQL statement as a whole and not just to the association.
#
# You must disambiguate column references for this fallback to happen, for example
# <tt>order: "author.name DESC"</tt> will work but <tt>order: "name DESC"</tt> will not.
#
# If you want to load all posts (including posts with no approved comments), then write
# your own <tt>LEFT OUTER JOIN</tt> query using <tt>ON</tt>:
#
# Post.joins("LEFT OUTER JOIN comments ON comments.post_id = posts.id AND comments.approved = '1'")
#
# In this case, it is usually more natural to include an association which has conditions defined on it:
#
# class Post < ActiveRecord::Base
# has_many :approved_comments, -> { where(approved: true) }, class_name: 'Comment'
# end
#
# Post.includes(:approved_comments)
#
# This will load posts and eager load the +approved_comments+ association, which contains
# only those comments that have been approved.
#
# If you eager load an association with a specified <tt>:limit</tt> option, it will be ignored,
# returning all the associated objects:
#
# class Picture < ActiveRecord::Base
# has_many :most_recent_comments, -> { order('id DESC').limit(10) }, class_name: 'Comment'
# end
#
# Picture.includes(:most_recent_comments).first.most_recent_comments # => returns all associated comments.
#
# Eager loading is supported with polymorphic associations.
#
# class Address < ActiveRecord::Base
# belongs_to :addressable, polymorphic: true
# end
#
# A call that tries to eager load the addressable model
#
# Address.includes(:addressable)
#
# This will execute one query to load the addresses and load the addressables with one
# query per addressable type.
# For example, if all the addressables are either of class Person or Company, then a total
# of 3 queries will be executed. The list of addressable types to load is determined on
# the back of the addresses loaded. This is not supported if Active Record has to fallback
# to the previous implementation of eager loading and will raise ActiveRecord::EagerLoadPolymorphicError.
# The reason is that the parent model's type is a column value so its corresponding table
# name cannot be put in the +FROM+/+JOIN+ clauses of that query.
#
# == Table Aliasing
#
# Active Record uses table aliasing in the case that a table is referenced multiple times
# in a join. If a table is referenced only once, the standard table name is used. The
# second time, the table is aliased as <tt>#{reflection_name}_#{parent_table_name}</tt>.
# Indexes are appended for any more successive uses of the table name.
#
# Post.joins(:comments)
# # => SELECT ... FROM posts INNER JOIN comments ON ...
# Post.joins(:special_comments) # STI
# # => SELECT ... FROM posts INNER JOIN comments ON ... AND comments.type = 'SpecialComment'
# Post.joins(:comments, :special_comments) # special_comments is the reflection name, posts is the parent table name
# # => SELECT ... FROM posts INNER JOIN comments ON ... INNER JOIN comments special_comments_posts
#
# Acts as tree example:
#
# TreeMixin.joins(:children)
# # => SELECT ... FROM mixins INNER JOIN mixins childrens_mixins ...
# TreeMixin.joins(children: :parent)
# # => SELECT ... FROM mixins INNER JOIN mixins childrens_mixins ...
# INNER JOIN parents_mixins ...
# TreeMixin.joins(children: {parent: :children})
# # => SELECT ... FROM mixins INNER JOIN mixins childrens_mixins ...
# INNER JOIN parents_mixins ...
# INNER JOIN mixins childrens_mixins_2
#
# Has and Belongs to Many join tables use the same idea, but add a <tt>_join</tt> suffix:
#
# Post.joins(:categories)
# # => SELECT ... FROM posts INNER JOIN categories_posts ... INNER JOIN categories ...
# Post.joins(categories: :posts)
# # => SELECT ... FROM posts INNER JOIN categories_posts ... INNER JOIN categories ...
# INNER JOIN categories_posts posts_categories_join INNER JOIN posts posts_categories
# Post.joins(categories: {posts: :categories})
# # => SELECT ... FROM posts INNER JOIN categories_posts ... INNER JOIN categories ...
# INNER JOIN categories_posts posts_categories_join INNER JOIN posts posts_categories
# INNER JOIN categories_posts categories_posts_join INNER JOIN categories categories_posts_2
#
# If you wish to specify your own custom joins using ActiveRecord::QueryMethods#joins method, those table
# names will take precedence over the eager associations:
#
# Post.joins(:comments).joins("inner join comments ...")
# # => SELECT ... FROM posts INNER JOIN comments_posts ON ... INNER JOIN comments ...
# Post.joins(:comments, :special_comments).joins("inner join comments ...")
# # => SELECT ... FROM posts INNER JOIN comments comments_posts ON ...
# INNER JOIN comments special_comments_posts ...
# INNER JOIN comments ...
#
# Table aliases are automatically truncated according to the maximum length of table identifiers
# according to the specific database.
#
# == Modules
#
# By default, associations will look for objects within the current module scope. Consider:
#
# module MyApplication
# module Business
# class Firm < ActiveRecord::Base
# has_many :clients
# end
#
# class Client < ActiveRecord::Base; end
# end
# end
#
# When <tt>Firm#clients</tt> is called, it will in turn call
# <tt>MyApplication::Business::Client.find_all_by_firm_id(firm.id)</tt>.
# If you want to associate with a class in another module scope, this can be done by
# specifying the complete class name.
#
# module MyApplication
# module Business
# class Firm < ActiveRecord::Base; end
# end
#
# module Billing
# class Account < ActiveRecord::Base
# belongs_to :firm, class_name: "MyApplication::Business::Firm"
# end
# end
# end
#
# == Bi-directional associations
#
# When you specify an association, there is usually an association on the associated model
# that specifies the same relationship in reverse. For example, with the following models:
#
# class Dungeon < ActiveRecord::Base
# has_many :traps
# has_one :evil_wizard
# end
#
# class Trap < ActiveRecord::Base
# belongs_to :dungeon
# end
#
# class EvilWizard < ActiveRecord::Base
# belongs_to :dungeon
# end
#
# The +traps+ association on +Dungeon+ and the +dungeon+ association on +Trap+ are
# the inverse of each other, and the inverse of the +dungeon+ association on +EvilWizard+
# is the +evil_wizard+ association on +Dungeon+ (and vice-versa). By default,
# Active Record can guess the inverse of the association based on the name
# of the class. The result is the following:
#
# d = Dungeon.first
# t = d.traps.first
# d.object_id == t.dungeon.object_id # => true
#
# The +Dungeon+ instances +d+ and <tt>t.dungeon</tt> in the above example refer to
# the same in-memory instance since the association matches the name of the class.
# The result would be the same if we added +:inverse_of+ to our model definitions:
#
# class Dungeon < ActiveRecord::Base
# has_many :traps, inverse_of: :dungeon
# has_one :evil_wizard, inverse_of: :dungeon
# end
#
# class Trap < ActiveRecord::Base
# belongs_to :dungeon, inverse_of: :traps
# end
#
# class EvilWizard < ActiveRecord::Base
# belongs_to :dungeon, inverse_of: :evil_wizard
# end
#
# For more information, see the documentation for the +:inverse_of+ option and the
# {Active Record Associations guide}[https://guides.rubyonrails.org/association_basics.html#bi-directional-associations].
#
# == Deleting from associations
#
# === Dependent associations
#
# #has_many, #has_one, and #belongs_to associations support the <tt>:dependent</tt> option.
# This allows you to specify that associated records should be deleted when the owner is
# deleted.
#
# For example:
#
# class Author
# has_many :posts, dependent: :destroy
# end
# Author.find(1).destroy # => Will destroy all of the author's posts, too
#
# The <tt>:dependent</tt> option can have different values which specify how the deletion
# is done. For more information, see the documentation for this option on the different
# specific association types. When no option is given, the behavior is to do nothing
# with the associated records when destroying a record.
#
# Note that <tt>:dependent</tt> is implemented using Rails' callback
# system, which works by processing callbacks in order. Therefore, other
# callbacks declared either before or after the <tt>:dependent</tt> option
# can affect what it does.
#
# Note that <tt>:dependent</tt> option is ignored for #has_one <tt>:through</tt> associations.
#
# === Delete or destroy?
#
# #has_many and #has_and_belongs_to_many associations have the methods <tt>destroy</tt>,
# <tt>delete</tt>, <tt>destroy_all</tt> and <tt>delete_all</tt>.
#
# For #has_and_belongs_to_many, <tt>delete</tt> and <tt>destroy</tt> are the same: they
# cause the records in the join table to be removed.
#
# For #has_many, <tt>destroy</tt> and <tt>destroy_all</tt> will always call the <tt>destroy</tt> method of the
# record(s) being removed so that callbacks are run. However <tt>delete</tt> and <tt>delete_all</tt> will either
# do the deletion according to the strategy specified by the <tt>:dependent</tt> option, or
# if no <tt>:dependent</tt> option is given, then it will follow the default strategy.
# The default strategy is to do nothing (leave the foreign keys with the parent ids set), except for
# #has_many <tt>:through</tt>, where the default strategy is <tt>delete_all</tt> (delete
# the join records, without running their callbacks).
#
# There is also a <tt>clear</tt> method which is the same as <tt>delete_all</tt>, except that
# it returns the association rather than the records which have been deleted.
#
# === What gets deleted?
#
# There is a potential pitfall here: #has_and_belongs_to_many and #has_many <tt>:through</tt>
# associations have records in join tables, as well as the associated records. So when we
# call one of these deletion methods, what exactly should be deleted?
#
# The answer is that it is assumed that deletion on an association is about removing the
# <i>link</i> between the owner and the associated object(s), rather than necessarily the
# associated objects themselves. So with #has_and_belongs_to_many and #has_many
# <tt>:through</tt>, the join records will be deleted, but the associated records won't.
#
# This makes sense if you think about it: if you were to call <tt>post.tags.delete(Tag.find_by(name: 'food'))</tt>
# you would want the 'food' tag to be unlinked from the post, rather than for the tag itself
# to be removed from the database.
#
# However, there are examples where this strategy doesn't make sense. For example, suppose
# a person has many projects, and each project has many tasks. If we deleted one of a person's
# tasks, we would probably not want the project to be deleted. In this scenario, the delete method
# won't actually work: it can only be used if the association on the join model is a
# #belongs_to. In other situations you are expected to perform operations directly on
# either the associated records or the <tt>:through</tt> association.
#
# With a regular #has_many there is no distinction between the "associated records"
# and the "link", so there is only one choice for what gets deleted.
#
# With #has_and_belongs_to_many and #has_many <tt>:through</tt>, if you want to delete the
# associated records themselves, you can always do something along the lines of
# <tt>person.tasks.each(&:destroy)</tt>.
#
# == Type safety with ActiveRecord::AssociationTypeMismatch
#
# If you attempt to assign an object to an association that doesn't match the inferred
# or specified <tt>:class_name</tt>, you'll get an ActiveRecord::AssociationTypeMismatch.
#
# == Options
#
# All of the association macros can be specialized through options. This makes cases
# more complex than the simple and guessable ones possible.
module ClassMethods
# Specifies a one-to-many association. The following methods for retrieval and query of
# collections of associated objects will be added:
#
# +collection+ is a placeholder for the symbol passed as the +name+ argument, so
# <tt>has_many :clients</tt> would add among others <tt>clients.empty?</tt>.
#
# [collection]
# Returns a Relation of all the associated objects.
# An empty Relation is returned if none are found.
# [collection<<(object, ...)]
# Adds one or more objects to the collection by setting their foreign keys to the collection's primary key.
# Note that this operation instantly fires update SQL without waiting for the save or update call on the
# parent object, unless the parent object is a new record.
# This will also run validations and callbacks of associated object(s).
# [collection.delete(object, ...)]
# Removes one or more objects from the collection by setting their foreign keys to +NULL+.
# Objects will be in addition destroyed if they're associated with <tt>dependent: :destroy</tt>,
# and deleted if they're associated with <tt>dependent: :delete_all</tt>.
#
# If the <tt>:through</tt> option is used, then the join records are deleted (rather than
# nullified) by default, but you can specify <tt>dependent: :destroy</tt> or
# <tt>dependent: :nullify</tt> to override this.
# [collection.destroy(object, ...)]
# Removes one or more objects from the collection by running <tt>destroy</tt> on
# each record, regardless of any dependent option, ensuring callbacks are run.
#
# If the <tt>:through</tt> option is used, then the join records are destroyed
# instead, not the objects themselves.
# [collection=objects]
# Replaces the collections content by deleting and adding objects as appropriate. If the <tt>:through</tt>
# option is true callbacks in the join models are triggered except destroy callbacks, since deletion is
# direct by default. You can specify <tt>dependent: :destroy</tt> or
# <tt>dependent: :nullify</tt> to override this.
# [collection_singular_ids]
# Returns an array of the associated objects' ids
# [collection_singular_ids=ids]
# Replace the collection with the objects identified by the primary keys in +ids+. This
# method loads the models and calls <tt>collection=</tt>. See above.
# [collection.clear]
# Removes every object from the collection. This destroys the associated objects if they
# are associated with <tt>dependent: :destroy</tt>, deletes them directly from the
# database if <tt>dependent: :delete_all</tt>, otherwise sets their foreign keys to +NULL+.
# If the <tt>:through</tt> option is true no destroy callbacks are invoked on the join models.
# Join models are directly deleted.
# [collection.empty?]
# Returns +true+ if there are no associated objects.
# [collection.size]
# Returns the number of associated objects.
# [collection.find(...)]
# Finds an associated object according to the same rules as ActiveRecord::FinderMethods#find.
# [collection.exists?(...)]
# Checks whether an associated object with the given conditions exists.
# Uses the same rules as ActiveRecord::FinderMethods#exists?.
# [collection.build(attributes = {}, ...)]
# Returns one or more new objects of the collection type that have been instantiated
# with +attributes+ and linked to this object through a foreign key, but have not yet
# been saved.
# [collection.create(attributes = {})]
# Returns a new object of the collection type that has been instantiated
# with +attributes+, linked to this object through a foreign key, and that has already
# been saved (if it passed the validation). *Note*: This only works if the base model
# already exists in the DB, not if it is a new (unsaved) record!
# [collection.create!(attributes = {})]
# Does the same as <tt>collection.create</tt>, but raises ActiveRecord::RecordInvalid
# if the record is invalid.
# [collection.reload]
# Returns a Relation of all of the associated objects, forcing a database read.
# An empty Relation is returned if none are found.
#
# === Example
#
# A <tt>Firm</tt> class declares <tt>has_many :clients</tt>, which will add:
# * <tt>Firm#clients</tt> (similar to <tt>Client.where(firm_id: id)</tt>)
# * <tt>Firm#clients<<</tt>
# * <tt>Firm#clients.delete</tt>
# * <tt>Firm#clients.destroy</tt>
# * <tt>Firm#clients=</tt>
# * <tt>Firm#client_ids</tt>
# * <tt>Firm#client_ids=</tt>
# * <tt>Firm#clients.clear</tt>
# * <tt>Firm#clients.empty?</tt> (similar to <tt>firm.clients.size == 0</tt>)
# * <tt>Firm#clients.size</tt> (similar to <tt>Client.count "firm_id = #{id}"</tt>)
# * <tt>Firm#clients.find</tt> (similar to <tt>Client.where(firm_id: id).find(id)</tt>)
# * <tt>Firm#clients.exists?(name: 'ACME')</tt> (similar to <tt>Client.exists?(name: 'ACME', firm_id: firm.id)</tt>)
# * <tt>Firm#clients.build</tt> (similar to <tt>Client.new(firm_id: id)</tt>)
# * <tt>Firm#clients.create</tt> (similar to <tt>c = Client.new(firm_id: id); c.save; c</tt>)
# * <tt>Firm#clients.create!</tt> (similar to <tt>c = Client.new(firm_id: id); c.save!</tt>)
# * <tt>Firm#clients.reload</tt>
# The declaration can also include an +options+ hash to specialize the behavior of the association.
#
# === Scopes
#
# You can pass a second argument +scope+ as a callable (i.e. proc or
# lambda) to retrieve a specific set of records or customize the generated
# query when you access the associated collection.
#
# Scope examples:
# has_many :comments, -> { where(author_id: 1) }
# has_many :employees, -> { joins(:address) }
# has_many :posts, ->(blog) { where("max_post_length > ?", blog.max_post_length) }
#
# === Extensions
#
# The +extension+ argument allows you to pass a block into a has_many
# association. This is useful for adding new finders, creators, and other
# factory-type methods to be used as part of the association.
#
# Extension examples:
# has_many :employees do
# def find_or_create_by_name(name)
# first_name, last_name = name.split(" ", 2)
# find_or_create_by(first_name: first_name, last_name: last_name)
# end
# end
#
# === Options
# [:class_name]
# Specify the class name of the association. Use it only if that name can't be inferred
# from the association name. So <tt>has_many :products</tt> will by default be linked
# to the +Product+ class, but if the real class name is +SpecialProduct+, you'll have to
# specify it with this option.
# [:foreign_key]
# Specify the foreign key used for the association. By default this is guessed to be the name
# of this class in lower-case and "_id" suffixed. So a Person class that makes a #has_many
# association will use "person_id" as the default <tt>:foreign_key</tt>.
#
# Setting the <tt>:foreign_key</tt> option prevents automatic detection of the association's
# inverse, so it is generally a good idea to set the <tt>:inverse_of</tt> option as well.
# [:foreign_type]
# Specify the column used to store the associated object's type, if this is a polymorphic
# association. By default this is guessed to be the name of the polymorphic association
# specified on "as" option with a "_type" suffix. So a class that defines a
# <tt>has_many :tags, as: :taggable</tt> association will use "taggable_type" as the
# default <tt>:foreign_type</tt>.
# [:primary_key]
# Specify the name of the column to use as the primary key for the association. By default this is +id+.
# [:dependent]
# Controls what happens to the associated objects when
# their owner is destroyed. Note that these are implemented as
# callbacks, and Rails executes callbacks in order. Therefore, other
# similar callbacks may affect the <tt>:dependent</tt> behavior, and the
# <tt>:dependent</tt> behavior may affect other callbacks.
#
# * <tt>nil</tt> do nothing (default).
# * <tt>:destroy</tt> causes all the associated objects to also be destroyed.
# * <tt>:destroy_async</tt> destroys all the associated objects in a background job. <b>WARNING:</b> Do not use
# this option if the association is backed by foreign key constraints in your database. The foreign key
# constraint actions will occur inside the same transaction that deletes its owner.
# * <tt>:delete_all</tt> causes all the associated objects to be deleted directly from the database (so callbacks will not be executed).
# * <tt>:nullify</tt> causes the foreign keys to be set to +NULL+. Polymorphic type will also be nullified
# on polymorphic associations. Callbacks are not executed.
# * <tt>:restrict_with_exception</tt> causes an <tt>ActiveRecord::DeleteRestrictionError</tt> exception to be raised if there are any associated records.
# * <tt>:restrict_with_error</tt> causes an error to be added to the owner if there are any associated objects.
#
# If using with the <tt>:through</tt> option, the association on the join model must be
# a #belongs_to, and the records which get deleted are the join records, rather than
# the associated records.
#
# If using <tt>dependent: :destroy</tt> on a scoped association, only the scoped objects are destroyed.
# For example, if a Post model defines
# <tt>has_many :comments, -> { where published: true }, dependent: :destroy</tt> and <tt>destroy</tt> is
# called on a post, only published comments are destroyed. This means that any unpublished comments in the
# database would still contain a foreign key pointing to the now deleted post.
# [:counter_cache]
# This option can be used to configure a custom named <tt>:counter_cache.</tt> You only need this option,
# when you customized the name of your <tt>:counter_cache</tt> on the #belongs_to association.
# [:as]
# Specifies a polymorphic interface (See #belongs_to).
# [:through]
# Specifies an association through which to perform the query. This can be any other type
# of association, including other <tt>:through</tt> associations. Options for <tt>:class_name</tt>,
# <tt>:primary_key</tt> and <tt>:foreign_key</tt> are ignored, as the association uses the
# source reflection.
#
# If the association on the join model is a #belongs_to, the collection can be modified
# and the records on the <tt>:through</tt> model will be automatically created and removed
# as appropriate. Otherwise, the collection is read-only, so you should manipulate the
# <tt>:through</tt> association directly.
#
# If you are going to modify the association (rather than just read from it), then it is
# a good idea to set the <tt>:inverse_of</tt> option on the source association on the
# join model. This allows associated records to be built which will automatically create
# the appropriate join model records when they are saved. (See the 'Association Join Models'
# and 'Setting Inverses' sections above.)
# [:disable_joins]
# Specifies whether joins should be skipped for an association. If set to true, two or more queries
# will be generated. Note that in some cases, if order or limit is applied, it will be done in-memory
# due to database limitations. This option is only applicable on <tt>has_many :through</tt> associations as
# +has_many+ alone do not perform a join.
# [:source]
# Specifies the source association name used by #has_many <tt>:through</tt> queries.
# Only use it if the name cannot be inferred from the association.
# <tt>has_many :subscribers, through: :subscriptions</tt> will look for either <tt>:subscribers</tt> or
# <tt>:subscriber</tt> on Subscription, unless a <tt>:source</tt> is given.
# [:source_type]
# Specifies type of the source association used by #has_many <tt>:through</tt> queries where the source
# association is a polymorphic #belongs_to.
# [:validate]
# When set to +true+, validates new objects added to association when saving the parent object. +true+ by default.
# If you want to ensure associated objects are revalidated on every update, use +validates_associated+.
# [:autosave]
# If true, always save the associated objects or destroy them if marked for destruction,
# when saving the parent object. If false, never save or destroy the associated objects.
# By default, only save associated objects that are new records. This option is implemented as a
# +before_save+ callback. Because callbacks are run in the order they are defined, associated objects
# may need to be explicitly saved in any user-defined +before_save+ callbacks.
#
# Note that NestedAttributes::ClassMethods#accepts_nested_attributes_for sets
# <tt>:autosave</tt> to <tt>true</tt>.
# [:inverse_of]
# Specifies the name of the #belongs_to association on the associated object
# that is the inverse of this #has_many association.
# See ActiveRecord::Associations::ClassMethods's overview on Bi-directional associations for more detail.
# [:extend]
# Specifies a module or array of modules that will be extended into the association object returned.
# Useful for defining methods on associations, especially when they should be shared between multiple
# association objects.
# [:strict_loading]
# When set to +true+, enforces strict loading every time the associated record is loaded through this
# association.
# [:ensuring_owner_was]
# Specifies an instance method to be called on the owner. The method must return true in order for the
# associated records to be deleted in a background job.
#
# Option examples:
# has_many :comments, -> { order("posted_on") }
# has_many :comments, -> { includes(:author) }
# has_many :people, -> { where(deleted: false).order("name") }, class_name: "Person"
# has_many :tracks, -> { order("position") }, dependent: :destroy
# has_many :comments, dependent: :nullify
# has_many :tags, as: :taggable
# has_many :reports, -> { readonly }
# has_many :subscribers, through: :subscriptions, source: :user
# has_many :subscribers, through: :subscriptions, disable_joins: true
# has_many :comments, strict_loading: true
def has_many(name, scope = nil, **options, &extension)
reflection = Builder::HasMany.build(self, name, scope, options, &extension)
Reflection.add_reflection self, name, reflection
end
# Specifies a one-to-one association with another class. This method should only be used
# if the other class contains the foreign key. If the current class contains the foreign key,
# then you should use #belongs_to instead. See also ActiveRecord::Associations::ClassMethods's overview
# on when to use #has_one and when to use #belongs_to.
#
# The following methods for retrieval and query of a single associated object will be added:
#
# +association+ is a placeholder for the symbol passed as the +name+ argument, so
# <tt>has_one :manager</tt> would add among others <tt>manager.nil?</tt>.
#
# [association]
# Returns the associated object. +nil+ is returned if none is found.
# [association=(associate)]
# Assigns the associate object, extracts the primary key, sets it as the foreign key,
# and saves the associate object. To avoid database inconsistencies, permanently deletes an existing
# associated object when assigning a new one, even if the new one isn't saved to database.
# [build_association(attributes = {})]
# Returns a new object of the associated type that has been instantiated
# with +attributes+ and linked to this object through a foreign key, but has not
# yet been saved.
# [create_association(attributes = {})]
# Returns a new object of the associated type that has been instantiated
# with +attributes+, linked to this object through a foreign key, and that
# has already been saved (if it passed the validation).
# [create_association!(attributes = {})]
# Does the same as <tt>create_association</tt>, but raises ActiveRecord::RecordInvalid
# if the record is invalid.
# [reload_association]
# Returns the associated object, forcing a database read.
#
# === Example
#
# An Account class declares <tt>has_one :beneficiary</tt>, which will add:
# * <tt>Account#beneficiary</tt> (similar to <tt>Beneficiary.where(account_id: id).first</tt>)
# * <tt>Account#beneficiary=(beneficiary)</tt> (similar to <tt>beneficiary.account_id = account.id; beneficiary.save</tt>)
# * <tt>Account#build_beneficiary</tt> (similar to <tt>Beneficiary.new(account_id: id)</tt>)
# * <tt>Account#create_beneficiary</tt> (similar to <tt>b = Beneficiary.new(account_id: id); b.save; b</tt>)
# * <tt>Account#create_beneficiary!</tt> (similar to <tt>b = Beneficiary.new(account_id: id); b.save!; b</tt>)
# * <tt>Account#reload_beneficiary</tt>
#
# === Scopes
#
# You can pass a second argument +scope+ as a callable (i.e. proc or
# lambda) to retrieve a specific record or customize the generated query
# when you access the associated object.
#
# Scope examples:
# has_one :author, -> { where(comment_id: 1) }
# has_one :employer, -> { joins(:company) }
# has_one :latest_post, ->(blog) { where("created_at > ?", blog.enabled_at) }
#
# === Options
#
# The declaration can also include an +options+ hash to specialize the behavior of the association.
#
# Options are:
# [:class_name]
# Specify the class name of the association. Use it only if that name can't be inferred
# from the association name. So <tt>has_one :manager</tt> will by default be linked to the Manager class, but
# if the real class name is Person, you'll have to specify it with this option.
# [:dependent]
# Controls what happens to the associated object when
# its owner is destroyed:
#
# * <tt>nil</tt> do nothing (default).
# * <tt>:destroy</tt> causes the associated object to also be destroyed
# * <tt>:destroy_async</tt> causes the associated object to be destroyed in a background job. <b>WARNING:</b> Do not use
# this option if the association is backed by foreign key constraints in your database. The foreign key
# constraint actions will occur inside the same transaction that deletes its owner.
# * <tt>:delete</tt> causes the associated object to be deleted directly from the database (so callbacks will not execute)
# * <tt>:nullify</tt> causes the foreign key to be set to +NULL+. Polymorphic type column is also nullified
# on polymorphic associations. Callbacks are not executed.
# * <tt>:restrict_with_exception</tt> causes an <tt>ActiveRecord::DeleteRestrictionError</tt> exception to be raised if there is an associated record
# * <tt>:restrict_with_error</tt> causes an error to be added to the owner if there is an associated object
#
# Note that <tt>:dependent</tt> option is ignored when using <tt>:through</tt> option.
# [:foreign_key]
# Specify the foreign key used for the association. By default this is guessed to be the name
# of this class in lower-case and "_id" suffixed. So a Person class that makes a #has_one association
# will use "person_id" as the default <tt>:foreign_key</tt>.
#
# Setting the <tt>:foreign_key</tt> option prevents automatic detection of the association's
# inverse, so it is generally a good idea to set the <tt>:inverse_of</tt> option as well.
# [:foreign_type]
# Specify the column used to store the associated object's type, if this is a polymorphic
# association. By default this is guessed to be the name of the polymorphic association
# specified on "as" option with a "_type" suffix. So a class that defines a
# <tt>has_one :tag, as: :taggable</tt> association will use "taggable_type" as the
# default <tt>:foreign_type</tt>.
# [:primary_key]
# Specify the method that returns the primary key used for the association. By default this is +id+.
# [:as]
# Specifies a polymorphic interface (See #belongs_to).
# [:through]
# Specifies a Join Model through which to perform the query. Options for <tt>:class_name</tt>,
# <tt>:primary_key</tt>, and <tt>:foreign_key</tt> are ignored, as the association uses the
# source reflection. You can only use a <tt>:through</tt> query through a #has_one
# or #belongs_to association on the join model.
#
# If the association on the join model is a #belongs_to, the collection can be modified
# and the records on the <tt>:through</tt> model will be automatically created and removed
# as appropriate. Otherwise, the collection is read-only, so you should manipulate the
# <tt>:through</tt> association directly.
#
# If you are going to modify the association (rather than just read from it), then it is
# a good idea to set the <tt>:inverse_of</tt> option on the source association on the
# join model. This allows associated records to be built which will automatically create
# the appropriate join model records when they are saved. (See the 'Association Join Models'
# and 'Setting Inverses' sections above.)
# [:disable_joins]
# Specifies whether joins should be skipped for an association. If set to true, two or more queries
# will be generated. Note that in some cases, if order or limit is applied, it will be done in-memory
# due to database limitations. This option is only applicable on <tt>has_one :through</tt> associations as
# +has_one+ alone does not perform a join.
# [:source]
# Specifies the source association name used by #has_one <tt>:through</tt> queries.
# Only use it if the name cannot be inferred from the association.
# <tt>has_one :favorite, through: :favorites</tt> will look for a
# <tt>:favorite</tt> on Favorite, unless a <tt>:source</tt> is given.
# [:source_type]
# Specifies type of the source association used by #has_one <tt>:through</tt> queries where the source
# association is a polymorphic #belongs_to.
# [:validate]
# When set to +true+, validates new objects added to association when saving the parent object. +false+ by default.
# If you want to ensure associated objects are revalidated on every update, use +validates_associated+.
# [:autosave]
# If true, always save the associated object or destroy it if marked for destruction,
# when saving the parent object. If false, never save or destroy the associated object.
# By default, only save the associated object if it's a new record.
#
# Note that NestedAttributes::ClassMethods#accepts_nested_attributes_for sets
# <tt>:autosave</tt> to <tt>true</tt>.
# [:inverse_of]
# Specifies the name of the #belongs_to association on the associated object
# that is the inverse of this #has_one association.
# See ActiveRecord::Associations::ClassMethods's overview on Bi-directional associations for more detail.
# [:required]
# When set to +true+, the association will also have its presence validated.
# This will validate the association itself, not the id. You can use
# +:inverse_of+ to avoid an extra query during validation.
# [:strict_loading]
# Enforces strict loading every time the associated record is loaded through this association.
# [:ensuring_owner_was]
# Specifies an instance method to be called on the owner. The method must return true in order for the
# associated records to be deleted in a background job.
#
# Option examples:
# has_one :credit_card, dependent: :destroy # destroys the associated credit card
# has_one :credit_card, dependent: :nullify # updates the associated records foreign
# # key value to NULL rather than destroying it
# has_one :last_comment, -> { order('posted_on') }, class_name: "Comment"
# has_one :project_manager, -> { where(role: 'project_manager') }, class_name: "Person"
# has_one :attachment, as: :attachable
# has_one :boss, -> { readonly }
# has_one :club, through: :membership
# has_one :club, through: :membership, disable_joins: true
# has_one :primary_address, -> { where(primary: true) }, through: :addressables, source: :addressable
# has_one :credit_card, required: true
# has_one :credit_card, strict_loading: true
def has_one(name, scope = nil, **options)
reflection = Builder::HasOne.build(self, name, scope, options)
Reflection.add_reflection self, name, reflection
end
# Specifies a one-to-one association with another class. This method should only be used
# if this class contains the foreign key. If the other class contains the foreign key,
# then you should use #has_one instead. See also ActiveRecord::Associations::ClassMethods's overview
# on when to use #has_one and when to use #belongs_to.
#
# Methods will be added for retrieval and query for a single associated object, for which
# this object holds an id:
#
# +association+ is a placeholder for the symbol passed as the +name+ argument, so
# <tt>belongs_to :author</tt> would add among others <tt>author.nil?</tt>.
#
# [association]
# Returns the associated object. +nil+ is returned if none is found.
# [association=(associate)]
# Assigns the associate object, extracts the primary key, and sets it as the foreign key.
# No modification or deletion of existing records takes place.
# [build_association(attributes = {})]
# Returns a new object of the associated type that has been instantiated
# with +attributes+ and linked to this object through a foreign key, but has not yet been saved.
# [create_association(attributes = {})]
# Returns a new object of the associated type that has been instantiated
# with +attributes+, linked to this object through a foreign key, and that
# has already been saved (if it passed the validation).
# [create_association!(attributes = {})]
# Does the same as <tt>create_association</tt>, but raises ActiveRecord::RecordInvalid
# if the record is invalid.
# [reload_association]
# Returns the associated object, forcing a database read.
# [association_changed?]
# Returns true if a new associate object has been assigned and the next save will update the foreign key.
# [association_previously_changed?]
# Returns true if the previous save updated the association to reference a new associate object.
#
# === Example
#
# A Post class declares <tt>belongs_to :author</tt>, which will add:
# * <tt>Post#author</tt> (similar to <tt>Author.find(author_id)</tt>)
# * <tt>Post#author=(author)</tt> (similar to <tt>post.author_id = author.id</tt>)
# * <tt>Post#build_author</tt> (similar to <tt>post.author = Author.new</tt>)
# * <tt>Post#create_author</tt> (similar to <tt>post.author = Author.new; post.author.save; post.author</tt>)
# * <tt>Post#create_author!</tt> (similar to <tt>post.author = Author.new; post.author.save!; post.author</tt>)
# * <tt>Post#reload_author</tt>
# * <tt>Post#author_changed?</tt>
# * <tt>Post#author_previously_changed?</tt>
# The declaration can also include an +options+ hash to specialize the behavior of the association.
#
# === Scopes
#
# You can pass a second argument +scope+ as a callable (i.e. proc or
# lambda) to retrieve a specific record or customize the generated query
# when you access the associated object.
#
# Scope examples:
# belongs_to :firm, -> { where(id: 2) }
# belongs_to :user, -> { joins(:friends) }
# belongs_to :level, ->(game) { where("game_level > ?", game.current_level) }
#
# === Options
#
# [:class_name]
# Specify the class name of the association. Use it only if that name can't be inferred
# from the association name. So <tt>belongs_to :author</tt> will by default be linked to the Author class, but
# if the real class name is Person, you'll have to specify it with this option.
# [:foreign_key]
# Specify the foreign key used for the association. By default this is guessed to be the name
# of the association with an "_id" suffix. So a class that defines a <tt>belongs_to :person</tt>
# association will use "person_id" as the default <tt>:foreign_key</tt>. Similarly,
# <tt>belongs_to :favorite_person, class_name: "Person"</tt> will use a foreign key
# of "favorite_person_id".
#
# Setting the <tt>:foreign_key</tt> option prevents automatic detection of the association's
# inverse, so it is generally a good idea to set the <tt>:inverse_of</tt> option as well.
# [:foreign_type]
# Specify the column used to store the associated object's type, if this is a polymorphic
# association. By default this is guessed to be the name of the association with a "_type"
# suffix. So a class that defines a <tt>belongs_to :taggable, polymorphic: true</tt>
# association will use "taggable_type" as the default <tt>:foreign_type</tt>.
# [:primary_key]
# Specify the method that returns the primary key of associated object used for the association.
# By default this is +id+.
# [:dependent]
# If set to <tt>:destroy</tt>, the associated object is destroyed when this object is. If set to
# <tt>:delete</tt>, the associated object is deleted *without* calling its destroy method. If set to
# <tt>:destroy_async</tt>, the associated object is scheduled to be destroyed in a background job.
# This option should not be specified when #belongs_to is used in conjunction with
# a #has_many relationship on another class because of the potential to leave
# orphaned records behind.
# [:counter_cache]
# Caches the number of belonging objects on the associate class through the use of CounterCache::ClassMethods#increment_counter
# and CounterCache::ClassMethods#decrement_counter. The counter cache is incremented when an object of this
# class is created and decremented when it's destroyed. This requires that a column
# named <tt>#{table_name}_count</tt> (such as +comments_count+ for a belonging Comment class)
# is used on the associate class (such as a Post class) - that is the migration for
# <tt>#{table_name}_count</tt> is created on the associate class (such that <tt>Post.comments_count</tt> will
# return the count cached, see note below). You can also specify a custom counter
# cache column by providing a column name instead of a +true+/+false+ value to this
# option (e.g., <tt>counter_cache: :my_custom_counter</tt>.)
# Note: Specifying a counter cache will add it to that model's list of readonly attributes
# using +attr_readonly+.
# [:polymorphic]
# Specify this association is a polymorphic association by passing +true+.
# Note: If you've enabled the counter cache, then you may want to add the counter cache attribute
# to the +attr_readonly+ list in the associated classes (e.g. <tt>class Post; attr_readonly :comments_count; end</tt>).
# [:validate]
# When set to +true+, validates new objects added to association when saving the parent object. +false+ by default.
# If you want to ensure associated objects are revalidated on every update, use +validates_associated+.
# [:autosave]
# If true, always save the associated object or destroy it if marked for destruction, when
# saving the parent object.
# If false, never save or destroy the associated object.
# By default, only save the associated object if it's a new record.
#
# Note that NestedAttributes::ClassMethods#accepts_nested_attributes_for
# sets <tt>:autosave</tt> to <tt>true</tt>.
# [:touch]
# If true, the associated object will be touched (the updated_at/on attributes set to current time)
# when this record is either saved or destroyed. If you specify a symbol, that attribute
# will be updated with the current time in addition to the updated_at/on attribute.
# Please note that with touching no validation is performed and only the +after_touch+,
# +after_commit+ and +after_rollback+ callbacks are executed.
# [:inverse_of]
# Specifies the name of the #has_one or #has_many association on the associated
# object that is the inverse of this #belongs_to association.
# See ActiveRecord::Associations::ClassMethods's overview on Bi-directional associations for more detail.
# [:optional]
# When set to +true+, the association will not have its presence validated.
# [:required]
# When set to +true+, the association will also have its presence validated.
# This will validate the association itself, not the id. You can use
# +:inverse_of+ to avoid an extra query during validation.
# NOTE: <tt>required</tt> is set to <tt>true</tt> by default and is deprecated. If
# you don't want to have association presence validated, use <tt>optional: true</tt>.
# [:default]
# Provide a callable (i.e. proc or lambda) to specify that the association should
# be initialized with a particular record before validation.
# [:strict_loading]
# Enforces strict loading every time the associated record is loaded through this association.
# [:ensuring_owner_was]
# Specifies an instance method to be called on the owner. The method must return true in order for the
# associated records to be deleted in a background job.
#
# Option examples:
# belongs_to :firm, foreign_key: "client_of"
# belongs_to :person, primary_key: "name", foreign_key: "person_name"
# belongs_to :author, class_name: "Person", foreign_key: "author_id"
# belongs_to :valid_coupon, ->(o) { where "discounts > ?", o.payments_count },
# class_name: "Coupon", foreign_key: "coupon_id"
# belongs_to :attachable, polymorphic: true
# belongs_to :project, -> { readonly }
# belongs_to :post, counter_cache: true
# belongs_to :comment, touch: true
# belongs_to :company, touch: :employees_last_updated_at
# belongs_to :user, optional: true
# belongs_to :account, default: -> { company.account }
# belongs_to :account, strict_loading: true
def belongs_to(name, scope = nil, **options)
reflection = Builder::BelongsTo.build(self, name, scope, options)
Reflection.add_reflection self, name, reflection
end
# Specifies a many-to-many relationship with another class. This associates two classes via an
# intermediate join table. Unless the join table is explicitly specified as an option, it is
# guessed using the lexical order of the class names. So a join between Developer and Project
# will give the default join table name of "developers_projects" because "D" precedes "P" alphabetically.
# Note that this precedence is calculated using the <tt><</tt> operator for String. This
# means that if the strings are of different lengths, and the strings are equal when compared
# up to the shortest length, then the longer string is considered of higher
# lexical precedence than the shorter one. For example, one would expect the tables "paper_boxes" and "papers"
# to generate a join table name of "papers_paper_boxes" because of the length of the name "paper_boxes",
# but it in fact generates a join table name of "paper_boxes_papers". Be aware of this caveat, and use the
# custom <tt>:join_table</tt> option if you need to.
# If your tables share a common prefix, it will only appear once at the beginning. For example,
# the tables "catalog_categories" and "catalog_products" generate a join table name of "catalog_categories_products".
#
# The join table should not have a primary key or a model associated with it. You must manually generate the
# join table with a migration such as this:
#
# class CreateDevelopersProjectsJoinTable < ActiveRecord::Migration[7.1]
# def change
# create_join_table :developers, :projects
# end
# end
#
# It's also a good idea to add indexes to each of those columns to speed up the joins process.
# However, in MySQL it is advised to add a compound index for both of the columns as MySQL only
# uses one index per table during the lookup.
#
# Adds the following methods for retrieval and query:
#
# +collection+ is a placeholder for the symbol passed as the +name+ argument, so
# <tt>has_and_belongs_to_many :categories</tt> would add among others <tt>categories.empty?</tt>.
#
# [collection]
# Returns a Relation of all the associated objects.
# An empty Relation is returned if none are found.
# [collection<<(object, ...)]
# Adds one or more objects to the collection by creating associations in the join table
# (<tt>collection.push</tt> and <tt>collection.concat</tt> are aliases to this method).
# Note that this operation instantly fires update SQL without waiting for the save or update call on the
# parent object, unless the parent object is a new record.
# [collection.delete(object, ...)]
# Removes one or more objects from the collection by removing their associations from the join table.
# This does not destroy the objects.
# [collection.destroy(object, ...)]
# Removes one or more objects from the collection by running destroy on each association in the join table, overriding any dependent option.
# This does not destroy the objects.
# [collection=objects]
# Replaces the collection's content by deleting and adding objects as appropriate.
# [collection_singular_ids]
# Returns an array of the associated objects' ids.
# [collection_singular_ids=ids]
# Replace the collection by the objects identified by the primary keys in +ids+.
# [collection.clear]
# Removes every object from the collection. This does not destroy the objects.
# [collection.empty?]
# Returns +true+ if there are no associated objects.
# [collection.size]
# Returns the number of associated objects.
# [collection.find(id)]
# Finds an associated object responding to the +id+ and that
# meets the condition that it has to be associated with this object.
# Uses the same rules as ActiveRecord::FinderMethods#find.
# [collection.exists?(...)]
# Checks whether an associated object with the given conditions exists.
# Uses the same rules as ActiveRecord::FinderMethods#exists?.
# [collection.build(attributes = {})]
# Returns a new object of the collection type that has been instantiated
# with +attributes+ and linked to this object through the join table, but has not yet been saved.
# [collection.create(attributes = {})]
# Returns a new object of the collection type that has been instantiated
# with +attributes+, linked to this object through the join table, and that has already been
# saved (if it passed the validation).
# [collection.reload]
# Returns a Relation of all of the associated objects, forcing a database read.
# An empty Relation is returned if none are found.
#
# === Example
#
# A Developer class declares <tt>has_and_belongs_to_many :projects</tt>, which will add:
# * <tt>Developer#projects</tt>
# * <tt>Developer#projects<<</tt>
# * <tt>Developer#projects.delete</tt>
# * <tt>Developer#projects.destroy</tt>
# * <tt>Developer#projects=</tt>
# * <tt>Developer#project_ids</tt>
# * <tt>Developer#project_ids=</tt>
# * <tt>Developer#projects.clear</tt>
# * <tt>Developer#projects.empty?</tt>
# * <tt>Developer#projects.size</tt>
# * <tt>Developer#projects.find(id)</tt>
# * <tt>Developer#projects.exists?(...)</tt>
# * <tt>Developer#projects.build</tt> (similar to <tt>Project.new(developer_id: id)</tt>)
# * <tt>Developer#projects.create</tt> (similar to <tt>c = Project.new(developer_id: id); c.save; c</tt>)
# * <tt>Developer#projects.reload</tt>
# The declaration may include an +options+ hash to specialize the behavior of the association.
#
# === Scopes
#
# You can pass a second argument +scope+ as a callable (i.e. proc or
# lambda) to retrieve a specific set of records or customize the generated
# query when you access the associated collection.
#
# Scope examples:
# has_and_belongs_to_many :projects, -> { includes(:milestones, :manager) }
# has_and_belongs_to_many :categories, ->(post) {
# where("default_category = ?", post.default_category)
# }
#
# === Extensions
#
# The +extension+ argument allows you to pass a block into a
# has_and_belongs_to_many association. This is useful for adding new
# finders, creators, and other factory-type methods to be used as part of
# the association.
#
# Extension examples:
# has_and_belongs_to_many :contractors do
# def find_or_create_by_name(name)
# first_name, last_name = name.split(" ", 2)
# find_or_create_by(first_name: first_name, last_name: last_name)
# end
# end
#
# === Options
#
# [:class_name]
# Specify the class name of the association. Use it only if that name can't be inferred
# from the association name. So <tt>has_and_belongs_to_many :projects</tt> will by default be linked to the
# Project class, but if the real class name is SuperProject, you'll have to specify it with this option.
# [:join_table]
# Specify the name of the join table if the default based on lexical order isn't what you want.
# <b>WARNING:</b> If you're overwriting the table name of either class, the +table_name+ method
# MUST be declared underneath any #has_and_belongs_to_many declaration in order to work.
# [:foreign_key]
# Specify the foreign key used for the association. By default this is guessed to be the name
# of this class in lower-case and "_id" suffixed. So a Person class that makes
# a #has_and_belongs_to_many association to Project will use "person_id" as the
# default <tt>:foreign_key</tt>.
#
# Setting the <tt>:foreign_key</tt> option prevents automatic detection of the association's
# inverse, so it is generally a good idea to set the <tt>:inverse_of</tt> option as well.
# [:association_foreign_key]
# Specify the foreign key used for the association on the receiving side of the association.
# By default this is guessed to be the name of the associated class in lower-case and "_id" suffixed.
# So if a Person class makes a #has_and_belongs_to_many association to Project,
# the association will use "project_id" as the default <tt>:association_foreign_key</tt>.
# [:validate]
# When set to +true+, validates new objects added to association when saving the parent object. +true+ by default.
# If you want to ensure associated objects are revalidated on every update, use +validates_associated+.
# [:autosave]
# If true, always save the associated objects or destroy them if marked for destruction, when
# saving the parent object.
# If false, never save or destroy the associated objects.
# By default, only save associated objects that are new records.
#
# Note that NestedAttributes::ClassMethods#accepts_nested_attributes_for sets
# <tt>:autosave</tt> to <tt>true</tt>.
# [:strict_loading]
# Enforces strict loading every time an associated record is loaded through this association.
#
# Option examples:
# has_and_belongs_to_many :projects
# has_and_belongs_to_many :projects, -> { includes(:milestones, :manager) }
# has_and_belongs_to_many :nations, class_name: "Country"
# has_and_belongs_to_many :categories, join_table: "prods_cats"
# has_and_belongs_to_many :categories, -> { readonly }
# has_and_belongs_to_many :categories, strict_loading: true
def has_and_belongs_to_many(name, scope = nil, **options, &extension)
habtm_reflection = ActiveRecord::Reflection::HasAndBelongsToManyReflection.new(name, scope, options, self)
builder = Builder::HasAndBelongsToMany.new name, self, options
join_model = builder.through_model
const_set join_model.name, join_model
private_constant join_model.name
middle_reflection = builder.middle_reflection join_model
Builder::HasMany.define_callbacks self, middle_reflection
Reflection.add_reflection self, middle_reflection.name, middle_reflection
middle_reflection.parent_reflection = habtm_reflection
include Module.new {
class_eval <<-RUBY, __FILE__, __LINE__ + 1
def destroy_associations
association(:#{middle_reflection.name}).delete_all(:delete_all)
association(:#{name}).reset
super
end
RUBY
}
hm_options = {}
hm_options[:through] = middle_reflection.name
hm_options[:source] = join_model.right_reflection.name
[:before_add, :after_add, :before_remove, :after_remove, :autosave, :validate, :join_table, :class_name, :extend, :strict_loading].each do |k|
hm_options[k] = options[k] if options.key? k
end
has_many name, scope, **hm_options, &extension
_reflections[name.to_s].parent_reflection = habtm_reflection
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
class AsynchronousQueriesTracker # :nodoc:
module NullSession # :nodoc:
class << self
def active?
true
end
def finalize
end
end
end
class Session # :nodoc:
def initialize
@active = true
end
def active?
@active
end
def finalize
@active = false
end
end
class << self
def install_executor_hooks(executor = ActiveSupport::Executor)
executor.register_hook(self)
end
def run
ActiveRecord::Base.asynchronous_queries_tracker.start_session
end
def complete(asynchronous_queries_tracker)
asynchronous_queries_tracker.finalize_session
end
end
attr_reader :current_session
def initialize
@current_session = NullSession
end
def start_session
@current_session = Session.new
self
end
def finalize_session
@current_session.finalize
@current_session = NullSession
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Attributes
class Attribute < Struct.new :relation, :name
include Arel::Expressions
include Arel::Predications
include Arel::AliasPredication
include Arel::OrderPredications
include Arel::Math
def type_caster
relation.type_for_attribute(name)
end
###
# Create a node for lowering this attribute
def lower
relation.lower self
end
def type_cast_for_database(value)
relation.type_cast_for_database(name, value)
end
def able_to_type_cast?
relation.able_to_type_cast?
end
end
end
Attribute = Attributes::Attribute
end
# frozen_string_literal: true
require "active_model/forbidden_attributes_protection"
module ActiveRecord
module AttributeAssignment
include ActiveModel::AttributeAssignment
private
def _assign_attributes(attributes)
multi_parameter_attributes = nested_parameter_attributes = nil
attributes.each do |k, v|
key = k.to_s
if key.include?("(")
(multi_parameter_attributes ||= {})[key] = v
elsif v.is_a?(Hash)
(nested_parameter_attributes ||= {})[key] = v
else
_assign_attribute(key, v)
end
end
assign_nested_parameter_attributes(nested_parameter_attributes) if nested_parameter_attributes
assign_multiparameter_attributes(multi_parameter_attributes) if multi_parameter_attributes
end
# Assign any deferred nested attributes after the base attributes have been set.
def assign_nested_parameter_attributes(pairs)
pairs.each { |k, v| _assign_attribute(k, v) }
end
# Instantiates objects for all attribute classes that needs more than one constructor parameter. This is done
# by calling new on the column type or aggregation type (through composed_of) object with these parameters.
# So having the pairs written_on(1) = "2004", written_on(2) = "6", written_on(3) = "24", will instantiate
# written_on (a date type) with Date.new("2004", "6", "24"). You can also specify a typecast character in the
# parentheses to have the parameters typecasted before they're used in the constructor. Use i for Integer and
# f for Float. If all the values for a given attribute are empty, the attribute will be set to +nil+.
def assign_multiparameter_attributes(pairs)
execute_callstack_for_multiparameter_attributes(
extract_callstack_for_multiparameter_attributes(pairs)
)
end
def execute_callstack_for_multiparameter_attributes(callstack)
errors = []
callstack.each do |name, values_with_empty_parameters|
if values_with_empty_parameters.each_value.all?(NilClass)
values = nil
else
values = values_with_empty_parameters
end
send("#{name}=", values)
rescue => ex
errors << AttributeAssignmentError.new("error on assignment #{values_with_empty_parameters.values.inspect} to #{name} (#{ex.message})", ex, name)
end
unless errors.empty?
error_descriptions = errors.map(&:message).join(",")
raise MultiparameterAssignmentErrors.new(errors), "#{errors.size} error(s) on assignment of multiparameter attributes [#{error_descriptions}]"
end
end
def extract_callstack_for_multiparameter_attributes(pairs)
attributes = {}
pairs.each do |(multiparameter_name, value)|
attribute_name = multiparameter_name.split("(").first
attributes[attribute_name] ||= {}
parameter_value = value.empty? ? nil : type_cast_attribute_value(multiparameter_name, value)
attributes[attribute_name][find_parameter_position(multiparameter_name)] ||= parameter_value
end
attributes
end
def type_cast_attribute_value(multiparameter_name, value)
multiparameter_name =~ /\([0-9]*([if])\)/ ? value.send("to_" + $1) : value
end
def find_parameter_position(multiparameter_name)
multiparameter_name.scan(/\(([0-9]*).*\)/).first.first.to_i
end
end
end
# frozen_string_literal: true
require "mutex_m"
require "active_support/core_ext/enumerable"
module ActiveRecord
# = Active Record Attribute Methods
module AttributeMethods
extend ActiveSupport::Concern
include ActiveModel::AttributeMethods
included do
initialize_generated_modules
include Read
include Write
include BeforeTypeCast
include Query
include PrimaryKey
include TimeZoneConversion
include Dirty
include Serialization
end
RESTRICTED_CLASS_METHODS = %w(private public protected allocate new name parent superclass)
class GeneratedAttributeMethods < Module # :nodoc:
include Mutex_m
end
class << self
def dangerous_attribute_methods # :nodoc:
@dangerous_attribute_methods ||= (
Base.instance_methods +
Base.private_instance_methods -
Base.superclass.instance_methods -
Base.superclass.private_instance_methods
).map { |m| -m.to_s }.to_set.freeze
end
end
module ClassMethods
def inherited(child_class) # :nodoc:
child_class.initialize_generated_modules
super
end
def initialize_generated_modules # :nodoc:
@generated_attribute_methods = const_set(:GeneratedAttributeMethods, GeneratedAttributeMethods.new)
private_constant :GeneratedAttributeMethods
@attribute_methods_generated = false
include @generated_attribute_methods
super
end
# Generates all the attribute related methods for columns in the database
# accessors, mutators and query methods.
def define_attribute_methods # :nodoc:
return false if @attribute_methods_generated
# Use a mutex; we don't want two threads simultaneously trying to define
# attribute methods.
generated_attribute_methods.synchronize do
return false if @attribute_methods_generated
superclass.define_attribute_methods unless base_class?
super(attribute_names)
@attribute_methods_generated = true
end
end
def undefine_attribute_methods # :nodoc:
generated_attribute_methods.synchronize do
super if defined?(@attribute_methods_generated) && @attribute_methods_generated
@attribute_methods_generated = false
end
end
# Raises an ActiveRecord::DangerousAttributeError exception when an
# \Active \Record method is defined in the model, otherwise +false+.
#
# class Person < ActiveRecord::Base
# def save
# 'already defined by Active Record'
# end
# end
#
# Person.instance_method_already_implemented?(:save)
# # => ActiveRecord::DangerousAttributeError: save is defined by Active Record. Check to make sure that you don't have an attribute or method with the same name.
#
# Person.instance_method_already_implemented?(:name)
# # => false
def instance_method_already_implemented?(method_name)
if dangerous_attribute_method?(method_name)
raise DangerousAttributeError, "#{method_name} is defined by Active Record. Check to make sure that you don't have an attribute or method with the same name."
end
if superclass == Base
super
else
# If ThisClass < ... < SomeSuperClass < ... < Base and SomeSuperClass
# defines its own attribute method, then we don't want to override that.
defined = method_defined_within?(method_name, superclass, Base) &&
! superclass.instance_method(method_name).owner.is_a?(GeneratedAttributeMethods)
defined || super
end
end
# A method name is 'dangerous' if it is already (re)defined by Active Record, but
# not by any ancestors. (So 'puts' is not dangerous but 'save' is.)
def dangerous_attribute_method?(name) # :nodoc:
::ActiveRecord::AttributeMethods.dangerous_attribute_methods.include?(name.to_s)
end
def method_defined_within?(name, klass, superklass = klass.superclass) # :nodoc:
if klass.method_defined?(name) || klass.private_method_defined?(name)
if superklass.method_defined?(name) || superklass.private_method_defined?(name)
klass.instance_method(name).owner != superklass.instance_method(name).owner
else
true
end
else
false
end
end
# A class method is 'dangerous' if it is already (re)defined by Active Record, but
# not by any ancestors. (So 'puts' is not dangerous but 'new' is.)
def dangerous_class_method?(method_name)
return true if RESTRICTED_CLASS_METHODS.include?(method_name.to_s)
if Base.respond_to?(method_name, true)
if Object.respond_to?(method_name, true)
Base.method(method_name).owner != Object.method(method_name).owner
else
true
end
else
false
end
end
# Returns +true+ if +attribute+ is an attribute method and table exists,
# +false+ otherwise.
#
# class Person < ActiveRecord::Base
# end
#
# Person.attribute_method?('name') # => true
# Person.attribute_method?(:age=) # => true
# Person.attribute_method?(:nothing) # => false
def attribute_method?(attribute)
super || (table_exists? && column_names.include?(attribute.to_s.delete_suffix("=")))
end
# Returns an array of column names as strings if it's not an abstract class and
# table exists. Otherwise it returns an empty array.
#
# class Person < ActiveRecord::Base
# end
#
# Person.attribute_names
# # => ["id", "created_at", "updated_at", "name", "age"]
def attribute_names
@attribute_names ||= if !abstract_class? && table_exists?
attribute_types.keys
else
[]
end.freeze
end
# Returns true if the given attribute exists, otherwise false.
#
# class Person < ActiveRecord::Base
# alias_attribute :new_name, :name
# end
#
# Person.has_attribute?('name') # => true
# Person.has_attribute?('new_name') # => true
# Person.has_attribute?(:age) # => true
# Person.has_attribute?(:nothing) # => false
def has_attribute?(attr_name)
attr_name = attr_name.to_s
attr_name = attribute_aliases[attr_name] || attr_name
attribute_types.key?(attr_name)
end
def _has_attribute?(attr_name) # :nodoc:
attribute_types.key?(attr_name)
end
end
# A Person object with a name attribute can ask <tt>person.respond_to?(:name)</tt>,
# <tt>person.respond_to?(:name=)</tt>, and <tt>person.respond_to?(:name?)</tt>
# which will all return +true+. It also defines the attribute methods if they have
# not been generated.
#
# class Person < ActiveRecord::Base
# end
#
# person = Person.new
# person.respond_to?(:name) # => true
# person.respond_to?(:name=) # => true
# person.respond_to?(:name?) # => true
# person.respond_to?('age') # => true
# person.respond_to?('age=') # => true
# person.respond_to?('age?') # => true
# person.respond_to?(:nothing) # => false
def respond_to?(name, include_private = false)
return false unless super
# If the result is true then check for the select case.
# For queries selecting a subset of columns, return false for unselected columns.
# We check defined?(@attributes) not to issue warnings if called on objects that
# have been allocated but not yet initialized.
if defined?(@attributes)
if name = self.class.symbol_column_to_string(name.to_sym)
return _has_attribute?(name)
end
end
true
end
# Returns +true+ if the given attribute is in the attributes hash, otherwise +false+.
#
# class Person < ActiveRecord::Base
# alias_attribute :new_name, :name
# end
#
# person = Person.new
# person.has_attribute?(:name) # => true
# person.has_attribute?(:new_name) # => true
# person.has_attribute?('age') # => true
# person.has_attribute?(:nothing) # => false
def has_attribute?(attr_name)
attr_name = attr_name.to_s
attr_name = self.class.attribute_aliases[attr_name] || attr_name
@attributes.key?(attr_name)
end
def _has_attribute?(attr_name) # :nodoc:
@attributes.key?(attr_name)
end
# Returns an array of names for the attributes available on this object.
#
# class Person < ActiveRecord::Base
# end
#
# person = Person.new
# person.attribute_names
# # => ["id", "created_at", "updated_at", "name", "age"]
def attribute_names
@attributes.keys
end
# Returns a hash of all the attributes with their names as keys and the values of the attributes as values.
#
# class Person < ActiveRecord::Base
# end
#
# person = Person.create(name: 'Francesco', age: 22)
# person.attributes
# # => {"id"=>3, "created_at"=>Sun, 21 Oct 2012 04:53:04, "updated_at"=>Sun, 21 Oct 2012 04:53:04, "name"=>"Francesco", "age"=>22}
def attributes
@attributes.to_hash
end
# Returns an <tt>#inspect</tt>-like string for the value of the
# attribute +attr_name+. String attributes are truncated up to 50
# characters. Other attributes return the value of <tt>#inspect</tt>
# without modification.
#
# person = Person.create!(name: 'David Heinemeier Hansson ' * 3)
#
# person.attribute_for_inspect(:name)
# # => "\"David Heinemeier Hansson David Heinemeier Hansson ...\""
#
# person.attribute_for_inspect(:created_at)
# # => "\"2012-10-22 00:15:07.000000000 +0000\""
#
# person.attribute_for_inspect(:tag_ids)
# # => "[1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11]"
def attribute_for_inspect(attr_name)
attr_name = attr_name.to_s
attr_name = self.class.attribute_aliases[attr_name] || attr_name
value = _read_attribute(attr_name)
format_for_inspect(attr_name, value)
end
# Returns +true+ if the specified +attribute+ has been set by the user or by a
# database load and is neither +nil+ nor <tt>empty?</tt> (the latter only applies
# to objects that respond to <tt>empty?</tt>, most notably Strings). Otherwise, +false+.
# Note that it always returns +true+ with boolean attributes.
#
# class Task < ActiveRecord::Base
# end
#
# task = Task.new(title: '', is_done: false)
# task.attribute_present?(:title) # => false
# task.attribute_present?(:is_done) # => true
# task.title = 'Buy milk'
# task.is_done = true
# task.attribute_present?(:title) # => true
# task.attribute_present?(:is_done) # => true
def attribute_present?(attr_name)
attr_name = attr_name.to_s
attr_name = self.class.attribute_aliases[attr_name] || attr_name
value = _read_attribute(attr_name)
!value.nil? && !(value.respond_to?(:empty?) && value.empty?)
end
# Returns the value of the attribute identified by <tt>attr_name</tt> after it has been typecast (for example,
# "2004-12-12" in a date column is cast to a date object, like Date.new(2004, 12, 12)). It raises
# <tt>ActiveModel::MissingAttributeError</tt> if the identified attribute is missing.
#
# Note: +:id+ is always present.
#
# class Person < ActiveRecord::Base
# belongs_to :organization
# end
#
# person = Person.new(name: 'Francesco', age: '22')
# person[:name] # => "Francesco"
# person[:age] # => 22
#
# person = Person.select('id').first
# person[:name] # => ActiveModel::MissingAttributeError: missing attribute: name
# person[:organization_id] # => ActiveModel::MissingAttributeError: missing attribute: organization_id
def [](attr_name)
read_attribute(attr_name) { |n| missing_attribute(n, caller) }
end
# Updates the attribute identified by <tt>attr_name</tt> with the specified +value+.
# (Alias for the protected #write_attribute method).
#
# class Person < ActiveRecord::Base
# end
#
# person = Person.new
# person[:age] = '22'
# person[:age] # => 22
# person[:age].class # => Integer
def []=(attr_name, value)
write_attribute(attr_name, value)
end
# Returns the name of all database fields which have been read from this
# model. This can be useful in development mode to determine which fields
# need to be selected. For performance critical pages, selecting only the
# required fields can be an easy performance win (assuming you aren't using
# all of the fields on the model).
#
# For example:
#
# class PostsController < ActionController::Base
# after_action :print_accessed_fields, only: :index
#
# def index
# @posts = Post.all
# end
#
# private
#
# def print_accessed_fields
# p @posts.first.accessed_fields
# end
# end
#
# Which allows you to quickly change your code to:
#
# class PostsController < ActionController::Base
# def index
# @posts = Post.select(:id, :title, :author_id, :updated_at)
# end
# end
def accessed_fields
@attributes.accessed
end
private
def attribute_method?(attr_name)
# We check defined? because Syck calls respond_to? before actually calling initialize.
defined?(@attributes) && @attributes.key?(attr_name)
end
def attributes_with_values(attribute_names)
attribute_names.index_with { |name| @attributes[name] }
end
# Filters the primary keys, readonly attributes and virtual columns from the attribute names.
def attributes_for_update(attribute_names)
attribute_names &= self.class.column_names
attribute_names.delete_if do |name|
self.class.readonly_attribute?(name) ||
column_for_attribute(name).virtual?
end
end
# Filters out the virtual columns and also primary keys, from the attribute names, when the primary
# key is to be generated (e.g. the id attribute has no value).
def attributes_for_create(attribute_names)
attribute_names &= self.class.column_names
attribute_names.delete_if do |name|
(pk_attribute?(name) && id.nil?) ||
column_for_attribute(name).virtual?
end
end
def format_for_inspect(name, value)
if value.nil?
value.inspect
else
inspected_value = if value.is_a?(String) && value.length > 50
"#{value[0, 50]}...".inspect
elsif value.is_a?(Date) || value.is_a?(Time)
%("#{value.to_fs(:inspect)}")
else
value.inspect
end
inspection_filter.filter_param(name, inspected_value)
end
end
def pk_attribute?(name)
name == @primary_key
end
end
end
# frozen_string_literal: true
require "active_model/attribute/user_provided_default"
module ActiveRecord
# See ActiveRecord::Attributes::ClassMethods for documentation
module Attributes
extend ActiveSupport::Concern
included do
class_attribute :attributes_to_define_after_schema_loads, instance_accessor: false, default: {} # :internal:
end
module ClassMethods
# Defines an attribute with a type on this model. It will override the
# type of existing attributes if needed. This allows control over how
# values are converted to and from SQL when assigned to a model. It also
# changes the behavior of values passed to
# {ActiveRecord::Base.where}[rdoc-ref:QueryMethods#where]. This will let you use
# your domain objects across much of Active Record, without having to
# rely on implementation details or monkey patching.
#
# +name+ The name of the methods to define attribute methods for, and the
# column which this will persist to.
#
# +cast_type+ A symbol such as +:string+ or +:integer+, or a type object
# to be used for this attribute. See the examples below for more
# information about providing custom type objects.
#
# ==== Options
#
# The following options are accepted:
#
# +default+ The default value to use when no value is provided. If this option
# is not passed, the previous default value (if any) will be used.
# Otherwise, the default will be +nil+.
#
# +array+ (PostgreSQL only) specifies that the type should be an array (see the
# examples below).
#
# +range+ (PostgreSQL only) specifies that the type should be a range (see the
# examples below).
#
# When using a symbol for +cast_type+, extra options are forwarded to the
# constructor of the type object.
#
# ==== Examples
#
# The type detected by Active Record can be overridden.
#
# # db/schema.rb
# create_table :store_listings, force: true do |t|
# t.decimal :price_in_cents
# end
#
# # app/models/store_listing.rb
# class StoreListing < ActiveRecord::Base
# end
#
# store_listing = StoreListing.new(price_in_cents: '10.1')
#
# # before
# store_listing.price_in_cents # => BigDecimal(10.1)
#
# class StoreListing < ActiveRecord::Base
# attribute :price_in_cents, :integer
# end
#
# # after
# store_listing.price_in_cents # => 10
#
# A default can also be provided.
#
# # db/schema.rb
# create_table :store_listings, force: true do |t|
# t.string :my_string, default: "original default"
# end
#
# StoreListing.new.my_string # => "original default"
#
# # app/models/store_listing.rb
# class StoreListing < ActiveRecord::Base
# attribute :my_string, :string, default: "new default"
# end
#
# StoreListing.new.my_string # => "new default"
#
# class Product < ActiveRecord::Base
# attribute :my_default_proc, :datetime, default: -> { Time.now }
# end
#
# Product.new.my_default_proc # => 2015-05-30 11:04:48 -0600
# sleep 1
# Product.new.my_default_proc # => 2015-05-30 11:04:49 -0600
#
# \Attributes do not need to be backed by a database column.
#
# # app/models/my_model.rb
# class MyModel < ActiveRecord::Base
# attribute :my_string, :string
# attribute :my_int_array, :integer, array: true
# attribute :my_float_range, :float, range: true
# end
#
# model = MyModel.new(
# my_string: "string",
# my_int_array: ["1", "2", "3"],
# my_float_range: "[1,3.5]",
# )
# model.attributes
# # =>
# {
# my_string: "string",
# my_int_array: [1, 2, 3],
# my_float_range: 1.0..3.5
# }
#
# Passing options to the type constructor
#
# # app/models/my_model.rb
# class MyModel < ActiveRecord::Base
# attribute :small_int, :integer, limit: 2
# end
#
# MyModel.create(small_int: 65537)
# # => Error: 65537 is out of range for the limit of two bytes
#
# ==== Creating Custom Types
#
# Users may also define their own custom types, as long as they respond
# to the methods defined on the value type. The method +deserialize+ or
# +cast+ will be called on your type object, with raw input from the
# database or from your controllers. See ActiveModel::Type::Value for the
# expected API. It is recommended that your type objects inherit from an
# existing type, or from ActiveRecord::Type::Value
#
# class MoneyType < ActiveRecord::Type::Integer
# def cast(value)
# if !value.kind_of?(Numeric) && value.include?('$')
# price_in_dollars = value.gsub(/\$/, '').to_f
# super(price_in_dollars * 100)
# else
# super
# end
# end
# end
#
# # config/initializers/types.rb
# ActiveRecord::Type.register(:money, MoneyType)
#
# # app/models/store_listing.rb
# class StoreListing < ActiveRecord::Base
# attribute :price_in_cents, :money
# end
#
# store_listing = StoreListing.new(price_in_cents: '$10.00')
# store_listing.price_in_cents # => 1000
#
# For more details on creating custom types, see the documentation for
# ActiveModel::Type::Value. For more details on registering your types
# to be referenced by a symbol, see ActiveRecord::Type.register. You can
# also pass a type object directly, in place of a symbol.
#
# ==== \Querying
#
# When {ActiveRecord::Base.where}[rdoc-ref:QueryMethods#where] is called, it will
# use the type defined by the model class to convert the value to SQL,
# calling +serialize+ on your type object. For example:
#
# class Money < Struct.new(:amount, :currency)
# end
#
# class MoneyType < ActiveRecord::Type::Value
# def initialize(currency_converter:)
# @currency_converter = currency_converter
# end
#
# # value will be the result of +deserialize+ or
# # +cast+. Assumed to be an instance of +Money+ in
# # this case.
# def serialize(value)
# value_in_bitcoins = @currency_converter.convert_to_bitcoins(value)
# value_in_bitcoins.amount
# end
# end
#
# # config/initializers/types.rb
# ActiveRecord::Type.register(:money, MoneyType)
#
# # app/models/product.rb
# class Product < ActiveRecord::Base
# currency_converter = ConversionRatesFromTheInternet.new
# attribute :price_in_bitcoins, :money, currency_converter: currency_converter
# end
#
# Product.where(price_in_bitcoins: Money.new(5, "USD"))
# # => SELECT * FROM products WHERE price_in_bitcoins = 0.02230
#
# Product.where(price_in_bitcoins: Money.new(5, "GBP"))
# # => SELECT * FROM products WHERE price_in_bitcoins = 0.03412
#
# ==== Dirty Tracking
#
# The type of an attribute is given the opportunity to change how dirty
# tracking is performed. The methods +changed?+ and +changed_in_place?+
# will be called from ActiveModel::Dirty. See the documentation for those
# methods in ActiveModel::Type::Value for more details.
def attribute(name, cast_type = nil, default: NO_DEFAULT_PROVIDED, **options)
name = name.to_s
name = attribute_aliases[name] || name
reload_schema_from_cache
case cast_type
when Symbol
cast_type = Type.lookup(cast_type, **options, adapter: Type.adapter_name_from(self))
when nil
if (prev_cast_type, prev_default = attributes_to_define_after_schema_loads[name])
default = prev_default if default == NO_DEFAULT_PROVIDED
else
prev_cast_type = -> subtype { subtype }
end
cast_type = if block_given?
-> subtype { yield Proc === prev_cast_type ? prev_cast_type[subtype] : prev_cast_type }
else
prev_cast_type
end
end
self.attributes_to_define_after_schema_loads =
attributes_to_define_after_schema_loads.merge(name => [cast_type, default])
end
# This is the low level API which sits beneath +attribute+. It only
# accepts type objects, and will do its work immediately instead of
# waiting for the schema to load. Automatic schema detection and
# ClassMethods#attribute both call this under the hood. While this method
# is provided so it can be used by plugin authors, application code
# should probably use ClassMethods#attribute.
#
# +name+ The name of the attribute being defined. Expected to be a +String+.
#
# +cast_type+ The type object to use for this attribute.
#
# +default+ The default value to use when no value is provided. If this option
# is not passed, the previous default value (if any) will be used.
# Otherwise, the default will be +nil+. A proc can also be passed, and
# will be called once each time a new value is needed.
#
# +user_provided_default+ Whether the default value should be cast using
# +cast+ or +deserialize+.
def define_attribute(
name,
cast_type,
default: NO_DEFAULT_PROVIDED,
user_provided_default: true
)
attribute_types[name] = cast_type
define_default_attribute(name, default, cast_type, from_user: user_provided_default)
end
def load_schema! # :nodoc:
super
attributes_to_define_after_schema_loads.each do |name, (cast_type, default)|
cast_type = cast_type[type_for_attribute(name)] if Proc === cast_type
define_attribute(name, cast_type, default: default)
end
end
private
NO_DEFAULT_PROVIDED = Object.new # :nodoc:
private_constant :NO_DEFAULT_PROVIDED
def define_default_attribute(name, value, type, from_user:)
if value == NO_DEFAULT_PROVIDED
default_attribute = _default_attributes[name].with_type(type)
elsif from_user
default_attribute = ActiveModel::Attribute::UserProvidedDefault.new(
name,
value,
type,
_default_attributes.fetch(name.to_s) { nil },
)
else
default_attribute = ActiveModel::Attribute.from_database(name, value, type)
end
_default_attributes[name] = default_attribute
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
# = Active Record Autosave Association
#
# AutosaveAssociation is a module that takes care of automatically saving
# associated records when their parent is saved. In addition to saving, it
# also destroys any associated records that were marked for destruction.
# (See #mark_for_destruction and #marked_for_destruction?).
#
# Saving of the parent, its associations, and the destruction of marked
# associations, all happen inside a transaction. This should never leave the
# database in an inconsistent state.
#
# If validations for any of the associations fail, their error messages will
# be applied to the parent.
#
# Note that it also means that associations marked for destruction won't
# be destroyed directly. They will however still be marked for destruction.
#
# Note that <tt>autosave: false</tt> is not same as not declaring <tt>:autosave</tt>.
# When the <tt>:autosave</tt> option is not present then new association records are
# saved but the updated association records are not saved.
#
# == Validation
#
# Child records are validated unless <tt>:validate</tt> is +false+.
#
# == Callbacks
#
# Association with autosave option defines several callbacks on your
# model (around_save, before_save, after_create, after_update). Please note that
# callbacks are executed in the order they were defined in
# model. You should avoid modifying the association content before
# autosave callbacks are executed. Placing your callbacks after
# associations is usually a good practice.
#
# === One-to-one Example
#
# class Post < ActiveRecord::Base
# has_one :author, autosave: true
# end
#
# Saving changes to the parent and its associated model can now be performed
# automatically _and_ atomically:
#
# post = Post.find(1)
# post.title # => "The current global position of migrating ducks"
# post.author.name # => "alloy"
#
# post.title = "On the migration of ducks"
# post.author.name = "Eloy Duran"
#
# post.save
# post.reload
# post.title # => "On the migration of ducks"
# post.author.name # => "Eloy Duran"
#
# Destroying an associated model, as part of the parent's save action, is as
# simple as marking it for destruction:
#
# post.author.mark_for_destruction
# post.author.marked_for_destruction? # => true
#
# Note that the model is _not_ yet removed from the database:
#
# id = post.author.id
# Author.find_by(id: id).nil? # => false
#
# post.save
# post.reload.author # => nil
#
# Now it _is_ removed from the database:
#
# Author.find_by(id: id).nil? # => true
#
# === One-to-many Example
#
# When <tt>:autosave</tt> is not declared new children are saved when their parent is saved:
#
# class Post < ActiveRecord::Base
# has_many :comments # :autosave option is not declared
# end
#
# post = Post.new(title: 'ruby rocks')
# post.comments.build(body: 'hello world')
# post.save # => saves both post and comment
#
# post = Post.create(title: 'ruby rocks')
# post.comments.build(body: 'hello world')
# post.save # => saves both post and comment
#
# post = Post.create(title: 'ruby rocks')
# comment = post.comments.create(body: 'hello world')
# comment.body = 'hi everyone'
# post.save # => saves post, but not comment
#
# When <tt>:autosave</tt> is true all children are saved, no matter whether they
# are new records or not:
#
# class Post < ActiveRecord::Base
# has_many :comments, autosave: true
# end
#
# post = Post.create(title: 'ruby rocks')
# comment = post.comments.create(body: 'hello world')
# comment.body = 'hi everyone'
# post.comments.build(body: "good morning.")
# post.save # => saves post and both comments.
#
# Destroying one of the associated models as part of the parent's save action
# is as simple as marking it for destruction:
#
# post.comments # => [#<Comment id: 1, ...>, #<Comment id: 2, ...]>
# post.comments[1].mark_for_destruction
# post.comments[1].marked_for_destruction? # => true
# post.comments.length # => 2
#
# Note that the model is _not_ yet removed from the database:
#
# id = post.comments.last.id
# Comment.find_by(id: id).nil? # => false
#
# post.save
# post.reload.comments.length # => 1
#
# Now it _is_ removed from the database:
#
# Comment.find_by(id: id).nil? # => true
#
# === Caveats
#
# Note that autosave will only trigger for already-persisted association records
# if the records themselves have been changed. This is to protect against
# <tt>SystemStackError</tt> caused by circular association validations. The one
# exception is if a custom validation context is used, in which case the validations
# will always fire on the associated records.
module AutosaveAssociation
extend ActiveSupport::Concern
module AssociationBuilderExtension # :nodoc:
def self.build(model, reflection)
model.send(:add_autosave_association_callbacks, reflection)
end
def self.valid_options
[ :autosave ]
end
end
included do
Associations::Builder::Association.extensions << AssociationBuilderExtension
end
module ClassMethods # :nodoc:
private
def define_non_cyclic_method(name, &block)
return if method_defined?(name, false)
define_method(name) do |*args|
result = true; @_already_called ||= {}
# Loop prevention for validation of associations
unless @_already_called[name]
begin
@_already_called[name] = true
result = instance_eval(&block)
ensure
@_already_called[name] = false
end
end
result
end
end
# Adds validation and save callbacks for the association as specified by
# the +reflection+.
#
# For performance reasons, we don't check whether to validate at runtime.
# However the validation and callback methods are lazy and those methods
# get created when they are invoked for the very first time. However,
# this can change, for instance, when using nested attributes, which is
# called _after_ the association has been defined. Since we don't want
# the callbacks to get defined multiple times, there are guards that
# check if the save or validation methods have already been defined
# before actually defining them.
def add_autosave_association_callbacks(reflection)
save_method = :"autosave_associated_records_for_#{reflection.name}"
if reflection.collection?
around_save :around_save_collection_association
define_non_cyclic_method(save_method) { save_collection_association(reflection) }
# Doesn't use after_save as that would save associations added in after_create/after_update twice
after_create save_method
after_update save_method
elsif reflection.has_one?
define_non_cyclic_method(save_method) { save_has_one_association(reflection) }
# Configures two callbacks instead of a single after_save so that
# the model may rely on their execution order relative to its
# own callbacks.
#
# For example, given that after_creates run before after_saves, if
# we configured instead an after_save there would be no way to fire
# a custom after_create callback after the child association gets
# created.
after_create save_method
after_update save_method
else
define_non_cyclic_method(save_method) { throw(:abort) if save_belongs_to_association(reflection) == false }
before_save save_method
end
define_autosave_validation_callbacks(reflection)
end
def define_autosave_validation_callbacks(reflection)
validation_method = :"validate_associated_records_for_#{reflection.name}"
if reflection.validate? && !method_defined?(validation_method)
if reflection.collection?
method = :validate_collection_association
else
method = :validate_single_association
end
define_non_cyclic_method(validation_method) { send(method, reflection) }
validate validation_method
after_validation :_ensure_no_duplicate_errors
end
end
end
# Reloads the attributes of the object as usual and clears <tt>marked_for_destruction</tt> flag.
def reload(options = nil)
@marked_for_destruction = false
@destroyed_by_association = nil
super
end
# Marks this record to be destroyed as part of the parent's save transaction.
# This does _not_ actually destroy the record instantly, rather child record will be destroyed
# when <tt>parent.save</tt> is called.
#
# Only useful if the <tt>:autosave</tt> option on the parent is enabled for this associated model.
def mark_for_destruction
@marked_for_destruction = true
end
# Returns whether or not this record will be destroyed as part of the parent's save transaction.
#
# Only useful if the <tt>:autosave</tt> option on the parent is enabled for this associated model.
def marked_for_destruction?
@marked_for_destruction
end
# Records the association that is being destroyed and destroying this
# record in the process.
def destroyed_by_association=(reflection)
@destroyed_by_association = reflection
end
# Returns the association for the parent being destroyed.
#
# Used to avoid updating the counter cache unnecessarily.
def destroyed_by_association
@destroyed_by_association
end
# Returns whether or not this record has been changed in any way (including whether
# any of its nested autosave associations are likewise changed)
def changed_for_autosave?
new_record? || has_changes_to_save? || marked_for_destruction? || nested_records_changed_for_autosave?
end
private
# Returns the record for an association collection that should be validated
# or saved. If +autosave+ is +false+ only new records will be returned,
# unless the parent is/was a new record itself.
def associated_records_to_validate_or_save(association, new_record, autosave)
if new_record || custom_validation_context?
association && association.target
elsif autosave
association.target.find_all(&:changed_for_autosave?)
else
association.target.find_all(&:new_record?)
end
end
# Go through nested autosave associations that are loaded in memory (without loading
# any new ones), and return true if any are changed for autosave.
# Returns false if already called to prevent an infinite loop.
def nested_records_changed_for_autosave?
@_nested_records_changed_for_autosave_already_called ||= false
return false if @_nested_records_changed_for_autosave_already_called
begin
@_nested_records_changed_for_autosave_already_called = true
self.class._reflections.values.any? do |reflection|
if reflection.options[:autosave]
association = association_instance_get(reflection.name)
association && Array.wrap(association.target).any?(&:changed_for_autosave?)
end
end
ensure
@_nested_records_changed_for_autosave_already_called = false
end
end
# Validate the association if <tt>:validate</tt> or <tt>:autosave</tt> is
# turned on for the association.
def validate_single_association(reflection)
association = association_instance_get(reflection.name)
record = association && association.reader
association_valid?(reflection, record) if record && (record.changed_for_autosave? || custom_validation_context?)
end
# Validate the associated records if <tt>:validate</tt> or
# <tt>:autosave</tt> is turned on for the association specified by
# +reflection+.
def validate_collection_association(reflection)
if association = association_instance_get(reflection.name)
if records = associated_records_to_validate_or_save(association, new_record?, reflection.options[:autosave])
records.each_with_index { |record, index| association_valid?(reflection, record, index) }
end
end
end
# Returns whether or not the association is valid and applies any errors to
# the parent, <tt>self</tt>, if it wasn't. Skips any <tt>:autosave</tt>
# enabled records if they're marked_for_destruction? or destroyed.
def association_valid?(reflection, record, index = nil)
return true if record.destroyed? || (reflection.options[:autosave] && record.marked_for_destruction?)
context = validation_context if custom_validation_context?
unless valid = record.valid?(context)
if reflection.options[:autosave]
indexed_attribute = !index.nil? && (reflection.options[:index_errors] || ActiveRecord.index_nested_attribute_errors)
record.errors.group_by_attribute.each { |attribute, errors|
attribute = normalize_reflection_attribute(indexed_attribute, reflection, index, attribute)
errors.each { |error|
self.errors.import(
error,
attribute: attribute
)
}
}
else
errors.add(reflection.name)
end
end
valid
end
def normalize_reflection_attribute(indexed_attribute, reflection, index, attribute)
if indexed_attribute
"#{reflection.name}[#{index}].#{attribute}"
else
"#{reflection.name}.#{attribute}"
end
end
# Is used as an around_save callback to check while saving a collection
# association whether or not the parent was a new record before saving.
def around_save_collection_association
previously_new_record_before_save = (@new_record_before_save ||= false)
@new_record_before_save = !previously_new_record_before_save && new_record?
yield
ensure
@new_record_before_save = previously_new_record_before_save
end
# Saves any new associated records, or all loaded autosave associations if
# <tt>:autosave</tt> is enabled on the association.
#
# In addition, it destroys all children that were marked for destruction
# with #mark_for_destruction.
#
# This all happens inside a transaction, _if_ the Transactions module is included into
# ActiveRecord::Base after the AutosaveAssociation module, which it does by default.
def save_collection_association(reflection)
if association = association_instance_get(reflection.name)
autosave = reflection.options[:autosave]
# By saving the instance variable in a local variable,
# we make the whole callback re-entrant.
new_record_before_save = @new_record_before_save
# reconstruct the scope now that we know the owner's id
association.reset_scope
if records = associated_records_to_validate_or_save(association, new_record_before_save, autosave)
if autosave
records_to_destroy = records.select(&:marked_for_destruction?)
records_to_destroy.each { |record| association.destroy(record) }
records -= records_to_destroy
end
records.each do |record|
next if record.destroyed?
saved = true
if autosave != false && (new_record_before_save || record.new_record?)
association.set_inverse_instance(record)
if autosave
saved = association.insert_record(record, false)
elsif !reflection.nested?
association_saved = association.insert_record(record)
if reflection.validate?
errors.add(reflection.name) unless association_saved
saved = association_saved
end
end
elsif autosave
saved = record.save(validate: false)
end
raise(RecordInvalid.new(association.owner)) unless saved
end
end
end
end
# Saves the associated record if it's new or <tt>:autosave</tt> is enabled
# on the association.
#
# In addition, it will destroy the association if it was marked for
# destruction with #mark_for_destruction.
#
# This all happens inside a transaction, _if_ the Transactions module is included into
# ActiveRecord::Base after the AutosaveAssociation module, which it does by default.
def save_has_one_association(reflection)
association = association_instance_get(reflection.name)
record = association && association.load_target
if record && !record.destroyed?
autosave = reflection.options[:autosave]
if autosave && record.marked_for_destruction?
record.destroy
elsif autosave != false
key = reflection.options[:primary_key] ? public_send(reflection.options[:primary_key]) : id
if (autosave && record.changed_for_autosave?) || _record_changed?(reflection, record, key)
unless reflection.through_reflection
record[reflection.foreign_key] = key
association.set_inverse_instance(record)
end
saved = record.save(validate: !autosave)
raise ActiveRecord::Rollback if !saved && autosave
saved
end
end
end
end
# If the record is new or it has changed, returns true.
def _record_changed?(reflection, record, key)
record.new_record? ||
association_foreign_key_changed?(reflection, record, key) ||
record.will_save_change_to_attribute?(reflection.foreign_key)
end
def association_foreign_key_changed?(reflection, record, key)
return false if reflection.through_reflection?
record._has_attribute?(reflection.foreign_key) && record._read_attribute(reflection.foreign_key) != key
end
# Saves the associated record if it's new or <tt>:autosave</tt> is enabled.
#
# In addition, it will destroy the association if it was marked for destruction.
def save_belongs_to_association(reflection)
association = association_instance_get(reflection.name)
return unless association && association.loaded? && !association.stale_target?
record = association.load_target
if record && !record.destroyed?
autosave = reflection.options[:autosave]
if autosave && record.marked_for_destruction?
self[reflection.foreign_key] = nil
record.destroy
elsif autosave != false
saved = record.save(validate: !autosave) if record.new_record? || (autosave && record.changed_for_autosave?)
if association.updated?
association_id = record.public_send(reflection.options[:primary_key] || :id)
self[reflection.foreign_key] = association_id
association.loaded!
end
saved if autosave
end
end
end
def custom_validation_context?
validation_context && [:create, :update].exclude?(validation_context)
end
def _ensure_no_duplicate_errors
errors.uniq!
end
end
end
# frozen_string_literal: true
require "active_support/benchmarkable"
require "active_support/dependencies"
require "active_support/descendants_tracker"
require "active_support/time"
require "active_support/core_ext/class/subclasses"
require "active_record/log_subscriber"
require "active_record/explain_subscriber"
require "active_record/relation/delegation"
require "active_record/attributes"
require "active_record/type_caster"
require "active_record/database_configurations"
module ActiveRecord # :nodoc:
# = Active Record
#
# Active Record objects don't specify their attributes directly, but rather infer them from
# the table definition with which they're linked. Adding, removing, and changing attributes
# and their type is done directly in the database. Any change is instantly reflected in the
# Active Record objects. The mapping that binds a given Active Record class to a certain
# database table will happen automatically in most common cases, but can be overwritten for the uncommon ones.
#
# See the mapping rules in table_name and the full example in link:files/activerecord/README_rdoc.html for more insight.
#
# == Creation
#
# Active Records accept constructor parameters either in a hash or as a block. The hash
# method is especially useful when you're receiving the data from somewhere else, like an
# HTTP request. It works like this:
#
# user = User.new(name: "David", occupation: "Code Artist")
# user.name # => "David"
#
# You can also use block initialization:
#
# user = User.new do |u|
# u.name = "David"
# u.occupation = "Code Artist"
# end
#
# And of course you can just create a bare object and specify the attributes after the fact:
#
# user = User.new
# user.name = "David"
# user.occupation = "Code Artist"
#
# == Conditions
#
# Conditions can either be specified as a string, array, or hash representing the WHERE-part of an SQL statement.
# The array form is to be used when the condition input is tainted and requires sanitization. The string form can
# be used for statements that don't involve tainted data. The hash form works much like the array form, except
# only equality and range is possible. Examples:
#
# class User < ActiveRecord::Base
# def self.authenticate_unsafely(user_name, password)
# where("user_name = '#{user_name}' AND password = '#{password}'").first
# end
#
# def self.authenticate_safely(user_name, password)
# where("user_name = ? AND password = ?", user_name, password).first
# end
#
# def self.authenticate_safely_simply(user_name, password)
# where(user_name: user_name, password: password).first
# end
# end
#
# The <tt>authenticate_unsafely</tt> method inserts the parameters directly into the query
# and is thus susceptible to SQL-injection attacks if the <tt>user_name</tt> and +password+
# parameters come directly from an HTTP request. The <tt>authenticate_safely</tt> and
# <tt>authenticate_safely_simply</tt> both will sanitize the <tt>user_name</tt> and +password+
# before inserting them in the query, which will ensure that an attacker can't escape the
# query and fake the login (or worse).
#
# When using multiple parameters in the conditions, it can easily become hard to read exactly
# what the fourth or fifth question mark is supposed to represent. In those cases, you can
# resort to named bind variables instead. That's done by replacing the question marks with
# symbols and supplying a hash with values for the matching symbol keys:
#
# Company.where(
# "id = :id AND name = :name AND division = :division AND created_at > :accounting_date",
# { id: 3, name: "37signals", division: "First", accounting_date: '2005-01-01' }
# ).first
#
# Similarly, a simple hash without a statement will generate conditions based on equality with the SQL AND
# operator. For instance:
#
# Student.where(first_name: "Harvey", status: 1)
# Student.where(params[:student])
#
# A range may be used in the hash to use the SQL BETWEEN operator:
#
# Student.where(grade: 9..12)
#
# An array may be used in the hash to use the SQL IN operator:
#
# Student.where(grade: [9,11,12])
#
# When joining tables, nested hashes or keys written in the form 'table_name.column_name'
# can be used to qualify the table name of a particular condition. For instance:
#
# Student.joins(:schools).where(schools: { category: 'public' })
# Student.joins(:schools).where('schools.category' => 'public' )
#
# == Overwriting default accessors
#
# All column values are automatically available through basic accessors on the Active Record
# object, but sometimes you want to specialize this behavior. This can be done by overwriting
# the default accessors (using the same name as the attribute) and calling
# +super+ to actually change things.
#
# class Song < ActiveRecord::Base
# # Uses an integer of seconds to hold the length of the song
#
# def length=(minutes)
# super(minutes.to_i * 60)
# end
#
# def length
# super / 60
# end
# end
#
# == Attribute query methods
#
# In addition to the basic accessors, query methods are also automatically available on the Active Record object.
# Query methods allow you to test whether an attribute value is present.
# Additionally, when dealing with numeric values, a query method will return false if the value is zero.
#
# For example, an Active Record User with the <tt>name</tt> attribute has a <tt>name?</tt> method that you can call
# to determine whether the user has a name:
#
# user = User.new(name: "David")
# user.name? # => true
#
# anonymous = User.new(name: "")
# anonymous.name? # => false
#
# Query methods will also respect any overrides of default accessors:
#
# class User
# # Has admin boolean column
# def admin
# false
# end
# end
#
# user.update(admin: true)
#
# user.read_attribute(:admin) # => true, gets the column value
# user[:admin] # => true, also gets the column value
#
# user.admin # => false, due to the getter override
# user.admin? # => false, due to the getter override
#
# == Accessing attributes before they have been typecasted
#
# Sometimes you want to be able to read the raw attribute data without having the column-determined
# typecast run its course first. That can be done by using the <tt><attribute>_before_type_cast</tt>
# accessors that all attributes have. For example, if your Account model has a <tt>balance</tt> attribute,
# you can call <tt>account.balance_before_type_cast</tt> or <tt>account.id_before_type_cast</tt>.
#
# This is especially useful in validation situations where the user might supply a string for an
# integer field and you want to display the original string back in an error message. Accessing the
# attribute normally would typecast the string to 0, which isn't what you want.
#
# == Dynamic attribute-based finders
#
# Dynamic attribute-based finders are a mildly deprecated way of getting (and/or creating) objects
# by simple queries without turning to SQL. They work by appending the name of an attribute
# to <tt>find_by_</tt> like <tt>Person.find_by_user_name</tt>.
# Instead of writing <tt>Person.find_by(user_name: user_name)</tt>, you can use
# <tt>Person.find_by_user_name(user_name)</tt>.
#
# It's possible to add an exclamation point (!) on the end of the dynamic finders to get them to raise an
# ActiveRecord::RecordNotFound error if they do not return any records,
# like <tt>Person.find_by_last_name!</tt>.
#
# It's also possible to use multiple attributes in the same <tt>find_by_</tt> by separating them with
# "_and_".
#
# Person.find_by(user_name: user_name, password: password)
# Person.find_by_user_name_and_password(user_name, password) # with dynamic finder
#
# It's even possible to call these dynamic finder methods on relations and named scopes.
#
# Payment.order("created_on").find_by_amount(50)
#
# == Saving arrays, hashes, and other non-mappable objects in text columns
#
# Active Record can serialize any object in text columns using YAML. To do so, you must
# specify this with a call to the class method
# {serialize}[rdoc-ref:AttributeMethods::Serialization::ClassMethods#serialize].
# This makes it possible to store arrays, hashes, and other non-mappable objects without doing
# any additional work.
#
# class User < ActiveRecord::Base
# serialize :preferences
# end
#
# user = User.create(preferences: { "background" => "black", "display" => large })
# User.find(user.id).preferences # => { "background" => "black", "display" => large }
#
# You can also specify a class option as the second parameter that'll raise an exception
# if a serialized object is retrieved as a descendant of a class not in the hierarchy.
#
# class User < ActiveRecord::Base
# serialize :preferences, Hash
# end
#
# user = User.create(preferences: %w( one two three ))
# User.find(user.id).preferences # raises SerializationTypeMismatch
#
# When you specify a class option, the default value for that attribute will be a new
# instance of that class.
#
# class User < ActiveRecord::Base
# serialize :preferences, OpenStruct
# end
#
# user = User.new
# user.preferences.theme_color = "red"
#
#
# == Single table inheritance
#
# Active Record allows inheritance by storing the name of the class in a
# column that is named "type" by default. See ActiveRecord::Inheritance for
# more details.
#
# == Connection to multiple databases in different models
#
# Connections are usually created through
# {ActiveRecord::Base.establish_connection}[rdoc-ref:ConnectionHandling#establish_connection] and retrieved
# by ActiveRecord::Base.connection. All classes inheriting from ActiveRecord::Base will use this
# connection. But you can also set a class-specific connection. For example, if Course is an
# ActiveRecord::Base, but resides in a different database, you can just say <tt>Course.establish_connection</tt>
# and Course and all of its subclasses will use this connection instead.
#
# This feature is implemented by keeping a connection pool in ActiveRecord::Base that is
# a hash indexed by the class. If a connection is requested, the
# {ActiveRecord::Base.retrieve_connection}[rdoc-ref:ConnectionHandling#retrieve_connection] method
# will go up the class-hierarchy until a connection is found in the connection pool.
#
# == Exceptions
#
# * ActiveRecordError - Generic error class and superclass of all other errors raised by Active Record.
# * AdapterNotSpecified - The configuration hash used in
# {ActiveRecord::Base.establish_connection}[rdoc-ref:ConnectionHandling#establish_connection]
# didn't include an <tt>:adapter</tt> key.
# * AdapterNotFound - The <tt>:adapter</tt> key used in
# {ActiveRecord::Base.establish_connection}[rdoc-ref:ConnectionHandling#establish_connection]
# specified a non-existent adapter
# (or a bad spelling of an existing one).
# * AssociationTypeMismatch - The object assigned to the association wasn't of the type
# specified in the association definition.
# * AttributeAssignmentError - An error occurred while doing a mass assignment through the
# {ActiveRecord::Base#attributes=}[rdoc-ref:AttributeAssignment#attributes=] method.
# You can inspect the +attribute+ property of the exception object to determine which attribute
# triggered the error.
# * ConnectionNotEstablished - No connection has been established.
# Use {ActiveRecord::Base.establish_connection}[rdoc-ref:ConnectionHandling#establish_connection] before querying.
# * MultiparameterAssignmentErrors - Collection of errors that occurred during a mass assignment using the
# {ActiveRecord::Base#attributes=}[rdoc-ref:AttributeAssignment#attributes=] method.
# The +errors+ property of this exception contains an array of
# AttributeAssignmentError
# objects that should be inspected to determine which attributes triggered the errors.
# * RecordInvalid - raised by {ActiveRecord::Base#save!}[rdoc-ref:Persistence#save!] and
# {ActiveRecord::Base.create!}[rdoc-ref:Persistence::ClassMethods#create!]
# when the record is invalid.
# * RecordNotFound - No record responded to the {ActiveRecord::Base.find}[rdoc-ref:FinderMethods#find] method.
# Either the row with the given ID doesn't exist or the row didn't meet the additional restrictions.
# Some {ActiveRecord::Base.find}[rdoc-ref:FinderMethods#find] calls do not raise this exception to signal
# nothing was found, please check its documentation for further details.
# * SerializationTypeMismatch - The serialized object wasn't of the class specified as the second parameter.
# * StatementInvalid - The database server rejected the SQL statement. The precise error is added in the message.
#
# *Note*: The attributes listed are class-level attributes (accessible from both the class and instance level).
# So it's possible to assign a logger to the class through <tt>Base.logger=</tt> which will then be used by all
# instances in the current object space.
class Base
extend ActiveModel::Naming
extend ActiveSupport::Benchmarkable
extend ActiveSupport::DescendantsTracker
extend ConnectionHandling
extend QueryCache::ClassMethods
extend Querying
extend Translation
extend DynamicMatchers
extend DelegatedType
extend Explain
extend Enum
extend Delegation::DelegateCache
extend Aggregations::ClassMethods
include Core
include Persistence
include ReadonlyAttributes
include ModelSchema
include Inheritance
include Scoping
include Sanitization
include AttributeAssignment
include ActiveModel::Conversion
include Integration
include Validations
include CounterCache
include Attributes
include Locking::Optimistic
include Locking::Pessimistic
include AttributeMethods
include Callbacks
include Timestamp
include Associations
include SecurePassword
include AutosaveAssociation
include NestedAttributes
include Transactions
include TouchLater
include NoTouching
include Reflection
include Serialization
include Store
include SecureToken
include SignedId
include Suppressor
include Encryption::EncryptableRecord
end
ActiveSupport.run_load_hooks(:active_record, Base)
end
# frozen_string_literal: true
module ActiveRecord
class PredicateBuilder
class BasicObjectHandler # :nodoc:
def initialize(predicate_builder)
@predicate_builder = predicate_builder
end
def call(attribute, value)
bind = predicate_builder.build_bind_attribute(attribute.name, value)
attribute.eq(bind)
end
private
attr_reader :predicate_builder
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Associations
class Preloader
class Batch # :nodoc:
def initialize(preloaders, available_records:)
@preloaders = preloaders.reject(&:empty?)
@available_records = available_records.flatten.group_by { |r| r.class.base_class }
end
def call
branches = @preloaders.flat_map(&:branches)
until branches.empty?
loaders = branches.flat_map(&:runnable_loaders)
loaders.each { |loader| loader.associate_records_from_unscoped(@available_records[loader.klass.base_class]) }
if loaders.any?
future_tables = branches.flat_map do |branch|
branch.future_classes - branch.runnable_loaders.map(&:klass)
end.map(&:table_name).uniq
target_loaders = loaders.reject { |l| future_tables.include?(l.table_name) }
target_loaders = loaders if target_loaders.empty?
group_and_load_similar(target_loaders)
target_loaders.each(&:run)
end
finished, in_progress = branches.partition(&:done?)
branches = in_progress + finished.flat_map(&:children)
end
end
private
attr_reader :loaders
def group_and_load_similar(loaders)
loaders.grep_v(ThroughAssociation).group_by(&:loader_query).each_pair do |query, similar_loaders|
query.load_records_in_batch(similar_loaders)
end
end
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Batches
class BatchEnumerator
include Enumerable
def initialize(of: 1000, start: nil, finish: nil, relation:) # :nodoc:
@of = of
@relation = relation
@start = start
@finish = finish
end
# The primary key value from which the BatchEnumerator starts, inclusive of the value.
attr_reader :start
# The primary key value at which the BatchEnumerator ends, inclusive of the value.
attr_reader :finish
# The relation from which the BatchEnumerator yields batches.
attr_reader :relation
# The size of the batches yielded by the BatchEnumerator.
def batch_size
@of
end
# Looping through a collection of records from the database (using the
# +all+ method, for example) is very inefficient since it will try to
# instantiate all the objects at once.
#
# In that case, batch processing methods allow you to work with the
# records in batches, thereby greatly reducing memory consumption.
#
# Person.in_batches.each_record do |person|
# person.do_awesome_stuff
# end
#
# Person.where("age > 21").in_batches(of: 10).each_record do |person|
# person.party_all_night!
# end
#
# If you do not provide a block to #each_record, it will return an Enumerator
# for chaining with other methods:
#
# Person.in_batches.each_record.with_index do |person, index|
# person.award_trophy(index + 1)
# end
def each_record(&block)
return to_enum(:each_record) unless block_given?
@relation.to_enum(:in_batches, of: @of, start: @start, finish: @finish, load: true).each do |relation|
relation.records.each(&block)
end
end
# Deletes records in batches. Returns the total number of rows affected.
#
# Person.in_batches.delete_all
#
# See Relation#delete_all for details of how each batch is deleted.
def delete_all
sum(&:delete_all)
end
# Updates records in batches. Returns the total number of rows affected.
#
# Person.in_batches.update_all("age = age + 1")
#
# See Relation#update_all for details of how each batch is updated.
def update_all(updates)
sum do |relation|
relation.update_all(updates)
end
end
# Destroys records in batches.
#
# Person.where("age < 10").in_batches.destroy_all
#
# See Relation#destroy_all for details of how each batch is destroyed.
def destroy_all
each(&:destroy_all)
end
# Yields an ActiveRecord::Relation object for each batch of records.
#
# Person.in_batches.each do |relation|
# relation.update_all(awesome: true)
# end
def each(&block)
enum = @relation.to_enum(:in_batches, of: @of, start: @start, finish: @finish, load: false)
return enum.each(&block) if block_given?
enum
end
end
end
end
# frozen_string_literal: true
require "active_record/relation/batches/batch_enumerator"
module ActiveRecord
module Batches
ORDER_IGNORE_MESSAGE = "Scoped order is ignored, it's forced to be batch order."
# Looping through a collection of records from the database
# (using the Scoping::Named::ClassMethods.all method, for example)
# is very inefficient since it will try to instantiate all the objects at once.
#
# In that case, batch processing methods allow you to work
# with the records in batches, thereby greatly reducing memory consumption.
#
# The #find_each method uses #find_in_batches with a batch size of 1000 (or as
# specified by the +:batch_size+ option).
#
# Person.find_each do |person|
# person.do_awesome_stuff
# end
#
# Person.where("age > 21").find_each do |person|
# person.party_all_night!
# end
#
# If you do not provide a block to #find_each, it will return an Enumerator
# for chaining with other methods:
#
# Person.find_each.with_index do |person, index|
# person.award_trophy(index + 1)
# end
#
# ==== Options
# * <tt>:batch_size</tt> - Specifies the size of the batch. Defaults to 1000.
# * <tt>:start</tt> - Specifies the primary key value to start from, inclusive of the value.
# * <tt>:finish</tt> - Specifies the primary key value to end at, inclusive of the value.
# * <tt>:error_on_ignore</tt> - Overrides the application config to specify if an error should be raised when
# an order is present in the relation.
# * <tt>:order</tt> - Specifies the primary key order (can be +:asc+ or +:desc+). Defaults to +:asc+.
#
# Limits are honored, and if present there is no requirement for the batch
# size: it can be less than, equal to, or greater than the limit.
#
# The options +start+ and +finish+ are especially useful if you want
# multiple workers dealing with the same processing queue. You can make
# worker 1 handle all the records between id 1 and 9999 and worker 2
# handle from 10000 and beyond by setting the +:start+ and +:finish+
# option on each worker.
#
# # In worker 1, let's process until 9999 records.
# Person.find_each(finish: 9_999) do |person|
# person.party_all_night!
# end
#
# # In worker 2, let's process from record 10_000 and onwards.
# Person.find_each(start: 10_000) do |person|
# person.party_all_night!
# end
#
# NOTE: Order can be ascending (:asc) or descending (:desc). It is automatically set to
# ascending on the primary key ("id ASC").
# This also means that this method only works when the primary key is
# orderable (e.g. an integer or string).
#
# NOTE: By its nature, batch processing is subject to race conditions if
# other processes are modifying the database.
def find_each(start: nil, finish: nil, batch_size: 1000, error_on_ignore: nil, order: :asc, &block)
if block_given?
find_in_batches(start: start, finish: finish, batch_size: batch_size, error_on_ignore: error_on_ignore, order: order) do |records|
records.each(&block)
end
else
enum_for(:find_each, start: start, finish: finish, batch_size: batch_size, error_on_ignore: error_on_ignore, order: order) do
relation = self
apply_limits(relation, start, finish, order).size
end
end
end
# Yields each batch of records that was found by the find options as
# an array.
#
# Person.where("age > 21").find_in_batches do |group|
# sleep(50) # Make sure it doesn't get too crowded in there!
# group.each { |person| person.party_all_night! }
# end
#
# If you do not provide a block to #find_in_batches, it will return an Enumerator
# for chaining with other methods:
#
# Person.find_in_batches.with_index do |group, batch|
# puts "Processing group ##{batch}"
# group.each(&:recover_from_last_night!)
# end
#
# To be yielded each record one by one, use #find_each instead.
#
# ==== Options
# * <tt>:batch_size</tt> - Specifies the size of the batch. Defaults to 1000.
# * <tt>:start</tt> - Specifies the primary key value to start from, inclusive of the value.
# * <tt>:finish</tt> - Specifies the primary key value to end at, inclusive of the value.
# * <tt>:error_on_ignore</tt> - Overrides the application config to specify if an error should be raised when
# an order is present in the relation.
# * <tt>:order</tt> - Specifies the primary key order (can be +:asc+ or +:desc+). Defaults to +:asc+.
#
# Limits are honored, and if present there is no requirement for the batch
# size: it can be less than, equal to, or greater than the limit.
#
# The options +start+ and +finish+ are especially useful if you want
# multiple workers dealing with the same processing queue. You can make
# worker 1 handle all the records between id 1 and 9999 and worker 2
# handle from 10000 and beyond by setting the +:start+ and +:finish+
# option on each worker.
#
# # Let's process from record 10_000 on.
# Person.find_in_batches(start: 10_000) do |group|
# group.each { |person| person.party_all_night! }
# end
#
# NOTE: Order can be ascending (:asc) or descending (:desc). It is automatically set to
# ascending on the primary key ("id ASC").
# This also means that this method only works when the primary key is
# orderable (e.g. an integer or string).
#
# NOTE: By its nature, batch processing is subject to race conditions if
# other processes are modifying the database.
def find_in_batches(start: nil, finish: nil, batch_size: 1000, error_on_ignore: nil, order: :asc)
relation = self
unless block_given?
return to_enum(:find_in_batches, start: start, finish: finish, batch_size: batch_size, error_on_ignore: error_on_ignore, order: order) do
total = apply_limits(relation, start, finish, order).size
(total - 1).div(batch_size) + 1
end
end
in_batches(of: batch_size, start: start, finish: finish, load: true, error_on_ignore: error_on_ignore, order: order) do |batch|
yield batch.to_a
end
end
# Yields ActiveRecord::Relation objects to work with a batch of records.
#
# Person.where("age > 21").in_batches do |relation|
# relation.delete_all
# sleep(10) # Throttle the delete queries
# end
#
# If you do not provide a block to #in_batches, it will return a
# BatchEnumerator which is enumerable.
#
# Person.in_batches.each_with_index do |relation, batch_index|
# puts "Processing relation ##{batch_index}"
# relation.delete_all
# end
#
# Examples of calling methods on the returned BatchEnumerator object:
#
# Person.in_batches.delete_all
# Person.in_batches.update_all(awesome: true)
# Person.in_batches.each_record(&:party_all_night!)
#
# ==== Options
# * <tt>:of</tt> - Specifies the size of the batch. Defaults to 1000.
# * <tt>:load</tt> - Specifies if the relation should be loaded. Defaults to false.
# * <tt>:start</tt> - Specifies the primary key value to start from, inclusive of the value.
# * <tt>:finish</tt> - Specifies the primary key value to end at, inclusive of the value.
# * <tt>:error_on_ignore</tt> - Overrides the application config to specify if an error should be raised when
# an order is present in the relation.
# * <tt>:order</tt> - Specifies the primary key order (can be +:asc+ or +:desc+). Defaults to +:asc+.
#
# Limits are honored, and if present there is no requirement for the batch
# size, it can be less than, equal, or greater than the limit.
#
# The options +start+ and +finish+ are especially useful if you want
# multiple workers dealing with the same processing queue. You can make
# worker 1 handle all the records between id 1 and 9999 and worker 2
# handle from 10000 and beyond by setting the +:start+ and +:finish+
# option on each worker.
#
# # Let's process from record 10_000 on.
# Person.in_batches(start: 10_000).update_all(awesome: true)
#
# An example of calling where query method on the relation:
#
# Person.in_batches.each do |relation|
# relation.update_all('age = age + 1')
# relation.where('age > 21').update_all(should_party: true)
# relation.where('age <= 21').delete_all
# end
#
# NOTE: If you are going to iterate through each record, you should call
# #each_record on the yielded BatchEnumerator:
#
# Person.in_batches.each_record(&:party_all_night!)
#
# NOTE: Order can be ascending (:asc) or descending (:desc). It is automatically set to
# ascending on the primary key ("id ASC").
# This also means that this method only works when the primary key is
# orderable (e.g. an integer or string).
#
# NOTE: By its nature, batch processing is subject to race conditions if
# other processes are modifying the database.
def in_batches(of: 1000, start: nil, finish: nil, load: false, error_on_ignore: nil, order: :asc)
relation = self
unless block_given?
return BatchEnumerator.new(of: of, start: start, finish: finish, relation: self)
end
unless [:asc, :desc].include?(order)
raise ArgumentError, ":order must be :asc or :desc, got #{order.inspect}"
end
if arel.orders.present?
act_on_ignored_order(error_on_ignore)
end
batch_limit = of
if limit_value
remaining = limit_value
batch_limit = remaining if remaining < batch_limit
end
relation = relation.reorder(batch_order(order)).limit(batch_limit)
relation = apply_limits(relation, start, finish, order)
relation.skip_query_cache! # Retaining the results in the query cache would undermine the point of batching
batch_relation = relation
loop do
if load
records = batch_relation.records
ids = records.map(&:id)
yielded_relation = where(primary_key => ids)
yielded_relation.load_records(records)
else
ids = batch_relation.pluck(primary_key)
yielded_relation = where(primary_key => ids)
end
break if ids.empty?
primary_key_offset = ids.last
raise ArgumentError.new("Primary key not included in the custom select clause") unless primary_key_offset
yield yielded_relation
break if ids.length < batch_limit
if limit_value
remaining -= ids.length
if remaining == 0
# Saves a useless iteration when the limit is a multiple of the
# batch size.
break
elsif remaining < batch_limit
relation = relation.limit(remaining)
end
end
batch_relation = relation.where(
predicate_builder[primary_key, primary_key_offset, order == :desc ? :lt : :gt]
)
end
end
private
def apply_limits(relation, start, finish, order)
relation = apply_start_limit(relation, start, order) if start
relation = apply_finish_limit(relation, finish, order) if finish
relation
end
def apply_start_limit(relation, start, order)
relation.where(predicate_builder[primary_key, start, order == :desc ? :lteq : :gteq])
end
def apply_finish_limit(relation, finish, order)
relation.where(predicate_builder[primary_key, finish, order == :desc ? :gteq : :lteq])
end
def batch_order(order)
table[primary_key].public_send(order)
end
def act_on_ignored_order(error_on_ignore)
raise_error = (error_on_ignore.nil? ? ActiveRecord.error_on_ignored_order : error_on_ignore)
if raise_error
raise ArgumentError.new(ORDER_IGNORE_MESSAGE)
elsif logger
logger.warn(ORDER_IGNORE_MESSAGE)
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module AttributeMethods
# = Active Record Attribute Methods Before Type Cast
#
# ActiveRecord::AttributeMethods::BeforeTypeCast provides a way to
# read the value of the attributes before typecasting and deserialization.
#
# class Task < ActiveRecord::Base
# end
#
# task = Task.new(id: '1', completed_on: '2012-10-21')
# task.id # => 1
# task.completed_on # => Sun, 21 Oct 2012
#
# task.attributes_before_type_cast
# # => {"id"=>"1", "completed_on"=>"2012-10-21", ... }
# task.read_attribute_before_type_cast('id') # => "1"
# task.read_attribute_before_type_cast('completed_on') # => "2012-10-21"
#
# In addition to #read_attribute_before_type_cast and #attributes_before_type_cast,
# it declares a method for all attributes with the <tt>*_before_type_cast</tt>
# suffix.
#
# task.id_before_type_cast # => "1"
# task.completed_on_before_type_cast # => "2012-10-21"
module BeforeTypeCast
extend ActiveSupport::Concern
included do
attribute_method_suffix "_before_type_cast", "_for_database", parameters: false
attribute_method_suffix "_came_from_user?", parameters: false
end
# Returns the value of the attribute identified by +attr_name+ before
# typecasting and deserialization.
#
# class Task < ActiveRecord::Base
# end
#
# task = Task.new(id: '1', completed_on: '2012-10-21')
# task.read_attribute('id') # => 1
# task.read_attribute_before_type_cast('id') # => '1'
# task.read_attribute('completed_on') # => Sun, 21 Oct 2012
# task.read_attribute_before_type_cast('completed_on') # => "2012-10-21"
# task.read_attribute_before_type_cast(:completed_on) # => "2012-10-21"
def read_attribute_before_type_cast(attr_name)
name = attr_name.to_s
name = self.class.attribute_aliases[name] || name
attribute_before_type_cast(name)
end
# Returns a hash of attributes before typecasting and deserialization.
#
# class Task < ActiveRecord::Base
# end
#
# task = Task.new(title: nil, is_done: true, completed_on: '2012-10-21')
# task.attributes
# # => {"id"=>nil, "title"=>nil, "is_done"=>true, "completed_on"=>Sun, 21 Oct 2012, "created_at"=>nil, "updated_at"=>nil}
# task.attributes_before_type_cast
# # => {"id"=>nil, "title"=>nil, "is_done"=>true, "completed_on"=>"2012-10-21", "created_at"=>nil, "updated_at"=>nil}
def attributes_before_type_cast
@attributes.values_before_type_cast
end
# Returns a hash of attributes for assignment to the database.
def attributes_for_database
@attributes.values_for_database
end
private
# Dispatch target for <tt>*_before_type_cast</tt> attribute methods.
def attribute_before_type_cast(attr_name)
@attributes[attr_name].value_before_type_cast
end
def attribute_for_database(attr_name)
@attributes[attr_name].value_for_database
end
def attribute_came_from_user?(attr_name)
@attributes[attr_name].came_from_user?
end
end
end
end
# frozen_string_literal: true
module ActiveRecord::Associations::Builder # :nodoc:
class BelongsTo < SingularAssociation # :nodoc:
def self.macro
:belongs_to
end
def self.valid_options(options)
valid = super + [:polymorphic, :counter_cache, :optional, :default]
valid += [:foreign_type] if options[:polymorphic]
valid += [:ensuring_owner_was] if options[:dependent] == :destroy_async
valid
end
def self.valid_dependent_options
[:destroy, :delete, :destroy_async]
end
def self.define_callbacks(model, reflection)
super
add_counter_cache_callbacks(model, reflection) if reflection.options[:counter_cache]
add_touch_callbacks(model, reflection) if reflection.options[:touch]
add_default_callbacks(model, reflection) if reflection.options[:default]
end
def self.add_counter_cache_callbacks(model, reflection)
cache_column = reflection.counter_cache_column
model.after_update lambda { |record|
association = association(reflection.name)
if association.saved_change_to_target?
association.increment_counters
association.decrement_counters_before_last_save
end
}
klass = reflection.class_name.safe_constantize
klass.attr_readonly cache_column if klass && klass.respond_to?(:attr_readonly)
end
def self.touch_record(o, changes, foreign_key, name, touch, touch_method) # :nodoc:
old_foreign_id = changes[foreign_key] && changes[foreign_key].first
if old_foreign_id
association = o.association(name)
reflection = association.reflection
if reflection.polymorphic?
foreign_type = reflection.foreign_type
klass = changes[foreign_type] && changes[foreign_type].first || o.public_send(foreign_type)
klass = o.class.polymorphic_class_for(klass)
else
klass = association.klass
end
primary_key = reflection.association_primary_key(klass)
old_record = klass.find_by(primary_key => old_foreign_id)
if old_record
if touch != true
old_record.public_send(touch_method, touch)
else
old_record.public_send(touch_method)
end
end
end
record = o.public_send name
if record && record.persisted?
if touch != true
record.public_send(touch_method, touch)
else
record.public_send(touch_method)
end
end
end
def self.add_touch_callbacks(model, reflection)
foreign_key = reflection.foreign_key
name = reflection.name
touch = reflection.options[:touch]
callback = lambda { |changes_method| lambda { |record|
BelongsTo.touch_record(record, record.send(changes_method), foreign_key, name, touch, belongs_to_touch_method)
}}
if reflection.counter_cache_column
touch_callback = callback.(:saved_changes)
update_callback = lambda { |record|
instance_exec(record, &touch_callback) unless association(reflection.name).saved_change_to_target?
}
model.after_update update_callback, if: :saved_changes?
else
model.after_create callback.(:saved_changes), if: :saved_changes?
model.after_update callback.(:saved_changes), if: :saved_changes?
model.after_destroy callback.(:changes_to_save)
end
model.after_touch callback.(:changes_to_save)
end
def self.add_default_callbacks(model, reflection)
model.before_validation lambda { |o|
o.association(reflection.name).default(&reflection.options[:default])
}
end
def self.add_destroy_callbacks(model, reflection)
model.after_destroy lambda { |o| o.association(reflection.name).handle_dependency }
end
def self.define_validations(model, reflection)
if reflection.options.key?(:required)
reflection.options[:optional] = !reflection.options.delete(:required)
end
if reflection.options[:optional].nil?
required = model.belongs_to_required_by_default
else
required = !reflection.options[:optional]
end
super
if required
model.validates_presence_of reflection.name, message: :required
end
end
def self.define_change_tracking_methods(model, reflection)
model.generated_association_methods.class_eval <<-CODE, __FILE__, __LINE__ + 1
def #{reflection.name}_changed?
association(:#{reflection.name}).target_changed?
end
def #{reflection.name}_previously_changed?
association(:#{reflection.name}).target_previously_changed?
end
CODE
end
private_class_method :macro, :valid_options, :valid_dependent_options, :define_callbacks,
:define_validations, :define_change_tracking_methods, :add_counter_cache_callbacks,
:add_touch_callbacks, :add_default_callbacks, :add_destroy_callbacks
end
end
# frozen_string_literal: true
module ActiveRecord
module Associations
# = Active Record Belongs To Association
class BelongsToAssociation < SingularAssociation # :nodoc:
def handle_dependency
return unless load_target
case options[:dependent]
when :destroy
raise ActiveRecord::Rollback unless target.destroy
when :destroy_async
id = owner.public_send(reflection.foreign_key.to_sym)
primary_key_column = reflection.active_record_primary_key.to_sym
enqueue_destroy_association(
owner_model_name: owner.class.to_s,
owner_id: owner.id,
association_class: reflection.klass.to_s,
association_ids: [id],
association_primary_key_column: primary_key_column,
ensuring_owner_was_method: options.fetch(:ensuring_owner_was, nil)
)
else
target.public_send(options[:dependent])
end
end
def inversed_from(record)
replace_keys(record)
super
end
def default(&block)
writer(owner.instance_exec(&block)) if reader.nil?
end
def reset
super
@updated = false
end
def updated?
@updated
end
def decrement_counters
update_counters(-1)
end
def increment_counters
update_counters(1)
end
def decrement_counters_before_last_save
if reflection.polymorphic?
model_type_was = owner.attribute_before_last_save(reflection.foreign_type)
model_was = owner.class.polymorphic_class_for(model_type_was) if model_type_was
else
model_was = klass
end
foreign_key_was = owner.attribute_before_last_save(reflection.foreign_key)
if foreign_key_was && model_was < ActiveRecord::Base
update_counters_via_scope(model_was, foreign_key_was, -1)
end
end
def target_changed?
owner.attribute_changed?(reflection.foreign_key) || (!foreign_key_present? && target&.new_record?)
end
def target_previously_changed?
owner.attribute_previously_changed?(reflection.foreign_key)
end
def saved_change_to_target?
owner.saved_change_to_attribute?(reflection.foreign_key)
end
private
def replace(record)
if record
raise_on_type_mismatch!(record)
set_inverse_instance(record)
@updated = true
elsif target
remove_inverse_instance(target)
end
replace_keys(record, force: true)
self.target = record
end
def update_counters(by)
if require_counter_update? && foreign_key_present?
if target && !stale_target?
target.increment!(reflection.counter_cache_column, by, touch: reflection.options[:touch])
else
update_counters_via_scope(klass, owner._read_attribute(reflection.foreign_key), by)
end
end
end
def update_counters_via_scope(klass, foreign_key, by)
scope = klass.unscoped.where!(primary_key(klass) => foreign_key)
scope.update_counters(reflection.counter_cache_column => by, touch: reflection.options[:touch])
end
def find_target?
!loaded? && foreign_key_present? && klass
end
def require_counter_update?
reflection.counter_cache_column && owner.persisted?
end
def replace_keys(record, force: false)
target_key = record ? record._read_attribute(primary_key(record.class)) : nil
if force || owner._read_attribute(reflection.foreign_key) != target_key
owner[reflection.foreign_key] = target_key
end
end
def primary_key(klass)
reflection.association_primary_key(klass)
end
def foreign_key_present?
owner._read_attribute(reflection.foreign_key)
end
def invertible_for?(record)
inverse = inverse_reflection_for(record)
inverse && (inverse.has_one? || inverse.klass.has_many_inversing)
end
def stale_state
result = owner._read_attribute(reflection.foreign_key) { |n| owner.send(:missing_attribute, n, caller) }
result && result.to_s
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Associations
# = Active Record Belongs To Polymorphic Association
class BelongsToPolymorphicAssociation < BelongsToAssociation # :nodoc:
def klass
type = owner[reflection.foreign_type]
type.presence && owner.class.polymorphic_class_for(type)
end
def target_changed?
super || owner.attribute_changed?(reflection.foreign_type)
end
def target_previously_changed?
super || owner.attribute_previously_changed?(reflection.foreign_type)
end
def saved_change_to_target?
super || owner.saved_change_to_attribute?(reflection.foreign_type)
end
private
def replace_keys(record, force: false)
super
target_type = record ? record.class.polymorphic_name : nil
if force || owner._read_attribute(reflection.foreign_type) != target_type
owner[reflection.foreign_type] = target_type
end
end
def inverse_reflection_for(record)
reflection.polymorphic_inverse_of(record.class)
end
def raise_on_type_mismatch!(record)
# A polymorphic association cannot have a type mismatch, by definition
end
def stale_state
foreign_key = super
foreign_key && [foreign_key.to_s, owner[reflection.foreign_type].to_s]
end
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Nodes
class Binary < Arel::Nodes::NodeExpression
attr_accessor :left, :right
def initialize(left, right)
super()
@left = left
@right = right
end
def initialize_copy(other)
super
@left = @left.clone if @left
@right = @right.clone if @right
end
def hash
[self.class, @left, @right].hash
end
def eql?(other)
self.class == other.class &&
self.left == other.left &&
self.right == other.right
end
alias :== :eql?
end
module FetchAttribute
def fetch_attribute
if left.is_a?(Arel::Attributes::Attribute)
yield left
elsif right.is_a?(Arel::Attributes::Attribute)
yield right
end
end
end
class Between < Binary; include FetchAttribute; end
class GreaterThan < Binary
include FetchAttribute
def invert
Arel::Nodes::LessThanOrEqual.new(left, right)
end
end
class GreaterThanOrEqual < Binary
include FetchAttribute
def invert
Arel::Nodes::LessThan.new(left, right)
end
end
class LessThan < Binary
include FetchAttribute
def invert
Arel::Nodes::GreaterThanOrEqual.new(left, right)
end
end
class LessThanOrEqual < Binary
include FetchAttribute
def invert
Arel::Nodes::GreaterThan.new(left, right)
end
end
class IsDistinctFrom < Binary
include FetchAttribute
def invert
Arel::Nodes::IsNotDistinctFrom.new(left, right)
end
end
class IsNotDistinctFrom < Binary
include FetchAttribute
def invert
Arel::Nodes::IsDistinctFrom.new(left, right)
end
end
class NotEqual < Binary
include FetchAttribute
def invert
Arel::Nodes::Equality.new(left, right)
end
end
class NotIn < Binary
include FetchAttribute
def invert
Arel::Nodes::In.new(left, right)
end
end
class Or < Binary
def fetch_attribute(&block)
left.fetch_attribute(&block) && right.fetch_attribute(&block)
end
end
%w{
As
Assignment
Join
Union
UnionAll
Intersect
Except
}.each do |name|
const_set name, Class.new(Binary)
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Collectors
class Bind
def initialize
@binds = []
end
def <<(str)
self
end
def add_bind(bind)
@binds << bind
self
end
def add_binds(binds, proc_for_binds = nil)
@binds.concat proc_for_binds ? binds.map(&proc_for_binds) : binds
self
end
def value
@binds
end
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Nodes
class BindParam < Node
attr_reader :value
def initialize(value)
@value = value
super()
end
def hash
[self.class, self.value].hash
end
def eql?(other)
other.is_a?(BindParam) &&
value == other.value
end
alias :== :eql?
def nil?
value.nil?
end
def value_before_type_cast
if value.respond_to?(:value_before_type_cast)
value.value_before_type_cast
else
value
end
end
def infinite?
value.respond_to?(:infinite?) && value.infinite?
end
def unboundable?
value.respond_to?(:unboundable?) && value.unboundable?
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module ConnectionAdapters
module PostgreSQL
module OID # :nodoc:
class Bit < Type::Value # :nodoc:
def type
:bit
end
def cast_value(value)
if ::String === value
case value
when /^0x/i
value[2..-1].hex.to_s(2) # Hexadecimal notation
else
value # Bit-string notation
end
else
value.to_s
end
end
def serialize(value)
Data.new(super) if value
end
class Data
def initialize(value)
@value = value
end
def to_s
value
end
def binary?
/\A[01]*\Z/.match?(value)
end
def hex?
/\A[0-9A-F]*\Z/i.match?(value)
end
private
attr_reader :value
end
end
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module ConnectionAdapters
module PostgreSQL
module OID # :nodoc:
class BitVarying < OID::Bit # :nodoc:
def type
:bit_varying
end
end
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Associations
class Preloader
class Branch # :nodoc:
attr_reader :association, :children, :parent
attr_reader :scope, :associate_by_default
attr_writer :preloaded_records
def initialize(association:, children:, parent:, associate_by_default:, scope:)
@association = association
@parent = parent
@scope = scope
@associate_by_default = associate_by_default
@children = build_children(children)
@loaders = nil
end
def future_classes
(immediate_future_classes + children.flat_map(&:future_classes)).uniq
end
def immediate_future_classes
if parent.done?
loaders.flat_map(&:future_classes).uniq
else
likely_reflections.reject(&:polymorphic?).flat_map do |reflection|
reflection.
chain.
map(&:klass)
end.uniq
end
end
def target_classes
if done?
preloaded_records.map(&:klass).uniq
elsif parent.done?
loaders.map(&:klass).uniq
else
likely_reflections.reject(&:polymorphic?).map(&:klass).uniq
end
end
def likely_reflections
parent_classes = parent.target_classes
parent_classes.filter_map do |parent_klass|
parent_klass._reflect_on_association(@association)
end
end
def root?
parent.nil?
end
def source_records
@parent.preloaded_records
end
def preloaded_records
@preloaded_records ||= loaders.flat_map(&:preloaded_records)
end
def done?
root? || (@loaders && @loaders.all?(&:run?))
end
def runnable_loaders
loaders.flat_map(&:runnable_loaders).reject(&:run?)
end
def grouped_records
h = {}
polymorphic_parent = !root? && parent.polymorphic?
source_records.each do |record|
reflection = record.class._reflect_on_association(association)
next if polymorphic_parent && !reflection || !record.association(association).klass
(h[reflection] ||= []) << record
end
h
end
def preloaders_for_reflection(reflection, reflection_records)
reflection_records.group_by do |record|
klass = record.association(association).klass
if reflection.scope && reflection.scope.arity != 0
# For instance dependent scopes, the scope is potentially
# different for each record. To allow this we'll group each
# object separately into its own preloader
reflection_scope = reflection.join_scopes(klass.arel_table, klass.predicate_builder, klass, record).inject(&:merge!)
end
[klass, reflection_scope]
end.map do |(rhs_klass, reflection_scope), rs|
preloader_for(reflection).new(rhs_klass, rs, reflection, scope, reflection_scope, associate_by_default)
end
end
def polymorphic?
return false if root?
return @polymorphic if defined?(@polymorphic)
@polymorphic = source_records.any? do |record|
reflection = record.class._reflect_on_association(association)
reflection && reflection.options[:polymorphic]
end
end
def loaders
@loaders ||=
grouped_records.flat_map do |reflection, reflection_records|
preloaders_for_reflection(reflection, reflection_records)
end
end
private
def build_children(children)
Array.wrap(children).flat_map { |association|
Array(association).flat_map { |parent, child|
Branch.new(
parent: self,
association: parent,
children: child,
associate_by_default: associate_by_default,
scope: scope
)
}
}
end
# Returns a class containing the logic needed to load preload the data
# and attach it to a relation. The class returned implements a `run` method
# that accepts a preloader.
def preloader_for(reflection)
if reflection.options[:through]
ThroughAssociation
else
Association
end
end
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module ConnectionAdapters
module PostgreSQL
module OID # :nodoc:
class Bytea < Type::Binary # :nodoc:
def deserialize(value)
return if value.nil?
return value.to_s if value.is_a?(Type::Binary::Data)
PG::Connection.unescape_bytea(super)
end
end
end
end
end
end
# frozen_string_literal: true
require "active_support/core_ext/enumerable"
module ActiveRecord
module Calculations
# Count the records.
#
# Person.count
# # => the total count of all people
#
# Person.count(:age)
# # => returns the total count of all people whose age is present in database
#
# Person.count(:all)
# # => performs a COUNT(*) (:all is an alias for '*')
#
# Person.distinct.count(:age)
# # => counts the number of different age values
#
# If #count is used with {Relation#group}[rdoc-ref:QueryMethods#group],
# it returns a Hash whose keys represent the aggregated column,
# and the values are the respective amounts:
#
# Person.group(:city).count
# # => { 'Rome' => 5, 'Paris' => 3 }
#
# If #count is used with {Relation#group}[rdoc-ref:QueryMethods#group] for multiple columns, it returns a Hash whose
# keys are an array containing the individual values of each column and the value
# of each key would be the #count.
#
# Article.group(:status, :category).count
# # => {["draft", "business"]=>10, ["draft", "technology"]=>4,
# # ["published", "business"]=>0, ["published", "technology"]=>2}
#
# If #count is used with {Relation#select}[rdoc-ref:QueryMethods#select], it will count the selected columns:
#
# Person.select(:age).count
# # => counts the number of different age values
#
# Note: not all valid {Relation#select}[rdoc-ref:QueryMethods#select] expressions are valid #count expressions. The specifics differ
# between databases. In invalid cases, an error from the database is thrown.
def count(column_name = nil)
if block_given?
unless column_name.nil?
raise ArgumentError, "Column name argument is not supported when a block is passed."
end
super()
else
calculate(:count, column_name)
end
end
# Calculates the average value on a given column. Returns +nil+ if there's
# no row. See #calculate for examples with options.
#
# Person.average(:age) # => 35.8
def average(column_name)
calculate(:average, column_name)
end
# Calculates the minimum value on a given column. The value is returned
# with the same data type of the column, or +nil+ if there's no row. See
# #calculate for examples with options.
#
# Person.minimum(:age) # => 7
def minimum(column_name)
calculate(:minimum, column_name)
end
# Calculates the maximum value on a given column. The value is returned
# with the same data type of the column, or +nil+ if there's no row. See
# #calculate for examples with options.
#
# Person.maximum(:age) # => 93
def maximum(column_name)
calculate(:maximum, column_name)
end
# Calculates the sum of values on a given column. The value is returned
# with the same data type of the column, +0+ if there's no row. See
# #calculate for examples with options.
#
# Person.sum(:age) # => 4562
def sum(identity_or_column = nil, &block)
if block_given?
values = map(&block)
if identity_or_column.nil? && (values.first.is_a?(Numeric) || values.first(1) == [])
identity_or_column = 0
end
if identity_or_column.nil?
ActiveSupport::Deprecation.warn(<<-MSG.squish)
Rails 7.0 has deprecated Enumerable.sum in favor of Ruby's native implementation available since 2.4.
Sum of non-numeric elements requires an initial argument.
MSG
values.inject(:+) || 0
else
values.sum(identity_or_column)
end
else
calculate(:sum, identity_or_column)
end
end
# This calculates aggregate values in the given column. Methods for #count, #sum, #average,
# #minimum, and #maximum have been added as shortcuts.
#
# Person.calculate(:count, :all) # The same as Person.count
# Person.average(:age) # SELECT AVG(age) FROM people...
#
# # Selects the minimum age for any family without any minors
# Person.group(:last_name).having("min(age) > 17").minimum(:age)
#
# Person.sum("2 * age")
#
# There are two basic forms of output:
#
# * Single aggregate value: The single value is type cast to Integer for COUNT, Float
# for AVG, and the given column's type for everything else.
#
# * Grouped values: This returns an ordered hash of the values and groups them. It
# takes either a column name, or the name of a belongs_to association.
#
# values = Person.group('last_name').maximum(:age)
# puts values["Drake"]
# # => 43
#
# drake = Family.find_by(last_name: 'Drake')
# values = Person.group(:family).maximum(:age) # Person belongs_to :family
# puts values[drake]
# # => 43
#
# values.each do |family, max_age|
# ...
# end
def calculate(operation, column_name)
if has_include?(column_name)
relation = apply_join_dependency
if operation.to_s.downcase == "count"
unless distinct_value || distinct_select?(column_name || select_for_count)
relation.distinct!
relation.select_values = [ klass.primary_key || table[Arel.star] ]
end
# PostgreSQL: ORDER BY expressions must appear in SELECT list when using DISTINCT
relation.order_values = [] if group_values.empty?
end
relation.calculate(operation, column_name)
else
perform_calculation(operation, column_name)
end
end
# Use #pluck as a shortcut to select one or more attributes without
# loading an entire record object per row.
#
# Person.pluck(:name)
#
# instead of
#
# Person.all.map(&:name)
#
# Pluck returns an Array of attribute values type-casted to match
# the plucked column names, if they can be deduced. Plucking an SQL fragment
# returns String values by default.
#
# Person.pluck(:name)
# # SELECT people.name FROM people
# # => ['David', 'Jeremy', 'Jose']
#
# Person.pluck(:id, :name)
# # SELECT people.id, people.name FROM people
# # => [[1, 'David'], [2, 'Jeremy'], [3, 'Jose']]
#
# Person.distinct.pluck(:role)
# # SELECT DISTINCT role FROM people
# # => ['admin', 'member', 'guest']
#
# Person.where(age: 21).limit(5).pluck(:id)
# # SELECT people.id FROM people WHERE people.age = 21 LIMIT 5
# # => [2, 3]
#
# Person.pluck(Arel.sql('DATEDIFF(updated_at, created_at)'))
# # SELECT DATEDIFF(updated_at, created_at) FROM people
# # => ['0', '27761', '173']
#
# See also #ids.
#
def pluck(*column_names)
if loaded? && all_attributes?(column_names)
return records.pluck(*column_names)
end
if has_include?(column_names.first)
relation = apply_join_dependency
relation.pluck(*column_names)
else
klass.disallow_raw_sql!(column_names)
columns = arel_columns(column_names)
relation = spawn
relation.select_values = columns
result = skip_query_cache_if_necessary do
if where_clause.contradiction?
ActiveRecord::Result.empty
else
klass.connection.select_all(relation.arel, "#{klass.name} Pluck")
end
end
type_cast_pluck_values(result, columns)
end
end
# Pick the value(s) from the named column(s) in the current relation.
# This is short-hand for <tt>relation.limit(1).pluck(*column_names).first</tt>, and is primarily useful
# when you have a relation that's already narrowed down to a single row.
#
# Just like #pluck, #pick will only load the actual value, not the entire record object, so it's also
# more efficient. The value is, again like with pluck, typecast by the column type.
#
# Person.where(id: 1).pick(:name)
# # SELECT people.name FROM people WHERE id = 1 LIMIT 1
# # => 'David'
#
# Person.where(id: 1).pick(:name, :email_address)
# # SELECT people.name, people.email_address FROM people WHERE id = 1 LIMIT 1
# # => [ 'David', 'david@loudthinking.com' ]
def pick(*column_names)
if loaded? && all_attributes?(column_names)
return records.pick(*column_names)
end
limit(1).pluck(*column_names).first
end
# Pluck all the ID's for the relation using the table's primary key
#
# Person.ids # SELECT people.id FROM people
# Person.joins(:companies).ids # SELECT people.id FROM people INNER JOIN companies ON companies.person_id = people.id
def ids
pluck primary_key
end
private
def all_attributes?(column_names)
(column_names.map(&:to_s) - @klass.attribute_names - @klass.attribute_aliases.keys).empty?
end
def has_include?(column_name)
eager_loading? || (includes_values.present? && column_name && column_name != :all)
end
def perform_calculation(operation, column_name)
operation = operation.to_s.downcase
# If #count is used with #distinct (i.e. `relation.distinct.count`) it is
# considered distinct.
distinct = distinct_value
if operation == "count"
column_name ||= select_for_count
if column_name == :all
if !distinct
distinct = distinct_select?(select_for_count) if group_values.empty?
elsif group_values.any? || select_values.empty? && order_values.empty?
column_name = primary_key
end
elsif distinct_select?(column_name)
distinct = nil
end
end
if group_values.any?
execute_grouped_calculation(operation, column_name, distinct)
else
execute_simple_calculation(operation, column_name, distinct)
end
end
def distinct_select?(column_name)
column_name.is_a?(::String) && /\bDISTINCT[\s(]/i.match?(column_name)
end
def aggregate_column(column_name)
return column_name if Arel::Expressions === column_name
arel_column(column_name.to_s) do |name|
Arel.sql(column_name == :all ? "*" : name)
end
end
def operation_over_aggregate_column(column, operation, distinct)
operation == "count" ? column.count(distinct) : column.public_send(operation)
end
def execute_simple_calculation(operation, column_name, distinct) # :nodoc:
if operation == "count" && (column_name == :all && distinct || has_limit_or_offset?)
# Shortcut when limit is zero.
return 0 if limit_value == 0
query_builder = build_count_subquery(spawn, column_name, distinct)
else
# PostgreSQL doesn't like ORDER BY when there are no GROUP BY
relation = unscope(:order).distinct!(false)
column = aggregate_column(column_name)
select_value = operation_over_aggregate_column(column, operation, distinct)
select_value.distinct = true if operation == "sum" && distinct
relation.select_values = [select_value]
query_builder = relation.arel
end
result = skip_query_cache_if_necessary { @klass.connection.select_all(query_builder, "#{@klass.name} #{operation.capitalize}") }
if operation != "count"
type = column.try(:type_caster) ||
lookup_cast_type_from_join_dependencies(column_name.to_s) || Type.default_value
type = type.subtype if Enum::EnumType === type
end
type_cast_calculated_value(result.cast_values.first, operation, type)
end
def execute_grouped_calculation(operation, column_name, distinct) # :nodoc:
group_fields = group_values
group_fields = group_fields.uniq if group_fields.size > 1
if group_fields.size == 1 && group_fields.first.respond_to?(:to_sym)
association = klass._reflect_on_association(group_fields.first)
associated = association && association.belongs_to? # only count belongs_to associations
group_fields = Array(association.foreign_key) if associated
end
group_fields = arel_columns(group_fields)
group_aliases = group_fields.map { |field|
field = connection.visitor.compile(field) if Arel.arel_node?(field)
column_alias_for(field.to_s.downcase)
}
group_columns = group_aliases.zip(group_fields)
column = aggregate_column(column_name)
column_alias = column_alias_for("#{operation} #{column_name.to_s.downcase}")
select_value = operation_over_aggregate_column(column, operation, distinct)
select_value.as(connection.quote_column_name(column_alias))
select_values = [select_value]
select_values += self.select_values unless having_clause.empty?
select_values.concat group_columns.map { |aliaz, field|
aliaz = connection.quote_column_name(aliaz)
if field.respond_to?(:as)
field.as(aliaz)
else
"#{field} AS #{aliaz}"
end
}
relation = except(:group).distinct!(false)
relation.group_values = group_fields
relation.select_values = select_values
calculated_data = skip_query_cache_if_necessary { @klass.connection.select_all(relation.arel, "#{@klass.name} #{operation.capitalize}") }
if association
key_ids = calculated_data.collect { |row| row[group_aliases.first] }
key_records = association.klass.base_class.where(association.klass.base_class.primary_key => key_ids)
key_records = key_records.index_by(&:id)
end
key_types = group_columns.each_with_object({}) do |(aliaz, col_name), types|
types[aliaz] = type_for(col_name) do
calculated_data.column_types.fetch(aliaz, Type.default_value)
end
end
hash_rows = calculated_data.cast_values(key_types).map! do |row|
calculated_data.columns.each_with_object({}).with_index do |(col_name, hash), i|
hash[col_name] = row[i]
end
end
if operation != "count"
type = column.try(:type_caster) ||
lookup_cast_type_from_join_dependencies(column_name.to_s) || Type.default_value
type = type.subtype if Enum::EnumType === type
end
hash_rows.each_with_object({}) do |row, result|
key = group_aliases.map { |aliaz| row[aliaz] }
key = key.first if key.size == 1
key = key_records[key] if associated
result[key] = type_cast_calculated_value(row[column_alias], operation, type)
end
end
# Converts the given field to the value that the database adapter returns as
# a usable column name:
#
# column_alias_for("users.id") # => "users_id"
# column_alias_for("sum(id)") # => "sum_id"
# column_alias_for("count(distinct users.id)") # => "count_distinct_users_id"
# column_alias_for("count(*)") # => "count_all"
def column_alias_for(field)
column_alias = +field
column_alias.gsub!(/\*/, "all")
column_alias.gsub!(/\W+/, " ")
column_alias.strip!
column_alias.gsub!(/ +/, "_")
connection.table_alias_for(column_alias)
end
def type_for(field, &block)
field_name = field.respond_to?(:name) ? field.name.to_s : field.to_s.split(".").last
@klass.type_for_attribute(field_name, &block)
end
def lookup_cast_type_from_join_dependencies(name, join_dependencies = build_join_dependencies)
each_join_dependencies(join_dependencies) do |join|
type = join.base_klass.attribute_types.fetch(name, nil)
return type if type
end
nil
end
def type_cast_pluck_values(result, columns)
cast_types = if result.columns.size != columns.size
klass.attribute_types
else
join_dependencies = nil
columns.map.with_index do |column, i|
column.try(:type_caster) ||
klass.attribute_types.fetch(name = result.columns[i]) do
join_dependencies ||= build_join_dependencies
lookup_cast_type_from_join_dependencies(name, join_dependencies) ||
result.column_types[name] || Type.default_value
end
end
end
result.cast_values(cast_types)
end
def type_cast_calculated_value(value, operation, type)
case operation
when "count"
value.to_i
when "sum"
type.deserialize(value || 0)
when "average"
case type.type
when :integer, :decimal
value&.to_d
else
type.deserialize(value)
end
else # "minimum", "maximum"
type.deserialize(value)
end
end
def select_for_count
if select_values.present?
return select_values.first if select_values.one?
select_values.join(", ")
else
:all
end
end
def build_count_subquery(relation, column_name, distinct)
if column_name == :all
column_alias = Arel.star
relation.select_values = [ Arel.sql(FinderMethods::ONE_AS_ONE) ] unless distinct
else
column_alias = Arel.sql("count_column")
relation.select_values = [ aggregate_column(column_name).as(column_alias) ]
end
subquery_alias = Arel.sql("subquery_for_count")
select_value = operation_over_aggregate_column(column_alias, "count", false)
relation.build_subquery(subquery_alias, select_value)
end
end
end
# frozen_string_literal: true
module ActiveRecord
# = Active Record \Callbacks
#
# \Callbacks are hooks into the life cycle of an Active Record object that allow you to trigger logic
# before or after a change in the object state. This can be used to make sure that associated and
# dependent objects are deleted when {ActiveRecord::Base#destroy}[rdoc-ref:Persistence#destroy] is called (by overwriting +before_destroy+) or
# to massage attributes before they're validated (by overwriting +before_validation+).
# As an example of the callbacks initiated, consider the {ActiveRecord::Base#save}[rdoc-ref:Persistence#save] call for a new record:
#
# * (-) <tt>save</tt>
# * (-) <tt>valid</tt>
# * (1) <tt>before_validation</tt>
# * (-) <tt>validate</tt>
# * (2) <tt>after_validation</tt>
# * (3) <tt>before_save</tt>
# * (4) <tt>before_create</tt>
# * (-) <tt>create</tt>
# * (5) <tt>after_create</tt>
# * (6) <tt>after_save</tt>
# * (7) <tt>after_commit</tt>
#
# Also, an <tt>after_rollback</tt> callback can be configured to be triggered whenever a rollback is issued.
# Check out ActiveRecord::Transactions for more details about <tt>after_commit</tt> and
# <tt>after_rollback</tt>.
#
# Additionally, an <tt>after_touch</tt> callback is triggered whenever an
# object is touched.
#
# Lastly an <tt>after_find</tt> and <tt>after_initialize</tt> callback is triggered for each object that
# is found and instantiated by a finder, with <tt>after_initialize</tt> being triggered after new objects
# are instantiated as well.
#
# There are nineteen callbacks in total, which give a lot of control over how to react and prepare for each state in the
# Active Record life cycle. The sequence for calling {ActiveRecord::Base#save}[rdoc-ref:Persistence#save] for an existing record is similar,
# except that each <tt>_create</tt> callback is replaced by the corresponding <tt>_update</tt> callback.
#
# Examples:
# class CreditCard < ActiveRecord::Base
# # Strip everything but digits, so the user can specify "555 234 34" or
# # "5552-3434" and both will mean "55523434"
# before_validation(on: :create) do
# self.number = number.gsub(/[^0-9]/, "") if attribute_present?("number")
# end
# end
#
# class Subscription < ActiveRecord::Base
# before_create :record_signup
#
# private
# def record_signup
# self.signed_up_on = Date.today
# end
# end
#
# class Firm < ActiveRecord::Base
# # Disables access to the system, for associated clients and people when the firm is destroyed
# before_destroy { |record| Person.where(firm_id: record.id).update_all(access: 'disabled') }
# before_destroy { |record| Client.where(client_of: record.id).update_all(access: 'disabled') }
# end
#
# == Inheritable callback queues
#
# Besides the overwritable callback methods, it's also possible to register callbacks through the
# use of the callback macros. Their main advantage is that the macros add behavior into a callback
# queue that is kept intact through an inheritance hierarchy.
#
# class Topic < ActiveRecord::Base
# before_destroy :destroy_author
# end
#
# class Reply < Topic
# before_destroy :destroy_readers
# end
#
# When <tt>Topic#destroy</tt> is run only +destroy_author+ is called. When <tt>Reply#destroy</tt> is
# run, both +destroy_author+ and +destroy_readers+ are called.
#
# *IMPORTANT:* In order for inheritance to work for the callback queues, you must specify the
# callbacks before specifying the associations. Otherwise, you might trigger the loading of a
# child before the parent has registered the callbacks and they won't be inherited.
#
# == Types of callbacks
#
# There are three types of callbacks accepted by the callback macros: method references (symbol), callback objects,
# inline methods (using a proc). Method references and callback objects are the recommended approaches,
# inline methods using a proc are sometimes appropriate (such as for creating mix-ins).
#
# The method reference callbacks work by specifying a protected or private method available in the object, like this:
#
# class Topic < ActiveRecord::Base
# before_destroy :delete_parents
#
# private
# def delete_parents
# self.class.delete_by(parent_id: id)
# end
# end
#
# The callback objects have methods named after the callback called with the record as the only parameter, such as:
#
# class BankAccount < ActiveRecord::Base
# before_save EncryptionWrapper.new
# after_save EncryptionWrapper.new
# after_initialize EncryptionWrapper.new
# end
#
# class EncryptionWrapper
# def before_save(record)
# record.credit_card_number = encrypt(record.credit_card_number)
# end
#
# def after_save(record)
# record.credit_card_number = decrypt(record.credit_card_number)
# end
#
# alias_method :after_initialize, :after_save
#
# private
# def encrypt(value)
# # Secrecy is committed
# end
#
# def decrypt(value)
# # Secrecy is unveiled
# end
# end
#
# So you specify the object you want to be messaged on a given callback. When that callback is triggered, the object has
# a method by the name of the callback messaged. You can make these callbacks more flexible by passing in other
# initialization data such as the name of the attribute to work with:
#
# class BankAccount < ActiveRecord::Base
# before_save EncryptionWrapper.new("credit_card_number")
# after_save EncryptionWrapper.new("credit_card_number")
# after_initialize EncryptionWrapper.new("credit_card_number")
# end
#
# class EncryptionWrapper
# def initialize(attribute)
# @attribute = attribute
# end
#
# def before_save(record)
# record.send("#{@attribute}=", encrypt(record.send("#{@attribute}")))
# end
#
# def after_save(record)
# record.send("#{@attribute}=", decrypt(record.send("#{@attribute}")))
# end
#
# alias_method :after_initialize, :after_save
#
# private
# def encrypt(value)
# # Secrecy is committed
# end
#
# def decrypt(value)
# # Secrecy is unveiled
# end
# end
#
# == <tt>before_validation*</tt> returning statements
#
# If the +before_validation+ callback throws +:abort+, the process will be
# aborted and {ActiveRecord::Base#save}[rdoc-ref:Persistence#save] will return +false+.
# If {ActiveRecord::Base#save!}[rdoc-ref:Persistence#save!] is called it will raise an ActiveRecord::RecordInvalid exception.
# Nothing will be appended to the errors object.
#
# == Canceling callbacks
#
# If a <tt>before_*</tt> callback throws +:abort+, all the later callbacks and
# the associated action are cancelled.
# Callbacks are generally run in the order they are defined, with the exception of callbacks defined as
# methods on the model, which are called last.
#
# == Ordering callbacks
#
# Sometimes application code requires that callbacks execute in a specific order. For example, a +before_destroy+
# callback (+log_children+ in this case) should be executed before records in the +children+ association are destroyed by the
# <tt>dependent: :destroy</tt> option.
#
# Let's look at the code below:
#
# class Topic < ActiveRecord::Base
# has_many :children, dependent: :destroy
#
# before_destroy :log_children
#
# private
# def log_children
# # Child processing
# end
# end
#
# In this case, the problem is that when the +before_destroy+ callback is executed, records in the +children+ association no
# longer exist because the {ActiveRecord::Base#destroy}[rdoc-ref:Persistence#destroy] callback was executed first.
# You can use the +prepend+ option on the +before_destroy+ callback to avoid this.
#
# class Topic < ActiveRecord::Base
# has_many :children, dependent: :destroy
#
# before_destroy :log_children, prepend: true
#
# private
# def log_children
# # Child processing
# end
# end
#
# This way, the +before_destroy+ is executed before the <tt>dependent: :destroy</tt> is called, and the data is still available.
#
# Also, there are cases when you want several callbacks of the same type to
# be executed in order.
#
# For example:
#
# class Topic < ActiveRecord::Base
# has_many :children
#
# after_save :log_children
# after_save :do_something_else
#
# private
#
# def log_children
# # Child processing
# end
#
# def do_something_else
# # Something else
# end
# end
#
# In this case the +log_children+ is executed before +do_something_else+.
# The same applies to all non-transactional callbacks.
#
# As seen below, in case there are multiple transactional callbacks the order
# is reversed.
#
# For example:
#
# class Topic < ActiveRecord::Base
# has_many :children
#
# after_commit :log_children
# after_commit :do_something_else
#
# private
#
# def log_children
# # Child processing
# end
#
# def do_something_else
# # Something else
# end
# end
#
# In this case the +do_something_else+ is executed before +log_children+.
#
# == \Transactions
#
# The entire callback chain of a {#save}[rdoc-ref:Persistence#save], {#save!}[rdoc-ref:Persistence#save!],
# or {#destroy}[rdoc-ref:Persistence#destroy] call runs within a transaction. That includes <tt>after_*</tt> hooks.
# If everything goes fine a +COMMIT+ is executed once the chain has been completed.
#
# If a <tt>before_*</tt> callback cancels the action a +ROLLBACK+ is issued. You
# can also trigger a +ROLLBACK+ raising an exception in any of the callbacks,
# including <tt>after_*</tt> hooks. Note, however, that in that case the client
# needs to be aware of it because an ordinary {#save}[rdoc-ref:Persistence#save] will raise such exception
# instead of quietly returning +false+.
#
# == Debugging callbacks
#
# The callback chain is accessible via the <tt>_*_callbacks</tt> method on an object. Active Model \Callbacks support
# <tt>:before</tt>, <tt>:after</tt> and <tt>:around</tt> as values for the <tt>kind</tt> property. The <tt>kind</tt> property
# defines what part of the chain the callback runs in.
#
# To find all callbacks in the +before_save+ callback chain:
#
# Topic._save_callbacks.select { |cb| cb.kind.eql?(:before) }
#
# Returns an array of callback objects that form the +before_save+ chain.
#
# To further check if the before_save chain contains a proc defined as <tt>rest_when_dead</tt> use the <tt>filter</tt> property of the callback object:
#
# Topic._save_callbacks.select { |cb| cb.kind.eql?(:before) }.collect(&:filter).include?(:rest_when_dead)
#
# Returns true or false depending on whether the proc is contained in the +before_save+ callback chain on a Topic model.
#
module Callbacks
extend ActiveSupport::Concern
CALLBACKS = [
:after_initialize, :after_find, :after_touch, :before_validation, :after_validation,
:before_save, :around_save, :after_save, :before_create, :around_create,
:after_create, :before_update, :around_update, :after_update,
:before_destroy, :around_destroy, :after_destroy, :after_commit, :after_rollback
]
module ClassMethods
include ActiveModel::Callbacks
##
# :method: after_initialize
#
# :call-seq: after_initialize(*args, &block)
#
# Registers a callback to be called after a record is instantiated. See
# ActiveRecord::Callbacks for more information.
##
# :method: after_find
#
# :call-seq: after_find(*args, &block)
#
# Registers a callback to be called after a record is instantiated
# via a finder. See ActiveRecord::Callbacks for more information.
##
# :method: after_touch
#
# :call-seq: after_touch(*args, &block)
#
# Registers a callback to be called after a record is touched. See
# ActiveRecord::Callbacks for more information.
##
# :method: before_save
#
# :call-seq: before_save(*args, &block)
#
# Registers a callback to be called before a record is saved. See
# ActiveRecord::Callbacks for more information.
##
# :method: around_save
#
# :call-seq: around_save(*args, &block)
#
# Registers a callback to be called around the save of a record. See
# ActiveRecord::Callbacks for more information.
##
# :method: after_save
#
# :call-seq: after_save(*args, &block)
#
# Registers a callback to be called after a record is saved. See
# ActiveRecord::Callbacks for more information.
##
# :method: before_create
#
# :call-seq: before_create(*args, &block)
#
# Registers a callback to be called before a record is created. See
# ActiveRecord::Callbacks for more information.
##
# :method: around_create
#
# :call-seq: around_create(*args, &block)
#
# Registers a callback to be called around the creation of a record. See
# ActiveRecord::Callbacks for more information.
##
# :method: after_create
#
# :call-seq: after_create(*args, &block)
#
# Registers a callback to be called after a record is created. See
# ActiveRecord::Callbacks for more information.
##
# :method: before_update
#
# :call-seq: before_update(*args, &block)
#
# Registers a callback to be called before a record is updated. See
# ActiveRecord::Callbacks for more information.
##
# :method: around_update
#
# :call-seq: around_update(*args, &block)
#
# Registers a callback to be called around the update of a record. See
# ActiveRecord::Callbacks for more information.
##
# :method: after_update
#
# :call-seq: after_update(*args, &block)
#
# Registers a callback to be called after a record is updated. See
# ActiveRecord::Callbacks for more information.
##
# :method: before_destroy
#
# :call-seq: before_destroy(*args, &block)
#
# Registers a callback to be called before a record is destroyed. See
# ActiveRecord::Callbacks for more information.
##
# :method: around_destroy
#
# :call-seq: around_destroy(*args, &block)
#
# Registers a callback to be called around the destruction of a record.
# See ActiveRecord::Callbacks for more information.
##
# :method: after_destroy
#
# :call-seq: after_destroy(*args, &block)
#
# Registers a callback to be called after a record is destroyed. See
# ActiveRecord::Callbacks for more information.
end
included do
include ActiveModel::Validations::Callbacks
define_model_callbacks :initialize, :find, :touch, only: :after
define_model_callbacks :save, :create, :update, :destroy
end
def destroy # :nodoc:
@_destroy_callback_already_called ||= false
return if @_destroy_callback_already_called
@_destroy_callback_already_called = true
_run_destroy_callbacks { super }
rescue RecordNotDestroyed => e
@_association_destroy_exception = e
false
ensure
@_destroy_callback_already_called = false
end
def touch(*, **) # :nodoc:
_run_touch_callbacks { super }
end
def increment!(attribute, by = 1, touch: nil) # :nodoc:
touch ? _run_touch_callbacks { super } : super
end
private
def create_or_update(**)
_run_save_callbacks { super }
end
def _create_record
_run_create_callbacks { super }
end
def _update_record
_run_update_callbacks { super }
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Nodes
class Case < Arel::Nodes::NodeExpression
attr_accessor :case, :conditions, :default
def initialize(expression = nil, default = nil)
@case = expression
@conditions = []
@default = default
end
def when(condition, expression = nil)
@conditions << When.new(Nodes.build_quoted(condition), expression)
self
end
def then(expression)
@conditions.last.right = Nodes.build_quoted(expression)
self
end
def else(expression)
@default = Else.new Nodes.build_quoted(expression)
self
end
def initialize_copy(other)
super
@case = @case.clone if @case
@conditions = @conditions.map { |x| x.clone }
@default = @default.clone if @default
end
def hash
[@case, @conditions, @default].hash
end
def eql?(other)
self.class == other.class &&
self.case == other.case &&
self.conditions == other.conditions &&
self.default == other.default
end
alias :== :eql?
end
class When < Binary # :nodoc:
end
class Else < Unary # :nodoc:
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Nodes
class Casted < Arel::Nodes::NodeExpression # :nodoc:
attr_reader :value, :attribute
alias :value_before_type_cast :value
def initialize(value, attribute)
@value = value
@attribute = attribute
super()
end
def nil?; value.nil?; end
def value_for_database
if attribute.able_to_type_cast?
attribute.type_cast_for_database(value)
else
value
end
end
def hash
[self.class, value, attribute].hash
end
def eql?(other)
self.class == other.class &&
self.value == other.value &&
self.attribute == other.attribute
end
alias :== :eql?
end
class Quoted < Arel::Nodes::Unary # :nodoc:
alias :value_for_database :value
alias :value_before_type_cast :value
def nil?; value.nil?; end
def infinite?
value.respond_to?(:infinite?) && value.infinite?
end
end
def self.build_quoted(other, attribute = nil)
case other
when Arel::Nodes::Node, Arel::Attributes::Attribute, Arel::Table, Arel::SelectManager, Arel::Nodes::SqlLiteral, ActiveModel::Attribute
other
else
case attribute
when Arel::Attributes::Attribute
Casted.new other, attribute
else
Quoted.new other
end
end
end
end
end
# frozen_string_literal: true
require "ipaddr"
module ActiveRecord
module ConnectionAdapters
module PostgreSQL
module OID # :nodoc:
class Cidr < Type::Value # :nodoc:
def type
:cidr
end
def type_cast_for_schema(value)
# If the subnet mask is equal to /32, don't output it
if value.prefix == 32
"\"#{value}\""
else
"\"#{value}/#{value.prefix}\""
end
end
def serialize(value)
if IPAddr === value
"#{value}/#{value.prefix}"
else
value
end
end
def cast_value(value)
if value.nil?
nil
elsif String === value
begin
IPAddr.new(value)
rescue ArgumentError
nil
end
else
value
end
end
end
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Encryption
# The algorithm used for encrypting and decrypting +Message+ objects.
#
# It uses AES-256-GCM. It will generate a random IV for non deterministic encryption (default)
# or derive an initialization vector from the encrypted content for deterministic encryption.
#
# See +Cipher::Aes256Gcm+.
class Cipher
DEFAULT_ENCODING = Encoding::UTF_8
# Encrypts the provided text and return an encrypted +Message+.
def encrypt(clean_text, key:, deterministic: false)
cipher_for(key, deterministic: deterministic).encrypt(clean_text).tap do |message|
message.headers.encoding = clean_text.encoding.name unless clean_text.encoding == DEFAULT_ENCODING
end
end
# Decrypt the provided +Message+.
#
# When +key+ is an Array, it will try all the keys raising a
# +ActiveRecord::Encryption::Errors::Decryption+ if none works.
def decrypt(encrypted_message, key:)
try_to_decrypt_with_each(encrypted_message, keys: Array(key)).tap do |decrypted_text|
decrypted_text.force_encoding(encrypted_message.headers.encoding || DEFAULT_ENCODING)
end
end
def key_length
Aes256Gcm.key_length
end
def iv_length
Aes256Gcm.iv_length
end
private
def try_to_decrypt_with_each(encrypted_text, keys:)
keys.each.with_index do |key, index|
return cipher_for(key).decrypt(encrypted_text)
rescue ActiveRecord::Encryption::Errors::Decryption
raise if index == keys.length - 1
end
end
def cipher_for(secret, deterministic: false)
Aes256Gcm.new(secret, deterministic: deterministic)
end
end
end
end
# frozen_string_literal: true
require "active_support/core_ext/enumerable"
module ActiveRecord
module Associations
# = Active Record Association Collection
#
# CollectionAssociation is an abstract class that provides common stuff to
# ease the implementation of association proxies that represent
# collections. See the class hierarchy in Association.
#
# CollectionAssociation:
# HasManyAssociation => has_many
# HasManyThroughAssociation + ThroughAssociation => has_many :through
#
# The CollectionAssociation class provides common methods to the collections
# defined by +has_and_belongs_to_many+, +has_many+ or +has_many+ with
# the +:through association+ option.
#
# You need to be careful with assumptions regarding the target: The proxy
# does not fetch records from the database until it needs them, but new
# ones created with +build+ are added to the target. So, the target may be
# non-empty and still lack children waiting to be read from the database.
# If you look directly to the database you cannot assume that's the entire
# collection because new records may have been added to the target, etc.
#
# If you need to work on all current children, new and existing records,
# +load_target+ and the +loaded+ flag are your friends.
class CollectionAssociation < Association # :nodoc:
# Implements the reader method, e.g. foo.items for Foo.has_many :items
def reader
ensure_klass_exists!
if stale_target?
reload
end
@proxy ||= CollectionProxy.create(klass, self)
@proxy.reset_scope
end
# Implements the writer method, e.g. foo.items= for Foo.has_many :items
def writer(records)
replace(records)
end
# Implements the ids reader method, e.g. foo.item_ids for Foo.has_many :items
def ids_reader
if loaded?
target.pluck(reflection.association_primary_key)
elsif !target.empty?
load_target.pluck(reflection.association_primary_key)
else
@association_ids ||= scope.pluck(reflection.association_primary_key)
end
end
# Implements the ids writer method, e.g. foo.item_ids= for Foo.has_many :items
def ids_writer(ids)
primary_key = reflection.association_primary_key
pk_type = klass.type_for_attribute(primary_key)
ids = Array(ids).compact_blank
ids.map! { |i| pk_type.cast(i) }
records = klass.where(primary_key => ids).index_by do |r|
r.public_send(primary_key)
end.values_at(*ids).compact
if records.size != ids.size
found_ids = records.map { |record| record.public_send(primary_key) }
not_found_ids = ids - found_ids
klass.all.raise_record_not_found_exception!(ids, records.size, ids.size, primary_key, not_found_ids)
else
replace(records)
end
end
def reset
super
@target = []
@replaced_or_added_targets = Set.new
@association_ids = nil
end
def find(*args)
if options[:inverse_of] && loaded?
args_flatten = args.flatten
model = scope.klass
if args_flatten.blank?
error_message = "Couldn't find #{model.name} without an ID"
raise RecordNotFound.new(error_message, model.name, model.primary_key, args)
end
result = find_by_scan(*args)
result_size = Array(result).size
if !result || result_size != args_flatten.size
scope.raise_record_not_found_exception!(args_flatten, result_size, args_flatten.size)
else
result
end
else
scope.find(*args)
end
end
def build(attributes = nil, &block)
if attributes.is_a?(Array)
attributes.collect { |attr| build(attr, &block) }
else
add_to_target(build_record(attributes, &block), replace: true)
end
end
# Add +records+ to this association. Since +<<+ flattens its argument list
# and inserts each record, +push+ and +concat+ behave identically.
def concat(*records)
records = records.flatten
if owner.new_record?
load_target
concat_records(records)
else
transaction { concat_records(records) }
end
end
# Removes all records from the association without calling callbacks
# on the associated records. It honors the +:dependent+ option. However
# if the +:dependent+ value is +:destroy+ then in that case the +:delete_all+
# deletion strategy for the association is applied.
#
# You can force a particular deletion strategy by passing a parameter.
#
# Example:
#
# @author.books.delete_all(:nullify)
# @author.books.delete_all(:delete_all)
#
# See delete for more info.
def delete_all(dependent = nil)
if dependent && ![:nullify, :delete_all].include?(dependent)
raise ArgumentError, "Valid values are :nullify or :delete_all"
end
dependent = if dependent
dependent
elsif options[:dependent] == :destroy
:delete_all
else
options[:dependent]
end
delete_or_nullify_all_records(dependent).tap do
reset
loaded!
end
end
# Destroy all the records from this association.
#
# See destroy for more info.
def destroy_all
destroy(load_target).tap do
reset
loaded!
end
end
# Removes +records+ from this association calling +before_remove+ and
# +after_remove+ callbacks.
#
# This method is abstract in the sense that +delete_records+ has to be
# provided by descendants. Note this method does not imply the records
# are actually removed from the database, that depends precisely on
# +delete_records+. They are in any case removed from the collection.
def delete(*records)
delete_or_destroy(records, options[:dependent])
end
# Deletes the +records+ and removes them from this association calling
# +before_remove+, +after_remove+, +before_destroy+ and +after_destroy+ callbacks.
#
# Note that this method removes records from the database ignoring the
# +:dependent+ option.
def destroy(*records)
delete_or_destroy(records, :destroy)
end
# Returns the size of the collection by executing a SELECT COUNT(*)
# query if the collection hasn't been loaded, and calling
# <tt>collection.size</tt> if it has.
#
# If the collection has been already loaded +size+ and +length+ are
# equivalent. If not and you are going to need the records anyway
# +length+ will take one less query. Otherwise +size+ is more efficient.
#
# This method is abstract in the sense that it relies on
# +count_records+, which is a method descendants have to provide.
def size
if !find_target? || loaded?
target.size
elsif @association_ids
@association_ids.size
elsif !association_scope.group_values.empty?
load_target.size
elsif !association_scope.distinct_value && !target.empty?
unsaved_records = target.select(&:new_record?)
unsaved_records.size + count_records
else
count_records
end
end
# Returns true if the collection is empty.
#
# If the collection has been loaded
# it is equivalent to <tt>collection.size.zero?</tt>. If the
# collection has not been loaded, it is equivalent to
# <tt>!collection.exists?</tt>. If the collection has not already been
# loaded and you are going to fetch the records anyway it is better to
# check <tt>collection.length.zero?</tt>.
def empty?
if loaded? || @association_ids || reflection.has_cached_counter?
size.zero?
else
target.empty? && !scope.exists?
end
end
# Replace this collection with +other_array+. This will perform a diff
# and delete/add only records that have changed.
def replace(other_array)
other_array.each { |val| raise_on_type_mismatch!(val) }
original_target = load_target.dup
if owner.new_record?
replace_records(other_array, original_target)
else
replace_common_records_in_memory(other_array, original_target)
if other_array != original_target
transaction { replace_records(other_array, original_target) }
else
other_array
end
end
end
def include?(record)
if record.is_a?(reflection.klass)
if record.new_record?
include_in_memory?(record)
else
loaded? ? target.include?(record) : scope.exists?(record.id)
end
else
false
end
end
def load_target
if find_target?
@target = merge_target_lists(find_target, target)
end
loaded!
target
end
def add_to_target(record, skip_callbacks: false, replace: false, &block)
replace_on_target(record, skip_callbacks, replace: replace || association_scope.distinct_value, &block)
end
def target=(record)
return super unless reflection.klass.has_many_inversing
case record
when nil
# It's not possible to remove the record from the inverse association.
when Array
super
else
replace_on_target(record, true, replace: true, inversing: true)
end
end
def scope
scope = super
scope.none! if null_scope?
scope
end
def null_scope?
owner.new_record? && !foreign_key_present?
end
def find_from_target?
loaded? ||
owner.strict_loading? ||
reflection.strict_loading? ||
owner.new_record? ||
target.any? { |record| record.new_record? || record.changed? }
end
private
def transaction(&block)
reflection.klass.transaction(&block)
end
# We have some records loaded from the database (persisted) and some that are
# in-memory (memory). The same record may be represented in the persisted array
# and in the memory array.
#
# So the task of this method is to merge them according to the following rules:
#
# * The final array must not have duplicates
# * The order of the persisted array is to be preserved
# * Any changes made to attributes on objects in the memory array are to be preserved
# * Otherwise, attributes should have the value found in the database
def merge_target_lists(persisted, memory)
return persisted if memory.empty?
return memory if persisted.empty?
persisted.map! do |record|
if mem_record = memory.delete(record)
((record.attribute_names & mem_record.attribute_names) - mem_record.changed_attribute_names_to_save).each do |name|
mem_record[name] = record[name]
end
mem_record
else
record
end
end
persisted + memory.reject(&:persisted?)
end
def _create_record(attributes, raise = false, &block)
unless owner.persisted?
raise ActiveRecord::RecordNotSaved.new("You cannot call create unless the parent is saved", owner)
end
if attributes.is_a?(Array)
attributes.collect { |attr| _create_record(attr, raise, &block) }
else
record = build_record(attributes, &block)
transaction do
result = nil
add_to_target(record) do
result = insert_record(record, true, raise) {
@_was_loaded = loaded?
}
end
raise ActiveRecord::Rollback unless result
end
record
end
end
# Do the relevant stuff to insert the given record into the association collection.
def insert_record(record, validate = true, raise = false, &block)
if raise
record.save!(validate: validate, &block)
else
record.save(validate: validate, &block)
end
end
def delete_or_destroy(records, method)
return if records.empty?
records = find(records) if records.any? { |record| record.kind_of?(Integer) || record.kind_of?(String) }
records = records.flatten
records.each { |record| raise_on_type_mismatch!(record) }
existing_records = records.reject(&:new_record?)
if existing_records.empty?
remove_records(existing_records, records, method)
else
transaction { remove_records(existing_records, records, method) }
end
end
def remove_records(existing_records, records, method)
catch(:abort) do
records.each { |record| callback(:before_remove, record) }
end || return
delete_records(existing_records, method) if existing_records.any?
@target -= records
@association_ids = nil
records.each { |record| callback(:after_remove, record) }
end
# Delete the given records from the association,
# using one of the methods +:destroy+, +:delete_all+
# or +:nullify+ (or +nil+, in which case a default is used).
def delete_records(records, method)
raise NotImplementedError
end
def replace_records(new_target, original_target)
delete(difference(target, new_target))
unless concat(difference(new_target, target))
@target = original_target
raise RecordNotSaved, "Failed to replace #{reflection.name} because one or more of the " \
"new records could not be saved."
end
target
end
def replace_common_records_in_memory(new_target, original_target)
common_records = intersection(new_target, original_target)
common_records.each do |record|
skip_callbacks = true
replace_on_target(record, skip_callbacks, replace: true)
end
end
def concat_records(records, raise = false)
result = true
records.each do |record|
raise_on_type_mismatch!(record)
add_to_target(record) do
unless owner.new_record?
result &&= insert_record(record, true, raise) {
@_was_loaded = loaded?
}
end
end
end
raise ActiveRecord::Rollback unless result
records
end
def replace_on_target(record, skip_callbacks, replace:, inversing: false)
if replace && (!record.new_record? || @replaced_or_added_targets.include?(record))
index = @target.index(record)
end
catch(:abort) do
callback(:before_add, record)
end || return unless skip_callbacks
set_inverse_instance(record)
@_was_loaded = true
yield(record) if block_given?
if !index && @replaced_or_added_targets.include?(record)
index = @target.index(record)
end
@replaced_or_added_targets << record if inversing || index || record.new_record?
if index
target[index] = record
elsif @_was_loaded || !loaded?
@association_ids = nil
target << record
end
callback(:after_add, record) unless skip_callbacks
record
ensure
@_was_loaded = nil
end
def callback(method, record)
callbacks_for(method).each do |callback|
callback.call(method, owner, record)
end
end
def callbacks_for(callback_name)
full_callback_name = "#{callback_name}_for_#{reflection.name}"
if owner.class.respond_to?(full_callback_name)
owner.class.send(full_callback_name)
else
[]
end
end
def include_in_memory?(record)
if reflection.is_a?(ActiveRecord::Reflection::ThroughReflection)
assoc = owner.association(reflection.through_reflection.name)
assoc.reader.any? { |source|
target_reflection = source.send(reflection.source_reflection.name)
target_reflection.respond_to?(:include?) ? target_reflection.include?(record) : target_reflection == record
} || target.include?(record)
else
target.include?(record)
end
end
# If the :inverse_of option has been
# specified, then #find scans the entire collection.
def find_by_scan(*args)
expects_array = args.first.kind_of?(Array)
ids = args.flatten.compact.map(&:to_s).uniq
if ids.size == 1
id = ids.first
record = load_target.detect { |r| id == r.id.to_s }
expects_array ? [ record ] : record
else
load_target.select { |r| ids.include?(r.id.to_s) }
end
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Associations
# Collection proxies in Active Record are middlemen between an
# <tt>association</tt>, and its <tt>target</tt> result set.
#
# For example, given
#
# class Blog < ActiveRecord::Base
# has_many :posts
# end
#
# blog = Blog.first
#
# The collection proxy returned by <tt>blog.posts</tt> is built from a
# <tt>:has_many</tt> <tt>association</tt>, and delegates to a collection
# of posts as the <tt>target</tt>.
#
# This class delegates unknown methods to the <tt>association</tt>'s
# relation class via a delegate cache.
#
# The <tt>target</tt> result set is not loaded until needed. For example,
#
# blog.posts.count
#
# is computed directly through SQL and does not trigger by itself the
# instantiation of the actual post records.
class CollectionProxy < Relation
def initialize(klass, association, **) # :nodoc:
@association = association
super klass
extensions = association.extensions
extend(*extensions) if extensions.any?
end
def target
@association.target
end
def load_target
@association.load_target
end
# Returns +true+ if the association has been loaded, otherwise +false+.
#
# person.pets.loaded? # => false
# person.pets.records
# person.pets.loaded? # => true
def loaded?
@association.loaded?
end
alias :loaded :loaded?
##
# :method: select
#
# :call-seq:
# select(*fields, &block)
#
# Works in two ways.
#
# *First:* Specify a subset of fields to be selected from the result set.
#
# class Person < ActiveRecord::Base
# has_many :pets
# end
#
# person.pets
# # => [
# # #<Pet id: 1, name: "Fancy-Fancy", person_id: 1>,
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
#
# person.pets.select(:name)
# # => [
# # #<Pet id: nil, name: "Fancy-Fancy">,
# # #<Pet id: nil, name: "Spook">,
# # #<Pet id: nil, name: "Choo-Choo">
# # ]
#
# person.pets.select(:id, :name)
# # => [
# # #<Pet id: 1, name: "Fancy-Fancy">,
# # #<Pet id: 2, name: "Spook">,
# # #<Pet id: 3, name: "Choo-Choo">
# # ]
#
# Be careful because this also means you're initializing a model
# object with only the fields that you've selected. If you attempt
# to access a field except +id+ that is not in the initialized record you'll
# receive:
#
# person.pets.select(:name).first.person_id
# # => ActiveModel::MissingAttributeError: missing attribute: person_id
#
# *Second:* You can pass a block so it can be used just like Array#select.
# This builds an array of objects from the database for the scope,
# converting them into an array and iterating through them using
# Array#select.
#
# person.pets.select { |pet| /oo/.match?(pet.name) }
# # => [
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
# Finds an object in the collection responding to the +id+. Uses the same
# rules as ActiveRecord::Base.find. Returns ActiveRecord::RecordNotFound
# error if the object cannot be found.
#
# class Person < ActiveRecord::Base
# has_many :pets
# end
#
# person.pets
# # => [
# # #<Pet id: 1, name: "Fancy-Fancy", person_id: 1>,
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
#
# person.pets.find(1) # => #<Pet id: 1, name: "Fancy-Fancy", person_id: 1>
# person.pets.find(4) # => ActiveRecord::RecordNotFound: Couldn't find Pet with 'id'=4
#
# person.pets.find(2) { |pet| pet.name.downcase! }
# # => #<Pet id: 2, name: "fancy-fancy", person_id: 1>
#
# person.pets.find(2, 3)
# # => [
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
def find(*args)
return super if block_given?
@association.find(*args)
end
##
# :method: first
#
# :call-seq:
# first(limit = nil)
#
# Returns the first record, or the first +n+ records, from the collection.
# If the collection is empty, the first form returns +nil+, and the second
# form returns an empty array.
#
# class Person < ActiveRecord::Base
# has_many :pets
# end
#
# person.pets
# # => [
# # #<Pet id: 1, name: "Fancy-Fancy", person_id: 1>,
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
#
# person.pets.first # => #<Pet id: 1, name: "Fancy-Fancy", person_id: 1>
#
# person.pets.first(2)
# # => [
# # #<Pet id: 1, name: "Fancy-Fancy", person_id: 1>,
# # #<Pet id: 2, name: "Spook", person_id: 1>
# # ]
#
# another_person_without.pets # => []
# another_person_without.pets.first # => nil
# another_person_without.pets.first(3) # => []
##
# :method: second
#
# :call-seq:
# second()
#
# Same as #first except returns only the second record.
##
# :method: third
#
# :call-seq:
# third()
#
# Same as #first except returns only the third record.
##
# :method: fourth
#
# :call-seq:
# fourth()
#
# Same as #first except returns only the fourth record.
##
# :method: fifth
#
# :call-seq:
# fifth()
#
# Same as #first except returns only the fifth record.
##
# :method: forty_two
#
# :call-seq:
# forty_two()
#
# Same as #first except returns only the forty second record.
# Also known as accessing "the reddit".
##
# :method: third_to_last
#
# :call-seq:
# third_to_last()
#
# Same as #first except returns only the third-to-last record.
##
# :method: second_to_last
#
# :call-seq:
# second_to_last()
#
# Same as #first except returns only the second-to-last record.
# Returns the last record, or the last +n+ records, from the collection.
# If the collection is empty, the first form returns +nil+, and the second
# form returns an empty array.
#
# class Person < ActiveRecord::Base
# has_many :pets
# end
#
# person.pets
# # => [
# # #<Pet id: 1, name: "Fancy-Fancy", person_id: 1>,
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
#
# person.pets.last # => #<Pet id: 3, name: "Choo-Choo", person_id: 1>
#
# person.pets.last(2)
# # => [
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
#
# another_person_without.pets # => []
# another_person_without.pets.last # => nil
# another_person_without.pets.last(3) # => []
def last(limit = nil)
load_target if find_from_target?
super
end
# Gives a record (or N records if a parameter is supplied) from the collection
# using the same rules as <tt>ActiveRecord::Base.take</tt>.
#
# class Person < ActiveRecord::Base
# has_many :pets
# end
#
# person.pets
# # => [
# # #<Pet id: 1, name: "Fancy-Fancy", person_id: 1>,
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
#
# person.pets.take # => #<Pet id: 1, name: "Fancy-Fancy", person_id: 1>
#
# person.pets.take(2)
# # => [
# # #<Pet id: 1, name: "Fancy-Fancy", person_id: 1>,
# # #<Pet id: 2, name: "Spook", person_id: 1>
# # ]
#
# another_person_without.pets # => []
# another_person_without.pets.take # => nil
# another_person_without.pets.take(2) # => []
def take(limit = nil)
load_target if find_from_target?
super
end
# Returns a new object of the collection type that has been instantiated
# with +attributes+ and linked to this object, but have not yet been saved.
# You can pass an array of attributes hashes, this will return an array
# with the new objects.
#
# class Person
# has_many :pets
# end
#
# person.pets.build
# # => #<Pet id: nil, name: nil, person_id: 1>
#
# person.pets.build(name: 'Fancy-Fancy')
# # => #<Pet id: nil, name: "Fancy-Fancy", person_id: 1>
#
# person.pets.build([{name: 'Spook'}, {name: 'Choo-Choo'}, {name: 'Brain'}])
# # => [
# # #<Pet id: nil, name: "Spook", person_id: 1>,
# # #<Pet id: nil, name: "Choo-Choo", person_id: 1>,
# # #<Pet id: nil, name: "Brain", person_id: 1>
# # ]
#
# person.pets.size # => 5 # size of the collection
# person.pets.count # => 0 # count from database
def build(attributes = {}, &block)
@association.build(attributes, &block)
end
alias_method :new, :build
# Returns a new object of the collection type that has been instantiated with
# attributes, linked to this object and that has already been saved (if it
# passes the validations).
#
# class Person
# has_many :pets
# end
#
# person.pets.create(name: 'Fancy-Fancy')
# # => #<Pet id: 1, name: "Fancy-Fancy", person_id: 1>
#
# person.pets.create([{name: 'Spook'}, {name: 'Choo-Choo'}])
# # => [
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
#
# person.pets.size # => 3
# person.pets.count # => 3
#
# person.pets.find(1, 2, 3)
# # => [
# # #<Pet id: 1, name: "Fancy-Fancy", person_id: 1>,
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
def create(attributes = {}, &block)
@association.create(attributes, &block)
end
# Like #create, except that if the record is invalid, raises an exception.
#
# class Person
# has_many :pets
# end
#
# class Pet
# validates :name, presence: true
# end
#
# person.pets.create!(name: nil)
# # => ActiveRecord::RecordInvalid: Validation failed: Name can't be blank
def create!(attributes = {}, &block)
@association.create!(attributes, &block)
end
# Replaces this collection with +other_array+. This will perform a diff
# and delete/add only records that have changed.
#
# class Person < ActiveRecord::Base
# has_many :pets
# end
#
# person.pets
# # => [#<Pet id: 1, name: "Gorby", group: "cats", person_id: 1>]
#
# other_pets = [Pet.new(name: 'Puff', group: 'celebrities')]
#
# person.pets.replace(other_pets)
#
# person.pets
# # => [#<Pet id: 2, name: "Puff", group: "celebrities", person_id: 1>]
#
# If the supplied array has an incorrect association type, it raises
# an <tt>ActiveRecord::AssociationTypeMismatch</tt> error:
#
# person.pets.replace(["doo", "ggie", "gaga"])
# # => ActiveRecord::AssociationTypeMismatch: Pet expected, got String
def replace(other_array)
@association.replace(other_array)
end
# Deletes all the records from the collection according to the strategy
# specified by the +:dependent+ option. If no +:dependent+ option is given,
# then it will follow the default strategy.
#
# For <tt>has_many :through</tt> associations, the default deletion strategy is
# +:delete_all+.
#
# For +has_many+ associations, the default deletion strategy is +:nullify+.
# This sets the foreign keys to +NULL+.
#
# class Person < ActiveRecord::Base
# has_many :pets # dependent: :nullify option by default
# end
#
# person.pets.size # => 3
# person.pets
# # => [
# # #<Pet id: 1, name: "Fancy-Fancy", person_id: 1>,
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
#
# person.pets.delete_all
# # => [
# # #<Pet id: 1, name: "Fancy-Fancy", person_id: 1>,
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
#
# person.pets.size # => 0
# person.pets # => []
#
# Pet.find(1, 2, 3)
# # => [
# # #<Pet id: 1, name: "Fancy-Fancy", person_id: nil>,
# # #<Pet id: 2, name: "Spook", person_id: nil>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: nil>
# # ]
#
# Both +has_many+ and <tt>has_many :through</tt> dependencies default to the
# +:delete_all+ strategy if the +:dependent+ option is set to +:destroy+.
# Records are not instantiated and callbacks will not be fired.
#
# class Person < ActiveRecord::Base
# has_many :pets, dependent: :destroy
# end
#
# person.pets.size # => 3
# person.pets
# # => [
# # #<Pet id: 1, name: "Fancy-Fancy", person_id: 1>,
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
#
# person.pets.delete_all
#
# Pet.find(1, 2, 3)
# # => ActiveRecord::RecordNotFound: Couldn't find all Pets with 'id': (1, 2, 3)
#
# If it is set to <tt>:delete_all</tt>, all the objects are deleted
# *without* calling their +destroy+ method.
#
# class Person < ActiveRecord::Base
# has_many :pets, dependent: :delete_all
# end
#
# person.pets.size # => 3
# person.pets
# # => [
# # #<Pet id: 1, name: "Fancy-Fancy", person_id: 1>,
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
#
# person.pets.delete_all
#
# Pet.find(1, 2, 3)
# # => ActiveRecord::RecordNotFound: Couldn't find all Pets with 'id': (1, 2, 3)
def delete_all(dependent = nil)
@association.delete_all(dependent).tap { reset_scope }
end
# Deletes the records of the collection directly from the database
# ignoring the +:dependent+ option. Records are instantiated and it
# invokes +before_remove+, +after_remove+, +before_destroy+, and
# +after_destroy+ callbacks.
#
# class Person < ActiveRecord::Base
# has_many :pets
# end
#
# person.pets.size # => 3
# person.pets
# # => [
# # #<Pet id: 1, name: "Fancy-Fancy", person_id: 1>,
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
#
# person.pets.destroy_all
#
# person.pets.size # => 0
# person.pets # => []
#
# Pet.find(1) # => Couldn't find Pet with id=1
def destroy_all
@association.destroy_all.tap { reset_scope }
end
# Deletes the +records+ supplied from the collection according to the strategy
# specified by the +:dependent+ option. If no +:dependent+ option is given,
# then it will follow the default strategy. Returns an array with the
# deleted records.
#
# For <tt>has_many :through</tt> associations, the default deletion strategy is
# +:delete_all+.
#
# For +has_many+ associations, the default deletion strategy is +:nullify+.
# This sets the foreign keys to +NULL+.
#
# class Person < ActiveRecord::Base
# has_many :pets # dependent: :nullify option by default
# end
#
# person.pets.size # => 3
# person.pets
# # => [
# # #<Pet id: 1, name: "Fancy-Fancy", person_id: 1>,
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
#
# person.pets.delete(Pet.find(1))
# # => [#<Pet id: 1, name: "Fancy-Fancy", person_id: 1>]
#
# person.pets.size # => 2
# person.pets
# # => [
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
#
# Pet.find(1)
# # => #<Pet id: 1, name: "Fancy-Fancy", person_id: nil>
#
# If it is set to <tt>:destroy</tt> all the +records+ are removed by calling
# their +destroy+ method. See +destroy+ for more information.
#
# class Person < ActiveRecord::Base
# has_many :pets, dependent: :destroy
# end
#
# person.pets.size # => 3
# person.pets
# # => [
# # #<Pet id: 1, name: "Fancy-Fancy", person_id: 1>,
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
#
# person.pets.delete(Pet.find(1), Pet.find(3))
# # => [
# # #<Pet id: 1, name: "Fancy-Fancy", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
#
# person.pets.size # => 1
# person.pets
# # => [#<Pet id: 2, name: "Spook", person_id: 1>]
#
# Pet.find(1, 3)
# # => ActiveRecord::RecordNotFound: Couldn't find all Pets with 'id': (1, 3)
#
# If it is set to <tt>:delete_all</tt>, all the +records+ are deleted
# *without* calling their +destroy+ method.
#
# class Person < ActiveRecord::Base
# has_many :pets, dependent: :delete_all
# end
#
# person.pets.size # => 3
# person.pets
# # => [
# # #<Pet id: 1, name: "Fancy-Fancy", person_id: 1>,
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
#
# person.pets.delete(Pet.find(1))
# # => [#<Pet id: 1, name: "Fancy-Fancy", person_id: 1>]
#
# person.pets.size # => 2
# person.pets
# # => [
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
#
# Pet.find(1)
# # => ActiveRecord::RecordNotFound: Couldn't find Pet with 'id'=1
#
# You can pass +Integer+ or +String+ values, it finds the records
# responding to the +id+ and executes delete on them.
#
# class Person < ActiveRecord::Base
# has_many :pets
# end
#
# person.pets.size # => 3
# person.pets
# # => [
# # #<Pet id: 1, name: "Fancy-Fancy", person_id: 1>,
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
#
# person.pets.delete("1")
# # => [#<Pet id: 1, name: "Fancy-Fancy", person_id: 1>]
#
# person.pets.delete(2, 3)
# # => [
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
def delete(*records)
@association.delete(*records).tap { reset_scope }
end
# Destroys the +records+ supplied and removes them from the collection.
# This method will _always_ remove record from the database ignoring
# the +:dependent+ option. Returns an array with the removed records.
#
# class Person < ActiveRecord::Base
# has_many :pets
# end
#
# person.pets.size # => 3
# person.pets
# # => [
# # #<Pet id: 1, name: "Fancy-Fancy", person_id: 1>,
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
#
# person.pets.destroy(Pet.find(1))
# # => [#<Pet id: 1, name: "Fancy-Fancy", person_id: 1>]
#
# person.pets.size # => 2
# person.pets
# # => [
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
#
# person.pets.destroy(Pet.find(2), Pet.find(3))
# # => [
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
#
# person.pets.size # => 0
# person.pets # => []
#
# Pet.find(1, 2, 3) # => ActiveRecord::RecordNotFound: Couldn't find all Pets with 'id': (1, 2, 3)
#
# You can pass +Integer+ or +String+ values, it finds the records
# responding to the +id+ and then deletes them from the database.
#
# person.pets.size # => 3
# person.pets
# # => [
# # #<Pet id: 4, name: "Benny", person_id: 1>,
# # #<Pet id: 5, name: "Brain", person_id: 1>,
# # #<Pet id: 6, name: "Boss", person_id: 1>
# # ]
#
# person.pets.destroy("4")
# # => #<Pet id: 4, name: "Benny", person_id: 1>
#
# person.pets.size # => 2
# person.pets
# # => [
# # #<Pet id: 5, name: "Brain", person_id: 1>,
# # #<Pet id: 6, name: "Boss", person_id: 1>
# # ]
#
# person.pets.destroy(5, 6)
# # => [
# # #<Pet id: 5, name: "Brain", person_id: 1>,
# # #<Pet id: 6, name: "Boss", person_id: 1>
# # ]
#
# person.pets.size # => 0
# person.pets # => []
#
# Pet.find(4, 5, 6) # => ActiveRecord::RecordNotFound: Couldn't find all Pets with 'id': (4, 5, 6)
def destroy(*records)
@association.destroy(*records).tap { reset_scope }
end
##
# :method: distinct
#
# :call-seq:
# distinct(value = true)
#
# Specifies whether the records should be unique or not.
#
# class Person < ActiveRecord::Base
# has_many :pets
# end
#
# person.pets.select(:name)
# # => [
# # #<Pet name: "Fancy-Fancy">,
# # #<Pet name: "Fancy-Fancy">
# # ]
#
# person.pets.select(:name).distinct
# # => [#<Pet name: "Fancy-Fancy">]
#
# person.pets.select(:name).distinct.distinct(false)
# # => [
# # #<Pet name: "Fancy-Fancy">,
# # #<Pet name: "Fancy-Fancy">
# # ]
#--
def calculate(operation, column_name)
null_scope? ? scope.calculate(operation, column_name) : super
end
def pluck(*column_names)
null_scope? ? scope.pluck(*column_names) : super
end
##
# :method: count
#
# :call-seq:
# count(column_name = nil, &block)
#
# Count all records.
#
# class Person < ActiveRecord::Base
# has_many :pets
# end
#
# # This will perform the count using SQL.
# person.pets.count # => 3
# person.pets
# # => [
# # #<Pet id: 1, name: "Fancy-Fancy", person_id: 1>,
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
#
# Passing a block will select all of a person's pets in SQL and then
# perform the count using Ruby.
#
# person.pets.count { |pet| pet.name.include?('-') } # => 2
# Returns the size of the collection. If the collection hasn't been loaded,
# it executes a <tt>SELECT COUNT(*)</tt> query. Else it calls <tt>collection.size</tt>.
#
# If the collection has been already loaded +size+ and +length+ are
# equivalent. If not and you are going to need the records anyway
# +length+ will take one less query. Otherwise +size+ is more efficient.
#
# class Person < ActiveRecord::Base
# has_many :pets
# end
#
# person.pets.size # => 3
# # executes something like SELECT COUNT(*) FROM "pets" WHERE "pets"."person_id" = 1
#
# person.pets # This will execute a SELECT * FROM query
# # => [
# # #<Pet id: 1, name: "Fancy-Fancy", person_id: 1>,
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
#
# person.pets.size # => 3
# # Because the collection is already loaded, this will behave like
# # collection.size and no SQL count query is executed.
def size
@association.size
end
##
# :method: length
#
# :call-seq:
# length()
#
# Returns the size of the collection calling +size+ on the target.
# If the collection has been already loaded, +length+ and +size+ are
# equivalent. If not and you are going to need the records anyway this
# method will take one less query. Otherwise +size+ is more efficient.
#
# class Person < ActiveRecord::Base
# has_many :pets
# end
#
# person.pets.length # => 3
# # executes something like SELECT "pets".* FROM "pets" WHERE "pets"."person_id" = 1
#
# # Because the collection is loaded, you can
# # call the collection with no additional queries:
# person.pets
# # => [
# # #<Pet id: 1, name: "Fancy-Fancy", person_id: 1>,
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
# Returns +true+ if the collection is empty. If the collection has been
# loaded it is equivalent
# to <tt>collection.size.zero?</tt>. If the collection has not been loaded,
# it is equivalent to <tt>!collection.exists?</tt>. If the collection has
# not already been loaded and you are going to fetch the records anyway it
# is better to check <tt>collection.load.empty?</tt>.
#
# class Person < ActiveRecord::Base
# has_many :pets
# end
#
# person.pets.count # => 1
# person.pets.empty? # => false
#
# person.pets.delete_all
#
# person.pets.count # => 0
# person.pets.empty? # => true
def empty?
@association.empty?
end
##
# :method: any?
#
# :call-seq:
# any?()
#
# Returns +true+ if the collection is not empty.
#
# class Person < ActiveRecord::Base
# has_many :pets
# end
#
# person.pets.count # => 0
# person.pets.any? # => false
#
# person.pets << Pet.new(name: 'Snoop')
# person.pets.count # => 1
# person.pets.any? # => true
#
# Calling it without a block when the collection is not yet
# loaded is equivalent to <tt>collection.exists?</tt>.
# If you're going to load the collection anyway, it is better
# to call <tt>collection.load.any?</tt> to avoid an extra query.
#
# You can also pass a +block+ to define criteria. The behavior
# is the same, it returns true if the collection based on the
# criteria is not empty.
#
# person.pets
# # => [#<Pet name: "Snoop", group: "dogs">]
#
# person.pets.any? do |pet|
# pet.group == 'cats'
# end
# # => false
#
# person.pets.any? do |pet|
# pet.group == 'dogs'
# end
# # => true
##
# :method: many?
#
# :call-seq:
# many?()
#
# Returns true if the collection has more than one record.
# Equivalent to <tt>collection.size > 1</tt>.
#
# class Person < ActiveRecord::Base
# has_many :pets
# end
#
# person.pets.count # => 1
# person.pets.many? # => false
#
# person.pets << Pet.new(name: 'Snoopy')
# person.pets.count # => 2
# person.pets.many? # => true
#
# You can also pass a +block+ to define criteria. The
# behavior is the same, it returns true if the collection
# based on the criteria has more than one record.
#
# person.pets
# # => [
# # #<Pet name: "Gorby", group: "cats">,
# # #<Pet name: "Puff", group: "cats">,
# # #<Pet name: "Snoop", group: "dogs">
# # ]
#
# person.pets.many? do |pet|
# pet.group == 'dogs'
# end
# # => false
#
# person.pets.many? do |pet|
# pet.group == 'cats'
# end
# # => true
# Returns +true+ if the given +record+ is present in the collection.
#
# class Person < ActiveRecord::Base
# has_many :pets
# end
#
# person.pets # => [#<Pet id: 20, name: "Snoop">]
#
# person.pets.include?(Pet.find(20)) # => true
# person.pets.include?(Pet.find(21)) # => false
def include?(record)
!!@association.include?(record)
end
def proxy_association # :nodoc:
@association
end
# Returns a <tt>Relation</tt> object for the records in this association
def scope
@scope ||= @association.scope
end
# Equivalent to <tt>Array#==</tt>. Returns +true+ if the two arrays
# contain the same number of elements and if each element is equal
# to the corresponding element in the +other+ array, otherwise returns
# +false+.
#
# class Person < ActiveRecord::Base
# has_many :pets
# end
#
# person.pets
# # => [
# # #<Pet id: 1, name: "Fancy-Fancy", person_id: 1>,
# # #<Pet id: 2, name: "Spook", person_id: 1>
# # ]
#
# other = person.pets.to_ary
#
# person.pets == other
# # => true
#
# other = [Pet.new(id: 1), Pet.new(id: 2)]
#
# person.pets == other
# # => false
def ==(other)
load_target == other
end
##
# :method: to_ary
#
# :call-seq:
# to_ary()
#
# Returns a new array of objects from the collection. If the collection
# hasn't been loaded, it fetches the records from the database.
#
# class Person < ActiveRecord::Base
# has_many :pets
# end
#
# person.pets
# # => [
# # #<Pet id: 4, name: "Benny", person_id: 1>,
# # #<Pet id: 5, name: "Brain", person_id: 1>,
# # #<Pet id: 6, name: "Boss", person_id: 1>
# # ]
#
# other_pets = person.pets.to_ary
# # => [
# # #<Pet id: 4, name: "Benny", person_id: 1>,
# # #<Pet id: 5, name: "Brain", person_id: 1>,
# # #<Pet id: 6, name: "Boss", person_id: 1>
# # ]
#
# other_pets.replace([Pet.new(name: 'BooGoo')])
#
# other_pets
# # => [#<Pet id: nil, name: "BooGoo", person_id: 1>]
#
# person.pets
# # This is not affected by replace
# # => [
# # #<Pet id: 4, name: "Benny", person_id: 1>,
# # #<Pet id: 5, name: "Brain", person_id: 1>,
# # #<Pet id: 6, name: "Boss", person_id: 1>
# # ]
def records # :nodoc:
load_target
end
# Adds one or more +records+ to the collection by setting their foreign keys
# to the association's primary key. Since <tt><<</tt> flattens its argument list and
# inserts each record, +push+ and +concat+ behave identically. Returns +self+
# so several appends may be chained together.
#
# class Person < ActiveRecord::Base
# has_many :pets
# end
#
# person.pets.size # => 0
# person.pets << Pet.new(name: 'Fancy-Fancy')
# person.pets << [Pet.new(name: 'Spook'), Pet.new(name: 'Choo-Choo')]
# person.pets.size # => 3
#
# person.id # => 1
# person.pets
# # => [
# # #<Pet id: 1, name: "Fancy-Fancy", person_id: 1>,
# # #<Pet id: 2, name: "Spook", person_id: 1>,
# # #<Pet id: 3, name: "Choo-Choo", person_id: 1>
# # ]
def <<(*records)
proxy_association.concat(records) && self
end
alias_method :push, :<<
alias_method :append, :<<
alias_method :concat, :<<
def prepend(*args) # :nodoc:
raise NoMethodError, "prepend on association is not defined. Please use <<, push or append"
end
# Equivalent to +delete_all+. The difference is that returns +self+, instead
# of an array with the deleted objects, so methods can be chained. See
# +delete_all+ for more information.
# Note that because +delete_all+ removes records by directly
# running an SQL query into the database, the +updated_at+ column of
# the object is not changed.
def clear
delete_all
self
end
# Reloads the collection from the database. Returns +self+.
#
# class Person < ActiveRecord::Base
# has_many :pets
# end
#
# person.pets # fetches pets from the database
# # => [#<Pet id: 1, name: "Snoop", group: "dogs", person_id: 1>]
#
# person.pets # uses the pets cache
# # => [#<Pet id: 1, name: "Snoop", group: "dogs", person_id: 1>]
#
# person.pets.reload # fetches pets from the database
# # => [#<Pet id: 1, name: "Snoop", group: "dogs", person_id: 1>]
def reload
proxy_association.reload(true)
reset_scope
end
# Unloads the association. Returns +self+.
#
# class Person < ActiveRecord::Base
# has_many :pets
# end
#
# person.pets # fetches pets from the database
# # => [#<Pet id: 1, name: "Snoop", group: "dogs", person_id: 1>]
#
# person.pets # uses the pets cache
# # => [#<Pet id: 1, name: "Snoop", group: "dogs", person_id: 1>]
#
# person.pets.reset # clears the pets cache
#
# person.pets # fetches pets from the database
# # => [#<Pet id: 1, name: "Snoop", group: "dogs", person_id: 1>]
def reset
proxy_association.reset
proxy_association.reset_scope
reset_scope
end
def reset_scope # :nodoc:
@offsets = @take = nil
@scope = nil
self
end
def inspect # :nodoc:
load_target if find_from_target?
super
end
delegate_methods = [
QueryMethods,
SpawnMethods,
].flat_map { |klass|
klass.public_instance_methods(false)
} - self.public_instance_methods(false) - [:select] + [
:scoping, :values, :insert, :insert_all, :insert!, :insert_all!, :upsert, :upsert_all
]
delegate(*delegate_methods, to: :scope)
private
def find_nth_with_limit(index, limit)
load_target if find_from_target?
super
end
def find_nth_from_last(index)
load_target if find_from_target?
super
end
def null_scope?
@association.null_scope?
end
def find_from_target?
@association.find_from_target?
end
def exec_queries
load_target
end
end
end
end
# frozen_string_literal: true
require "active_support/core_ext/object/blank"
module ActiveRecord
module ConnectionAdapters
module PostgreSQL
class Column < ConnectionAdapters::Column # :nodoc:
delegate :oid, :fmod, to: :sql_type_metadata
def initialize(*, serial: nil, generated: nil, **)
super
@serial = serial
@generated = generated
end
def serial?
@serial
end
def virtual?
# We assume every generated column is virtual, no matter the concrete type
@generated.present?
end
def has_default?
super && !virtual?
end
def array
sql_type_metadata.sql_type.end_with?("[]")
end
alias :array? :array
def enum?
type == :enum
end
def sql_type
super.delete_suffix("[]")
end
def init_with(coder)
@serial = coder["serial"]
super
end
def encode_with(coder)
coder["serial"] = @serial
super
end
def ==(other)
other.is_a?(Column) &&
super &&
serial? == other.serial?
end
alias :eql? :==
def hash
Column.hash ^
super.hash ^
serial?.hash
end
end
end
PostgreSQLColumn = PostgreSQL::Column # :nodoc:
end
end
# frozen_string_literal: true
module ActiveRecord
class Migration
# <tt>ActiveRecord::Migration::CommandRecorder</tt> records commands done during
# a migration and knows how to reverse those commands. The CommandRecorder
# knows how to invert the following commands:
#
# * add_column
# * add_foreign_key
# * add_check_constraint
# * add_index
# * add_reference
# * add_timestamps
# * change_column
# * change_column_default (must supply a +:from+ and +:to+ option)
# * change_column_null
# * change_column_comment (must supply a +:from+ and +:to+ option)
# * change_table_comment (must supply a +:from+ and +:to+ option)
# * create_join_table
# * create_table
# * disable_extension
# * drop_join_table
# * drop_table (must supply a block)
# * enable_extension
# * remove_column (must supply a type)
# * remove_columns (must specify at least one column name or more)
# * remove_foreign_key (must supply a second table)
# * remove_check_constraint
# * remove_index
# * remove_reference
# * remove_timestamps
# * rename_column
# * rename_index
# * rename_table
class CommandRecorder
ReversibleAndIrreversibleMethods = [
:create_table, :create_join_table, :rename_table, :add_column, :remove_column,
:rename_index, :rename_column, :add_index, :remove_index, :add_timestamps, :remove_timestamps,
:change_column_default, :add_reference, :remove_reference, :transaction,
:drop_join_table, :drop_table, :execute_block, :enable_extension, :disable_extension,
:change_column, :execute, :remove_columns, :change_column_null,
:add_foreign_key, :remove_foreign_key,
:change_column_comment, :change_table_comment,
:add_check_constraint, :remove_check_constraint
]
include JoinTable
attr_accessor :commands, :delegate, :reverting
def initialize(delegate = nil)
@commands = []
@delegate = delegate
@reverting = false
end
# While executing the given block, the recorded will be in reverting mode.
# All commands recorded will end up being recorded reverted
# and in reverse order.
# For example:
#
# recorder.revert{ recorder.record(:rename_table, [:old, :new]) }
# # same effect as recorder.record(:rename_table, [:new, :old])
def revert
@reverting = !@reverting
previous = @commands
@commands = []
yield
ensure
@commands = previous.concat(@commands.reverse)
@reverting = !@reverting
end
# Record +command+. +command+ should be a method name and arguments.
# For example:
#
# recorder.record(:method_name, [:arg1, :arg2])
def record(*command, &block)
if @reverting
@commands << inverse_of(*command, &block)
else
@commands << (command << block)
end
end
# Returns the inverse of the given command. For example:
#
# recorder.inverse_of(:rename_table, [:old, :new])
# # => [:rename_table, [:new, :old]]
#
# If the inverse of a command requires several commands, returns array of commands.
#
# recorder.inverse_of(:remove_columns, [:some_table, :foo, :bar, type: :string])
# # => [[:add_column, :some_table, :foo, :string], [:add_column, :some_table, :bar, :string]]
#
# This method will raise an +IrreversibleMigration+ exception if it cannot
# invert the +command+.
def inverse_of(command, args, &block)
method = :"invert_#{command}"
raise IrreversibleMigration, <<~MSG unless respond_to?(method, true)
This migration uses #{command}, which is not automatically reversible.
To make the migration reversible you can either:
1. Define #up and #down methods in place of the #change method.
2. Use the #reversible method to define reversible behavior.
MSG
send(method, args, &block)
end
ReversibleAndIrreversibleMethods.each do |method|
class_eval <<-EOV, __FILE__, __LINE__ + 1
def #{method}(*args, &block) # def create_table(*args, &block)
record(:"#{method}", args, &block) # record(:create_table, args, &block)
end # end
EOV
ruby2_keywords(method)
end
alias :add_belongs_to :add_reference
alias :remove_belongs_to :remove_reference
def change_table(table_name, **options) # :nodoc:
yield delegate.update_table_definition(table_name, self)
end
def replay(migration)
commands.each do |cmd, args, block|
migration.send(cmd, *args, &block)
end
end
private
module StraightReversions # :nodoc:
private
{
execute_block: :execute_block,
create_table: :drop_table,
create_join_table: :drop_join_table,
add_column: :remove_column,
add_index: :remove_index,
add_timestamps: :remove_timestamps,
add_reference: :remove_reference,
add_foreign_key: :remove_foreign_key,
add_check_constraint: :remove_check_constraint,
enable_extension: :disable_extension
}.each do |cmd, inv|
[[inv, cmd], [cmd, inv]].uniq.each do |method, inverse|
class_eval <<-EOV, __FILE__, __LINE__ + 1
def invert_#{method}(args, &block) # def invert_create_table(args, &block)
[:#{inverse}, args, block] # [:drop_table, args, block]
end # end
EOV
end
end
end
include StraightReversions
def invert_transaction(args, &block)
sub_recorder = CommandRecorder.new(delegate)
sub_recorder.revert(&block)
invertions_proc = proc {
sub_recorder.replay(self)
}
[:transaction, args, invertions_proc]
end
def invert_drop_table(args, &block)
if args.size == 1 && block == nil
raise ActiveRecord::IrreversibleMigration, "To avoid mistakes, drop_table is only reversible if given options or a block (can be empty)."
end
super
end
def invert_rename_table(args)
[:rename_table, args.reverse]
end
def invert_remove_column(args)
raise ActiveRecord::IrreversibleMigration, "remove_column is only reversible if given a type." if args.size <= 2
super
end
def invert_remove_columns(args)
unless args[-1].is_a?(Hash) && args[-1].has_key?(:type)
raise ActiveRecord::IrreversibleMigration, "remove_columns is only reversible if given a type."
end
[:add_columns, args]
end
def invert_rename_index(args)
table_name, old_name, new_name = args
[:rename_index, [table_name, new_name, old_name]]
end
def invert_rename_column(args)
table_name, old_name, new_name = args
[:rename_column, [table_name, new_name, old_name]]
end
def invert_remove_index(args)
options = args.extract_options!
table, columns = args
columns ||= options.delete(:column)
unless columns
raise ActiveRecord::IrreversibleMigration, "remove_index is only reversible if given a :column option."
end
options.delete(:if_exists)
args = [table, columns]
args << options unless options.empty?
[:add_index, args]
end
alias :invert_add_belongs_to :invert_add_reference
alias :invert_remove_belongs_to :invert_remove_reference
def invert_change_column_default(args)
table, column, options = args
unless options.is_a?(Hash) && options.has_key?(:from) && options.has_key?(:to)
raise ActiveRecord::IrreversibleMigration, "change_column_default is only reversible if given a :from and :to option."
end
[:change_column_default, [table, column, from: options[:to], to: options[:from]]]
end
def invert_change_column_null(args)
args[2] = !args[2]
[:change_column_null, args]
end
def invert_remove_foreign_key(args)
options = args.extract_options!
from_table, to_table = args
to_table ||= options.delete(:to_table)
raise ActiveRecord::IrreversibleMigration, "remove_foreign_key is only reversible if given a second table" if to_table.nil?
reversed_args = [from_table, to_table]
reversed_args << options unless options.empty?
[:add_foreign_key, reversed_args]
end
def invert_change_column_comment(args)
table, column, options = args
unless options.is_a?(Hash) && options.has_key?(:from) && options.has_key?(:to)
raise ActiveRecord::IrreversibleMigration, "change_column_comment is only reversible if given a :from and :to option."
end
[:change_column_comment, [table, column, from: options[:to], to: options[:from]]]
end
def invert_change_table_comment(args)
table, options = args
unless options.is_a?(Hash) && options.has_key?(:from) && options.has_key?(:to)
raise ActiveRecord::IrreversibleMigration, "change_table_comment is only reversible if given a :from and :to option."
end
[:change_table_comment, [table, from: options[:to], to: options[:from]]]
end
def invert_remove_check_constraint(args)
raise ActiveRecord::IrreversibleMigration, "remove_check_constraint is only reversible if given an expression." if args.size < 2
super
end
def respond_to_missing?(method, _)
super || delegate.respond_to?(method)
end
# Forwards any missing method call to the \target.
def method_missing(method, *args, &block)
if delegate.respond_to?(method)
delegate.public_send(method, *args, &block)
else
super
end
end
ruby2_keywords(:method_missing)
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Nodes
class Comment < Arel::Nodes::Node
attr_reader :values
def initialize(values)
super()
@values = values
end
def initialize_copy(other)
super
@values = @values.clone
end
def hash
[@values].hash
end
def eql?(other)
self.class == other.class &&
self.values == other.values
end
alias :== :eql?
end
end
end
# frozen_string_literal: true
module ActiveRecord
class Migration
module Compatibility # :nodoc: all
def self.find(version)
version = version.to_s
name = "V#{version.tr('.', '_')}"
unless const_defined?(name)
versions = constants.grep(/\AV[0-9_]+\z/).map { |s| s.to_s.delete("V").tr("_", ".").inspect }
raise ArgumentError, "Unknown migration version #{version.inspect}; expected one of #{versions.sort.join(', ')}"
end
const_get(name)
end
# This file exists to ensure that old migrations run the same way they did before a Rails upgrade.
# e.g. if you write a migration on Rails 6.1, then upgrade to Rails 7, the migration should do the same thing to your
# database as it did when you were running Rails 6.1
#
# "Current" is an alias for `ActiveRecord::Migration`, it represents the current Rails version.
# New migration functionality that will never be backward compatible should be added directly to `ActiveRecord::Migration`.
#
# There are classes for each prior Rails version. Each class descends from the *next* Rails version, so:
# 7.0 < 7.1
# 5.2 < 6.0 < 6.1 < 7.0 < 7.1
#
# If you are introducing new migration functionality that should only apply from Rails 7 onward, then you should
# find the class that immediately precedes it (6.1), and override the relevant migration methods to undo your changes.
#
# For example, Rails 6 added a default value for the `precision` option on datetime columns. So in this file, the `V5_2`
# class sets the value of `precision` to `nil` if it's not explicitly provided. This way, the default value will not apply
# for migrations written for 5.2, but will for migrations written for 6.0.
V7_1 = Current
class V7_0 < V7_1
module TableDefinition
private
def raise_on_if_exist_options(options)
end
end
def create_table(table_name, **options)
if block_given?
super { |t| yield compatible_table_definition(t) }
else
super
end
end
def change_table(table_name, **options)
if block_given?
super { |t| yield compatible_table_definition(t) }
else
super
end
end
private
def compatible_table_definition(t)
class << t
prepend TableDefinition
end
t
end
end
class V6_1 < V7_0
class PostgreSQLCompat
def self.compatible_timestamp_type(type, connection)
if connection.adapter_name == "PostgreSQL"
# For Rails <= 6.1, :datetime was aliased to :timestamp
# See: https://github.com/rails/rails/blob/v6.1.3.2/activerecord/lib/active_record/connection_adapters/postgresql_adapter.rb#L108
# From Rails 7 onwards, you can define what :datetime resolves to (the default is still :timestamp)
# See `ActiveRecord::ConnectionAdapters::PostgreSQLAdapter.datetime_type`
type.to_sym == :datetime ? :timestamp : type
else
type
end
end
end
def add_column(table_name, column_name, type, **options)
if type == :datetime
options[:precision] ||= nil
end
type = PostgreSQLCompat.compatible_timestamp_type(type, connection)
super
end
def create_table(table_name, **options)
if block_given?
super { |t| yield compatible_table_definition(t) }
else
super
end
end
def change_table(table_name, **options)
if block_given?
super { |t| yield compatible_table_definition(t) }
else
super
end
end
module TableDefinition
def new_column_definition(name, type, **options)
type = PostgreSQLCompat.compatible_timestamp_type(type, @conn)
super
end
def column(name, type, index: nil, **options)
options[:precision] ||= nil
super
end
private
def raise_on_if_exist_options(options)
end
end
private
def compatible_table_definition(t)
class << t
prepend TableDefinition
end
t
end
end
class V6_0 < V6_1
class ReferenceDefinition < ConnectionAdapters::ReferenceDefinition
def index_options(table_name)
as_options(index)
end
end
module SQLite3
module TableDefinition
def references(*args, **options)
args.each do |ref_name|
ReferenceDefinition.new(ref_name, type: :integer, **options).add_to(self)
end
end
alias :belongs_to :references
def column(name, type, index: nil, **options)
options[:precision] ||= nil
super
end
end
end
module TableDefinition
def references(*args, **options)
args.each do |ref_name|
ReferenceDefinition.new(ref_name, **options).add_to(self)
end
end
alias :belongs_to :references
def column(name, type, index: nil, **options)
options[:precision] ||= nil
super
end
private
def raise_on_if_exist_options(options)
end
end
def create_table(table_name, **options)
if block_given?
super { |t| yield compatible_table_definition(t) }
else
super
end
end
def change_table(table_name, **options)
if block_given?
super { |t| yield compatible_table_definition(t) }
else
super
end
end
def create_join_table(table_1, table_2, **options)
if block_given?
super { |t| yield compatible_table_definition(t) }
else
super
end
end
def add_reference(table_name, ref_name, **options)
if connection.adapter_name == "SQLite"
reference_definition = ReferenceDefinition.new(ref_name, type: :integer, **options)
else
reference_definition = ReferenceDefinition.new(ref_name, **options)
end
reference_definition.add_to(connection.update_table_definition(table_name, self))
end
alias :add_belongs_to :add_reference
private
def compatible_table_definition(t)
class << t
prepend TableDefinition
prepend SQLite3::TableDefinition
end
t
end
end
class V5_2 < V6_0
module TableDefinition
def timestamps(**options)
options[:precision] ||= nil
super
end
def column(name, type, index: nil, **options)
options[:precision] ||= nil
super
end
private
def raise_on_if_exist_options(options)
end
end
module CommandRecorder
def invert_transaction(args, &block)
[:transaction, args, block]
end
def invert_change_column_comment(args)
[:change_column_comment, args]
end
def invert_change_table_comment(args)
[:change_table_comment, args]
end
end
def create_table(table_name, **options)
if block_given?
super { |t| yield compatible_table_definition(t) }
else
super
end
end
def change_table(table_name, **options)
if block_given?
super { |t| yield compatible_table_definition(t) }
else
super
end
end
def create_join_table(table_1, table_2, **options)
if block_given?
super { |t| yield compatible_table_definition(t) }
else
super
end
end
def add_timestamps(table_name, **options)
options[:precision] ||= nil
super
end
private
def compatible_table_definition(t)
class << t
prepend TableDefinition
end
t
end
def command_recorder
recorder = super
class << recorder
prepend CommandRecorder
end
recorder
end
end
class V5_1 < V5_2
def change_column(table_name, column_name, type, **options)
if connection.adapter_name == "PostgreSQL"
super(table_name, column_name, type, **options.except(:default, :null, :comment))
connection.change_column_default(table_name, column_name, options[:default]) if options.key?(:default)
connection.change_column_null(table_name, column_name, options[:null], options[:default]) if options.key?(:null)
connection.change_column_comment(table_name, column_name, options[:comment]) if options.key?(:comment)
else
super
end
end
def create_table(table_name, **options)
if connection.adapter_name == "Mysql2"
super(table_name, options: "ENGINE=InnoDB", **options)
else
super
end
end
end
class V5_0 < V5_1
module TableDefinition
def primary_key(name, type = :primary_key, **options)
type = :integer if type == :primary_key
super
end
def references(*args, **options)
super(*args, type: :integer, **options)
end
alias :belongs_to :references
private
def raise_on_if_exist_options(options)
end
end
def create_table(table_name, **options)
if connection.adapter_name == "PostgreSQL"
if options[:id] == :uuid && !options.key?(:default)
options[:default] = "uuid_generate_v4()"
end
end
unless connection.adapter_name == "Mysql2" && options[:id] == :bigint
if [:integer, :bigint].include?(options[:id]) && !options.key?(:default)
options[:default] = nil
end
end
# Since 5.1 PostgreSQL adapter uses bigserial type for primary
# keys by default and MySQL uses bigint. This compat layer makes old migrations utilize
# serial/int type instead -- the way it used to work before 5.1.
unless options.key?(:id)
options[:id] = :integer
end
super
end
def create_join_table(table_1, table_2, column_options: {}, **options)
column_options.reverse_merge!(type: :integer)
super
end
def add_column(table_name, column_name, type, **options)
if type == :primary_key
type = :integer
options[:primary_key] = true
elsif type == :datetime
options[:precision] ||= nil
end
super
end
def add_reference(table_name, ref_name, **options)
super(table_name, ref_name, type: :integer, **options)
end
alias :add_belongs_to :add_reference
private
def compatible_table_definition(t)
class << t
prepend TableDefinition
end
super
end
end
class V4_2 < V5_0
module TableDefinition
def references(*, **options)
options[:index] ||= false
super
end
alias :belongs_to :references
def timestamps(**options)
options[:null] = true if options[:null].nil?
super
end
private
def raise_on_if_exist_options(options)
end
end
def add_reference(table_name, ref_name, **options)
options[:index] ||= false
super
end
alias :add_belongs_to :add_reference
def add_timestamps(table_name, **options)
options[:null] = true if options[:null].nil?
super
end
def index_exists?(table_name, column_name, **options)
column_names = Array(column_name).map(&:to_s)
options[:name] =
if options[:name].present?
options[:name].to_s
else
connection.index_name(table_name, column: column_names)
end
super
end
def remove_index(table_name, column_name = nil, **options)
options[:name] = index_name_for_remove(table_name, column_name, options)
super
end
private
def compatible_table_definition(t)
class << t
prepend TableDefinition
end
super
end
def index_name_for_remove(table_name, column_name, options)
index_name = connection.index_name(table_name, column_name || options)
unless connection.index_name_exists?(table_name, index_name)
if options.key?(:name)
options_without_column = options.except(:column)
index_name_without_column = connection.index_name(table_name, options_without_column)
if connection.index_name_exists?(table_name, index_name_without_column)
return index_name_without_column
end
end
raise ArgumentError, "Index name '#{index_name}' on table '#{table_name}' does not exist"
end
index_name
end
end
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Collectors
class Composite
attr_accessor :preparable
def initialize(left, right)
@left = left
@right = right
end
def <<(str)
left << str
right << str
self
end
def add_bind(bind, &block)
left.add_bind bind, &block
right.add_bind bind, &block
self
end
def add_binds(binds, proc_for_binds = nil, &block)
left.add_binds(binds, proc_for_binds, &block)
right.add_binds(binds, proc_for_binds, &block)
self
end
def value
[left.value, right.value]
end
private
attr_reader :left, :right
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Encryption
# Container of configuration options
class Config
attr_accessor :primary_key, :deterministic_key, :store_key_references, :key_derivation_salt,
:support_unencrypted_data, :encrypt_fixtures, :validate_column_size, :add_to_filter_parameters,
:excluded_from_filter_parameters, :extend_queries, :previous_schemes, :forced_encoding_for_deterministic_encryption
def initialize
set_defaults
end
# Configure previous encryption schemes.
#
# config.active_record.encryption.previous = [ { key_provider: MyOldKeyProvider.new } ]
def previous=(previous_schemes_properties)
previous_schemes_properties.each do |properties|
add_previous_scheme(**properties)
end
end
%w(key_derivation_salt primary_key deterministic_key).each do |key|
silence_redefinition_of_method key
define_method(key) do
instance_variable_get(:"@#{key}").presence or
raise Errors::Configuration, "Missing Active Record encryption credential: active_record_encryption.#{key}"
end
end
private
def set_defaults
self.store_key_references = false
self.support_unencrypted_data = false
self.encrypt_fixtures = false
self.validate_column_size = true
self.add_to_filter_parameters = true
self.excluded_from_filter_parameters = []
self.previous_schemes = []
self.forced_encoding_for_deterministic_encryption = Encoding::UTF_8
# TODO: Setting to false for now as the implementation is a bit experimental
self.extend_queries = false
end
def add_previous_scheme(**properties)
previous_schemes << ActiveRecord::Encryption::Scheme.new(**properties)
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Encryption
# Configuration API for ActiveRecord::Encryption
module Configurable
extend ActiveSupport::Concern
included do
mattr_reader :config, default: Config.new
mattr_accessor :encrypted_attribute_declaration_listeners
end
class_methods do
# Expose getters for context properties
Context::PROPERTIES.each do |name|
delegate name, to: :context
end
def configure(primary_key: nil, deterministic_key: nil, key_derivation_salt: nil, **properties) # :nodoc:
config.primary_key = primary_key
config.deterministic_key = deterministic_key
config.key_derivation_salt = key_derivation_salt
properties.each do |name, value|
[:context, :config].each do |configurable_object_name|
configurable_object = ActiveRecord::Encryption.send(configurable_object_name)
configurable_object.send "#{name}=", value if configurable_object.respond_to?("#{name}=")
end
end
end
# Register callback to be invoked when an encrypted attribute is declared.
#
# === Example:
#
# ActiveRecord::Encryption.on_encrypted_attribute_declared do |klass, attribute_name|
# ...
# end
def on_encrypted_attribute_declared(&block)
self.encrypted_attribute_declaration_listeners ||= Concurrent::Array.new
self.encrypted_attribute_declaration_listeners << block
end
def encrypted_attribute_was_declared(klass, name) # :nodoc:
self.encrypted_attribute_declaration_listeners&.each do |block|
block.call(klass, name)
end
end
def install_auto_filtered_parameters_hook(application) # :nodoc:
ActiveRecord::Encryption.on_encrypted_attribute_declared do |klass, encrypted_attribute_name|
filter_parameter = [("#{klass.model_name.element}" if klass.name), encrypted_attribute_name.to_s].compact.join(".")
application.config.filter_parameters << filter_parameter unless excluded_from_filter_parameters?(filter_parameter)
end
end
private
def excluded_from_filter_parameters?(filter_parameter)
ActiveRecord::Encryption.config.excluded_from_filter_parameters.find { |excluded_filter| excluded_filter.to_s == filter_parameter }
end
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module TypeCaster
class Connection # :nodoc:
def initialize(klass, table_name)
@klass = klass
@table_name = table_name
end
def type_cast_for_database(attr_name, value)
type = type_for_attribute(attr_name)
type.serialize(value)
end
def type_for_attribute(attr_name)
schema_cache = connection.schema_cache
if schema_cache.data_source_exists?(table_name)
column = schema_cache.columns_hash(table_name)[attr_name.to_s]
type = connection.lookup_cast_type_from_column(column) if column
end
type || Type.default_value
end
delegate :connection, to: :@klass, private: true
private
attr_reader :table_name
end
end
end
# frozen_string_literal: true
module ActiveRecord
module ConnectionAdapters
extend ActiveSupport::Autoload
eager_autoload do
autoload :AbstractAdapter
end
autoload :Column
autoload :PoolConfig
autoload :PoolManager
autoload :LegacyPoolManager
autoload :SchemaCache
autoload :Deduplicable
autoload_at "active_record/connection_adapters/abstract/schema_definitions" do
autoload :IndexDefinition
autoload :ColumnDefinition
autoload :ChangeColumnDefinition
autoload :ForeignKeyDefinition
autoload :CheckConstraintDefinition
autoload :TableDefinition
autoload :Table
autoload :AlterTable
autoload :ReferenceDefinition
end
autoload_under "abstract" do
autoload :SchemaStatements
autoload :DatabaseStatements
autoload :DatabaseLimits
autoload :Quoting
autoload :ConnectionHandler
autoload :QueryCache
autoload :Savepoints
end
autoload_at "active_record/connection_adapters/abstract/connection_pool" do
autoload :ConnectionPool
autoload :NullPool
end
autoload_at "active_record/connection_adapters/abstract/transaction" do
autoload :TransactionManager
autoload :NullTransaction
autoload :RealTransaction
autoload :SavepointTransaction
autoload :TransactionState
end
end
end
# frozen_string_literal: true
require "thread"
require "concurrent/map"
module ActiveRecord
module ConnectionAdapters
# ConnectionHandler is a collection of ConnectionPool objects. It is used
# for keeping separate connection pools that connect to different databases.
#
# For example, suppose that you have 5 models, with the following hierarchy:
#
# class Author < ActiveRecord::Base
# end
#
# class BankAccount < ActiveRecord::Base
# end
#
# class Book < ActiveRecord::Base
# establish_connection :library_db
# end
#
# class ScaryBook < Book
# end
#
# class GoodBook < Book
# end
#
# And a database.yml that looked like this:
#
# development:
# database: my_application
# host: localhost
#
# library_db:
# database: library
# host: some.library.org
#
# Your primary database in the development environment is "my_application"
# but the Book model connects to a separate database called "library_db"
# (this can even be a database on a different machine).
#
# Book, ScaryBook, and GoodBook will all use the same connection pool to
# "library_db" while Author, BankAccount, and any other models you create
# will use the default connection pool to "my_application".
#
# The various connection pools are managed by a single instance of
# ConnectionHandler accessible via ActiveRecord::Base.connection_handler.
# All Active Record models use this handler to determine the connection pool that they
# should use.
#
# The ConnectionHandler class is not coupled with the Active models, as it has no knowledge
# about the model. The model needs to pass a connection specification name to the handler,
# in order to look up the correct connection pool.
class ConnectionHandler
FINALIZER = lambda { |_| ActiveSupport::ForkTracker.check! }
private_constant :FINALIZER
class StringConnectionOwner # :nodoc:
attr_reader :name
def initialize(name)
@name = name
end
def primary_class?
false
end
def current_preventing_writes
false
end
end
def initialize
# These caches are keyed by pool_config.connection_specification_name (PoolConfig#connection_specification_name).
@owner_to_pool_manager = Concurrent::Map.new(initial_capacity: 2)
# Backup finalizer: if the forked child skipped Kernel#fork the early discard has not occurred
ObjectSpace.define_finalizer self, FINALIZER
end
def prevent_writes # :nodoc:
ActiveSupport::IsolatedExecutionState[:active_record_prevent_writes]
end
def prevent_writes=(prevent_writes) # :nodoc:
ActiveSupport::IsolatedExecutionState[:active_record_prevent_writes] = prevent_writes
end
# Prevent writing to the database regardless of role.
#
# In some cases you may want to prevent writes to the database
# even if you are on a database that can write. +while_preventing_writes+
# will prevent writes to the database for the duration of the block.
#
# This method does not provide the same protection as a readonly
# user and is meant to be a safeguard against accidental writes.
#
# See +READ_QUERY+ for the queries that are blocked by this
# method.
def while_preventing_writes(enabled = true)
unless ActiveRecord.legacy_connection_handling
raise NotImplementedError, "`while_preventing_writes` is only available on the connection_handler with legacy_connection_handling"
end
original, self.prevent_writes = self.prevent_writes, enabled
yield
ensure
self.prevent_writes = original
end
def connection_pool_names # :nodoc:
owner_to_pool_manager.keys
end
def all_connection_pools
owner_to_pool_manager.values.flat_map { |m| m.pool_configs.map(&:pool) }
end
def connection_pool_list(role = ActiveRecord::Base.current_role)
owner_to_pool_manager.values.flat_map { |m| m.pool_configs(role).map(&:pool) }
end
alias :connection_pools :connection_pool_list
def establish_connection(config, owner_name: Base, role: ActiveRecord::Base.current_role, shard: Base.current_shard)
owner_name = StringConnectionOwner.new(config.to_s) if config.is_a?(Symbol)
pool_config = resolve_pool_config(config, owner_name, role, shard)
db_config = pool_config.db_config
# Protects the connection named `ActiveRecord::Base` from being removed
# if the user calls `establish_connection :primary`.
if owner_to_pool_manager.key?(pool_config.connection_specification_name)
remove_connection_pool(pool_config.connection_specification_name, role: role, shard: shard)
end
message_bus = ActiveSupport::Notifications.instrumenter
payload = {}
if pool_config
payload[:spec_name] = pool_config.connection_specification_name
payload[:shard] = shard
payload[:config] = db_config.configuration_hash
end
if ActiveRecord.legacy_connection_handling
owner_to_pool_manager[pool_config.connection_specification_name] ||= LegacyPoolManager.new
else
owner_to_pool_manager[pool_config.connection_specification_name] ||= PoolManager.new
end
pool_manager = get_pool_manager(pool_config.connection_specification_name)
pool_manager.set_pool_config(role, shard, pool_config)
message_bus.instrument("!connection.active_record", payload) do
pool_config.pool
end
end
# Returns true if there are any active connections among the connection
# pools that the ConnectionHandler is managing.
def active_connections?(role = ActiveRecord::Base.current_role)
connection_pool_list(role).any?(&:active_connection?)
end
# Returns any connections in use by the current thread back to the pool,
# and also returns connections to the pool cached by threads that are no
# longer alive.
def clear_active_connections!(role = ActiveRecord::Base.current_role)
connection_pool_list(role).each(&:release_connection)
end
# Clears the cache which maps classes.
#
# See ConnectionPool#clear_reloadable_connections! for details.
def clear_reloadable_connections!(role = ActiveRecord::Base.current_role)
connection_pool_list(role).each(&:clear_reloadable_connections!)
end
def clear_all_connections!(role = ActiveRecord::Base.current_role)
connection_pool_list(role).each(&:disconnect!)
end
# Disconnects all currently idle connections.
#
# See ConnectionPool#flush! for details.
def flush_idle_connections!(role = ActiveRecord::Base.current_role)
connection_pool_list(role).each(&:flush!)
end
# Locate the connection of the nearest super class. This can be an
# active or defined connection: if it is the latter, it will be
# opened and set as the active connection for the class it was defined
# for (not necessarily the current class).
def retrieve_connection(spec_name, role: ActiveRecord::Base.current_role, shard: ActiveRecord::Base.current_shard) # :nodoc:
pool = retrieve_connection_pool(spec_name, role: role, shard: shard)
unless pool
if shard != ActiveRecord::Base.default_shard
message = "No connection pool for '#{spec_name}' found for the '#{shard}' shard."
elsif ActiveRecord::Base.connection_handler != ActiveRecord::Base.default_connection_handler
message = "No connection pool for '#{spec_name}' found for the '#{ActiveRecord::Base.current_role}' role."
elsif role != ActiveRecord::Base.default_role
message = "No connection pool for '#{spec_name}' found for the '#{role}' role."
else
message = "No connection pool for '#{spec_name}' found."
end
raise ConnectionNotEstablished, message
end
pool.connection
end
# Returns true if a connection that's accessible to this class has
# already been opened.
def connected?(spec_name, role: ActiveRecord::Base.current_role, shard: ActiveRecord::Base.current_shard)
pool = retrieve_connection_pool(spec_name, role: role, shard: shard)
pool && pool.connected?
end
def remove_connection_pool(owner, role: ActiveRecord::Base.current_role, shard: ActiveRecord::Base.current_shard)
if pool_manager = get_pool_manager(owner)
pool_config = pool_manager.remove_pool_config(role, shard)
if pool_config
pool_config.disconnect!
pool_config.db_config
end
end
end
# Retrieving the connection pool happens a lot, so we cache it in @owner_to_pool_manager.
# This makes retrieving the connection pool O(1) once the process is warm.
# When a connection is established or removed, we invalidate the cache.
def retrieve_connection_pool(owner, role: ActiveRecord::Base.current_role, shard: ActiveRecord::Base.current_shard)
pool_config = get_pool_manager(owner)&.get_pool_config(role, shard)
pool_config&.pool
end
private
attr_reader :owner_to_pool_manager
# Returns the pool manager for an owner.
def get_pool_manager(owner)
owner_to_pool_manager[owner]
end
# Returns an instance of PoolConfig for a given adapter.
# Accepts a hash one layer deep that contains all connection information.
#
# == Example
#
# config = { "production" => { "host" => "localhost", "database" => "foo", "adapter" => "sqlite3" } }
# pool_config = Base.configurations.resolve_pool_config(:production)
# pool_config.db_config.configuration_hash
# # => { host: "localhost", database: "foo", adapter: "sqlite3" }
#
def resolve_pool_config(config, owner_name, role, shard)
db_config = Base.configurations.resolve(config)
raise(AdapterNotSpecified, "database configuration does not specify adapter") unless db_config.adapter
# Require the adapter itself and give useful feedback about
# 1. Missing adapter gems and
# 2. Adapter gems' missing dependencies.
path_to_adapter = "active_record/connection_adapters/#{db_config.adapter}_adapter"
begin
require path_to_adapter
rescue LoadError => e
# We couldn't require the adapter itself. Raise an exception that
# points out config typos and missing gems.
if e.path == path_to_adapter
# We can assume that a non-builtin adapter was specified, so it's
# either misspelled or missing from Gemfile.
raise LoadError, "Could not load the '#{db_config.adapter}' Active Record adapter. Ensure that the adapter is spelled correctly in config/database.yml and that you've added the necessary adapter gem to your Gemfile.", e.backtrace
# Bubbled up from the adapter require. Prefix the exception message
# with some guidance about how to address it and reraise.
else
raise LoadError, "Error loading the '#{db_config.adapter}' Active Record adapter. Missing a gem it depends on? #{e.message}", e.backtrace
end
end
unless ActiveRecord::Base.respond_to?(db_config.adapter_method)
raise AdapterNotFound, "database configuration specifies nonexistent #{db_config.adapter} adapter"
end
ConnectionAdapters::PoolConfig.new(owner_name, db_config, role, shard)
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module ConnectionHandling
RAILS_ENV = -> { (Rails.env if defined?(Rails.env)) || ENV["RAILS_ENV"].presence || ENV["RACK_ENV"].presence }
DEFAULT_ENV = -> { RAILS_ENV.call || "default_env" }
# Establishes the connection to the database. Accepts a hash as input where
# the <tt>:adapter</tt> key must be specified with the name of a database adapter (in lower-case)
# example for regular databases (MySQL, PostgreSQL, etc):
#
# ActiveRecord::Base.establish_connection(
# adapter: "mysql2",
# host: "localhost",
# username: "myuser",
# password: "mypass",
# database: "somedatabase"
# )
#
# Example for SQLite database:
#
# ActiveRecord::Base.establish_connection(
# adapter: "sqlite3",
# database: "path/to/dbfile"
# )
#
# Also accepts keys as strings (for parsing from YAML for example):
#
# ActiveRecord::Base.establish_connection(
# "adapter" => "sqlite3",
# "database" => "path/to/dbfile"
# )
#
# Or a URL:
#
# ActiveRecord::Base.establish_connection(
# "postgres://myuser:mypass@localhost/somedatabase"
# )
#
# In case {ActiveRecord::Base.configurations}[rdoc-ref:Core.configurations]
# is set (Rails automatically loads the contents of config/database.yml into it),
# a symbol can also be given as argument, representing a key in the
# configuration hash:
#
# ActiveRecord::Base.establish_connection(:production)
#
# The exceptions AdapterNotSpecified, AdapterNotFound, and +ArgumentError+
# may be returned on an error.
def establish_connection(config_or_env = nil)
config_or_env ||= DEFAULT_ENV.call.to_sym
db_config, owner_name = resolve_config_for_connection(config_or_env)
connection_handler.establish_connection(db_config, owner_name: owner_name, role: current_role, shard: current_shard)
end
# Connects a model to the databases specified. The +database+ keyword
# takes a hash consisting of a +role+ and a +database_key+.
#
# This will create a connection handler for switching between connections,
# look up the config hash using the +database_key+ and finally
# establishes a connection to that config.
#
# class AnimalsModel < ApplicationRecord
# self.abstract_class = true
#
# connects_to database: { writing: :primary, reading: :primary_replica }
# end
#
# +connects_to+ also supports horizontal sharding. The horizontal sharding API
# also supports read replicas. Connect a model to a list of shards like this:
#
# class AnimalsModel < ApplicationRecord
# self.abstract_class = true
#
# connects_to shards: {
# default: { writing: :primary, reading: :primary_replica },
# shard_two: { writing: :primary_shard_two, reading: :primary_shard_replica_two }
# }
# end
#
# Returns an array of database connections.
def connects_to(database: {}, shards: {})
raise NotImplementedError, "`connects_to` can only be called on ActiveRecord::Base or abstract classes" unless self == Base || abstract_class?
if database.present? && shards.present?
raise ArgumentError, "`connects_to` can only accept a `database` or `shards` argument, but not both arguments."
end
connections = []
database.each do |role, database_key|
db_config, owner_name = resolve_config_for_connection(database_key)
handler = lookup_connection_handler(role.to_sym)
self.connection_class = true
connections << handler.establish_connection(db_config, owner_name: owner_name, role: role)
end
shards.each do |shard, database_keys|
database_keys.each do |role, database_key|
db_config, owner_name = resolve_config_for_connection(database_key)
handler = lookup_connection_handler(role.to_sym)
self.connection_class = true
connections << handler.establish_connection(db_config, owner_name: owner_name, role: role, shard: shard.to_sym)
end
end
connections
end
# Connects to a role (e.g. writing, reading, or a custom role) and/or
# shard for the duration of the block. At the end of the block the
# connection will be returned to the original role / shard.
#
# If only a role is passed, Active Record will look up the connection
# based on the requested role. If a non-established role is requested
# an +ActiveRecord::ConnectionNotEstablished+ error will be raised:
#
# ActiveRecord::Base.connected_to(role: :writing) do
# Dog.create! # creates dog using dog writing connection
# end
#
# ActiveRecord::Base.connected_to(role: :reading) do
# Dog.create! # throws exception because we're on a replica
# end
#
# When swapping to a shard, the role must be passed as well. If a non-existent
# shard is passed, an +ActiveRecord::ConnectionNotEstablished+ error will be
# raised.
#
# When a shard and role is passed, Active Record will first lookup the role,
# and then look up the connection by shard key.
#
# ActiveRecord::Base.connected_to(role: :reading, shard: :shard_one_replica) do
# Dog.first # finds first Dog record stored on the shard one replica
# end
def connected_to(role: nil, shard: nil, prevent_writes: false, &blk)
if ActiveRecord.legacy_connection_handling
if self != Base
raise NotImplementedError, "`connected_to` can only be called on ActiveRecord::Base with legacy connection handling."
end
else
if self != Base && !abstract_class
raise NotImplementedError, "calling `connected_to` is only allowed on ActiveRecord::Base or abstract classes."
end
if name != connection_specification_name && !primary_class?
raise NotImplementedError, "calling `connected_to` is only allowed on the abstract class that established the connection."
end
end
unless role || shard
raise ArgumentError, "must provide a `shard` and/or `role`."
end
with_role_and_shard(role, shard, prevent_writes, &blk)
end
# Connects a role and/or shard to the provided connection names. Optionally +prevent_writes+
# can be passed to block writes on a connection. +reading+ will automatically set
# +prevent_writes+ to true.
#
# +connected_to_many+ is an alternative to deeply nested +connected_to+ blocks.
#
# Usage:
#
# ActiveRecord::Base.connected_to_many(AnimalsRecord, MealsRecord, role: :reading) do
# Dog.first # Read from animals replica
# Dinner.first # Read from meals replica
# Person.first # Read from primary writer
# end
def connected_to_many(*classes, role:, shard: nil, prevent_writes: false)
classes = classes.flatten
if ActiveRecord.legacy_connection_handling
raise NotImplementedError, "connected_to_many is not available with legacy connection handling"
end
if self != Base || classes.include?(Base)
raise NotImplementedError, "connected_to_many can only be called on ActiveRecord::Base."
end
prevent_writes = true if role == ActiveRecord.reading_role
append_to_connected_to_stack(role: role, shard: shard, prevent_writes: prevent_writes, klasses: classes)
yield
ensure
connected_to_stack.pop
end
# Use a specified connection.
#
# This method is useful for ensuring that a specific connection is
# being used. For example, when booting a console in readonly mode.
#
# It is not recommended to use this method in a request since it
# does not yield to a block like +connected_to+.
def connecting_to(role: default_role, shard: default_shard, prevent_writes: false)
if ActiveRecord.legacy_connection_handling
raise NotImplementedError, "`connecting_to` is not available with `legacy_connection_handling`."
end
prevent_writes = true if role == ActiveRecord.reading_role
append_to_connected_to_stack(role: role, shard: shard, prevent_writes: prevent_writes, klasses: [self])
end
# Prohibit swapping shards while inside of the passed block.
#
# In some cases you may want to be able to swap shards but not allow a
# nested call to connected_to or connected_to_many to swap again. This
# is useful in cases you're using sharding to provide per-request
# database isolation.
def prohibit_shard_swapping(enabled = true)
prev_value = ActiveSupport::IsolatedExecutionState[:active_record_prohibit_shard_swapping]
ActiveSupport::IsolatedExecutionState[:active_record_prohibit_shard_swapping] = enabled
yield
ensure
ActiveSupport::IsolatedExecutionState[:active_record_prohibit_shard_swapping] = prev_value
end
# Determine whether or not shard swapping is currently prohibited
def shard_swapping_prohibited?
ActiveSupport::IsolatedExecutionState[:active_record_prohibit_shard_swapping]
end
# Prevent writing to the database regardless of role.
#
# In some cases you may want to prevent writes to the database
# even if you are on a database that can write. +while_preventing_writes+
# will prevent writes to the database for the duration of the block.
#
# This method does not provide the same protection as a readonly
# user and is meant to be a safeguard against accidental writes.
#
# See +READ_QUERY+ for the queries that are blocked by this
# method.
def while_preventing_writes(enabled = true, &block)
if ActiveRecord.legacy_connection_handling
connection_handler.while_preventing_writes(enabled, &block)
else
connected_to(role: current_role, prevent_writes: enabled, &block)
end
end
# Returns true if role is the current connected role.
#
# ActiveRecord::Base.connected_to(role: :writing) do
# ActiveRecord::Base.connected_to?(role: :writing) #=> true
# ActiveRecord::Base.connected_to?(role: :reading) #=> false
# end
def connected_to?(role:, shard: ActiveRecord::Base.default_shard)
current_role == role.to_sym && current_shard == shard.to_sym
end
def lookup_connection_handler(handler_key) # :nodoc:
if ActiveRecord.legacy_connection_handling
handler_key ||= ActiveRecord.writing_role
connection_handlers[handler_key] ||= ActiveRecord::ConnectionAdapters::ConnectionHandler.new
else
ActiveRecord::Base.connection_handler
end
end
# Clears the query cache for all connections associated with the current thread.
def clear_query_caches_for_current_thread
if ActiveRecord.legacy_connection_handling
ActiveRecord::Base.connection_handlers.each_value do |handler|
clear_on_handler(handler)
end
else
clear_on_handler(ActiveRecord::Base.connection_handler)
end
end
# Returns the connection currently associated with the class. This can
# also be used to "borrow" the connection to do database work unrelated
# to any of the specific Active Records.
def connection
retrieve_connection
end
attr_writer :connection_specification_name
# Return the connection specification name from the current class or its parent.
def connection_specification_name
if !defined?(@connection_specification_name) || @connection_specification_name.nil?
return self == Base ? Base.name : superclass.connection_specification_name
end
@connection_specification_name
end
def primary_class? # :nodoc:
self == Base || application_record_class?
end
# Returns the db_config object from the associated connection:
#
# ActiveRecord::Base.connection_db_config
# #<ActiveRecord::DatabaseConfigurations::HashConfig:0x00007fd1acbded10 @env_name="development",
# @name="primary", @config={pool: 5, timeout: 5000, database: "db/development.sqlite3", adapter: "sqlite3"}>
#
# Use only for reading.
def connection_db_config
connection_pool.db_config
end
def connection_pool
connection_handler.retrieve_connection_pool(connection_specification_name, role: current_role, shard: current_shard) || raise(ConnectionNotEstablished)
end
def retrieve_connection
connection_handler.retrieve_connection(connection_specification_name, role: current_role, shard: current_shard)
end
# Returns +true+ if Active Record is connected.
def connected?
connection_handler.connected?(connection_specification_name, role: current_role, shard: current_shard)
end
def remove_connection(name = nil)
name ||= @connection_specification_name if defined?(@connection_specification_name)
# if removing a connection that has a pool, we reset the
# connection_specification_name so it will use the parent
# pool.
if connection_handler.retrieve_connection_pool(name, role: current_role, shard: current_shard)
self.connection_specification_name = nil
end
connection_handler.remove_connection_pool(name, role: current_role, shard: current_shard)
end
def clear_cache! # :nodoc:
connection.schema_cache.clear!
end
delegate :clear_active_connections!, :clear_reloadable_connections!,
:clear_all_connections!, :flush_idle_connections!, to: :connection_handler
private
def clear_on_handler(handler)
handler.all_connection_pools.each do |pool|
pool.connection.clear_query_cache if pool.active_connection?
end
end
def resolve_config_for_connection(config_or_env)
raise "Anonymous class is not allowed." unless name
owner_name = primary_class? ? Base.name : name
self.connection_specification_name = owner_name
db_config = Base.configurations.resolve(config_or_env)
[db_config, self]
end
def with_handler(handler_key, &blk)
handler = lookup_connection_handler(handler_key)
swap_connection_handler(handler, &blk)
end
def with_role_and_shard(role, shard, prevent_writes)
prevent_writes = true if role == ActiveRecord.reading_role
if ActiveRecord.legacy_connection_handling
with_handler(role.to_sym) do
connection_handler.while_preventing_writes(prevent_writes) do
append_to_connected_to_stack(shard: shard, klasses: [self])
yield
end
end
else
append_to_connected_to_stack(role: role, shard: shard, prevent_writes: prevent_writes, klasses: [self])
return_value = yield
return_value.load if return_value.is_a? ActiveRecord::Relation
return_value
end
ensure
self.connected_to_stack.pop
end
def append_to_connected_to_stack(entry)
if shard_swapping_prohibited? && entry[:shard].present?
raise ArgumentError, "cannot swap `shard` while shard swapping is prohibited."
end
connected_to_stack << entry
end
def swap_connection_handler(handler, &blk) # :nodoc:
old_handler, ActiveRecord::Base.connection_handler = ActiveRecord::Base.connection_handler, handler
return_value = yield
return_value.load if return_value.is_a? ActiveRecord::Relation
return_value
ensure
ActiveRecord::Base.connection_handler = old_handler
end
end
end
# frozen_string_literal: true
require "thread"
require "concurrent/map"
require "monitor"
require "active_record/connection_adapters/abstract/connection_pool/queue"
require "active_record/connection_adapters/abstract/connection_pool/reaper"
module ActiveRecord
module ConnectionAdapters
module AbstractPool # :nodoc:
def get_schema_cache(connection)
self.schema_cache ||= SchemaCache.new(connection)
schema_cache.connection = connection
schema_cache
end
def set_schema_cache(cache)
self.schema_cache = cache
end
def lazily_set_schema_cache
return unless ActiveRecord.lazily_load_schema_cache
cache = SchemaCache.load_from(db_config.lazy_schema_cache_path)
set_schema_cache(cache)
end
end
class NullPool # :nodoc:
include ConnectionAdapters::AbstractPool
attr_accessor :schema_cache
def connection_class; end
def checkin(_); end
def remove(_); end
def async_executor; end
end
# Connection pool base class for managing Active Record database
# connections.
#
# == Introduction
#
# A connection pool synchronizes thread access to a limited number of
# database connections. The basic idea is that each thread checks out a
# database connection from the pool, uses that connection, and checks the
# connection back in. ConnectionPool is completely thread-safe, and will
# ensure that a connection cannot be used by two threads at the same time,
# as long as ConnectionPool's contract is correctly followed. It will also
# handle cases in which there are more threads than connections: if all
# connections have been checked out, and a thread tries to checkout a
# connection anyway, then ConnectionPool will wait until some other thread
# has checked in a connection.
#
# == Obtaining (checking out) a connection
#
# Connections can be obtained and used from a connection pool in several
# ways:
#
# 1. Simply use {ActiveRecord::Base.connection}[rdoc-ref:ConnectionHandling.connection]
# as with Active Record 2.1 and
# earlier (pre-connection-pooling). Eventually, when you're done with
# the connection(s) and wish it to be returned to the pool, you call
# {ActiveRecord::Base.clear_active_connections!}[rdoc-ref:ConnectionAdapters::ConnectionHandler#clear_active_connections!].
# This will be the default behavior for Active Record when used in conjunction with
# Action Pack's request handling cycle.
# 2. Manually check out a connection from the pool with
# {ActiveRecord::Base.connection_pool.checkout}[rdoc-ref:#checkout]. You are responsible for
# returning this connection to the pool when finished by calling
# {ActiveRecord::Base.connection_pool.checkin(connection)}[rdoc-ref:#checkin].
# 3. Use {ActiveRecord::Base.connection_pool.with_connection(&block)}[rdoc-ref:#with_connection], which
# obtains a connection, yields it as the sole argument to the block,
# and returns it to the pool after the block completes.
#
# Connections in the pool are actually AbstractAdapter objects (or objects
# compatible with AbstractAdapter's interface).
#
# == Options
#
# There are several connection-pooling-related options that you can add to
# your database connection configuration:
#
# * +pool+: maximum number of connections the pool may manage (default 5).
# * +idle_timeout+: number of seconds that a connection will be kept
# unused in the pool before it is automatically disconnected (default
# 300 seconds). Set this to zero to keep connections forever.
# * +checkout_timeout+: number of seconds to wait for a connection to
# become available before giving up and raising a timeout error (default
# 5 seconds).
#
#--
# Synchronization policy:
# * all public methods can be called outside +synchronize+
# * access to these instance variables needs to be in +synchronize+:
# * @connections
# * @now_connecting
# * private methods that require being called in a +synchronize+ blocks
# are now explicitly documented
class ConnectionPool
include MonitorMixin
include QueryCache::ConnectionPoolConfiguration
include ConnectionAdapters::AbstractPool
attr_accessor :automatic_reconnect, :checkout_timeout
attr_reader :db_config, :size, :reaper, :pool_config, :connection_class, :async_executor, :role, :shard
alias_method :connection_klass, :connection_class
deprecate :connection_klass
delegate :schema_cache, :schema_cache=, to: :pool_config
# Creates a new ConnectionPool object. +pool_config+ is a PoolConfig
# object which describes database connection information (e.g. adapter,
# host name, username, password, etc), as well as the maximum size for
# this ConnectionPool.
#
# The default ConnectionPool maximum size is 5.
def initialize(pool_config)
super()
@pool_config = pool_config
@db_config = pool_config.db_config
@connection_class = pool_config.connection_class
@role = pool_config.role
@shard = pool_config.shard
@checkout_timeout = db_config.checkout_timeout
@idle_timeout = db_config.idle_timeout
@size = db_config.pool
# This variable tracks the cache of threads mapped to reserved connections, with the
# sole purpose of speeding up the +connection+ method. It is not the authoritative
# registry of which thread owns which connection. Connection ownership is tracked by
# the +connection.owner+ attr on each +connection+ instance.
# The invariant works like this: if there is mapping of <tt>thread => conn</tt>,
# then that +thread+ does indeed own that +conn+. However, an absence of such
# mapping does not mean that the +thread+ doesn't own the said connection. In
# that case +conn.owner+ attr should be consulted.
# Access and modification of <tt>@thread_cached_conns</tt> does not require
# synchronization.
@thread_cached_conns = Concurrent::Map.new(initial_capacity: @size)
@connections = []
@automatic_reconnect = true
# Connection pool allows for concurrent (outside the main +synchronize+ section)
# establishment of new connections. This variable tracks the number of threads
# currently in the process of independently establishing connections to the DB.
@now_connecting = 0
@threads_blocking_new_connections = 0
@available = ConnectionLeasingQueue.new self
@lock_thread = false
@async_executor = build_async_executor
lazily_set_schema_cache
@reaper = Reaper.new(self, db_config.reaping_frequency)
@reaper.run
end
def lock_thread=(lock_thread)
if lock_thread
@lock_thread = ActiveSupport::IsolatedExecutionState.context
else
@lock_thread = nil
end
end
# Retrieve the connection associated with the current thread, or call
# #checkout to obtain one if necessary.
#
# #connection can be called any number of times; the connection is
# held in a cache keyed by a thread.
def connection
@thread_cached_conns[connection_cache_key(current_thread)] ||= checkout
end
# Returns true if there is an open connection being used for the current thread.
#
# This method only works for connections that have been obtained through
# #connection or #with_connection methods. Connections obtained through
# #checkout will not be detected by #active_connection?
def active_connection?
@thread_cached_conns[connection_cache_key(current_thread)]
end
# Signal that the thread is finished with the current connection.
# #release_connection releases the connection-thread association
# and returns the connection to the pool.
#
# This method only works for connections that have been obtained through
# #connection or #with_connection methods, connections obtained through
# #checkout will not be automatically released.
def release_connection(owner_thread = ActiveSupport::IsolatedExecutionState.context)
if conn = @thread_cached_conns.delete(connection_cache_key(owner_thread))
checkin conn
end
end
# If a connection obtained through #connection or #with_connection methods
# already exists yield it to the block. If no such connection
# exists checkout a connection, yield it to the block, and checkin the
# connection when finished.
def with_connection
unless conn = @thread_cached_conns[connection_cache_key(ActiveSupport::IsolatedExecutionState.context)]
conn = connection
fresh_connection = true
end
yield conn
ensure
release_connection if fresh_connection
end
# Returns true if a connection has already been opened.
def connected?
synchronize { @connections.any? }
end
# Returns an array containing the connections currently in the pool.
# Access to the array does not require synchronization on the pool because
# the array is newly created and not retained by the pool.
#
# However; this method bypasses the ConnectionPool's thread-safe connection
# access pattern. A returned connection may be owned by another thread,
# unowned, or by happen-stance owned by the calling thread.
#
# Calling methods on a connection without ownership is subject to the
# thread-safety guarantees of the underlying method. Many of the methods
# on connection adapter classes are inherently multi-thread unsafe.
def connections
synchronize { @connections.dup }
end
# Disconnects all connections in the pool, and clears the pool.
#
# Raises:
# - ActiveRecord::ExclusiveConnectionTimeoutError if unable to gain ownership of all
# connections in the pool within a timeout interval (default duration is
# <tt>spec.db_config.checkout_timeout * 2</tt> seconds).
def disconnect(raise_on_acquisition_timeout = true)
with_exclusively_acquired_all_connections(raise_on_acquisition_timeout) do
synchronize do
@connections.each do |conn|
if conn.in_use?
conn.steal!
checkin conn
end
conn.disconnect!
end
@connections = []
@available.clear
end
end
end
# Disconnects all connections in the pool, and clears the pool.
#
# The pool first tries to gain ownership of all connections. If unable to
# do so within a timeout interval (default duration is
# <tt>spec.db_config.checkout_timeout * 2</tt> seconds), then the pool is forcefully
# disconnected without any regard for other connection owning threads.
def disconnect!
disconnect(false)
end
# Discards all connections in the pool (even if they're currently
# leased!), along with the pool itself. Any further interaction with the
# pool (except #spec and #schema_cache) is undefined.
#
# See AbstractAdapter#discard!
def discard! # :nodoc:
synchronize do
return if self.discarded?
@connections.each do |conn|
conn.discard!
end
@connections = @available = @thread_cached_conns = nil
end
end
def discarded? # :nodoc:
@connections.nil?
end
# Clears the cache which maps classes and re-connects connections that
# require reloading.
#
# Raises:
# - ActiveRecord::ExclusiveConnectionTimeoutError if unable to gain ownership of all
# connections in the pool within a timeout interval (default duration is
# <tt>spec.db_config.checkout_timeout * 2</tt> seconds).
def clear_reloadable_connections(raise_on_acquisition_timeout = true)
with_exclusively_acquired_all_connections(raise_on_acquisition_timeout) do
synchronize do
@connections.each do |conn|
if conn.in_use?
conn.steal!
checkin conn
end
conn.disconnect! if conn.requires_reloading?
end
@connections.delete_if(&:requires_reloading?)
@available.clear
end
end
end
# Clears the cache which maps classes and re-connects connections that
# require reloading.
#
# The pool first tries to gain ownership of all connections. If unable to
# do so within a timeout interval (default duration is
# <tt>spec.db_config.checkout_timeout * 2</tt> seconds), then the pool forcefully
# clears the cache and reloads connections without any regard for other
# connection owning threads.
def clear_reloadable_connections!
clear_reloadable_connections(false)
end
# Check-out a database connection from the pool, indicating that you want
# to use it. You should call #checkin when you no longer need this.
#
# This is done by either returning and leasing existing connection, or by
# creating a new connection and leasing it.
#
# If all connections are leased and the pool is at capacity (meaning the
# number of currently leased connections is greater than or equal to the
# size limit set), an ActiveRecord::ConnectionTimeoutError exception will be raised.
#
# Returns: an AbstractAdapter object.
#
# Raises:
# - ActiveRecord::ConnectionTimeoutError no connection can be obtained from the pool.
def checkout(checkout_timeout = @checkout_timeout)
checkout_and_verify(acquire_connection(checkout_timeout))
end
# Check-in a database connection back into the pool, indicating that you
# no longer need this connection.
#
# +conn+: an AbstractAdapter object, which was obtained by earlier by
# calling #checkout on this pool.
def checkin(conn)
conn.lock.synchronize do
synchronize do
remove_connection_from_thread_cache conn
conn._run_checkin_callbacks do
conn.expire
end
@available.add conn
end
end
end
# Remove a connection from the connection pool. The connection will
# remain open and active but will no longer be managed by this pool.
def remove(conn)
needs_new_connection = false
synchronize do
remove_connection_from_thread_cache conn
@connections.delete conn
@available.delete conn
# @available.any_waiting? => true means that prior to removing this
# conn, the pool was at its max size (@connections.size == @size).
# This would mean that any threads stuck waiting in the queue wouldn't
# know they could checkout_new_connection, so let's do it for them.
# Because condition-wait loop is encapsulated in the Queue class
# (that in turn is oblivious to ConnectionPool implementation), threads
# that are "stuck" there are helpless. They have no way of creating
# new connections and are completely reliant on us feeding available
# connections into the Queue.
needs_new_connection = @available.any_waiting?
end
# This is intentionally done outside of the synchronized section as we
# would like not to hold the main mutex while checking out new connections.
# Thus there is some chance that needs_new_connection information is now
# stale, we can live with that (bulk_make_new_connections will make
# sure not to exceed the pool's @size limit).
bulk_make_new_connections(1) if needs_new_connection
end
# Recover lost connections for the pool. A lost connection can occur if
# a programmer forgets to checkin a connection at the end of a thread
# or a thread dies unexpectedly.
def reap
stale_connections = synchronize do
return if self.discarded?
@connections.select do |conn|
conn.in_use? && !conn.owner.alive?
end.each do |conn|
conn.steal!
end
end
stale_connections.each do |conn|
if conn.active?
conn.reset!
checkin conn
else
remove conn
end
end
end
# Disconnect all connections that have been idle for at least
# +minimum_idle+ seconds. Connections currently checked out, or that were
# checked in less than +minimum_idle+ seconds ago, are unaffected.
def flush(minimum_idle = @idle_timeout)
return if minimum_idle.nil?
idle_connections = synchronize do
return if self.discarded?
@connections.select do |conn|
!conn.in_use? && conn.seconds_idle >= minimum_idle
end.each do |conn|
conn.lease
@available.delete conn
@connections.delete conn
end
end
idle_connections.each do |conn|
conn.disconnect!
end
end
# Disconnect all currently idle connections. Connections currently checked
# out are unaffected.
def flush!
reap
flush(-1)
end
def num_waiting_in_queue # :nodoc:
@available.num_waiting
end
# Return connection pool's usage statistic
# Example:
#
# ActiveRecord::Base.connection_pool.stat # => { size: 15, connections: 1, busy: 1, dead: 0, idle: 0, waiting: 0, checkout_timeout: 5 }
def stat
synchronize do
{
size: size,
connections: @connections.size,
busy: @connections.count { |c| c.in_use? && c.owner.alive? },
dead: @connections.count { |c| c.in_use? && !c.owner.alive? },
idle: @connections.count { |c| !c.in_use? },
waiting: num_waiting_in_queue,
checkout_timeout: checkout_timeout
}
end
end
def schedule_query(future_result) # :nodoc:
@async_executor.post { future_result.execute_or_skip }
Thread.pass
end
private
def build_async_executor
case ActiveRecord.async_query_executor
when :multi_thread_pool
if @db_config.max_threads > 0
Concurrent::ThreadPoolExecutor.new(
min_threads: @db_config.min_threads,
max_threads: @db_config.max_threads,
max_queue: @db_config.max_queue,
fallback_policy: :caller_runs
)
end
when :global_thread_pool
ActiveRecord.global_thread_pool_async_query_executor
end
end
#--
# this is unfortunately not concurrent
def bulk_make_new_connections(num_new_conns_needed)
num_new_conns_needed.times do
# try_to_checkout_new_connection will not exceed pool's @size limit
if new_conn = try_to_checkout_new_connection
# make the new_conn available to the starving threads stuck @available Queue
checkin(new_conn)
end
end
end
#--
# From the discussion on GitHub:
# https://github.com/rails/rails/pull/14938#commitcomment-6601951
# This hook-in method allows for easier monkey-patching fixes needed by
# JRuby users that use Fibers.
def connection_cache_key(thread)
thread
end
def current_thread
@lock_thread || ActiveSupport::IsolatedExecutionState.context
end
# Take control of all existing connections so a "group" action such as
# reload/disconnect can be performed safely. It is no longer enough to
# wrap it in +synchronize+ because some pool's actions are allowed
# to be performed outside of the main +synchronize+ block.
def with_exclusively_acquired_all_connections(raise_on_acquisition_timeout = true)
with_new_connections_blocked do
attempt_to_checkout_all_existing_connections(raise_on_acquisition_timeout)
yield
end
end
def attempt_to_checkout_all_existing_connections(raise_on_acquisition_timeout = true)
collected_conns = synchronize do
# account for our own connections
@connections.select { |conn| conn.owner == ActiveSupport::IsolatedExecutionState.context }
end
newly_checked_out = []
timeout_time = Process.clock_gettime(Process::CLOCK_MONOTONIC) + (@checkout_timeout * 2)
@available.with_a_bias_for(ActiveSupport::IsolatedExecutionState.context) do
loop do
synchronize do
return if collected_conns.size == @connections.size && @now_connecting == 0
remaining_timeout = timeout_time - Process.clock_gettime(Process::CLOCK_MONOTONIC)
remaining_timeout = 0 if remaining_timeout < 0
conn = checkout_for_exclusive_access(remaining_timeout)
collected_conns << conn
newly_checked_out << conn
end
end
end
rescue ExclusiveConnectionTimeoutError
# <tt>raise_on_acquisition_timeout == false</tt> means we are directed to ignore any
# timeouts and are expected to just give up: we've obtained as many connections
# as possible, note that in a case like that we don't return any of the
# +newly_checked_out+ connections.
if raise_on_acquisition_timeout
release_newly_checked_out = true
raise
end
rescue Exception # if something else went wrong
# this can't be a "naked" rescue, because we have should return conns
# even for non-StandardErrors
release_newly_checked_out = true
raise
ensure
if release_newly_checked_out && newly_checked_out
# releasing only those conns that were checked out in this method, conns
# checked outside this method (before it was called) are not for us to release
newly_checked_out.each { |conn| checkin(conn) }
end
end
#--
# Must be called in a synchronize block.
def checkout_for_exclusive_access(checkout_timeout)
checkout(checkout_timeout)
rescue ConnectionTimeoutError
# this block can't be easily moved into attempt_to_checkout_all_existing_connections's
# rescue block, because doing so would put it outside of synchronize section, without
# being in a critical section thread_report might become inaccurate
msg = +"could not obtain ownership of all database connections in #{checkout_timeout} seconds"
thread_report = []
@connections.each do |conn|
unless conn.owner == ActiveSupport::IsolatedExecutionState.context
thread_report << "#{conn} is owned by #{conn.owner}"
end
end
msg << " (#{thread_report.join(', ')})" if thread_report.any?
raise ExclusiveConnectionTimeoutError, msg
end
def with_new_connections_blocked
synchronize do
@threads_blocking_new_connections += 1
end
yield
ensure
num_new_conns_required = 0
synchronize do
@threads_blocking_new_connections -= 1
if @threads_blocking_new_connections.zero?
@available.clear
num_new_conns_required = num_waiting_in_queue
@connections.each do |conn|
next if conn.in_use?
@available.add conn
num_new_conns_required -= 1
end
end
end
bulk_make_new_connections(num_new_conns_required) if num_new_conns_required > 0
end
# Acquire a connection by one of 1) immediately removing one
# from the queue of available connections, 2) creating a new
# connection if the pool is not at capacity, 3) waiting on the
# queue for a connection to become available.
#
# Raises:
# - ActiveRecord::ConnectionTimeoutError if a connection could not be acquired
#
#--
# Implementation detail: the connection returned by +acquire_connection+
# will already be "+connection.lease+ -ed" to the current thread.
def acquire_connection(checkout_timeout)
# NOTE: we rely on <tt>@available.poll</tt> and +try_to_checkout_new_connection+ to
# +conn.lease+ the returned connection (and to do this in a +synchronized+
# section). This is not the cleanest implementation, as ideally we would
# <tt>synchronize { conn.lease }</tt> in this method, but by leaving it to <tt>@available.poll</tt>
# and +try_to_checkout_new_connection+ we can piggyback on +synchronize+ sections
# of the said methods and avoid an additional +synchronize+ overhead.
if conn = @available.poll || try_to_checkout_new_connection
conn
else
reap
@available.poll(checkout_timeout)
end
end
#--
# if owner_thread param is omitted, this must be called in synchronize block
def remove_connection_from_thread_cache(conn, owner_thread = conn.owner)
@thread_cached_conns.delete_pair(connection_cache_key(owner_thread), conn)
end
alias_method :release, :remove_connection_from_thread_cache
def new_connection
Base.public_send(db_config.adapter_method, db_config.configuration_hash).tap do |conn|
conn.check_version
end
end
# If the pool is not at a <tt>@size</tt> limit, establish new connection. Connecting
# to the DB is done outside main synchronized section.
#--
# Implementation constraint: a newly established connection returned by this
# method must be in the +.leased+ state.
def try_to_checkout_new_connection
# first in synchronized section check if establishing new conns is allowed
# and increment @now_connecting, to prevent overstepping this pool's @size
# constraint
do_checkout = synchronize do
if @threads_blocking_new_connections.zero? && (@connections.size + @now_connecting) < @size
@now_connecting += 1
end
end
if do_checkout
begin
# if successfully incremented @now_connecting establish new connection
# outside of synchronized section
conn = checkout_new_connection
ensure
synchronize do
if conn
adopt_connection(conn)
# returned conn needs to be already leased
conn.lease
end
@now_connecting -= 1
end
end
end
end
def adopt_connection(conn)
conn.pool = self
@connections << conn
end
def checkout_new_connection
raise ConnectionNotEstablished unless @automatic_reconnect
new_connection
end
def checkout_and_verify(c)
c._run_checkout_callbacks do
c.verify!
end
c
rescue
remove c
c.disconnect!
raise
end
end
end
end
# frozen_string_literal: true
require "uri"
require "active_support/core_ext/enumerable"
require "active_support/core_ext/hash/reverse_merge"
module ActiveRecord
class DatabaseConfigurations
# Expands a connection string into a hash.
class ConnectionUrlResolver # :nodoc:
# == Example
#
# url = "postgresql://foo:bar@localhost:9000/foo_test?pool=5&timeout=3000"
# ConnectionUrlResolver.new(url).to_hash
# # => {
# adapter: "postgresql",
# host: "localhost",
# port: 9000,
# database: "foo_test",
# username: "foo",
# password: "bar",
# pool: "5",
# timeout: "3000"
# }
def initialize(url)
raise "Database URL cannot be empty" if url.blank?
@uri = uri_parser.parse(url)
@adapter = @uri.scheme && @uri.scheme.tr("-", "_")
@adapter = "postgresql" if @adapter == "postgres"
if @uri.opaque
@uri.opaque, @query = @uri.opaque.split("?", 2)
else
@query = @uri.query
end
end
# Converts the given URL to a full connection hash.
def to_hash
config = raw_config.compact_blank
config.map { |key, value| config[key] = uri_parser.unescape(value) if value.is_a? String }
config
end
private
attr_reader :uri
def uri_parser
@uri_parser ||= URI::Parser.new
end
# Converts the query parameters of the URI into a hash.
#
# "localhost?pool=5&reaping_frequency=2"
# # => { pool: "5", reaping_frequency: "2" }
#
# returns empty hash if no query present.
#
# "localhost"
# # => {}
def query_hash
Hash[(@query || "").split("&").map { |pair| pair.split("=", 2) }].symbolize_keys
end
def raw_config
if uri.opaque
query_hash.merge(
adapter: @adapter,
database: uri.opaque
)
else
query_hash.reverse_merge(
adapter: @adapter,
username: uri.user,
password: uri.password,
port: uri.port,
database: database_from_path,
host: uri.hostname
)
end
end
# Returns name of the database.
def database_from_path
if @adapter == "sqlite3"
# 'sqlite3:/foo' is absolute, because that makes sense. The
# corresponding relative version, 'sqlite3:foo', is handled
# elsewhere, as an "opaque".
uri.path
else
# Only SQLite uses a filename as the "database" name; for
# anything else, a leading slash would be silly.
uri.path.delete_prefix("/")
end
end
end
end
end
# frozen_string_literal: true
ActiveRecord::ConnectionAdapters::AbstractAdapter.set_callback(:checkout, :after) do
begin_transaction(joinable: false)
end
# frozen_string_literal: true
module ActiveRecord
module Encryption
# An encryption context configures the different entities used to perform encryption:
#
# * A key provider
# * A key generator
# * An encryptor, the facade to encrypt data
# * A cipher, the encryption algorithm
# * A message serializer
class Context
PROPERTIES = %i[ key_provider key_generator cipher message_serializer encryptor frozen_encryption ]
attr_accessor(*PROPERTIES)
def initialize
set_defaults
end
alias frozen_encryption? frozen_encryption
silence_redefinition_of_method :key_provider
def key_provider
@key_provider ||= build_default_key_provider
end
private
def set_defaults
self.frozen_encryption = false
self.key_generator = ActiveRecord::Encryption::KeyGenerator.new
self.cipher = ActiveRecord::Encryption::Cipher.new
self.encryptor = ActiveRecord::Encryption::Encryptor.new
self.message_serializer = ActiveRecord::Encryption::MessageSerializer.new
end
def build_default_key_provider
ActiveRecord::Encryption::DerivedSecretKeyProvider.new(ActiveRecord::Encryption.config.primary_key)
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Encryption
# ActiveRecord::Encryption uses encryption contexts to configure the different entities used to
# encrypt/decrypt at a given moment in time.
#
# By default, the library uses a default encryption context. This is the Context that gets configured
# initially via +config.active_record.encryption+ options. Library users can define nested encryption contexts
# when running blocks of code.
#
# See Context.
module Contexts
extend ActiveSupport::Concern
included do
mattr_reader :default_context, default: Context.new
thread_mattr_accessor :custom_contexts
end
class_methods do
# Configures a custom encryption context to use when running the provided block of code.
#
# It supports overriding all the properties defined in +Context+.
#
# Example:
#
# ActiveRecord::Encryption.with_encryption_context(encryptor: ActiveRecord::Encryption::NullEncryptor.new) do
# ...
# end
#
# Encryption contexts can be nested.
def with_encryption_context(properties)
self.custom_contexts ||= []
self.custom_contexts << default_context.dup
properties.each do |key, value|
self.current_custom_context.send("#{key}=", value)
end
yield
ensure
self.custom_contexts.pop
end
# Runs the provided block in an encryption context where encryption is disabled:
#
# * Reading encrypted content will return its ciphertexts.
# * Writing encrypted content will write its clear text.
def without_encryption(&block)
with_encryption_context encryptor: ActiveRecord::Encryption::NullEncryptor.new, &block
end
# Runs the provided block in an encryption context where:
#
# * Reading encrypted content will return its ciphertext.
# * Writing encrypted content will fail.
def protecting_encrypted_data(&block)
with_encryption_context encryptor: ActiveRecord::Encryption::EncryptingOnlyEncryptor.new, frozen_encryption: true, &block
end
# Returns the current context. By default it will return the current context.
def context
self.current_custom_context || self.default_context
end
def current_custom_context
self.custom_contexts&.last
end
end
end
end
end
# frozen_string_literal: true
require "active_support/core_ext/module/attr_internal"
require "active_record/log_subscriber"
module ActiveRecord
module Railties # :nodoc:
module ControllerRuntime # :nodoc:
extend ActiveSupport::Concern
module ClassMethods # :nodoc:
def log_process_action(payload)
messages, db_runtime = super, payload[:db_runtime]
messages << ("ActiveRecord: %.1fms" % db_runtime.to_f) if db_runtime
messages
end
end
private
attr_internal :db_runtime
def process_action(action, *args)
# We also need to reset the runtime before each action
# because of queries in middleware or in cases we are streaming
# and it won't be cleaned up by the method below.
ActiveRecord::LogSubscriber.reset_runtime
super
end
def cleanup_view_runtime
if logger && logger.info? && ActiveRecord::Base.connected?
db_rt_before_render = ActiveRecord::LogSubscriber.reset_runtime
self.db_runtime = (db_runtime || 0) + db_rt_before_render
runtime = super
db_rt_after_render = ActiveRecord::LogSubscriber.reset_runtime
self.db_runtime += db_rt_after_render
runtime - db_rt_after_render
else
super
end
end
def append_info_to_payload(payload)
super
if ActiveRecord::Base.connected?
payload[:db_runtime] = (db_runtime || 0) + ActiveRecord::LogSubscriber.reset_runtime
end
end
end
end
end
# frozen_string_literal: true
require "active_support/core_ext/enumerable"
require "active_support/core_ext/string/filters"
require "active_support/parameter_filter"
require "concurrent/map"
module ActiveRecord
module Core
extend ActiveSupport::Concern
include ActiveModel::Access
included do
##
# :singleton-method:
#
# Accepts a logger conforming to the interface of Log4r which is then
# passed on to any new database connections made and which can be
# retrieved on both a class and instance level by calling +logger+.
class_attribute :logger, instance_writer: false
##
# :singleton-method:
#
# Specifies the job used to destroy associations in the background
class_attribute :destroy_association_async_job, instance_writer: false, instance_predicate: false, default: false
##
# Contains the database configuration - as is typically stored in config/database.yml -
# as an ActiveRecord::DatabaseConfigurations object.
#
# For example, the following database.yml...
#
# development:
# adapter: sqlite3
# database: db/development.sqlite3
#
# production:
# adapter: sqlite3
# database: db/production.sqlite3
#
# ...would result in ActiveRecord::Base.configurations to look like this:
#
# #<ActiveRecord::DatabaseConfigurations:0x00007fd1acbdf800 @configurations=[
# #<ActiveRecord::DatabaseConfigurations::HashConfig:0x00007fd1acbded10 @env_name="development",
# @name="primary", @config={adapter: "sqlite3", database: "db/development.sqlite3"}>,
# #<ActiveRecord::DatabaseConfigurations::HashConfig:0x00007fd1acbdea90 @env_name="production",
# @name="primary", @config={adapter: "sqlite3", database: "db/production.sqlite3"}>
# ]>
def self.configurations=(config)
@@configurations = ActiveRecord::DatabaseConfigurations.new(config)
end
self.configurations = {}
# Returns fully resolved ActiveRecord::DatabaseConfigurations object
def self.configurations
@@configurations
end
##
# :singleton-method:
# Force enumeration of all columns in SELECT statements.
# e.g. <tt>SELECT first_name, last_name FROM ...</tt> instead of <tt>SELECT * FROM ...</tt>
# This avoids +PreparedStatementCacheExpired+ errors when a column is added
# to the database while the app is running.
class_attribute :enumerate_columns_in_select_statements, instance_accessor: false, default: false
class_attribute :belongs_to_required_by_default, instance_accessor: false
class_attribute :strict_loading_by_default, instance_accessor: false, default: false
class_attribute :has_many_inversing, instance_accessor: false, default: false
class_attribute :default_connection_handler, instance_writer: false
class_attribute :default_role, instance_writer: false
class_attribute :default_shard, instance_writer: false
class_attribute :shard_selector, instance_accessor: false, default: nil
def self.application_record_class? # :nodoc:
if ActiveRecord.application_record_class
self == ActiveRecord.application_record_class
else
if defined?(ApplicationRecord) && self == ApplicationRecord
true
end
end
end
self.filter_attributes = []
def self.connection_handler
ActiveSupport::IsolatedExecutionState[:active_record_connection_handler] || default_connection_handler
end
def self.connection_handler=(handler)
ActiveSupport::IsolatedExecutionState[:active_record_connection_handler] = handler
end
def self.connection_handlers
if ActiveRecord.legacy_connection_handling
else
raise NotImplementedError, "The new connection handling does not support accessing multiple connection handlers."
end
@@connection_handlers ||= {}
end
def self.connection_handlers=(handlers)
if ActiveRecord.legacy_connection_handling
ActiveSupport::Deprecation.warn(<<~MSG)
Using legacy connection handling is deprecated. Please set
`legacy_connection_handling` to `false` in your application.
The new connection handling does not support `connection_handlers`
getter and setter.
Read more about how to migrate at: https://guides.rubyonrails.org/active_record_multiple_databases.html#migrate-to-the-new-connection-handling
MSG
else
raise NotImplementedError, "The new connection handling does not support multiple connection handlers."
end
@@connection_handlers = handlers
end
def self.asynchronous_queries_session # :nodoc:
asynchronous_queries_tracker.current_session
end
def self.asynchronous_queries_tracker # :nodoc:
ActiveSupport::IsolatedExecutionState[:active_record_asynchronous_queries_tracker] ||= \
AsynchronousQueriesTracker.new
end
# Returns the symbol representing the current connected role.
#
# ActiveRecord::Base.connected_to(role: :writing) do
# ActiveRecord::Base.current_role #=> :writing
# end
#
# ActiveRecord::Base.connected_to(role: :reading) do
# ActiveRecord::Base.current_role #=> :reading
# end
def self.current_role
if ActiveRecord.legacy_connection_handling
connection_handlers.key(connection_handler) || default_role
else
connected_to_stack.reverse_each do |hash|
return hash[:role] if hash[:role] && hash[:klasses].include?(Base)
return hash[:role] if hash[:role] && hash[:klasses].include?(connection_class_for_self)
end
default_role
end
end
# Returns the symbol representing the current connected shard.
#
# ActiveRecord::Base.connected_to(role: :reading) do
# ActiveRecord::Base.current_shard #=> :default
# end
#
# ActiveRecord::Base.connected_to(role: :writing, shard: :one) do
# ActiveRecord::Base.current_shard #=> :one
# end
def self.current_shard
connected_to_stack.reverse_each do |hash|
return hash[:shard] if hash[:shard] && hash[:klasses].include?(Base)
return hash[:shard] if hash[:shard] && hash[:klasses].include?(connection_class_for_self)
end
default_shard
end
# Returns the symbol representing the current setting for
# preventing writes.
#
# ActiveRecord::Base.connected_to(role: :reading) do
# ActiveRecord::Base.current_preventing_writes #=> true
# end
#
# ActiveRecord::Base.connected_to(role: :writing) do
# ActiveRecord::Base.current_preventing_writes #=> false
# end
def self.current_preventing_writes
if ActiveRecord.legacy_connection_handling
connection_handler.prevent_writes
else
connected_to_stack.reverse_each do |hash|
return hash[:prevent_writes] if !hash[:prevent_writes].nil? && hash[:klasses].include?(Base)
return hash[:prevent_writes] if !hash[:prevent_writes].nil? && hash[:klasses].include?(connection_class_for_self)
end
false
end
end
def self.connected_to_stack # :nodoc:
if connected_to_stack = ActiveSupport::IsolatedExecutionState[:active_record_connected_to_stack]
connected_to_stack
else
connected_to_stack = Concurrent::Array.new
ActiveSupport::IsolatedExecutionState[:active_record_connected_to_stack] = connected_to_stack
connected_to_stack
end
end
def self.connection_class=(b) # :nodoc:
@connection_class = b
end
def self.connection_class # :nodoc:
@connection_class ||= false
end
def self.connection_class? # :nodoc:
self.connection_class
end
def self.connection_class_for_self # :nodoc:
klass = self
until klass == Base
break if klass.connection_class?
klass = klass.superclass
end
klass
end
self.default_connection_handler = ConnectionAdapters::ConnectionHandler.new
self.default_role = ActiveRecord.writing_role
self.default_shard = :default
def self.strict_loading_violation!(owner:, reflection:) # :nodoc:
case ActiveRecord.action_on_strict_loading_violation
when :raise
message = "`#{owner}` is marked for strict_loading. The `#{reflection.klass}` association named `:#{reflection.name}` cannot be lazily loaded."
raise ActiveRecord::StrictLoadingViolationError.new(message)
when :log
name = "strict_loading_violation.active_record"
ActiveSupport::Notifications.instrument(name, owner: owner, reflection: reflection)
end
end
end
module ClassMethods
def initialize_find_by_cache # :nodoc:
@find_by_statement_cache = { true => Concurrent::Map.new, false => Concurrent::Map.new }
end
def inherited(child_class) # :nodoc:
# initialize cache at class definition for thread safety
child_class.initialize_find_by_cache
unless child_class.base_class?
klass = self
until klass.base_class?
klass.initialize_find_by_cache
klass = klass.superclass
end
end
super
end
def find(*ids) # :nodoc:
# We don't have cache keys for this stuff yet
return super unless ids.length == 1
return super if block_given? || primary_key.nil? || scope_attributes?
id = ids.first
return super if StatementCache.unsupported_value?(id)
cached_find_by([primary_key], [id]) ||
raise(RecordNotFound.new("Couldn't find #{name} with '#{primary_key}'=#{id}", name, primary_key, id))
end
def find_by(*args) # :nodoc:
return super if scope_attributes?
hash = args.first
return super unless Hash === hash
hash = hash.each_with_object({}) do |(key, value), h|
key = key.to_s
key = attribute_aliases[key] || key
return super if reflect_on_aggregation(key)
reflection = _reflect_on_association(key)
if !reflection
value = value.id if value.respond_to?(:id)
elsif reflection.belongs_to? && !reflection.polymorphic?
key = reflection.join_foreign_key
pkey = reflection.join_primary_key
value = value.public_send(pkey) if value.respond_to?(pkey)
end
if !columns_hash.key?(key) || StatementCache.unsupported_value?(value)
return super
end
h[key] = value
end
cached_find_by(hash.keys, hash.values)
end
def find_by!(*args) # :nodoc:
find_by(*args) || where(*args).raise_record_not_found_exception!
end
%w(
reading_role writing_role legacy_connection_handling default_timezone index_nested_attribute_errors
verbose_query_logs queues warn_on_records_fetched_greater_than maintain_test_schema
application_record_class action_on_strict_loading_violation schema_format error_on_ignored_order
timestamped_migrations dump_schema_after_migration dump_schemas suppress_multiple_database_warning
).each do |attr|
module_eval(<<~RUBY, __FILE__, __LINE__ + 1)
def #{attr}
ActiveSupport::Deprecation.warn(<<~MSG)
ActiveRecord::Base.#{attr} is deprecated and will be removed in Rails 7.1.
Use `ActiveRecord.#{attr}` instead.
MSG
ActiveRecord.#{attr}
end
def #{attr}=(value)
ActiveSupport::Deprecation.warn(<<~MSG)
ActiveRecord::Base.#{attr}= is deprecated and will be removed in Rails 7.1.
Use `ActiveRecord.#{attr}=` instead.
MSG
ActiveRecord.#{attr} = value
end
RUBY
end
def initialize_generated_modules # :nodoc:
generated_association_methods
end
def generated_association_methods # :nodoc:
@generated_association_methods ||= begin
mod = const_set(:GeneratedAssociationMethods, Module.new)
private_constant :GeneratedAssociationMethods
include mod
mod
end
end
# Returns columns which shouldn't be exposed while calling +#inspect+.
def filter_attributes
if defined?(@filter_attributes)
@filter_attributes
else
superclass.filter_attributes
end
end
# Specifies columns which shouldn't be exposed while calling +#inspect+.
def filter_attributes=(filter_attributes)
@inspection_filter = nil
@filter_attributes = filter_attributes
end
def inspection_filter # :nodoc:
if defined?(@filter_attributes)
@inspection_filter ||= begin
mask = InspectionMask.new(ActiveSupport::ParameterFilter::FILTERED)
ActiveSupport::ParameterFilter.new(@filter_attributes, mask: mask)
end
else
superclass.inspection_filter
end
end
# Returns a string like 'Post(id:integer, title:string, body:text)'
def inspect # :nodoc:
if self == Base
super
elsif abstract_class?
"#{super}(abstract)"
elsif !connected?
"#{super} (call '#{super}.connection' to establish a connection)"
elsif table_exists?
attr_list = attribute_types.map { |name, type| "#{name}: #{type.type}" } * ", "
"#{super}(#{attr_list})"
else
"#{super}(Table doesn't exist)"
end
end
# Override the default class equality method to provide support for decorated models.
def ===(object) # :nodoc:
object.is_a?(self)
end
# Returns an instance of <tt>Arel::Table</tt> loaded with the current table name.
def arel_table # :nodoc:
@arel_table ||= Arel::Table.new(table_name, klass: self)
end
def predicate_builder # :nodoc:
@predicate_builder ||= PredicateBuilder.new(table_metadata)
end
def type_caster # :nodoc:
TypeCaster::Map.new(self)
end
def cached_find_by_statement(key, &block) # :nodoc:
cache = @find_by_statement_cache[connection.prepared_statements]
cache.compute_if_absent(key) { StatementCache.create(connection, &block) }
end
private
def relation
relation = Relation.create(self)
if finder_needs_type_condition? && !ignore_default_scope?
relation.where!(type_condition)
else
relation
end
end
def table_metadata
TableMetadata.new(self, arel_table)
end
def cached_find_by(keys, values)
statement = cached_find_by_statement(keys) { |params|
wheres = keys.index_with { params.bind }
where(wheres).limit(1)
}
begin
statement.execute(values, connection).first
rescue TypeError
raise ActiveRecord::StatementInvalid
end
end
end
# New objects can be instantiated as either empty (pass no construction parameter) or pre-set with
# attributes but not yet saved (pass a hash with key names matching the associated table column names).
# In both instances, valid attribute keys are determined by the column names of the associated table --
# hence you can't have attributes that aren't part of the table columns.
#
# ==== Example:
# # Instantiates a single new object
# User.new(first_name: 'Jamie')
def initialize(attributes = nil)
@new_record = true
@attributes = self.class._default_attributes.deep_dup
init_internals
initialize_internals_callback
assign_attributes(attributes) if attributes
yield self if block_given?
_run_initialize_callbacks
end
# Initialize an empty model object from +coder+. +coder+ should be
# the result of previously encoding an Active Record model, using
# #encode_with.
#
# class Post < ActiveRecord::Base
# end
#
# old_post = Post.new(title: "hello world")
# coder = {}
# old_post.encode_with(coder)
#
# post = Post.allocate
# post.init_with(coder)
# post.title # => 'hello world'
def init_with(coder, &block)
coder = LegacyYamlAdapter.convert(coder)
attributes = self.class.yaml_encoder.decode(coder)
init_with_attributes(attributes, coder["new_record"], &block)
end
##
# Initialize an empty model object from +attributes+.
# +attributes+ should be an attributes object, and unlike the
# `initialize` method, no assignment calls are made per attribute.
def init_with_attributes(attributes, new_record = false) # :nodoc:
@new_record = new_record
@attributes = attributes
init_internals
yield self if block_given?
_run_find_callbacks
_run_initialize_callbacks
self
end
##
# :method: clone
# Identical to Ruby's clone method. This is a "shallow" copy. Be warned that your attributes are not copied.
# That means that modifying attributes of the clone will modify the original, since they will both point to the
# same attributes hash. If you need a copy of your attributes hash, please use the #dup method.
#
# user = User.first
# new_user = user.clone
# user.name # => "Bob"
# new_user.name = "Joe"
# user.name # => "Joe"
#
# user.object_id == new_user.object_id # => false
# user.name.object_id == new_user.name.object_id # => true
#
# user.name.object_id == user.dup.name.object_id # => false
##
# :method: dup
# Duped objects have no id assigned and are treated as new records. Note
# that this is a "shallow" copy as it copies the object's attributes
# only, not its associations. The extent of a "deep" copy is application
# specific and is therefore left to the application to implement according
# to its need.
# The dup method does not preserve the timestamps (created|updated)_(at|on).
##
def initialize_dup(other) # :nodoc:
@attributes = @attributes.deep_dup
@attributes.reset(@primary_key)
_run_initialize_callbacks
@new_record = true
@previously_new_record = false
@destroyed = false
@_start_transaction_state = nil
super
end
# Populate +coder+ with attributes about this record that should be
# serialized. The structure of +coder+ defined in this method is
# guaranteed to match the structure of +coder+ passed to the #init_with
# method.
#
# Example:
#
# class Post < ActiveRecord::Base
# end
# coder = {}
# Post.new.encode_with(coder)
# coder # => {"attributes" => {"id" => nil, ... }}
def encode_with(coder)
self.class.yaml_encoder.encode(@attributes, coder)
coder["new_record"] = new_record?
coder["active_record_yaml_version"] = 2
end
# Returns true if +comparison_object+ is the same exact object, or +comparison_object+
# is of the same type and +self+ has an ID and it is equal to +comparison_object.id+.
#
# Note that new records are different from any other record by definition, unless the
# other record is the receiver itself. Besides, if you fetch existing records with
# +select+ and leave the ID out, you're on your own, this predicate will return false.
#
# Note also that destroying a record preserves its ID in the model instance, so deleted
# models are still comparable.
def ==(comparison_object)
super ||
comparison_object.instance_of?(self.class) &&
!id.nil? &&
comparison_object.id == id
end
alias :eql? :==
# Delegates to id in order to allow two records of the same type and id to work with something like:
# [ Person.find(1), Person.find(2), Person.find(3) ] & [ Person.find(1), Person.find(4) ] # => [ Person.find(1) ]
def hash
id = self.id
if id
self.class.hash ^ id.hash
else
super
end
end
# Clone and freeze the attributes hash such that associations are still
# accessible, even on destroyed records, but cloned models will not be
# frozen.
def freeze
@attributes = @attributes.clone.freeze
self
end
# Returns +true+ if the attributes hash has been frozen.
def frozen?
@attributes.frozen?
end
# Allows sort on objects
def <=>(other_object)
if other_object.is_a?(self.class)
to_key <=> other_object.to_key
else
super
end
end
def present? # :nodoc:
true
end
def blank? # :nodoc:
false
end
# Returns +true+ if the record is read only.
def readonly?
@readonly
end
# Returns +true+ if the record is in strict_loading mode.
def strict_loading?
@strict_loading
end
# Sets the record to strict_loading mode. This will raise an error
# if the record tries to lazily load an association.
#
# user = User.first
# user.strict_loading! # => true
# user.comments
# => ActiveRecord::StrictLoadingViolationError
#
# === Parameters:
#
# * value - Boolean specifying whether to enable or disable strict loading.
# * mode - Symbol specifying strict loading mode. Defaults to :all. Using
# :n_plus_one_only mode will only raise an error if an association
# that will lead to an n plus one query is lazily loaded.
#
# === Example:
#
# user = User.first
# user.strict_loading!(false) # => false
# user.comments
# => #<ActiveRecord::Associations::CollectionProxy>
def strict_loading!(value = true, mode: :all)
unless [:all, :n_plus_one_only].include?(mode)
raise ArgumentError, "The :mode option must be one of [:all, :n_plus_one_only] but #{mode.inspect} was provided."
end
@strict_loading_mode = mode
@strict_loading = value
end
attr_reader :strict_loading_mode
# Returns +true+ if the record uses strict_loading with +:n_plus_one_only+ mode enabled.
def strict_loading_n_plus_one_only?
@strict_loading_mode == :n_plus_one_only
end
# Marks this record as read only.
def readonly!
@readonly = true
end
def connection_handler
self.class.connection_handler
end
# Returns the contents of the record as a nicely formatted string.
def inspect
# We check defined?(@attributes) not to issue warnings if the object is
# allocated but not initialized.
inspection = if defined?(@attributes) && @attributes
self.class.attribute_names.filter_map do |name|
if _has_attribute?(name)
"#{name}: #{attribute_for_inspect(name)}"
end
end.join(", ")
else
"not initialized"
end
"#<#{self.class} #{inspection}>"
end
# Takes a PP and prettily prints this record to it, allowing you to get a nice result from <tt>pp record</tt>
# when pp is required.
def pretty_print(pp)
return super if custom_inspect_method_defined?
pp.object_address_group(self) do
if defined?(@attributes) && @attributes
attr_names = self.class.attribute_names.select { |name| _has_attribute?(name) }
pp.seplist(attr_names, proc { pp.text "," }) do |attr_name|
pp.breakable " "
pp.group(1) do
pp.text attr_name
pp.text ":"
pp.breakable
value = _read_attribute(attr_name)
value = inspection_filter.filter_param(attr_name, value) unless value.nil?
pp.pp value
end
end
else
pp.breakable " "
pp.text "not initialized"
end
end
end
##
# :method: slice
#
# :call-seq: slice(*methods)
#
# Returns a hash of the given methods with their names as keys and returned
# values as values.
#
#--
# Implemented by ActiveModel::Access#slice.
##
# :method: values_at
#
# :call-seq: values_at(*methods)
#
# Returns an array of the values returned by the given methods.
#
#--
# Implemented by ActiveModel::Access#values_at.
private
# +Array#flatten+ will call +#to_ary+ (recursively) on each of the elements of
# the array, and then rescues from the possible +NoMethodError+. If those elements are
# +ActiveRecord::Base+'s, then this triggers the various +method_missing+'s that we have,
# which significantly impacts upon performance.
#
# So we can avoid the +method_missing+ hit by explicitly defining +#to_ary+ as +nil+ here.
#
# See also https://tenderlovemaking.com/2011/06/28/til-its-ok-to-return-nil-from-to_ary.html
def to_ary
nil
end
def init_internals
@readonly = false
@previously_new_record = false
@destroyed = false
@marked_for_destruction = false
@destroyed_by_association = nil
@_start_transaction_state = nil
klass = self.class
@primary_key = klass.primary_key
@strict_loading = klass.strict_loading_by_default
@strict_loading_mode = :all
klass.define_attribute_methods
end
def initialize_internals_callback
end
def custom_inspect_method_defined?
self.class.instance_method(:inspect).owner != ActiveRecord::Base.instance_method(:inspect).owner
end
class InspectionMask < DelegateClass(::String)
def pretty_print(pp)
pp.text __getobj__
end
end
private_constant :InspectionMask
def inspection_filter
self.class.inspection_filter
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Nodes
class Count < Arel::Nodes::Function
def initialize(expr, distinct = false, aliaz = nil)
super(expr, aliaz)
@distinct = distinct
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
# = Active Record Counter Cache
module CounterCache
extend ActiveSupport::Concern
module ClassMethods
# Resets one or more counter caches to their correct value using an SQL
# count query. This is useful when adding new counter caches, or if the
# counter has been corrupted or modified directly by SQL.
#
# ==== Parameters
#
# * +id+ - The id of the object you wish to reset a counter on.
# * +counters+ - One or more association counters to reset. Association name or counter name can be given.
# * <tt>:touch</tt> - Touch timestamp columns when updating.
# Pass +true+ to touch +updated_at+ and/or +updated_on+. Pass a symbol to
# touch that column or an array of symbols to touch just those ones.
#
# ==== Examples
#
# # For the Post with id #1, reset the comments_count
# Post.reset_counters(1, :comments)
#
# # Like above, but also touch the +updated_at+ and/or +updated_on+
# # attributes.
# Post.reset_counters(1, :comments, touch: true)
def reset_counters(id, *counters, touch: nil)
object = find(id)
counters.each do |counter_association|
has_many_association = _reflect_on_association(counter_association)
unless has_many_association
has_many = reflect_on_all_associations(:has_many)
has_many_association = has_many.find { |association| association.counter_cache_column && association.counter_cache_column.to_sym == counter_association.to_sym }
counter_association = has_many_association.plural_name if has_many_association
end
raise ArgumentError, "'#{name}' has no association called '#{counter_association}'" unless has_many_association
if has_many_association.is_a? ActiveRecord::Reflection::ThroughReflection
has_many_association = has_many_association.through_reflection
end
foreign_key = has_many_association.foreign_key.to_s
child_class = has_many_association.klass
reflection = child_class._reflections.values.find { |e| e.belongs_to? && e.foreign_key.to_s == foreign_key && e.options[:counter_cache].present? }
counter_name = reflection.counter_cache_column
updates = { counter_name => object.send(counter_association).count(:all) }
if touch
names = touch if touch != true
names = Array.wrap(names)
options = names.extract_options!
touch_updates = touch_attributes_with_time(*names, **options)
updates.merge!(touch_updates)
end
unscoped.where(primary_key => object.id).update_all(updates)
end
true
end
# A generic "counter updater" implementation, intended primarily to be
# used by #increment_counter and #decrement_counter, but which may also
# be useful on its own. It simply does a direct SQL update for the record
# with the given ID, altering the given hash of counters by the amount
# given by the corresponding value:
#
# ==== Parameters
#
# * +id+ - The id of the object you wish to update a counter on or an array of ids.
# * +counters+ - A Hash containing the names of the fields
# to update as keys and the amount to update the field by as values.
# * <tt>:touch</tt> option - Touch timestamp columns when updating.
# If attribute names are passed, they are updated along with updated_at/on
# attributes.
#
# ==== Examples
#
# # For the Post with id of 5, decrement the comment_count by 1, and
# # increment the action_count by 1
# Post.update_counters 5, comment_count: -1, action_count: 1
# # Executes the following SQL:
# # UPDATE posts
# # SET comment_count = COALESCE(comment_count, 0) - 1,
# # action_count = COALESCE(action_count, 0) + 1
# # WHERE id = 5
#
# # For the Posts with id of 10 and 15, increment the comment_count by 1
# Post.update_counters [10, 15], comment_count: 1
# # Executes the following SQL:
# # UPDATE posts
# # SET comment_count = COALESCE(comment_count, 0) + 1
# # WHERE id IN (10, 15)
#
# # For the Posts with id of 10 and 15, increment the comment_count by 1
# # and update the updated_at value for each counter.
# Post.update_counters [10, 15], comment_count: 1, touch: true
# # Executes the following SQL:
# # UPDATE posts
# # SET comment_count = COALESCE(comment_count, 0) + 1,
# # `updated_at` = '2016-10-13T09:59:23-05:00'
# # WHERE id IN (10, 15)
def update_counters(id, counters)
unscoped.where!(primary_key => id).update_counters(counters)
end
# Increment a numeric field by one, via a direct SQL update.
#
# This method is used primarily for maintaining counter_cache columns that are
# used to store aggregate values. For example, a +DiscussionBoard+ may cache
# posts_count and comments_count to avoid running an SQL query to calculate the
# number of posts and comments there are, each time it is displayed.
#
# ==== Parameters
#
# * +counter_name+ - The name of the field that should be incremented.
# * +id+ - The id of the object that should be incremented or an array of ids.
# * <tt>:touch</tt> - Touch timestamp columns when updating.
# Pass +true+ to touch +updated_at+ and/or +updated_on+. Pass a symbol to
# touch that column or an array of symbols to touch just those ones.
#
# ==== Examples
#
# # Increment the posts_count column for the record with an id of 5
# DiscussionBoard.increment_counter(:posts_count, 5)
#
# # Increment the posts_count column for the record with an id of 5
# # and update the updated_at value.
# DiscussionBoard.increment_counter(:posts_count, 5, touch: true)
def increment_counter(counter_name, id, touch: nil)
update_counters(id, counter_name => 1, touch: touch)
end
# Decrement a numeric field by one, via a direct SQL update.
#
# This works the same as #increment_counter but reduces the column value by
# 1 instead of increasing it.
#
# ==== Parameters
#
# * +counter_name+ - The name of the field that should be decremented.
# * +id+ - The id of the object that should be decremented or an array of ids.
# * <tt>:touch</tt> - Touch timestamp columns when updating.
# Pass +true+ to touch +updated_at+ and/or +updated_on+. Pass a symbol to
# touch that column or an array of symbols to touch just those ones.
#
# ==== Examples
#
# # Decrement the posts_count column for the record with an id of 5
# DiscussionBoard.decrement_counter(:posts_count, 5)
#
# # Decrement the posts_count column for the record with an id of 5
# # and update the updated_at value.
# DiscussionBoard.decrement_counter(:posts_count, 5, touch: true)
def decrement_counter(counter_name, id, touch: nil)
update_counters(id, counter_name => -1, touch: touch)
end
end
private
def _create_record(attribute_names = self.attribute_names)
id = super
each_counter_cached_associations do |association|
association.increment_counters
end
id
end
def destroy_row
affected_rows = super
if affected_rows > 0
each_counter_cached_associations do |association|
foreign_key = association.reflection.foreign_key.to_sym
unless destroyed_by_association && destroyed_by_association.foreign_key.to_sym == foreign_key
association.decrement_counters
end
end
end
affected_rows
end
def each_counter_cached_associations
_reflections.each do |name, reflection|
yield association(name.to_sym) if reflection.belongs_to? && reflection.counter_cache_column
end
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
###
# FIXME hopefully we can remove this
module Crud
def compile_insert(values)
im = create_insert
im.insert values
im
end
def create_insert
InsertManager.new
end
def compile_update(
values,
key = nil,
having_clause = nil,
group_values_columns = []
)
um = UpdateManager.new(source)
um.set(values)
um.take(limit)
um.offset(offset)
um.order(*orders)
um.wheres = constraints
um.key = key
um.group(group_values_columns) unless group_values_columns.empty?
um.having(having_clause) unless having_clause.nil?
um
end
def compile_delete(key = nil, having_clause = nil, group_values_columns = [])
dm = DeleteManager.new(source)
dm.take(limit)
dm.offset(offset)
dm.order(*orders)
dm.wheres = constraints
dm.key = key
dm.group(group_values_columns) unless group_values_columns.empty?
dm.having(having_clause) unless having_clause.nil?
dm
end
end
end
# frozen_string_literal: true
module ActiveRecord
class DatabaseConfigurations
# ActiveRecord::Base.configurations will return either a HashConfig or
# UrlConfig respectively. It will never return a DatabaseConfig object,
# as this is the parent class for the types of database configuration objects.
class DatabaseConfig # :nodoc:
attr_reader :env_name, :name
attr_accessor :owner_name
def initialize(env_name, name)
@env_name = env_name
@name = name
end
def adapter_method
"#{adapter}_connection"
end
def host
raise NotImplementedError
end
def database
raise NotImplementedError
end
def _database=(database)
raise NotImplementedError
end
def adapter
raise NotImplementedError
end
def pool
raise NotImplementedError
end
def min_threads
raise NotImplementedError
end
def max_threads
raise NotImplementedError
end
def max_queue
raise NotImplementedError
end
def checkout_timeout
raise NotImplementedError
end
def reaping_frequency
raise NotImplementedError
end
def idle_timeout
raise NotImplementedError
end
def replica?
raise NotImplementedError
end
def migrations_paths
raise NotImplementedError
end
def for_current_env?
env_name == ActiveRecord::ConnectionHandling::DEFAULT_ENV.call
end
def schema_cache_path
raise NotImplementedError
end
end
end
end
# frozen_string_literal: true
require "uri"
require "active_record/database_configurations/database_config"
require "active_record/database_configurations/hash_config"
require "active_record/database_configurations/url_config"
require "active_record/database_configurations/connection_url_resolver"
module ActiveRecord
# ActiveRecord::DatabaseConfigurations returns an array of DatabaseConfig
# objects (either a HashConfig or UrlConfig) that are constructed from the
# application's database configuration hash or URL string.
class DatabaseConfigurations
class InvalidConfigurationError < StandardError; end
attr_reader :configurations
delegate :any?, to: :configurations
def initialize(configurations = {})
@configurations = build_configs(configurations)
end
# Collects the configs for the environment and optionally the specification
# name passed in. To include replica configurations pass <tt>include_hidden: true</tt>.
#
# If a name is provided a single DatabaseConfig object will be
# returned, otherwise an array of DatabaseConfig objects will be
# returned that corresponds with the environment and type requested.
#
# ==== Options
#
# * <tt>env_name:</tt> The environment name. Defaults to +nil+ which will collect
# configs for all environments.
# * <tt>name:</tt> The db config name (i.e. primary, animals, etc.). Defaults
# to +nil+. If no +env_name+ is specified the config for the default env and the
# passed +name+ will be returned.
# * <tt>include_replicas:</tt> Deprecated. Determines whether to include replicas in
# the returned list. Most of the time we're only iterating over the write
# connection (i.e. migrations don't need to run for the write and read connection).
# Defaults to +false+.
# * <tt>include_hidden:</tt> Determines whether to include replicas and configurations
# hidden by +database_tasks: false+ in the returned list. Most of the time we're only
# iterating over the primary connections (i.e. migrations don't need to run for the
# write and read connection). Defaults to +false+.
def configs_for(env_name: nil, name: nil, include_replicas: false, include_hidden: false)
if include_replicas
include_hidden = include_replicas
ActiveSupport::Deprecation.warn("The kwarg `include_replicas` is deprecated in favor of `include_hidden`. When `include_hidden` is passed, configurations with `replica: true` or `database_tasks: false` will be returned. `include_replicas` will be removed in Rails 7.1.")
end
env_name ||= default_env if name
configs = env_with_configs(env_name)
unless include_hidden
configs = configs.select do |db_config|
db_config.database_tasks?
end
end
if name
configs.find do |db_config|
db_config.name == name
end
else
configs
end
end
# Returns a single DatabaseConfig object based on the requested environment.
#
# If the application has multiple databases +find_db_config+ will return
# the first DatabaseConfig for the environment.
def find_db_config(env)
configurations
.sort_by.with_index { |db_config, i| db_config.for_current_env? ? [0, i] : [1, i] }
.find do |db_config|
db_config.env_name == env.to_s ||
(db_config.for_current_env? && db_config.name == env.to_s)
end
end
# A primary configuration is one that is named primary or if there is
# no primary, the first configuration for an environment will be treated
# as primary. This is used as the "default" configuration and is used
# when the application needs to treat one configuration differently. For
# example, when Rails dumps the schema, the primary configuration's schema
# file will be named `schema.rb` instead of `primary_schema.rb`.
def primary?(name) # :nodoc:
return true if name == "primary"
first_config = find_db_config(default_env)
first_config && name == first_config.name
end
# Checks if the application's configurations are empty.
#
# Aliased to blank?
def empty?
configurations.empty?
end
alias :blank? :empty?
# Returns fully resolved connection, accepts hash, string or symbol.
# Always returns a DatabaseConfiguration::DatabaseConfig
#
# == Examples
#
# Symbol representing current environment.
#
# DatabaseConfigurations.new("production" => {}).resolve(:production)
# # => DatabaseConfigurations::HashConfig.new(env_name: "production", config: {})
#
# One layer deep hash of connection values.
#
# DatabaseConfigurations.new({}).resolve("adapter" => "sqlite3")
# # => DatabaseConfigurations::HashConfig.new(config: {"adapter" => "sqlite3"})
#
# Connection URL.
#
# DatabaseConfigurations.new({}).resolve("postgresql://localhost/foo")
# # => DatabaseConfigurations::UrlConfig.new(config: {"adapter" => "postgresql", "host" => "localhost", "database" => "foo"})
def resolve(config) # :nodoc:
return config if DatabaseConfigurations::DatabaseConfig === config
case config
when Symbol
resolve_symbol_connection(config)
when Hash, String
build_db_config_from_raw_config(default_env, "primary", config)
else
raise TypeError, "Invalid type for configuration. Expected Symbol, String, or Hash. Got #{config.inspect}"
end
end
private
def default_env
ActiveRecord::ConnectionHandling::DEFAULT_ENV.call.to_s
end
def env_with_configs(env = nil)
if env
configurations.select { |db_config| db_config.env_name == env }
else
configurations
end
end
def build_configs(configs)
return configs.configurations if configs.is_a?(DatabaseConfigurations)
return configs if configs.is_a?(Array)
db_configs = configs.flat_map do |env_name, config|
if config.is_a?(Hash) && config.values.all?(Hash)
walk_configs(env_name.to_s, config)
else
build_db_config_from_raw_config(env_name.to_s, "primary", config)
end
end
unless db_configs.find(&:for_current_env?)
db_configs << environment_url_config(default_env, "primary", {})
end
merge_db_environment_variables(default_env, db_configs.compact)
end
def walk_configs(env_name, config)
config.map do |name, sub_config|
build_db_config_from_raw_config(env_name, name.to_s, sub_config)
end
end
def resolve_symbol_connection(name)
if db_config = find_db_config(name)
db_config
else
raise AdapterNotSpecified, <<~MSG
The `#{name}` database is not configured for the `#{default_env}` environment.
Available database configurations are:
#{build_configuration_sentence}
MSG
end
end
def build_configuration_sentence
configs = configs_for(include_hidden: true)
configs.group_by(&:env_name).map do |env, config|
names = config.map(&:name)
if names.size > 1
"#{env}: #{names.join(", ")}"
else
env
end
end.join("\n")
end
def build_db_config_from_raw_config(env_name, name, config)
case config
when String
build_db_config_from_string(env_name, name, config)
when Hash
build_db_config_from_hash(env_name, name, config.symbolize_keys)
else
raise InvalidConfigurationError, "'{ #{env_name} => #{config} }' is not a valid configuration. Expected '#{config}' to be a URL string or a Hash."
end
end
def build_db_config_from_string(env_name, name, config)
url = config
uri = URI.parse(url)
if uri.scheme
UrlConfig.new(env_name, name, url)
else
raise InvalidConfigurationError, "'{ #{env_name} => #{config} }' is not a valid configuration. Expected '#{config}' to be a URL string or a Hash."
end
end
def build_db_config_from_hash(env_name, name, config)
if config.has_key?(:url)
url = config[:url]
config_without_url = config.dup
config_without_url.delete :url
UrlConfig.new(env_name, name, url, config_without_url)
else
HashConfig.new(env_name, name, config)
end
end
def merge_db_environment_variables(current_env, configs)
configs.map do |config|
next config if config.is_a?(UrlConfig) || config.env_name != current_env
url_config = environment_url_config(current_env, config.name, config.configuration_hash)
url_config || config
end
end
def environment_url_config(env, name, config)
url = environment_value_for(name)
return unless url
UrlConfig.new(env, name, url, config)
end
def environment_value_for(name)
name_env_key = "#{name.upcase}_DATABASE_URL"
url = ENV[name_env_key]
url ||= ENV["DATABASE_URL"] if name == "primary"
url
end
end
end
# frozen_string_literal: true
module ActiveRecord
module ConnectionAdapters # :nodoc:
module DatabaseLimits
def max_identifier_length # :nodoc:
64
end
# Returns the maximum length of a table alias.
def table_alias_length
max_identifier_length
end
# Returns the maximum length of an index name.
def index_name_length
max_identifier_length
end
private
def bind_params_length
65535
end
end
end
end
# frozen_string_literal: true
require "active_record/middleware/database_selector/resolver"
module ActiveRecord
module Middleware
# The DatabaseSelector Middleware provides a framework for automatically
# swapping from the primary to the replica database connection. Rails
# provides a basic framework to determine when to swap and allows for
# applications to write custom strategy classes to override the default
# behavior.
#
# The resolver class defines when the application should switch (i.e. read
# from the primary if a write occurred less than 2 seconds ago) and a
# resolver context class that sets a value that helps the resolver class
# decide when to switch.
#
# Rails default middleware uses the request's session to set a timestamp
# that informs the application when to read from a primary or read from a
# replica.
#
# To use the DatabaseSelector in your application with default settings add
# the following options to your environment config:
#
# # This require is only necessary when using `rails new app --minimal`
# require "active_support/core_ext/integer/time"
#
# class Application < Rails::Application
# config.active_record.database_selector = { delay: 2.seconds }
# config.active_record.database_resolver = ActiveRecord::Middleware::DatabaseSelector::Resolver
# config.active_record.database_resolver_context = ActiveRecord::Middleware::DatabaseSelector::Resolver::Session
# end
#
# New applications will include these lines commented out in the production.rb.
#
# The default behavior can be changed by setting the config options to a
# custom class:
#
# config.active_record.database_selector = { delay: 2.seconds }
# config.active_record.database_resolver = MyResolver
# config.active_record.database_resolver_context = MyResolver::MySession
class DatabaseSelector
def initialize(app, resolver_klass = nil, context_klass = nil, options = {})
@app = app
@resolver_klass = resolver_klass || Resolver
@context_klass = context_klass || Resolver::Session
@options = options
end
attr_reader :resolver_klass, :context_klass, :options
# Middleware that determines which database connection to use in a multiple
# database application.
def call(env)
request = ActionDispatch::Request.new(env)
select_database(request) do
@app.call(env)
end
end
private
def select_database(request, &blk)
context = context_klass.call(request)
resolver = resolver_klass.call(context, options)
response = if reading_request?(request)
resolver.read(&blk)
else
resolver.write(&blk)
end
resolver.update_context(response)
response
end
def reading_request?(request)
request.get? || request.head?
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module ConnectionAdapters
module SQLite3
module DatabaseStatements
READ_QUERY = ActiveRecord::ConnectionAdapters::AbstractAdapter.build_read_query_regexp(
:pragma
) # :nodoc:
private_constant :READ_QUERY
def write_query?(sql) # :nodoc:
!READ_QUERY.match?(sql)
rescue ArgumentError # Invalid encoding
!READ_QUERY.match?(sql.b)
end
def explain(arel, binds = [])
sql = "EXPLAIN QUERY PLAN #{to_sql(arel, binds)}"
SQLite3::ExplainPrettyPrinter.new.pp(exec_query(sql, "EXPLAIN", []))
end
def execute(sql, name = nil) # :nodoc:
sql = transform_query(sql)
check_if_write_query(sql)
materialize_transactions
mark_transaction_written_if_write(sql)
log(sql, name) do
ActiveSupport::Dependencies.interlock.permit_concurrent_loads do
@raw_connection.execute(sql)
end
end
end
def exec_query(sql, name = nil, binds = [], prepare: false, async: false) # :nodoc:
sql = transform_query(sql)
check_if_write_query(sql)
materialize_transactions
mark_transaction_written_if_write(sql)
type_casted_binds = type_casted_binds(binds)
log(sql, name, binds, type_casted_binds, async: async) do
ActiveSupport::Dependencies.interlock.permit_concurrent_loads do
# Don't cache statements if they are not prepared
unless prepare
stmt = @raw_connection.prepare(sql)
begin
cols = stmt.columns
unless without_prepared_statement?(binds)
stmt.bind_params(type_casted_binds)
end
records = stmt.to_a
ensure
stmt.close
end
else
stmt = @statements[sql] ||= @raw_connection.prepare(sql)
cols = stmt.columns
stmt.reset!
stmt.bind_params(type_casted_binds)
records = stmt.to_a
end
build_result(columns: cols, rows: records)
end
end
end
def exec_delete(sql, name = "SQL", binds = []) # :nodoc:
exec_query(sql, name, binds)
@raw_connection.changes
end
alias :exec_update :exec_delete
def begin_isolated_db_transaction(isolation) # :nodoc:
raise TransactionIsolationError, "SQLite3 only supports the `read_uncommitted` transaction isolation level" if isolation != :read_uncommitted
raise StandardError, "You need to enable the shared-cache mode in SQLite mode before attempting to change the transaction isolation level" unless shared_cache?
ActiveSupport::IsolatedExecutionState[:active_record_read_uncommitted] = @raw_connection.get_first_value("PRAGMA read_uncommitted")
@raw_connection.read_uncommitted = true
begin_db_transaction
end
def begin_db_transaction # :nodoc:
log("begin transaction", "TRANSACTION") { @raw_connection.transaction }
end
def commit_db_transaction # :nodoc:
log("commit transaction", "TRANSACTION") { @raw_connection.commit }
reset_read_uncommitted
end
def exec_rollback_db_transaction # :nodoc:
log("rollback transaction", "TRANSACTION") { @raw_connection.rollback }
reset_read_uncommitted
end
# https://stackoverflow.com/questions/17574784
# https://www.sqlite.org/lang_datefunc.html
HIGH_PRECISION_CURRENT_TIMESTAMP = Arel.sql("STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')").freeze # :nodoc:
private_constant :HIGH_PRECISION_CURRENT_TIMESTAMP
def high_precision_current_timestamp
HIGH_PRECISION_CURRENT_TIMESTAMP
end
private
def reset_read_uncommitted
read_uncommitted = ActiveSupport::IsolatedExecutionState[:active_record_read_uncommitted]
return unless read_uncommitted
@raw_connection.read_uncommitted = read_uncommitted
end
def execute_batch(statements, name = nil)
statements = statements.map { |sql| transform_query(sql) }
sql = combine_multi_statements(statements)
check_if_write_query(sql)
materialize_transactions
mark_transaction_written_if_write(sql)
log(sql, name) do
ActiveSupport::Dependencies.interlock.permit_concurrent_loads do
@raw_connection.execute_batch2(sql)
end
end
end
def last_inserted_id(result)
@raw_connection.last_insert_row_id
end
def build_fixture_statements(fixture_set)
fixture_set.flat_map do |table_name, fixtures|
next if fixtures.empty?
fixtures.map { |fixture| build_fixture_sql([fixture], table_name) }
end.compact
end
def build_truncate_statement(table_name)
"DELETE FROM #{quote_table_name(table_name)}"
end
end
end
end
end
# frozen_string_literal: true
require "active_record/database_configurations"
module ActiveRecord
module Tasks # :nodoc:
class DatabaseNotSupported < StandardError; end # :nodoc:
# ActiveRecord::Tasks::DatabaseTasks is a utility class, which encapsulates
# logic behind common tasks used to manage database and migrations.
#
# The tasks defined here are used with Rails commands provided by Active Record.
#
# In order to use DatabaseTasks, a few config values need to be set. All the needed
# config values are set by Rails already, so it's necessary to do it only if you
# want to change the defaults or when you want to use Active Record outside of Rails
# (in such case after configuring the database tasks, you can also use the rake tasks
# defined in Active Record).
#
# The possible config values are:
#
# * +env+: current environment (like Rails.env).
# * +database_configuration+: configuration of your databases (as in +config/database.yml+).
# * +db_dir+: your +db+ directory.
# * +fixtures_path+: a path to fixtures directory.
# * +migrations_paths+: a list of paths to directories with migrations.
# * +seed_loader+: an object which will load seeds, it needs to respond to the +load_seed+ method.
# * +root+: a path to the root of the application.
#
# Example usage of DatabaseTasks outside Rails could look as such:
#
# include ActiveRecord::Tasks
# DatabaseTasks.database_configuration = YAML.load_file('my_database_config.yml')
# DatabaseTasks.db_dir = 'db'
# # other settings...
#
# DatabaseTasks.create_current('production')
module DatabaseTasks
##
# :singleton-method:
# Extra flags passed to database CLI tool (mysqldump/pg_dump) when calling db:schema:dump
# It can be used as a string/array (the typical case) or a hash (when you use multiple adapters)
# Example:
# ActiveRecord::Tasks::DatabaseTasks.structure_dump_flags = {
# mysql2: ['--no-defaults', '--skip-add-drop-table'],
# postgres: '--no-tablespaces'
# }
mattr_accessor :structure_dump_flags, instance_accessor: false
##
# :singleton-method:
# Extra flags passed to database CLI tool when calling db:schema:load
# It can be used as a string/array (the typical case) or a hash (when you use multiple adapters)
mattr_accessor :structure_load_flags, instance_accessor: false
extend self
attr_writer :db_dir, :migrations_paths, :fixtures_path, :root, :env, :seed_loader
attr_accessor :database_configuration
LOCAL_HOSTS = ["127.0.0.1", "localhost"]
def check_protected_environments!
unless ENV["DISABLE_DATABASE_ENVIRONMENT_CHECK"]
current = ActiveRecord::Base.connection.migration_context.current_environment
stored = ActiveRecord::Base.connection.migration_context.last_stored_environment
if ActiveRecord::Base.connection.migration_context.protected_environment?
raise ActiveRecord::ProtectedEnvironmentError.new(stored)
end
if stored && stored != current
raise ActiveRecord::EnvironmentMismatchError.new(current: current, stored: stored)
end
end
rescue ActiveRecord::NoDatabaseError
end
def register_task(pattern, task)
@tasks ||= {}
@tasks[pattern] = task
end
register_task(/mysql/, "ActiveRecord::Tasks::MySQLDatabaseTasks")
register_task(/postgresql/, "ActiveRecord::Tasks::PostgreSQLDatabaseTasks")
register_task(/sqlite/, "ActiveRecord::Tasks::SQLiteDatabaseTasks")
def db_dir
@db_dir ||= Rails.application.config.paths["db"].first
end
def migrations_paths
@migrations_paths ||= Rails.application.paths["db/migrate"].to_a
end
def fixtures_path
@fixtures_path ||= if ENV["FIXTURES_PATH"]
File.join(root, ENV["FIXTURES_PATH"])
else
File.join(root, "test", "fixtures")
end
end
def root
@root ||= Rails.root
end
def env
@env ||= Rails.env
end
def name
@name ||= "primary"
end
def seed_loader
@seed_loader ||= Rails.application
end
def create(configuration, *arguments)
db_config = resolve_configuration(configuration)
database_adapter_for(db_config, *arguments).create
$stdout.puts "Created database '#{db_config.database}'" if verbose?
rescue DatabaseAlreadyExists
$stderr.puts "Database '#{db_config.database}' already exists" if verbose?
rescue Exception => error
$stderr.puts error
$stderr.puts "Couldn't create '#{db_config.database}' database. Please check your configuration."
raise
end
def create_all
old_pool = ActiveRecord::Base.connection_handler.retrieve_connection_pool(ActiveRecord::Base.connection_specification_name)
each_local_configuration { |db_config| create(db_config) }
if old_pool
ActiveRecord::Base.connection_handler.establish_connection(old_pool.db_config)
end
end
def setup_initial_database_yaml
return {} unless defined?(Rails)
begin
Rails.application.config.load_database_yaml
rescue
unless ActiveRecord.suppress_multiple_database_warning
$stderr.puts "Rails couldn't infer whether you are using multiple databases from your database.yml and can't generate the tasks for the non-primary databases. If you'd like to use this feature, please simplify your ERB."
end
{}
end
end
def for_each(databases)
return {} unless defined?(Rails)
database_configs = ActiveRecord::DatabaseConfigurations.new(databases).configs_for(env_name: Rails.env)
# if this is a single database application we don't want tasks for each primary database
return if database_configs.count == 1
database_configs.each do |db_config|
next unless db_config.database_tasks?
yield db_config.name
end
end
def raise_for_multi_db(environment = env, command:)
db_configs = configs_for(env_name: environment)
if db_configs.count > 1
dbs_list = []
db_configs.each do |db|
dbs_list << "#{command}:#{db.name}"
end
raise "You're using a multiple database application. To use `#{command}` you must run the namespaced task with a VERSION. Available tasks are #{dbs_list.to_sentence}."
end
end
def create_current(environment = env, name = nil)
each_current_configuration(environment, name) { |db_config| create(db_config) }
ActiveRecord::Base.establish_connection(environment.to_sym)
end
def prepare_all
seed = false
configs_for(env_name: env).each do |db_config|
ActiveRecord::Base.establish_connection(db_config)
# Skipped when no database
migrate
if ActiveRecord.dump_schema_after_migration
dump_schema(db_config, ActiveRecord.schema_format)
end
rescue ActiveRecord::NoDatabaseError
create_current(db_config.env_name, db_config.name)
if File.exist?(schema_dump_path(db_config))
load_schema(
db_config,
ActiveRecord.schema_format,
nil
)
else
migrate
end
seed = true
end
ActiveRecord::Base.establish_connection
load_seed if seed
end
def drop(configuration, *arguments)
db_config = resolve_configuration(configuration)
database_adapter_for(db_config, *arguments).drop
$stdout.puts "Dropped database '#{db_config.database}'" if verbose?
rescue ActiveRecord::NoDatabaseError
$stderr.puts "Database '#{db_config.database}' does not exist"
rescue Exception => error
$stderr.puts error
$stderr.puts "Couldn't drop database '#{db_config.database}'"
raise
end
def drop_all
each_local_configuration { |db_config| drop(db_config) }
end
def drop_current(environment = env)
each_current_configuration(environment) { |db_config| drop(db_config) }
end
def truncate_tables(db_config)
ActiveRecord::Base.establish_connection(db_config)
connection = ActiveRecord::Base.connection
connection.truncate_tables(*connection.tables)
end
private :truncate_tables
def truncate_all(environment = env)
configs_for(env_name: environment).each do |db_config|
truncate_tables(db_config)
end
end
def migrate(version = nil)
check_target_version
scope = ENV["SCOPE"]
verbose_was, Migration.verbose = Migration.verbose, verbose?
Base.connection.migration_context.migrate(target_version) do |migration|
if version.blank?
scope.blank? || scope == migration.scope
else
migration.version == version
end
end.tap do |migrations_ran|
Migration.write("No migrations ran. (using #{scope} scope)") if scope.present? && migrations_ran.empty?
end
ActiveRecord::Base.clear_cache!
ensure
Migration.verbose = verbose_was
end
def db_configs_with_versions(db_configs) # :nodoc:
db_configs_with_versions = Hash.new { |h, k| h[k] = [] }
db_configs.each do |db_config|
ActiveRecord::Base.establish_connection(db_config)
versions_to_run = ActiveRecord::Base.connection.migration_context.pending_migration_versions
target_version = ActiveRecord::Tasks::DatabaseTasks.target_version
versions_to_run.each do |version|
next if target_version && target_version != version
db_configs_with_versions[version] << db_config
end
end
db_configs_with_versions
end
def migrate_status
unless ActiveRecord::Base.connection.schema_migration.table_exists?
Kernel.abort "Schema migrations table does not exist yet."
end
# output
puts "\ndatabase: #{ActiveRecord::Base.connection_db_config.database}\n\n"
puts "#{'Status'.center(8)} #{'Migration ID'.ljust(14)} Migration Name"
puts "-" * 50
ActiveRecord::Base.connection.migration_context.migrations_status.each do |status, version, name|
puts "#{status.center(8)} #{version.ljust(14)} #{name}"
end
puts
end
def check_target_version
if target_version && !Migration.valid_version_format?(ENV["VERSION"])
raise "Invalid format of target version: `VERSION=#{ENV['VERSION']}`"
end
end
def target_version
ENV["VERSION"].to_i if ENV["VERSION"] && !ENV["VERSION"].empty?
end
def charset_current(env_name = env, db_name = name)
db_config = configs_for(env_name: env_name, name: db_name)
charset(db_config)
end
def charset(configuration, *arguments)
db_config = resolve_configuration(configuration)
database_adapter_for(db_config, *arguments).charset
end
def collation_current(env_name = env, db_name = name)
db_config = configs_for(env_name: env_name, name: db_name)
collation(db_config)
end
def collation(configuration, *arguments)
db_config = resolve_configuration(configuration)
database_adapter_for(db_config, *arguments).collation
end
def purge(configuration)
db_config = resolve_configuration(configuration)
database_adapter_for(db_config).purge
end
def purge_all
each_local_configuration { |db_config| purge(db_config) }
end
def purge_current(environment = env)
each_current_configuration(environment) { |db_config| purge(db_config) }
ActiveRecord::Base.establish_connection(environment.to_sym)
end
def structure_dump(configuration, *arguments)
db_config = resolve_configuration(configuration)
filename = arguments.delete_at(0)
flags = structure_dump_flags_for(db_config.adapter)
database_adapter_for(db_config, *arguments).structure_dump(filename, flags)
end
def structure_load(configuration, *arguments)
db_config = resolve_configuration(configuration)
filename = arguments.delete_at(0)
flags = structure_load_flags_for(db_config.adapter)
database_adapter_for(db_config, *arguments).structure_load(filename, flags)
end
def load_schema(db_config, format = ActiveRecord.schema_format, file = nil) # :nodoc:
file ||= schema_dump_path(db_config, format)
verbose_was, Migration.verbose = Migration.verbose, verbose? && ENV["VERBOSE"]
check_schema_file(file)
ActiveRecord::Base.establish_connection(db_config)
case format
when :ruby
load(file)
when :sql
structure_load(db_config, file)
else
raise ArgumentError, "unknown format #{format.inspect}"
end
ActiveRecord::InternalMetadata.create_table
ActiveRecord::InternalMetadata[:environment] = db_config.env_name
ActiveRecord::InternalMetadata[:schema_sha1] = schema_sha1(file)
ensure
Migration.verbose = verbose_was
end
def schema_up_to_date?(configuration, format = ActiveRecord.schema_format, file = nil)
db_config = resolve_configuration(configuration)
file ||= schema_dump_path(db_config)
return true unless File.exist?(file)
ActiveRecord::Base.establish_connection(db_config)
return false unless ActiveRecord::InternalMetadata.enabled?
return false unless ActiveRecord::InternalMetadata.table_exists?
ActiveRecord::InternalMetadata[:schema_sha1] == schema_sha1(file)
end
def reconstruct_from_schema(db_config, format = ActiveRecord.schema_format, file = nil) # :nodoc:
file ||= schema_dump_path(db_config, format)
check_schema_file(file)
ActiveRecord::Base.establish_connection(db_config)
if schema_up_to_date?(db_config, format, file)
truncate_tables(db_config)
else
purge(db_config)
load_schema(db_config, format, file)
end
rescue ActiveRecord::NoDatabaseError
create(db_config)
load_schema(db_config, format, file)
end
def dump_schema(db_config, format = ActiveRecord.schema_format) # :nodoc:
require "active_record/schema_dumper"
filename = schema_dump_path(db_config, format)
connection = ActiveRecord::Base.connection
FileUtils.mkdir_p(db_dir)
case format
when :ruby
File.open(filename, "w:utf-8") do |file|
ActiveRecord::SchemaDumper.dump(ActiveRecord::Base.connection, file)
end
when :sql
structure_dump(db_config, filename)
if connection.schema_migration.table_exists?
File.open(filename, "a") do |f|
f.puts connection.dump_schema_information
f.print "\n"
end
end
end
end
def schema_file_type(format = ActiveRecord.schema_format)
case format
when :ruby
"schema.rb"
when :sql
"structure.sql"
end
end
deprecate :schema_file_type
def schema_dump_path(db_config, format = ActiveRecord.schema_format)
return ENV["SCHEMA"] if ENV["SCHEMA"]
filename = db_config.schema_dump(format)
return unless filename
if File.dirname(filename) == ActiveRecord::Tasks::DatabaseTasks.db_dir
filename
else
File.join(ActiveRecord::Tasks::DatabaseTasks.db_dir, filename)
end
end
def cache_dump_filename(db_config_name, schema_cache_path: nil)
filename = if ActiveRecord::Base.configurations.primary?(db_config_name)
"schema_cache.yml"
else
"#{db_config_name}_schema_cache.yml"
end
schema_cache_path || ENV["SCHEMA_CACHE"] || File.join(ActiveRecord::Tasks::DatabaseTasks.db_dir, filename)
end
def load_schema_current(format = ActiveRecord.schema_format, file = nil, environment = env)
each_current_configuration(environment) do |db_config|
load_schema(db_config, format, file)
end
ActiveRecord::Base.establish_connection(environment.to_sym)
end
def check_schema_file(filename)
unless File.exist?(filename)
message = +%{#{filename} doesn't exist yet. Run `bin/rails db:migrate` to create it, then try again.}
message << %{ If you do not intend to use a database, you should instead alter #{Rails.root}/config/application.rb to limit the frameworks that will be loaded.} if defined?(::Rails.root)
Kernel.abort message
end
end
def load_seed
if seed_loader
seed_loader.load_seed
else
raise "You tried to load seed data, but no seed loader is specified. Please specify seed " \
"loader with ActiveRecord::Tasks::DatabaseTasks.seed_loader = your_seed_loader\n" \
"Seed loader should respond to load_seed method"
end
end
# Dumps the schema cache in YAML format for the connection into the file
#
# ==== Examples:
# ActiveRecord::Tasks::DatabaseTasks.dump_schema_cache(ActiveRecord::Base.connection, "tmp/schema_dump.yaml")
def dump_schema_cache(conn, filename)
conn.schema_cache.dump_to(filename)
end
def clear_schema_cache(filename)
FileUtils.rm_f filename, verbose: false
end
private
def configs_for(**options)
Base.configurations.configs_for(**options)
end
def resolve_configuration(configuration)
Base.configurations.resolve(configuration)
end
def verbose?
ENV["VERBOSE"] ? ENV["VERBOSE"] != "false" : true
end
# Create a new instance for the specified db configuration object
# For classes that have been converted to use db_config objects, pass a
# `DatabaseConfig`, otherwise pass a `Hash`
def database_adapter_for(db_config, *arguments)
klass = class_for_adapter(db_config.adapter)
converted = klass.respond_to?(:using_database_configurations?) && klass.using_database_configurations?
config = converted ? db_config : db_config.configuration_hash
klass.new(config, *arguments)
end
def class_for_adapter(adapter)
_key, task = @tasks.reverse_each.detect { |pattern, _task| adapter[pattern] }
unless task
raise DatabaseNotSupported, "Rake tasks not supported by '#{adapter}' adapter"
end
task.is_a?(String) ? task.constantize : task
end
def each_current_configuration(environment, name = nil)
environments = [environment]
environments << "test" if environment == "development" && !ENV["SKIP_TEST_DATABASE"] && !ENV["DATABASE_URL"]
environments.each do |env|
configs_for(env_name: env).each do |db_config|
next if name && name != db_config.name
yield db_config
end
end
end
def each_local_configuration
configs_for.each do |db_config|
next unless db_config.database
if local_database?(db_config)
yield db_config
else
$stderr.puts "This task only modifies local databases. #{db_config.database} is on a remote host."
end
end
end
def local_database?(db_config)
host = db_config.host
host.blank? || LOCAL_HOSTS.include?(host)
end
def schema_sha1(file)
OpenSSL::Digest::SHA1.hexdigest(File.read(file))
end
def structure_dump_flags_for(adapter)
if structure_dump_flags.is_a?(Hash)
structure_dump_flags[adapter.to_sym]
else
structure_dump_flags
end
end
def structure_load_flags_for(adapter)
if structure_load_flags.is_a?(Hash)
structure_load_flags[adapter.to_sym]
else
structure_load_flags
end
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Type
class Date < ActiveModel::Type::Date
include Internal::Timezone
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Type
class DateTime < ActiveModel::Type::DateTime
include Internal::Timezone
end
end
end
# frozen_string_literal: true
module ActiveRecord
module ConnectionAdapters
module PostgreSQL
module OID # :nodoc:
class Decimal < Type::Decimal # :nodoc:
def infinity(options = {})
BigDecimal("Infinity") * (options[:negative] ? -1 : 1)
end
end
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Type
class DecimalWithoutScale < ActiveModel::Type::BigInteger # :nodoc:
def type
:decimal
end
def type_cast_for_schema(value)
value.to_s.inspect
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module ConnectionAdapters # :nodoc:
module Deduplicable
extend ActiveSupport::Concern
module ClassMethods
def registry
@registry ||= {}
end
def new(*, **)
super.deduplicate
end
end
def deduplicate
self.class.registry[self] ||= deduplicated
end
alias :-@ :deduplicate
private
def deduplicated
freeze
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Scoping
class DefaultScope # :nodoc:
attr_reader :scope, :all_queries
def initialize(scope, all_queries = nil)
@scope = scope
@all_queries = all_queries
end
end
module Default
extend ActiveSupport::Concern
included do
# Stores the default scope for the class.
class_attribute :default_scopes, instance_writer: false, instance_predicate: false, default: []
class_attribute :default_scope_override, instance_writer: false, instance_predicate: false, default: nil
end
module ClassMethods
# Returns a scope for the model without the previously set scopes.
#
# class Post < ActiveRecord::Base
# def self.default_scope
# where(published: true)
# end
# end
#
# Post.all # Fires "SELECT * FROM posts WHERE published = true"
# Post.unscoped.all # Fires "SELECT * FROM posts"
# Post.where(published: false).unscoped.all # Fires "SELECT * FROM posts"
#
# This method also accepts a block. All queries inside the block will
# not use the previously set scopes.
#
# Post.unscoped {
# Post.limit(10) # Fires "SELECT * FROM posts LIMIT 10"
# }
def unscoped(&block)
block_given? ? relation.scoping(&block) : relation
end
# Are there attributes associated with this scope?
def scope_attributes? # :nodoc:
super || default_scopes.any? || respond_to?(:default_scope)
end
def before_remove_const # :nodoc:
self.current_scope = nil
end
# Checks if the model has any default scopes. If all_queries
# is set to true, the method will check if there are any
# default_scopes for the model where +all_queries+ is true.
def default_scopes?(all_queries: false)
if all_queries
self.default_scopes.any?(&:all_queries)
else
self.default_scopes.any?
end
end
private
# Use this macro in your model to set a default scope for all operations on
# the model.
#
# class Article < ActiveRecord::Base
# default_scope { where(published: true) }
# end
#
# Article.all # => SELECT * FROM articles WHERE published = true
#
# The #default_scope is also applied while creating/building a record.
# It is not applied while updating or deleting a record.
#
# Article.new.published # => true
# Article.create.published # => true
#
# To apply a #default_scope when updating or deleting a record, add
# <tt>all_queries: true</tt>:
#
# class Article < ActiveRecord::Base
# default_scope -> { where(blog_id: 1) }, all_queries: true
# end
#
# Applying a default scope to all queries will ensure that records
# are always queried by the additional conditions. Note that only
# where clauses apply, as it does not make sense to add order to
# queries that return a single object by primary key.
#
# Article.find(1).destroy
# => DELETE ... FROM `articles` where ID = 1 AND blog_id = 1;
#
# (You can also pass any object which responds to +call+ to the
# +default_scope+ macro, and it will be called when building the
# default scope.)
#
# If you use multiple #default_scope declarations in your model then
# they will be merged together:
#
# class Article < ActiveRecord::Base
# default_scope { where(published: true) }
# default_scope { where(rating: 'G') }
# end
#
# Article.all # => SELECT * FROM articles WHERE published = true AND rating = 'G'
#
# This is also the case with inheritance and module includes where the
# parent or module defines a #default_scope and the child or including
# class defines a second one.
#
# If you need to do more complex things with a default scope, you can
# alternatively define it as a class method:
#
# class Article < ActiveRecord::Base
# def self.default_scope
# # Should return a scope, you can call 'super' here etc.
# end
# end
def default_scope(scope = nil, all_queries: nil, &block) # :doc:
scope = block if block_given?
if scope.is_a?(Relation) || !scope.respond_to?(:call)
raise ArgumentError,
"Support for calling #default_scope without a block is removed. For example instead " \
"of `default_scope where(color: 'red')`, please use " \
"`default_scope { where(color: 'red') }`. (Alternatively you can just redefine " \
"self.default_scope.)"
end
default_scope = DefaultScope.new(scope, all_queries)
self.default_scopes += [default_scope]
end
def build_default_scope(relation = relation(), all_queries: nil)
return if abstract_class?
if default_scope_override.nil?
self.default_scope_override = !Base.is_a?(method(:default_scope).owner)
end
if default_scope_override
# The user has defined their own default scope method, so call that
evaluate_default_scope do
relation.scoping { default_scope }
end
elsif default_scopes.any?
evaluate_default_scope do
default_scopes.inject(relation) do |default_scope, scope_obj|
if execute_scope?(all_queries, scope_obj)
scope = scope_obj.scope.respond_to?(:to_proc) ? scope_obj.scope : scope_obj.scope.method(:call)
default_scope.instance_exec(&scope) || default_scope
end
end
end
end
end
# If all_queries is nil, only execute on select and insert queries.
#
# If all_queries is true, check if the default_scope object has
# all_queries set, then execute on all queries; select, insert, update
# and delete.
def execute_scope?(all_queries, default_scope_obj)
all_queries.nil? || all_queries && default_scope_obj.all_queries
end
def ignore_default_scope?
ScopeRegistry.ignore_default_scope(base_class)
end
def ignore_default_scope=(ignore)
ScopeRegistry.set_ignore_default_scope(base_class, ignore)
end
# The ignore_default_scope flag is used to prevent an infinite recursion
# situation where a default scope references a scope which has a default
# scope which references a scope...
def evaluate_default_scope
return if ignore_default_scope?
begin
self.ignore_default_scope = true
yield
ensure
self.ignore_default_scope = false
end
end
end
end
end
end
# frozen_string_literal: true
require "active_support/core_ext/string/inquiry"
module ActiveRecord
# == Delegated types
#
# Class hierarchies can map to relational database tables in many ways. Active Record, for example, offers
# purely abstract classes, where the superclass doesn't persist any attributes, and single-table inheritance,
# where all attributes from all levels of the hierarchy are represented in a single table. Both have their
# places, but neither are without their drawbacks.
#
# The problem with purely abstract classes is that all concrete subclasses must persist all the shared
# attributes themselves in their own tables (also known as class-table inheritance). This makes it hard to
# do queries across the hierarchy. For example, imagine you have the following hierarchy:
#
# Entry < ApplicationRecord
# Message < Entry
# Comment < Entry
#
# How do you show a feed that has both +Message+ and +Comment+ records, which can be easily paginated?
# Well, you can't! Messages are backed by a messages table and comments by a comments table. You can't
# pull from both tables at once and use a consistent OFFSET/LIMIT scheme.
#
# You can get around the pagination problem by using single-table inheritance, but now you're forced into
# a single mega table with all the attributes from all subclasses. No matter how divergent. If a Message
# has a subject, but the comment does not, well, now the comment does anyway! So STI works best when there's
# little divergence between the subclasses and their attributes.
#
# But there's a third way: Delegated types. With this approach, the "superclass" is a concrete class
# that is represented by its own table, where all the superclass attributes that are shared amongst all the
# "subclasses" are stored. And then each of the subclasses have their own individual tables for additional
# attributes that are particular to their implementation. This is similar to what's called multi-table
# inheritance in Django, but instead of actual inheritance, this approach uses delegation to form the
# hierarchy and share responsibilities.
#
# Let's look at that entry/message/comment example using delegated types:
#
# # Schema: entries[ id, account_id, creator_id, created_at, updated_at, entryable_type, entryable_id ]
# class Entry < ApplicationRecord
# belongs_to :account
# belongs_to :creator
# delegated_type :entryable, types: %w[ Message Comment ]
# end
#
# module Entryable
# extend ActiveSupport::Concern
#
# included do
# has_one :entry, as: :entryable, touch: true
# end
# end
#
# # Schema: messages[ id, subject, body ]
# class Message < ApplicationRecord
# include Entryable
# end
#
# # Schema: comments[ id, content ]
# class Comment < ApplicationRecord
# include Entryable
# end
#
# As you can see, neither +Message+ nor +Comment+ are meant to stand alone. Crucial metadata for both classes
# resides in the +Entry+ "superclass". But the +Entry+ absolutely can stand alone in terms of querying capacity
# in particular. You can now easily do things like:
#
# Account.find(1).entries.order(created_at: :desc).limit(50)
#
# Which is exactly what you want when displaying both comments and messages together. The entry itself can
# be rendered as its delegated type easily, like so:
#
# # entries/_entry.html.erb
# <%= render "entries/entryables/#{entry.entryable_name}", entry: entry %>
#
# # entries/entryables/_message.html.erb
# <div class="message">
# <div class="subject"><%= entry.message.subject %></div>
# <p><%= entry.message.body %></p>
# <i>Posted on <%= entry.created_at %> by <%= entry.creator.name %></i>
# </div>
#
# # entries/entryables/_comment.html.erb
# <div class="comment">
# <%= entry.creator.name %> said: <%= entry.comment.content %>
# </div>
#
# == Sharing behavior with concerns and controllers
#
# The entry "superclass" also serves as a perfect place to put all that shared logic that applies to both
# messages and comments, and which acts primarily on the shared attributes. Imagine:
#
# class Entry < ApplicationRecord
# include Eventable, Forwardable, Redeliverable
# end
#
# Which allows you to have controllers for things like +ForwardsController+ and +RedeliverableController+
# that both act on entries, and thus provide the shared functionality to both messages and comments.
#
# == Creating new records
#
# You create a new record that uses delegated typing by creating the delegator and delegatee at the same time,
# like so:
#
# Entry.create! entryable: Comment.new(content: "Hello!"), creator: Current.user
#
# If you need more complicated composition, or you need to perform dependent validation, you should build a factory
# method or class to take care of the complicated needs. This could be as simple as:
#
# class Entry < ApplicationRecord
# def self.create_with_comment(content, creator: Current.user)
# create! entryable: Comment.new(content: content), creator: creator
# end
# end
#
# == Adding further delegation
#
# The delegated type shouldn't just answer the question of what the underlying class is called. In fact, that's
# an anti-pattern most of the time. The reason you're building this hierarchy is to take advantage of polymorphism.
# So here's a simple example of that:
#
# class Entry < ApplicationRecord
# delegated_type :entryable, types: %w[ Message Comment ]
# delegate :title, to: :entryable
# end
#
# class Message < ApplicationRecord
# def title
# subject
# end
# end
#
# class Comment < ApplicationRecord
# def title
# content.truncate(20)
# end
# end
#
# Now you can list a bunch of entries, call +Entry#title+, and polymorphism will provide you with the answer.
#
# == Nested Attributes
#
# Enabling nested attributes on a delegated_type association allows you to
# create the entry and message in one go:
#
# class Entry < ApplicationRecord
# delegated_type :entryable, types: %w[ Message Comment ]
# accepts_nested_attributes_for :entryable
# end
#
# params = { entry: { entryable_type: 'Message', entryable_attributes: { subject: 'Smiling' } } }
# entry = Entry.create(params[:entry])
# entry.entryable.id # => 2
# entry.entryable.subject # => 'Smiling'
module DelegatedType
# Defines this as a class that'll delegate its type for the passed +role+ to the class references in +types+.
# That'll create a polymorphic +belongs_to+ relationship to that +role+, and it'll add all the delegated
# type convenience methods:
#
# class Entry < ApplicationRecord
# delegated_type :entryable, types: %w[ Message Comment ], dependent: :destroy
# end
#
# Entry#entryable_class # => +Message+ or +Comment+
# Entry#entryable_name # => "message" or "comment"
# Entry.messages # => Entry.where(entryable_type: "Message")
# Entry#message? # => true when entryable_type == "Message"
# Entry#message # => returns the message record, when entryable_type == "Message", otherwise nil
# Entry#message_id # => returns entryable_id, when entryable_type == "Message", otherwise nil
# Entry.comments # => Entry.where(entryable_type: "Comment")
# Entry#comment? # => true when entryable_type == "Comment"
# Entry#comment # => returns the comment record, when entryable_type == "Comment", otherwise nil
# Entry#comment_id # => returns entryable_id, when entryable_type == "Comment", otherwise nil
#
# You can also declare namespaced types:
#
# class Entry < ApplicationRecord
# delegated_type :entryable, types: %w[ Message Comment Access::NoticeMessage ], dependent: :destroy
# end
#
# Entry.access_notice_messages
# entry.access_notice_message
# entry.access_notice_message?
#
# === Options
#
# The +options+ are passed directly to the +belongs_to+ call, so this is where you declare +dependent+ etc.
# The following options can be included to specialize the behavior of the delegated type convenience methods.
#
# [:foreign_key]
# Specify the foreign key used for the convenience methods. By default this is guessed to be the passed
# +role+ with an "_id" suffix. So a class that defines a
# <tt>delegated_type :entryable, types: %w[ Message Comment ]</tt> association will use "entryable_id" as
# the default <tt>:foreign_key</tt>.
# [:primary_key]
# Specify the method that returns the primary key of associated object used for the convenience methods.
# By default this is +id+.
#
# Option examples:
# class Entry < ApplicationRecord
# delegated_type :entryable, types: %w[ Message Comment ], primary_key: :uuid, foreign_key: :entryable_uuid
# end
#
# Entry#message_uuid # => returns entryable_uuid, when entryable_type == "Message", otherwise nil
# Entry#comment_uuid # => returns entryable_uuid, when entryable_type == "Comment", otherwise nil
def delegated_type(role, types:, **options)
belongs_to role, options.delete(:scope), **options.merge(polymorphic: true)
define_delegated_type_methods role, types: types, options: options
end
private
def define_delegated_type_methods(role, types:, options:)
primary_key = options[:primary_key] || "id"
role_type = "#{role}_type"
role_id = options[:foreign_key] || "#{role}_id"
define_method "#{role}_class" do
public_send("#{role}_type").constantize
end
define_method "#{role}_name" do
public_send("#{role}_class").model_name.singular.inquiry
end
define_method "build_#{role}" do |*params|
public_send("#{role}=", public_send("#{role}_class").new(*params))
end
types.each do |type|
scope_name = type.tableize.tr("/", "_")
singular = scope_name.singularize
query = "#{singular}?"
scope scope_name, -> { where(role_type => type) }
define_method query do
public_send(role_type) == type
end
define_method singular do
public_send(role) if public_send(query)
end
define_method "#{singular}_#{primary_key}" do
public_send(role_id) if public_send(query)
end
end
end
end
end
# frozen_string_literal: true
require "mutex_m"
require "active_support/core_ext/module/delegation"
module ActiveRecord
module Delegation # :nodoc:
module DelegateCache # :nodoc:
def relation_delegate_class(klass)
@relation_delegate_cache[klass]
end
def initialize_relation_delegate_cache
@relation_delegate_cache = cache = {}
[
ActiveRecord::Relation,
ActiveRecord::Associations::CollectionProxy,
ActiveRecord::AssociationRelation,
ActiveRecord::DisableJoinsAssociationRelation
].each do |klass|
delegate = Class.new(klass) {
include ClassSpecificRelation
}
include_relation_methods(delegate)
mangled_name = klass.name.gsub("::", "_")
const_set mangled_name, delegate
private_constant mangled_name
cache[klass] = delegate
end
end
def inherited(child_class)
child_class.initialize_relation_delegate_cache
super
end
def generate_relation_method(method)
generated_relation_methods.generate_method(method)
end
protected
def include_relation_methods(delegate)
superclass.include_relation_methods(delegate) unless base_class?
delegate.include generated_relation_methods
end
private
def generated_relation_methods
@generated_relation_methods ||= GeneratedRelationMethods.new.tap do |mod|
const_set(:GeneratedRelationMethods, mod)
private_constant :GeneratedRelationMethods
end
end
end
class GeneratedRelationMethods < Module # :nodoc:
include Mutex_m
def generate_method(method)
synchronize do
return if method_defined?(method)
if /\A[a-zA-Z_]\w*[!?]?\z/.match?(method) && !DELEGATION_RESERVED_METHOD_NAMES.include?(method.to_s)
module_eval <<-RUBY, __FILE__, __LINE__ + 1
def #{method}(...)
scoping { klass.#{method}(...) }
end
RUBY
else
define_method(method) do |*args, &block|
scoping { klass.public_send(method, *args, &block) }
end
ruby2_keywords(method)
end
end
end
end
private_constant :GeneratedRelationMethods
extend ActiveSupport::Concern
# This module creates compiled delegation methods dynamically at runtime, which makes
# subsequent calls to that method faster by avoiding method_missing. The delegations
# may vary depending on the klass of a relation, so we create a subclass of Relation
# for each different klass, and the delegations are compiled into that subclass only.
delegate :to_xml, :encode_with, :length, :each, :join,
:[], :&, :|, :+, :-, :sample, :reverse, :rotate, :compact, :in_groups, :in_groups_of,
:to_sentence, :to_fs, :to_formatted_s, :as_json,
:shuffle, :split, :slice, :index, :rindex, to: :records
delegate :primary_key, :connection, to: :klass
module ClassSpecificRelation # :nodoc:
extend ActiveSupport::Concern
module ClassMethods # :nodoc:
def name
superclass.name
end
end
private
def method_missing(method, *args, &block)
if @klass.respond_to?(method)
@klass.generate_relation_method(method)
scoping { @klass.public_send(method, *args, &block) }
else
super
end
end
ruby2_keywords(:method_missing)
end
module ClassMethods # :nodoc:
def create(klass, *args, **kwargs)
relation_class_for(klass).new(klass, *args, **kwargs)
end
private
def relation_class_for(klass)
klass.relation_delegate_class(self)
end
end
private
def respond_to_missing?(method, _)
super || @klass.respond_to?(method)
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
class DeleteManager < Arel::TreeManager
include TreeManager::StatementMethods
def initialize(table = nil)
@ast = Nodes::DeleteStatement.new(table)
end
def from(relation)
@ast.relation = relation
self
end
def group(columns)
columns.each do |column|
column = Nodes::SqlLiteral.new(column) if String === column
column = Nodes::SqlLiteral.new(column.to_s) if Symbol === column
@ast.groups.push Nodes::Group.new column
end
self
end
def having(expr)
@ast.havings << expr
self
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Nodes
class DeleteStatement < Arel::Nodes::Node
attr_accessor :relation, :wheres, :groups, :havings, :orders, :limit, :offset, :key
def initialize(relation = nil, wheres = [])
super()
@relation = relation
@wheres = wheres
@groups = []
@havings = []
@orders = []
@limit = nil
@offset = nil
@key = nil
end
def initialize_copy(other)
super
@relation = @relation.clone if @relation
@wheres = @wheres.clone if @wheres
end
def hash
[self.class, @relation, @wheres, @orders, @limit, @offset, @key].hash
end
def eql?(other)
self.class == other.class &&
self.relation == other.relation &&
self.wheres == other.wheres &&
self.orders == other.orders &&
self.groups == other.groups &&
self.havings == other.havings &&
self.limit == other.limit &&
self.offset == other.offset &&
self.key == other.key
end
alias :== :eql?
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Encryption
# A KeyProvider that derives keys from passwords.
class DerivedSecretKeyProvider < KeyProvider
def initialize(passwords)
super(Array(passwords).collect { |password| Key.derive_from(password) })
end
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Nodes
class Descending < Ordering
def reverse
Ascending.new(expr)
end
def direction
:desc
end
def ascending?
false
end
def descending?
true
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
class DestroyAssociationAsyncError < StandardError
end
# Job to destroy the records associated with a destroyed record in background.
class DestroyAssociationAsyncJob < ActiveJob::Base
queue_as { ActiveRecord.queues[:destroy] }
discard_on ActiveJob::DeserializationError
def perform(
owner_model_name: nil, owner_id: nil,
association_class: nil, association_ids: nil, association_primary_key_column: nil,
ensuring_owner_was_method: nil
)
association_model = association_class.constantize
owner_class = owner_model_name.constantize
owner = owner_class.find_by(owner_class.primary_key.to_sym => owner_id)
if !owner_destroyed?(owner, ensuring_owner_was_method)
raise DestroyAssociationAsyncError, "owner record not destroyed"
end
association_model.where(association_primary_key_column => association_ids).find_each do |r|
r.destroy
end
end
private
def owner_destroyed?(owner, ensuring_owner_was_method)
!owner || (ensuring_owner_was_method && owner.public_send(ensuring_owner_was_method))
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Encryption
# A KeyProvider that derives keys from passwords.
class DeterministicKeyProvider < DerivedSecretKeyProvider
def initialize(password)
passwords = Array(password)
raise ActiveRecord::Encryption::Errors::Configuration, "Deterministic encryption keys can't be rotated" if passwords.length > 1
super(passwords)
end
end
end
end
# frozen_string_literal: true
require "active_support/core_ext/module/attribute_accessors"
module ActiveRecord
module AttributeMethods
module Dirty
extend ActiveSupport::Concern
include ActiveModel::Dirty
included do
if self < ::ActiveRecord::Timestamp
raise "You cannot include Dirty after Timestamp"
end
class_attribute :partial_updates, instance_writer: false, default: true
class_attribute :partial_inserts, instance_writer: false, default: true
# Attribute methods for "changed in last call to save?"
attribute_method_affix(prefix: "saved_change_to_", suffix: "?", parameters: "**options")
attribute_method_prefix("saved_change_to_", parameters: false)
attribute_method_suffix("_before_last_save", parameters: false)
# Attribute methods for "will change if I call save?"
attribute_method_affix(prefix: "will_save_change_to_", suffix: "?", parameters: "**options")
attribute_method_suffix("_change_to_be_saved", "_in_database", parameters: false)
end
module ClassMethods
def partial_writes
ActiveSupport::Deprecation.warn(<<-MSG.squish)
ActiveRecord::Base.partial_writes is deprecated and will be removed in Rails 7.1.
Use `partial_updates` and `partial_inserts` instead.
MSG
partial_updates && partial_inserts
end
def partial_writes?
ActiveSupport::Deprecation.warn(<<-MSG.squish)
`ActiveRecord::Base.partial_writes?` is deprecated and will be removed in Rails 7.1.
Use `partial_updates?` and `partial_inserts?` instead.
MSG
partial_updates? && partial_inserts?
end
def partial_writes=(value)
ActiveSupport::Deprecation.warn(<<-MSG.squish)
`ActiveRecord::Base.partial_writes=` is deprecated and will be removed in Rails 7.1.
Use `partial_updates=` and `partial_inserts=` instead.
MSG
self.partial_updates = self.partial_inserts = value
end
end
# <tt>reload</tt> the record and clears changed attributes.
def reload(*)
super.tap do
@mutations_before_last_save = nil
@mutations_from_database = nil
end
end
# Did this attribute change when we last saved?
#
# This method is useful in after callbacks to determine if an attribute
# was changed during the save that triggered the callbacks to run. It can
# be invoked as +saved_change_to_name?+ instead of
# <tt>saved_change_to_attribute?("name")</tt>.
#
# ==== Options
#
# +from+ When passed, this method will return false unless the original
# value is equal to the given option
#
# +to+ When passed, this method will return false unless the value was
# changed to the given value
def saved_change_to_attribute?(attr_name, **options)
mutations_before_last_save.changed?(attr_name.to_s, **options)
end
# Returns the change to an attribute during the last save. If the
# attribute was changed, the result will be an array containing the
# original value and the saved value.
#
# This method is useful in after callbacks, to see the change in an
# attribute during the save that triggered the callbacks to run. It can be
# invoked as +saved_change_to_name+ instead of
# <tt>saved_change_to_attribute("name")</tt>.
def saved_change_to_attribute(attr_name)
mutations_before_last_save.change_to_attribute(attr_name.to_s)
end
# Returns the original value of an attribute before the last save.
#
# This method is useful in after callbacks to get the original value of an
# attribute before the save that triggered the callbacks to run. It can be
# invoked as +name_before_last_save+ instead of
# <tt>attribute_before_last_save("name")</tt>.
def attribute_before_last_save(attr_name)
mutations_before_last_save.original_value(attr_name.to_s)
end
# Did the last call to +save+ have any changes to change?
def saved_changes?
mutations_before_last_save.any_changes?
end
# Returns a hash containing all the changes that were just saved.
def saved_changes
mutations_before_last_save.changes
end
# Will this attribute change the next time we save?
#
# This method is useful in validations and before callbacks to determine
# if the next call to +save+ will change a particular attribute. It can be
# invoked as +will_save_change_to_name?+ instead of
# <tt>will_save_change_to_attribute?("name")</tt>.
#
# ==== Options
#
# +from+ When passed, this method will return false unless the original
# value is equal to the given option
#
# +to+ When passed, this method will return false unless the value will be
# changed to the given value
def will_save_change_to_attribute?(attr_name, **options)
mutations_from_database.changed?(attr_name.to_s, **options)
end
# Returns the change to an attribute that will be persisted during the
# next save.
#
# This method is useful in validations and before callbacks, to see the
# change to an attribute that will occur when the record is saved. It can
# be invoked as +name_change_to_be_saved+ instead of
# <tt>attribute_change_to_be_saved("name")</tt>.
#
# If the attribute will change, the result will be an array containing the
# original value and the new value about to be saved.
def attribute_change_to_be_saved(attr_name)
mutations_from_database.change_to_attribute(attr_name.to_s)
end
# Returns the value of an attribute in the database, as opposed to the
# in-memory value that will be persisted the next time the record is
# saved.
#
# This method is useful in validations and before callbacks, to see the
# original value of an attribute prior to any changes about to be
# saved. It can be invoked as +name_in_database+ instead of
# <tt>attribute_in_database("name")</tt>.
def attribute_in_database(attr_name)
mutations_from_database.original_value(attr_name.to_s)
end
# Will the next call to +save+ have any changes to persist?
def has_changes_to_save?
mutations_from_database.any_changes?
end
# Returns a hash containing all the changes that will be persisted during
# the next save.
def changes_to_save
mutations_from_database.changes
end
# Returns an array of the names of any attributes that will change when
# the record is next saved.
def changed_attribute_names_to_save
mutations_from_database.changed_attribute_names
end
# Returns a hash of the attributes that will change when the record is
# next saved.
#
# The hash keys are the attribute names, and the hash values are the
# original attribute values in the database (as opposed to the in-memory
# values about to be saved).
def attributes_in_database
mutations_from_database.changed_values
end
private
def _touch_row(attribute_names, time)
@_touch_attr_names = Set.new(attribute_names)
affected_rows = super
if @_skip_dirty_tracking ||= false
clear_attribute_changes(@_touch_attr_names)
return affected_rows
end
changes = {}
@attributes.keys.each do |attr_name|
next if @_touch_attr_names.include?(attr_name)
if attribute_changed?(attr_name)
changes[attr_name] = _read_attribute(attr_name)
_write_attribute(attr_name, attribute_was(attr_name))
clear_attribute_change(attr_name)
end
end
changes_applied
changes.each { |attr_name, value| _write_attribute(attr_name, value) }
affected_rows
ensure
@_touch_attr_names, @_skip_dirty_tracking = nil, nil
end
def _update_record(attribute_names = attribute_names_for_partial_updates)
affected_rows = super
changes_applied
affected_rows
end
def _create_record(attribute_names = attribute_names_for_partial_inserts)
id = super
changes_applied
id
end
def attribute_names_for_partial_updates
partial_updates? ? changed_attribute_names_to_save : attribute_names
end
def attribute_names_for_partial_inserts
if partial_inserts?
changed_attribute_names_to_save
else
attribute_names.reject do |attr_name|
if column_for_attribute(attr_name).default_function
!attribute_changed?(attr_name)
end
end
end
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
class DisableJoinsAssociationRelation < Relation # :nodoc:
attr_reader :ids, :key
def initialize(klass, key, ids)
@ids = ids.uniq
@key = key
super(klass)
end
def limit(value)
records.take(value)
end
def first(limit = nil)
if limit
records.limit(limit).first
else
records.first
end
end
def load
super
records = @records
records_by_id = records.group_by do |record|
record[key]
end
records = ids.flat_map { |id| records_by_id[id.to_i] }
records.compact!
@records = records
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Associations
class DisableJoinsAssociationScope < AssociationScope # :nodoc:
def scope(association)
source_reflection = association.reflection
owner = association.owner
unscoped = association.klass.unscoped
reverse_chain = get_chain(source_reflection, association, unscoped.alias_tracker).reverse
last_reflection, last_ordered, last_join_ids = last_scope_chain(reverse_chain, owner)
add_constraints(last_reflection, last_reflection.join_primary_key, last_join_ids, owner, last_ordered)
end
private
def last_scope_chain(reverse_chain, owner)
first_item = reverse_chain.shift
first_scope = [first_item, false, [owner._read_attribute(first_item.join_foreign_key)]]
reverse_chain.inject(first_scope) do |(reflection, ordered, join_ids), next_reflection|
key = reflection.join_primary_key
records = add_constraints(reflection, key, join_ids, owner, ordered)
foreign_key = next_reflection.join_foreign_key
record_ids = records.pluck(foreign_key)
records_ordered = records && records.order_values.any?
[next_reflection, records_ordered, record_ids]
end
end
def add_constraints(reflection, key, join_ids, owner, ordered)
scope = reflection.build_scope(reflection.aliased_table).where(key => join_ids)
relation = reflection.klass.scope_for_association
scope.merge!(
relation.except(:select, :create_with, :includes, :preload, :eager_load, :joins, :left_outer_joins)
)
scope = reflection.constraints.inject(scope) do |memo, scope_chain_item|
item = eval_scope(reflection, scope_chain_item, owner)
scope.unscope!(*item.unscope_values)
scope.where_clause += item.where_clause
scope.order_values = item.order_values | scope.order_values
scope
end
if scope.order_values.empty? && ordered
split_scope = DisableJoinsAssociationRelation.create(scope.klass, key, join_ids)
split_scope.where_clause += scope.where_clause
split_scope
else
scope
end
end
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Visitors
class Dot < Arel::Visitors::Visitor
class Node # :nodoc:
attr_accessor :name, :id, :fields
def initialize(name, id, fields = [])
@name = name
@id = id
@fields = fields
end
end
class Edge < Struct.new :name, :from, :to # :nodoc:
end
def initialize
super()
@nodes = []
@edges = []
@node_stack = []
@edge_stack = []
@seen = {}
end
def accept(object, collector)
visit object
collector << to_dot
end
private
def visit_Arel_Nodes_Function(o)
visit_edge o, "expressions"
visit_edge o, "distinct"
visit_edge o, "alias"
end
def visit_Arel_Nodes_Unary(o)
visit_edge o, "expr"
end
def visit_Arel_Nodes_Binary(o)
visit_edge o, "left"
visit_edge o, "right"
end
def visit_Arel_Nodes_UnaryOperation(o)
visit_edge o, "operator"
visit_edge o, "expr"
end
def visit_Arel_Nodes_InfixOperation(o)
visit_edge o, "operator"
visit_edge o, "left"
visit_edge o, "right"
end
def visit__regexp(o)
visit_edge o, "left"
visit_edge o, "right"
visit_edge o, "case_sensitive"
end
alias :visit_Arel_Nodes_Regexp :visit__regexp
alias :visit_Arel_Nodes_NotRegexp :visit__regexp
def visit_Arel_Nodes_Ordering(o)
visit_edge o, "expr"
end
def visit_Arel_Nodes_TableAlias(o)
visit_edge o, "name"
visit_edge o, "relation"
end
def visit_Arel_Nodes_Count(o)
visit_edge o, "expressions"
visit_edge o, "distinct"
end
def visit_Arel_Nodes_ValuesList(o)
visit_edge o, "rows"
end
def visit_Arel_Nodes_StringJoin(o)
visit_edge o, "left"
end
def visit_Arel_Nodes_Window(o)
visit_edge o, "partitions"
visit_edge o, "orders"
visit_edge o, "framing"
end
def visit_Arel_Nodes_NamedWindow(o)
visit_edge o, "partitions"
visit_edge o, "orders"
visit_edge o, "framing"
visit_edge o, "name"
end
def visit__no_edges(o)
# intentionally left blank
end
alias :visit_Arel_Nodes_CurrentRow :visit__no_edges
alias :visit_Arel_Nodes_Distinct :visit__no_edges
def visit_Arel_Nodes_Extract(o)
visit_edge o, "expressions"
visit_edge o, "alias"
end
def visit_Arel_Nodes_NamedFunction(o)
visit_edge o, "name"
visit_edge o, "expressions"
visit_edge o, "distinct"
visit_edge o, "alias"
end
def visit_Arel_Nodes_InsertStatement(o)
visit_edge o, "relation"
visit_edge o, "columns"
visit_edge o, "values"
visit_edge o, "select"
end
def visit_Arel_Nodes_SelectCore(o)
visit_edge o, "source"
visit_edge o, "projections"
visit_edge o, "wheres"
visit_edge o, "windows"
visit_edge o, "groups"
visit_edge o, "comment"
visit_edge o, "havings"
visit_edge o, "set_quantifier"
visit_edge o, "optimizer_hints"
end
def visit_Arel_Nodes_SelectStatement(o)
visit_edge o, "cores"
visit_edge o, "limit"
visit_edge o, "orders"
visit_edge o, "offset"
visit_edge o, "lock"
visit_edge o, "with"
end
def visit_Arel_Nodes_UpdateStatement(o)
visit_edge o, "relation"
visit_edge o, "wheres"
visit_edge o, "values"
visit_edge o, "orders"
visit_edge o, "limit"
visit_edge o, "offset"
visit_edge o, "key"
end
def visit_Arel_Nodes_DeleteStatement(o)
visit_edge o, "relation"
visit_edge o, "wheres"
visit_edge o, "orders"
visit_edge o, "limit"
visit_edge o, "offset"
visit_edge o, "key"
end
def visit_Arel_Table(o)
visit_edge o, "name"
end
def visit_Arel_Nodes_Casted(o)
visit_edge o, "value"
visit_edge o, "attribute"
end
def visit_Arel_Nodes_HomogeneousIn(o)
visit_edge o, "values"
visit_edge o, "type"
visit_edge o, "attribute"
end
def visit_Arel_Attributes_Attribute(o)
visit_edge o, "relation"
visit_edge o, "name"
end
def visit__children(o)
o.children.each_with_index do |child, i|
edge(i) { visit child }
end
end
alias :visit_Arel_Nodes_And :visit__children
alias :visit_Arel_Nodes_With :visit__children
def visit_String(o)
@node_stack.last.fields << o
end
alias :visit_Time :visit_String
alias :visit_Date :visit_String
alias :visit_DateTime :visit_String
alias :visit_NilClass :visit_String
alias :visit_TrueClass :visit_String
alias :visit_FalseClass :visit_String
alias :visit_Integer :visit_String
alias :visit_BigDecimal :visit_String
alias :visit_Float :visit_String
alias :visit_Symbol :visit_String
alias :visit_Arel_Nodes_SqlLiteral :visit_String
def visit_Arel_Nodes_BindParam(o)
visit_edge(o, "value")
end
def visit_ActiveModel_Attribute(o)
visit_edge(o, "value_before_type_cast")
end
def visit_Hash(o)
o.each_with_index do |pair, i|
edge("pair_#{i}") { visit pair }
end
end
def visit_Array(o)
o.each_with_index do |member, i|
edge(i) { visit member }
end
end
alias :visit_Set :visit_Array
def visit_Arel_Nodes_Comment(o)
visit_edge(o, "values")
end
def visit_Arel_Nodes_Case(o)
visit_edge(o, "case")
visit_edge(o, "conditions")
visit_edge(o, "default")
end
def visit_edge(o, method)
edge(method) { visit o.send(method) }
end
def visit(o)
if node = @seen[o.object_id]
@edge_stack.last.to = node
return
end
node = Node.new(o.class.name, o.object_id)
@seen[node.id] = node
@nodes << node
with_node node do
super
end
end
def edge(name)
edge = Edge.new(name, @node_stack.last)
@edge_stack.push edge
@edges << edge
yield
@edge_stack.pop
end
def with_node(node)
if edge = @edge_stack.last
edge.to = node
end
@node_stack.push node
yield
@node_stack.pop
end
def quote(string)
string.to_s.gsub('"', '\"')
end
def to_dot
"digraph \"Arel\" {\nnode [width=0.375,height=0.25,shape=record];\n" +
@nodes.map { |node|
label = "<f0>#{node.name}"
node.fields.each_with_index do |field, i|
label += "|<f#{i + 1}>#{quote field}"
end
"#{node.id} [label=\"#{label}\"];"
}.join("\n") + "\n" + @edges.map { |edge|
"#{edge.from.id} -> #{edge.to.id} [label=\"#{edge.name}\"];"
}.join("\n") + "\n}"
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module DynamicMatchers # :nodoc:
private
def respond_to_missing?(name, _)
if self == Base
super
else
match = Method.match(self, name)
match && match.valid? || super
end
end
def method_missing(name, *arguments, &block)
match = Method.match(self, name)
if match && match.valid?
match.define
send(name, *arguments, &block)
else
super
end
end
class Method
@matchers = []
class << self
attr_reader :matchers
def match(model, name)
klass = matchers.find { |k| k.pattern.match?(name) }
klass.new(model, name) if klass
end
def pattern
@pattern ||= /\A#{prefix}_([_a-zA-Z]\w*)#{suffix}\Z/
end
def prefix
raise NotImplementedError
end
def suffix
""
end
end
attr_reader :model, :name, :attribute_names
def initialize(model, method_name)
@model = model
@name = method_name.to_s
@attribute_names = @name.match(self.class.pattern)[1].split("_and_")
@attribute_names.map! { |name| @model.attribute_aliases[name] || name }
end
def valid?
attribute_names.all? { |name| model.columns_hash[name] || model.reflect_on_aggregation(name.to_sym) }
end
def define
model.class_eval <<-CODE, __FILE__, __LINE__ + 1
def self.#{name}(#{signature})
#{body}
end
CODE
end
private
def body
"#{finder}(#{attributes_hash})"
end
# The parameters in the signature may have reserved Ruby words, in order
# to prevent errors, we start each param name with `_`.
def signature
attribute_names.map { |name| "_#{name}" }.join(", ")
end
# Given that the parameters starts with `_`, the finder needs to use the
# same parameter name.
def attributes_hash
"{" + attribute_names.map { |name| ":#{name} => _#{name}" }.join(",") + "}"
end
def finder
raise NotImplementedError
end
end
class FindBy < Method
Method.matchers << self
def self.prefix
"find_by"
end
def finder
"find_by"
end
end
class FindByBang < Method
Method.matchers << self
def self.prefix
"find_by"
end
def self.suffix
"!"
end
def finder
"find_by!"
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Encryption
# This is the concern mixed in Active Record models to make them encryptable. It adds the +encrypts+
# attribute declaration, as well as the API to encrypt and decrypt records.
module EncryptableRecord
extend ActiveSupport::Concern
included do
class_attribute :encrypted_attributes
validate :cant_modify_encrypted_attributes_when_frozen, if: -> { has_encrypted_attributes? && ActiveRecord::Encryption.context.frozen_encryption? }
end
class_methods do
# Encrypts the +name+ attribute.
#
# === Options
#
# * <tt>:key_provider</tt> - A key provider to provide encryption and decryption keys. Defaults to
# +ActiveRecord::Encryption.key_provider+.
# * <tt>:key</tt> - A password to derive the key from. It's a shorthand for a +:key_provider+ that
# serves derivated keys. Both options can't be used at the same time.
# * <tt>:deterministic</tt> - By default, encryption is not deterministic. It will use a random
# initialization vector for each encryption operation. This means that encrypting the same content
# with the same key twice will generate different ciphertexts. When set to +true+, it will generate the
# initialization vector based on the encrypted content. This means that the same content will generate
# the same ciphertexts. This enables querying encrypted text with Active Record. Deterministic encryption
# will use the oldest encryption scheme to encrypt new data by default. You can change this by setting
# +deterministic: { fixed: false }+. That will make it use the newest encryption scheme for encrypting new
# data.
# * <tt>:downcase</tt> - When true, it converts the encrypted content to downcase automatically. This allows to
# effectively ignore case when querying data. Notice that the case is lost. Use +:ignore_case+ if you are interested
# in preserving it.
# * <tt>:ignore_case</tt> - When true, it behaves like +:downcase+ but, it also preserves the original case in a specially
# designated column +original_<name>+. When reading the encrypted content, the version with the original case is
# served. But you can still execute queries that will ignore the case. This option can only be used when +:deterministic+
# is true.
# * <tt>:context_properties</tt> - Additional properties that will override +Context+ settings when this attribute is
# encrypted and decrypted. E.g: +encryptor:+, +cipher:+, +message_serializer:+, etc.
# * <tt>:previous</tt> - List of previous encryption schemes. When provided, they will be used in order when trying to read
# the attribute. Each entry of the list can contain the properties supported by #encrypts. Also, when deterministic
# encryption is used, they will be used to generate additional ciphertexts to check in the queries.
def encrypts(*names, key_provider: nil, key: nil, deterministic: false, downcase: false, ignore_case: false, previous: [], **context_properties)
self.encrypted_attributes ||= Set.new # not using :default because the instance would be shared across classes
scheme = scheme_for key_provider: key_provider, key: key, deterministic: deterministic, downcase: downcase, \
ignore_case: ignore_case, previous: previous, **context_properties
names.each do |name|
encrypt_attribute name, scheme
end
end
# Returns the list of deterministic encryptable attributes in the model class.
def deterministic_encrypted_attributes
@deterministic_encrypted_attributes ||= encrypted_attributes&.find_all do |attribute_name|
type_for_attribute(attribute_name).deterministic?
end
end
# Given a attribute name, it returns the name of the source attribute when it's a preserved one.
def source_attribute_from_preserved_attribute(attribute_name)
attribute_name.to_s.sub(ORIGINAL_ATTRIBUTE_PREFIX, "") if /^#{ORIGINAL_ATTRIBUTE_PREFIX}/.match?(attribute_name)
end
private
def scheme_for(key_provider: nil, key: nil, deterministic: false, downcase: false, ignore_case: false, previous: [], **context_properties)
ActiveRecord::Encryption::Scheme.new(key_provider: key_provider, key: key, deterministic: deterministic,
downcase: downcase, ignore_case: ignore_case, **context_properties).tap do |scheme|
scheme.previous_schemes = global_previous_schemes_for(scheme) +
Array.wrap(previous).collect { |scheme_config| ActiveRecord::Encryption::Scheme.new(**scheme_config) }
end
end
def global_previous_schemes_for(scheme)
ActiveRecord::Encryption.config.previous_schemes.collect do |previous_scheme|
scheme.merge(previous_scheme)
end
end
def encrypt_attribute(name, attribute_scheme)
encrypted_attributes << name.to_sym
attribute name do |cast_type|
ActiveRecord::Encryption::EncryptedAttributeType.new scheme: attribute_scheme, cast_type: cast_type
end
preserve_original_encrypted(name) if attribute_scheme.ignore_case?
ActiveRecord::Encryption.encrypted_attribute_was_declared(self, name)
end
def preserve_original_encrypted(name)
original_attribute_name = "#{ORIGINAL_ATTRIBUTE_PREFIX}#{name}".to_sym
if !ActiveRecord::Encryption.config.support_unencrypted_data && !column_names.include?(original_attribute_name.to_s)
raise Errors::Configuration, "To use :ignore_case for '#{name}' you must create an additional column named '#{original_attribute_name}'"
end
encrypts original_attribute_name
override_accessors_to_preserve_original name, original_attribute_name
end
def override_accessors_to_preserve_original(name, original_attribute_name)
include(Module.new do
define_method name do
if ((value = super()) && encrypted_attribute?(name)) || !ActiveRecord::Encryption.config.support_unencrypted_data
send(original_attribute_name)
else
value
end
end
define_method "#{name}=" do |value|
self.send "#{original_attribute_name}=", value
super(value)
end
end)
end
def load_schema!
super
add_length_validation_for_encrypted_columns if ActiveRecord::Encryption.config.validate_column_size
end
def add_length_validation_for_encrypted_columns
encrypted_attributes&.each do |attribute_name|
validate_column_size attribute_name
end
end
def validate_column_size(attribute_name)
if limit = columns_hash[attribute_name.to_s]&.limit
validates_length_of attribute_name, maximum: limit
end
end
end
# Returns whether a given attribute is encrypted or not.
def encrypted_attribute?(attribute_name)
ActiveRecord::Encryption.encryptor.encrypted? ciphertext_for(attribute_name)
end
# Returns the ciphertext for +attribute_name+.
def ciphertext_for(attribute_name)
read_attribute_before_type_cast(attribute_name)
end
# Encrypts all the encryptable attributes and saves the changes.
def encrypt
encrypt_attributes if has_encrypted_attributes?
end
# Decrypts all the encryptable attributes and saves the changes.
def decrypt
decrypt_attributes if has_encrypted_attributes?
end
private
ORIGINAL_ATTRIBUTE_PREFIX = "original_"
def encrypt_attributes
validate_encryption_allowed
update_columns build_encrypt_attribute_assignments
end
def decrypt_attributes
validate_encryption_allowed
decrypt_attribute_assignments = build_decrypt_attribute_assignments
ActiveRecord::Encryption.without_encryption { update_columns decrypt_attribute_assignments }
end
def validate_encryption_allowed
raise ActiveRecord::Encryption::Errors::Configuration, "can't be modified because it is encrypted" if ActiveRecord::Encryption.context.frozen_encryption?
end
def has_encrypted_attributes?
self.class.encrypted_attributes.present?
end
def build_encrypt_attribute_assignments
Array(self.class.encrypted_attributes).index_with do |attribute_name|
self[attribute_name]
end
end
def build_decrypt_attribute_assignments
Array(self.class.encrypted_attributes).to_h do |attribute_name|
type = type_for_attribute(attribute_name)
encrypted_value = ciphertext_for(attribute_name)
new_value = type.deserialize(encrypted_value)
[attribute_name, new_value]
end
end
def cant_modify_encrypted_attributes_when_frozen
self.class&.encrypted_attributes.each do |attribute|
errors.add(attribute.to_sym, "can't be modified because it is encrypted") if changed_attributes.include?(attribute)
end
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Encryption
# An ActiveModel::Type::Value that encrypts/decrypts strings of text.
#
# This is the central piece that connects the encryption system with +encrypts+ declarations in the
# model classes. Whenever you declare an attribute as encrypted, it configures an +EncryptedAttributeType+
# for that attribute.
class EncryptedAttributeType < ::ActiveRecord::Type::Text
include ActiveModel::Type::Helpers::Mutable
attr_reader :scheme, :cast_type
delegate :key_provider, :downcase?, :deterministic?, :previous_schemes, :with_context, :fixed?, to: :scheme
delegate :accessor, to: :cast_type
# === Options
#
# * <tt>:scheme</tt> - A +Scheme+ with the encryption properties for this attribute.
# * <tt>:cast_type</tt> - A type that will be used to serialize (before encrypting) and deserialize
# (after decrypting). ActiveModel::Type::String by default.
def initialize(scheme:, cast_type: ActiveModel::Type::String.new, previous_type: false)
super()
@scheme = scheme
@cast_type = cast_type
@previous_type = previous_type
end
def deserialize(value)
cast_type.deserialize decrypt(value)
end
def serialize(value)
if serialize_with_oldest?
serialize_with_oldest(value)
else
serialize_with_current(value)
end
end
def changed_in_place?(raw_old_value, new_value)
old_value = raw_old_value.nil? ? nil : deserialize(raw_old_value)
old_value != new_value
end
def previous_types # :nodoc:
@previous_types ||= {} # Memoizing on support_unencrypted_data so that we can tweak it during tests
@previous_types[support_unencrypted_data?] ||= build_previous_types_for(previous_schemes_including_clean_text)
end
private
def previous_schemes_including_clean_text
previous_schemes.including((clean_text_scheme if support_unencrypted_data?)).compact
end
def previous_types_without_clean_text
@previous_types_without_clean_text ||= build_previous_types_for(previous_schemes)
end
def build_previous_types_for(schemes)
schemes.collect do |scheme|
EncryptedAttributeType.new(scheme: scheme, previous_type: true)
end
end
def previous_type?
@previous_type
end
def decrypt(value)
with_context do
encryptor.decrypt(value, **decryption_options) unless value.nil?
end
rescue ActiveRecord::Encryption::Errors::Base => error
if previous_types_without_clean_text.blank?
handle_deserialize_error(error, value)
else
try_to_deserialize_with_previous_encrypted_types(value)
end
end
def try_to_deserialize_with_previous_encrypted_types(value)
previous_types.each.with_index do |type, index|
break type.deserialize(value)
rescue ActiveRecord::Encryption::Errors::Base => error
handle_deserialize_error(error, value) if index == previous_types.length - 1
end
end
def handle_deserialize_error(error, value)
if error.is_a?(Errors::Decryption) && support_unencrypted_data?
value
else
raise error
end
end
def serialize_with_oldest?
@serialize_with_oldest ||= fixed? && previous_types_without_clean_text.present?
end
def serialize_with_oldest(value)
previous_types.first.serialize(value)
end
def serialize_with_current(value)
casted_value = cast_type.serialize(value)
casted_value = casted_value&.downcase if downcase?
encrypt(casted_value.to_s) unless casted_value.nil?
end
def encrypt(value)
with_context do
encryptor.encrypt(value, **encryption_options)
end
end
def encryptor
ActiveRecord::Encryption.encryptor
end
def support_unencrypted_data?
ActiveRecord::Encryption.config.support_unencrypted_data && !previous_type?
end
def encryption_options
@encryption_options ||= { key_provider: key_provider, cipher_options: { deterministic: deterministic? } }.compact
end
def decryption_options
@decryption_options ||= { key_provider: key_provider }.compact
end
def clean_text_scheme
@clean_text_scheme ||= ActiveRecord::Encryption::Scheme.new(downcase: downcase?, encryptor: ActiveRecord::Encryption::NullEncryptor.new)
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Encryption
module EncryptedFixtures
def initialize(fixture, model_class)
@clean_values = {}
encrypt_fixture_data(fixture, model_class)
process_preserved_original_columns(fixture, model_class)
super
end
private
def encrypt_fixture_data(fixture, model_class)
model_class&.encrypted_attributes&.each do |attribute_name|
if clean_value = fixture[attribute_name.to_s]
@clean_values[attribute_name.to_s] = clean_value
type = model_class.type_for_attribute(attribute_name)
encrypted_value = type.serialize(clean_value)
fixture[attribute_name.to_s] = encrypted_value
end
end
end
def process_preserved_original_columns(fixture, model_class)
model_class&.encrypted_attributes&.each do |attribute_name|
if source_attribute_name = model_class.source_attribute_from_preserved_attribute(attribute_name)
clean_value = @clean_values[source_attribute_name.to_s]
type = model_class.type_for_attribute(attribute_name)
encrypted_value = type.serialize(clean_value)
fixture[attribute_name.to_s] = encrypted_value
end
end
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Encryption
# An encryptor that can encrypt data but can't decrypt it.
class EncryptingOnlyEncryptor < Encryptor
def decrypt(encrypted_text, key_provider: nil, cipher_options: {})
encrypted_text
end
end
end
end
# frozen_string_literal: true
require "active_support/core_ext/module"
require "active_support/core_ext/array"
module ActiveRecord
module Encryption
extend ActiveSupport::Autoload
eager_autoload do
autoload :Cipher
autoload :Config
autoload :Configurable
autoload :Context
autoload :Contexts
autoload :DerivedSecretKeyProvider
autoload :EncryptableRecord
autoload :EncryptedAttributeType
autoload :EncryptedFixtures
autoload :EncryptingOnlyEncryptor
autoload :DeterministicKeyProvider
autoload :Encryptor
autoload :EnvelopeEncryptionKeyProvider
autoload :Errors
autoload :ExtendedDeterministicQueries
autoload :ExtendedDeterministicUniquenessValidator
autoload :Key
autoload :KeyGenerator
autoload :KeyProvider
autoload :Message
autoload :MessageSerializer
autoload :NullEncryptor
autoload :Properties
autoload :ReadOnlyNullEncryptor
autoload :Scheme
end
class Cipher
extend ActiveSupport::Autoload
eager_autoload do
autoload :Aes256Gcm
end
end
include Configurable
include Contexts
def self.eager_load!
super
Cipher.eager_load!
end
end
end
# frozen_string_literal: true
require "openssl"
require "zlib"
require "active_support/core_ext/numeric"
module ActiveRecord
module Encryption
# An encryptor exposes the encryption API that ActiveRecord::Encryption::EncryptedAttributeType
# uses for encrypting and decrypting attribute values.
#
# It interacts with a KeyProvider for getting the keys, and delegate to
# ActiveRecord::Encryption::Cipher the actual encryption algorithm.
class Encryptor
# Encrypts +clean_text+ and returns the encrypted result
#
# Internally, it will:
#
# 1. Create a new ActiveRecord::Encryption::Message
# 2. Compress and encrypt +clean_text+ as the message payload
# 3. Serialize it with +ActiveRecord::Encryption.message_serializer+ (+ActiveRecord::Encryption::SafeMarshal+
# by default)
# 4. Encode the result with Base 64
#
# === Options
#
# [:key_provider]
# Key provider to use for the encryption operation. It will default to
# +ActiveRecord::Encryption.key_provider+ when not provided.
#
# [:cipher_options]
# Cipher-specific options that will be passed to the Cipher configured in
# +ActiveRecord::Encryption.cipher+
def encrypt(clear_text, key_provider: default_key_provider, cipher_options: {})
clear_text = force_encoding_if_needed(clear_text) if cipher_options[:deterministic]
validate_payload_type(clear_text)
serialize_message build_encrypted_message(clear_text, key_provider: key_provider, cipher_options: cipher_options)
end
# Decrypts a +clean_text+ and returns the result as clean text
#
# === Options
#
# [:key_provider]
# Key provider to use for the encryption operation. It will default to
# +ActiveRecord::Encryption.key_provider+ when not provided
#
# [:cipher_options]
# Cipher-specific options that will be passed to the Cipher configured in
# +ActiveRecord::Encryption.cipher+
def decrypt(encrypted_text, key_provider: default_key_provider, cipher_options: {})
message = deserialize_message(encrypted_text)
keys = key_provider.decryption_keys(message)
raise Errors::Decryption unless keys.present?
uncompress_if_needed(cipher.decrypt(message, key: keys.collect(&:secret), **cipher_options), message.headers.compressed)
rescue *(ENCODING_ERRORS + DECRYPT_ERRORS)
raise Errors::Decryption
end
# Returns whether the text is encrypted or not
def encrypted?(text)
deserialize_message(text)
true
rescue Errors::Encoding, *DECRYPT_ERRORS
false
end
private
DECRYPT_ERRORS = [OpenSSL::Cipher::CipherError, Errors::EncryptedContentIntegrity, Errors::Decryption]
ENCODING_ERRORS = [EncodingError, Errors::Encoding]
THRESHOLD_TO_JUSTIFY_COMPRESSION = 140.bytes
def default_key_provider
ActiveRecord::Encryption.key_provider
end
def validate_payload_type(clear_text)
unless clear_text.is_a?(String)
raise ActiveRecord::Encryption::Errors::ForbiddenClass, "The encryptor can only encrypt string values (#{clear_text.class})"
end
end
def cipher
ActiveRecord::Encryption.cipher
end
def build_encrypted_message(clear_text, key_provider:, cipher_options:)
key = key_provider.encryption_key
clear_text, was_compressed = compress_if_worth_it(clear_text)
cipher.encrypt(clear_text, key: key.secret, **cipher_options).tap do |message|
message.headers.add(key.public_tags)
message.headers.compressed = true if was_compressed
end
end
def serialize_message(message)
serializer.dump(message)
end
def deserialize_message(message)
raise Errors::Encoding unless message.is_a?(String)
serializer.load message
rescue ArgumentError, TypeError, Errors::ForbiddenClass
raise Errors::Encoding
end
def serializer
ActiveRecord::Encryption.message_serializer
end
# Under certain threshold, ZIP compression is actually worse that not compressing
def compress_if_worth_it(string)
if string.bytesize > THRESHOLD_TO_JUSTIFY_COMPRESSION
[compress(string), true]
else
[string, false]
end
end
def compress(data)
Zlib::Deflate.deflate(data).tap do |compressed_data|
compressed_data.force_encoding(data.encoding)
end
end
def uncompress_if_needed(data, compressed)
if compressed
uncompress(data)
else
data
end
end
def uncompress(data)
Zlib::Inflate.inflate(data).tap do |uncompressed_data|
uncompressed_data.force_encoding(data.encoding)
end
end
def force_encoding_if_needed(value)
if forced_encoding_for_deterministic_encryption && value && value.encoding != forced_encoding_for_deterministic_encryption
value.encode(forced_encoding_for_deterministic_encryption, invalid: :replace, undef: :replace)
else
value
end
end
def forced_encoding_for_deterministic_encryption
ActiveRecord::Encryption.config.forced_encoding_for_deterministic_encryption
end
end
end
end
# frozen_string_literal: true
require "active_support/core_ext/hash/slice"
require "active_support/core_ext/object/deep_dup"
module ActiveRecord
# Declare an enum attribute where the values map to integers in the database,
# but can be queried by name. Example:
#
# class Conversation < ActiveRecord::Base
# enum :status, [ :active, :archived ]
# end
#
# # conversation.update! status: 0
# conversation.active!
# conversation.active? # => true
# conversation.status # => "active"
#
# # conversation.update! status: 1
# conversation.archived!
# conversation.archived? # => true
# conversation.status # => "archived"
#
# # conversation.status = 1
# conversation.status = "archived"
#
# conversation.status = nil
# conversation.status.nil? # => true
# conversation.status # => nil
#
# Scopes based on the allowed values of the enum field will be provided
# as well. With the above example:
#
# Conversation.active
# Conversation.not_active
# Conversation.archived
# Conversation.not_archived
#
# Of course, you can also query them directly if the scopes don't fit your
# needs:
#
# Conversation.where(status: [:active, :archived])
# Conversation.where.not(status: :active)
#
# Defining scopes can be disabled by setting +:scopes+ to +false+.
#
# class Conversation < ActiveRecord::Base
# enum :status, [ :active, :archived ], scopes: false
# end
#
# You can set the default enum value by setting +:default+, like:
#
# class Conversation < ActiveRecord::Base
# enum :status, [ :active, :archived ], default: :active
# end
#
# conversation = Conversation.new
# conversation.status # => "active"
#
# It's possible to explicitly map the relation between attribute and
# database integer with a hash:
#
# class Conversation < ActiveRecord::Base
# enum :status, active: 0, archived: 1
# end
#
# Finally it's also possible to use a string column to persist the enumerated value.
# Note that this will likely lead to slower database queries:
#
# class Conversation < ActiveRecord::Base
# enum :status, active: "active", archived: "archived"
# end
#
# Note that when an array is used, the implicit mapping from the values to database
# integers is derived from the order the values appear in the array. In the example,
# <tt>:active</tt> is mapped to +0+ as it's the first element, and <tt>:archived</tt>
# is mapped to +1+. In general, the +i+-th element is mapped to <tt>i-1</tt> in the
# database.
#
# Therefore, once a value is added to the enum array, its position in the array must
# be maintained, and new values should only be added to the end of the array. To
# remove unused values, the explicit hash syntax should be used.
#
# In rare circumstances you might need to access the mapping directly.
# The mappings are exposed through a class method with the pluralized attribute
# name, which return the mapping in a ActiveSupport::HashWithIndifferentAccess :
#
# Conversation.statuses[:active] # => 0
# Conversation.statuses["archived"] # => 1
#
# Use that class method when you need to know the ordinal value of an enum.
# For example, you can use that when manually building SQL strings:
#
# Conversation.where("status <> ?", Conversation.statuses[:archived])
#
# You can use the +:prefix+ or +:suffix+ options when you need to define
# multiple enums with same values. If the passed value is +true+, the methods
# are prefixed/suffixed with the name of the enum. It is also possible to
# supply a custom value:
#
# class Conversation < ActiveRecord::Base
# enum :status, [ :active, :archived ], suffix: true
# enum :comments_status, [ :active, :inactive ], prefix: :comments
# end
#
# With the above example, the bang and predicate methods along with the
# associated scopes are now prefixed and/or suffixed accordingly:
#
# conversation.active_status!
# conversation.archived_status? # => false
#
# conversation.comments_inactive!
# conversation.comments_active? # => false
module Enum
def self.extended(base) # :nodoc:
base.class_attribute(:defined_enums, instance_writer: false, default: {})
end
def inherited(base) # :nodoc:
base.defined_enums = defined_enums.deep_dup
super
end
class EnumType < Type::Value # :nodoc:
delegate :type, to: :subtype
def initialize(name, mapping, subtype)
@name = name
@mapping = mapping
@subtype = subtype
end
def cast(value)
if mapping.has_key?(value)
value.to_s
elsif mapping.has_value?(value)
mapping.key(value)
else
value.presence
end
end
def deserialize(value)
mapping.key(subtype.deserialize(value))
end
def serialize(value)
subtype.serialize(mapping.fetch(value, value))
end
def serializable?(value, &block)
subtype.serializable?(mapping.fetch(value, value), &block)
end
def assert_valid_value(value)
unless value.blank? || mapping.has_key?(value) || mapping.has_value?(value)
raise ArgumentError, "'#{value}' is not a valid #{name}"
end
end
attr_reader :subtype
private
attr_reader :name, :mapping
end
def enum(name = nil, values = nil, **options)
if name
values, options = options, {} unless values
return _enum(name, values, **options)
end
definitions = options.slice!(:_prefix, :_suffix, :_scopes, :_default)
options.transform_keys! { |key| :"#{key[1..-1]}" }
definitions.each { |name, values| _enum(name, values, **options) }
end
private
def _enum(name, values, prefix: nil, suffix: nil, scopes: true, **options)
assert_valid_enum_definition_values(values)
# statuses = { }
enum_values = ActiveSupport::HashWithIndifferentAccess.new
name = name.to_s
# def self.statuses() statuses end
detect_enum_conflict!(name, name.pluralize, true)
singleton_class.define_method(name.pluralize) { enum_values }
defined_enums[name] = enum_values
detect_enum_conflict!(name, name)
detect_enum_conflict!(name, "#{name}=")
attribute(name, **options) do |subtype|
subtype = subtype.subtype if EnumType === subtype
EnumType.new(name, enum_values, subtype)
end
value_method_names = []
_enum_methods_module.module_eval do
prefix = if prefix
prefix == true ? "#{name}_" : "#{prefix}_"
end
suffix = if suffix
suffix == true ? "_#{name}" : "_#{suffix}"
end
pairs = values.respond_to?(:each_pair) ? values.each_pair : values.each_with_index
pairs.each do |label, value|
enum_values[label] = value
label = label.to_s
value_method_name = "#{prefix}#{label}#{suffix}"
value_method_names << value_method_name
define_enum_methods(name, value_method_name, value, scopes)
method_friendly_label = label.gsub(/[\W&&[:ascii:]]+/, "_")
value_method_alias = "#{prefix}#{method_friendly_label}#{suffix}"
if value_method_alias != value_method_name && !value_method_names.include?(value_method_alias)
value_method_names << value_method_alias
define_enum_methods(name, value_method_alias, value, scopes)
end
end
end
detect_negative_enum_conditions!(value_method_names) if scopes
enum_values.freeze
end
class EnumMethods < Module # :nodoc:
def initialize(klass)
@klass = klass
end
private
attr_reader :klass
def define_enum_methods(name, value_method_name, value, scopes)
# def active?() status_for_database == 0 end
klass.send(:detect_enum_conflict!, name, "#{value_method_name}?")
define_method("#{value_method_name}?") { public_send(:"#{name}_for_database") == value }
# def active!() update!(status: 0) end
klass.send(:detect_enum_conflict!, name, "#{value_method_name}!")
define_method("#{value_method_name}!") { update!(name => value) }
# scope :active, -> { where(status: 0) }
# scope :not_active, -> { where.not(status: 0) }
if scopes
klass.send(:detect_enum_conflict!, name, value_method_name, true)
klass.scope value_method_name, -> { where(name => value) }
klass.send(:detect_enum_conflict!, name, "not_#{value_method_name}", true)
klass.scope "not_#{value_method_name}", -> { where.not(name => value) }
end
end
end
private_constant :EnumMethods
def _enum_methods_module
@_enum_methods_module ||= begin
mod = EnumMethods.new(self)
include mod
mod
end
end
def assert_valid_enum_definition_values(values)
case values
when Hash
if values.empty?
raise ArgumentError, "Enum values #{values} must not be empty."
end
if values.keys.any?(&:blank?)
raise ArgumentError, "Enum values #{values} must not contain a blank name."
end
when Array
if values.empty?
raise ArgumentError, "Enum values #{values} must not be empty."
end
unless values.all?(Symbol) || values.all?(String)
raise ArgumentError, "Enum values #{values} must only contain symbols or strings."
end
if values.any?(&:blank?)
raise ArgumentError, "Enum values #{values} must not contain a blank name."
end
else
raise ArgumentError, "Enum values #{values} must be either a non-empty hash or an array."
end
end
ENUM_CONFLICT_MESSAGE = \
"You tried to define an enum named \"%{enum}\" on the model \"%{klass}\", but " \
"this will generate a %{type} method \"%{method}\", which is already defined " \
"by %{source}."
private_constant :ENUM_CONFLICT_MESSAGE
def detect_enum_conflict!(enum_name, method_name, klass_method = false)
if klass_method && dangerous_class_method?(method_name)
raise_conflict_error(enum_name, method_name, type: "class")
elsif klass_method && method_defined_within?(method_name, Relation)
raise_conflict_error(enum_name, method_name, type: "class", source: Relation.name)
elsif !klass_method && dangerous_attribute_method?(method_name)
raise_conflict_error(enum_name, method_name)
elsif !klass_method && method_defined_within?(method_name, _enum_methods_module, Module)
raise_conflict_error(enum_name, method_name, source: "another enum")
end
end
def raise_conflict_error(enum_name, method_name, type: "instance", source: "Active Record")
raise ArgumentError, ENUM_CONFLICT_MESSAGE % {
enum: enum_name,
klass: name,
type: type,
method: method_name,
source: source
}
end
def detect_negative_enum_conditions!(method_names)
return unless logger
method_names.select { |m| m.start_with?("not_") }.each do |potential_not|
inverted_form = potential_not.sub("not_", "")
if method_names.include?(inverted_form)
logger.warn "Enum element '#{potential_not}' in #{self.name} uses the prefix 'not_'." \
" This has caused a conflict with auto generated negative scopes." \
" Avoid using enum elements starting with 'not' where the positive form is also an element."
end
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Encryption
# Implements a simple envelope encryption approach where:
#
# * It generates a random data-encryption key for each encryption operation.
# * It stores the generated key along with the encrypted payload. It encrypts this key
# with the master key provided in the +active_record_encryption.primary_key+ credential.
#
# This provider can work with multiple master keys. It will use the last one for encrypting.
#
# When +config.active_record.encryption.store_key_references+ is true, it will also store a reference to
# the specific master key that was used to encrypt the data-encryption key. When not set,
# it will try all the configured master keys looking for the right one, in order to
# return the right decryption key.
class EnvelopeEncryptionKeyProvider
def encryption_key
random_secret = generate_random_secret
ActiveRecord::Encryption::Key.new(random_secret).tap do |key|
key.public_tags.encrypted_data_key = encrypt_data_key(random_secret)
key.public_tags.encrypted_data_key_id = active_primary_key.id if ActiveRecord::Encryption.config.store_key_references
end
end
def decryption_keys(encrypted_message)
secret = decrypt_data_key(encrypted_message)
secret ? [ActiveRecord::Encryption::Key.new(secret)] : []
end
def active_primary_key
@active_primary_key ||= primary_key_provider.encryption_key
end
private
def encrypt_data_key(random_secret)
ActiveRecord::Encryption.cipher.encrypt(random_secret, key: active_primary_key.secret)
end
def decrypt_data_key(encrypted_message)
encrypted_data_key = encrypted_message.headers.encrypted_data_key
key = primary_key_provider.decryption_keys(encrypted_message)&.collect(&:secret)
ActiveRecord::Encryption.cipher.decrypt encrypted_data_key, key: key if key
end
def primary_key_provider
@primary_key_provider ||= DerivedSecretKeyProvider.new(ActiveRecord::Encryption.config.primary_key)
end
def generate_random_secret
ActiveRecord::Encryption.key_generator.generate_random_key
end
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Nodes
class Equality < Arel::Nodes::Binary
include FetchAttribute
def equality?; true; end
def invert
Arel::Nodes::NotEqual.new(left, right)
end
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
class ArelError < StandardError
end
class EmptyJoinError < ArelError
end
end
# frozen_string_literal: true
require "active_record/explain_registry"
module ActiveRecord
module Explain
# Executes the block with the collect flag enabled. Queries are collected
# asynchronously by the subscriber and returned.
def collecting_queries_for_explain # :nodoc:
ExplainRegistry.collect = true
yield
ExplainRegistry.queries
ensure
ExplainRegistry.reset
end
# Makes the adapter execute EXPLAIN for the tuples of queries and bindings.
# Returns a formatted string ready to be logged.
def exec_explain(queries) # :nodoc:
str = queries.map do |sql, binds|
msg = +"EXPLAIN for: #{sql}"
unless binds.empty?
msg << " "
msg << binds.map { |attr| render_bind(attr) }.inspect
end
msg << "\n"
msg << connection.explain(sql, binds)
end.join("\n")
# Overriding inspect to be more human readable, especially in the console.
def str.inspect
self
end
str
end
private
def render_bind(attr)
if ActiveModel::Attribute === attr
value = if attr.type.binary? && attr.value
"<#{attr.value_for_database.to_s.bytesize} bytes of binary data>"
else
connection.type_cast(attr.value_for_database)
end
else
value = connection.type_cast(attr)
attr = nil
end
[attr&.name, value]
end
end
end
# frozen_string_literal: true
module ActiveRecord
module ConnectionAdapters
module SQLite3
class ExplainPrettyPrinter # :nodoc:
# Pretty prints the result of an EXPLAIN QUERY PLAN in a way that resembles
# the output of the SQLite shell:
#
# 0|0|0|SEARCH TABLE users USING INTEGER PRIMARY KEY (rowid=?) (~1 rows)
# 0|1|1|SCAN TABLE posts (~100000 rows)
#
def pp(result)
result.rows.map do |row|
row.join("|")
end.join("\n") + "\n"
end
end
end
end
end
# frozen_string_literal: true
require "active_support/core_ext/module/delegation"
module ActiveRecord
# This is a thread locals registry for EXPLAIN. For example
#
# ActiveRecord::ExplainRegistry.queries
#
# returns the collected queries local to the current thread.
class ExplainRegistry # :nodoc:
class << self
delegate :reset, :collect, :collect=, :collect?, :queries, to: :instance
private
def instance
ActiveSupport::IsolatedExecutionState[:active_record_explain_registry] ||= new
end
end
attr_accessor :collect
attr_reader :queries
def initialize
reset
end
def collect?
@collect
end
def reset
@collect = false
@queries = []
end
end
end
# frozen_string_literal: true
require "active_support/notifications"
require "active_record/explain_registry"
module ActiveRecord
class ExplainSubscriber # :nodoc:
def start(name, id, payload)
# unused
end
def finish(name, id, payload)
if ExplainRegistry.collect? && !ignore_payload?(payload)
ExplainRegistry.queries << payload.values_at(:sql, :binds)
end
end
# SCHEMA queries cannot be EXPLAINed, also we do not want to run EXPLAIN on
# our own EXPLAINs no matter how loopingly beautiful that would be.
#
# On the other hand, we want to monitor the performance of our real database
# queries, not the performance of the access to the query cache.
IGNORED_PAYLOADS = %w(SCHEMA EXPLAIN)
EXPLAINED_SQLS = /\A\s*(\/\*.*\*\/)?\s*(with|select|update|delete|insert)\b/i
def ignore_payload?(payload)
payload[:exception] ||
payload[:cached] ||
IGNORED_PAYLOADS.include?(payload[:name]) ||
!payload[:sql].match?(EXPLAINED_SQLS)
end
ActiveSupport::Notifications.subscribe("sql.active_record", new)
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Expressions
def count(distinct = false)
Nodes::Count.new [self], distinct
end
def sum
Nodes::Sum.new [self]
end
def maximum
Nodes::Max.new [self]
end
def minimum
Nodes::Min.new [self]
end
def average
Nodes::Avg.new [self]
end
def extract(field)
Nodes::Extract.new [self], field
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Encryption
# Automatically expand encrypted arguments to support querying both encrypted and unencrypted data
#
# Active Record \Encryption supports querying the db using deterministic attributes. For example:
#
# Contact.find_by(email_address: "jorge@hey.com")
#
# The value "jorge@hey.com" will get encrypted automatically to perform the query. But there is
# a problem while the data is being encrypted. This won't work. During that time, you need these
# queries to be:
#
# Contact.find_by(email_address: [ "jorge@hey.com", "<encrypted jorge@hey.com>" ])
#
# This patches ActiveRecord to support this automatically. It addresses both:
#
# * ActiveRecord::Base - Used in <tt>Contact.find_by_email_address(...)</tt>
# * ActiveRecord::Relation - Used in <tt>Contact.internal.find_by_email_address(...)</tt>
#
# ActiveRecord::Base relies on ActiveRecord::Relation (ActiveRecord::QueryMethods) but it does
# some prepared statements caching. That's why we need to intercept +ActiveRecord::Base+ as soon
# as it's invoked (so that the proper prepared statement is cached).
#
# When modifying this file run performance tests in +test/performance/extended_deterministic_queries_performance_test.rb+ to
# make sure performance overhead is acceptable.
#
# We will extend this to support previous "encryption context" versions in future iterations
#
# @TODO Experimental. Support for every kind of query is pending
# @TODO It should not patch anything if not needed (no previous schemes or no support for previous encryption schemes)
module ExtendedDeterministicQueries
def self.install_support
ActiveRecord::Relation.prepend(RelationQueries)
ActiveRecord::Base.include(CoreQueries)
ActiveRecord::Encryption::EncryptedAttributeType.prepend(ExtendedEncryptableType)
Arel::Nodes::HomogeneousIn.prepend(InWithAdditionalValues)
end
module EncryptedQueryArgumentProcessor
extend ActiveSupport::Concern
private
def process_encrypted_query_arguments(args, check_for_additional_values)
if args.is_a?(Array) && (options = args.first).is_a?(Hash)
self.deterministic_encrypted_attributes&.each do |attribute_name|
type = type_for_attribute(attribute_name)
if !type.previous_types.empty? && value = options[attribute_name]
options[attribute_name] = process_encrypted_query_argument(value, check_for_additional_values, type)
end
end
end
end
def process_encrypted_query_argument(value, check_for_additional_values, type)
return value if check_for_additional_values && value.is_a?(Array) && value.last.is_a?(AdditionalValue)
case value
when String, Array
list = Array(value)
list + list.flat_map do |each_value|
if check_for_additional_values && each_value.is_a?(AdditionalValue)
each_value
else
additional_values_for(each_value, type)
end
end
else
value
end
end
def additional_values_for(value, type)
type.previous_types.collect do |additional_type|
AdditionalValue.new(value, additional_type)
end
end
end
module RelationQueries
include EncryptedQueryArgumentProcessor
def where(*args)
process_encrypted_query_arguments_if_needed(args)
super
end
def exists?(*args)
process_encrypted_query_arguments_if_needed(args)
super
end
def find_or_create_by(attributes, &block)
find_by(attributes.dup) || create(attributes, &block)
end
def find_or_create_by!(attributes, &block)
find_by(attributes.dup) || create!(attributes, &block)
end
private
def process_encrypted_query_arguments_if_needed(args)
process_encrypted_query_arguments(args, true) unless self.deterministic_encrypted_attributes&.empty?
end
end
module CoreQueries
extend ActiveSupport::Concern
class_methods do
include EncryptedQueryArgumentProcessor
def find_by(*args)
process_encrypted_query_arguments(args, false) unless self.deterministic_encrypted_attributes&.empty?
super
end
end
end
class AdditionalValue
attr_reader :value, :type
def initialize(value, type)
@type = type
@value = process(value)
end
private
def process(value)
type.serialize(value)
end
end
module ExtendedEncryptableType
def serialize(data)
if data.is_a?(AdditionalValue)
data.value
else
super
end
end
end
module InWithAdditionalValues
def proc_for_binds
-> value { ActiveModel::Attribute.with_cast_value(attribute.name, value, encryption_aware_type_caster) }
end
def encryption_aware_type_caster
if attribute.type_caster.is_a?(ActiveRecord::Encryption::EncryptedAttributeType)
attribute.type_caster.cast_type
else
attribute.type_caster
end
end
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Encryption
module ExtendedDeterministicUniquenessValidator
def self.install_support
ActiveRecord::Validations::UniquenessValidator.prepend(EncryptedUniquenessValidator)
end
module EncryptedUniquenessValidator
def validate_each(record, attribute, value)
super(record, attribute, value)
klass = record.class
klass.deterministic_encrypted_attributes&.each do |attribute_name|
encrypted_type = klass.type_for_attribute(attribute_name)
[ encrypted_type, *encrypted_type.previous_types ].each do |type|
encrypted_value = type.serialize(value)
ActiveRecord::Encryption.without_encryption do
super(record, attribute, encrypted_value)
end
end
end
end
end
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Nodes
class Extract < Arel::Nodes::Unary
attr_accessor :field
def initialize(expr, field)
super(expr)
@field = field
end
def hash
super ^ @field.hash
end
def eql?(other)
super &&
self.field == other.field
end
alias :== :eql?
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
###
# Methods for creating various nodes
module FactoryMethods
def create_true
Arel::Nodes::True.new
end
def create_false
Arel::Nodes::False.new
end
def create_table_alias(relation, name)
Nodes::TableAlias.new(relation, name)
end
def create_join(to, constraint = nil, klass = Nodes::InnerJoin)
klass.new(to, constraint)
end
def create_string_join(to)
create_join to, nil, Nodes::StringJoin
end
def create_and(clauses)
Nodes::And.new clauses
end
def create_on(expr)
Nodes::On.new expr
end
def grouping(expr)
Nodes::Grouping.new expr
end
###
# Create a LOWER() function
def lower(column)
Nodes::NamedFunction.new "LOWER", [Nodes.build_quoted(column)]
end
def coalesce(*exprs)
Nodes::NamedFunction.new "COALESCE", exprs
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Nodes
class False < Arel::Nodes::NodeExpression
def hash
self.class.hash
end
def eql?(other)
self.class == other.class
end
alias :== :eql?
end
end
end
# frozen_string_literal: true
require "active_support/configuration_file"
module ActiveRecord
class FixtureSet
class File # :nodoc:
include Enumerable
##
# Open a fixture file named +file+. When called with a block, the block
# is called with the filehandle and the filehandle is automatically closed
# when the block finishes.
def self.open(file)
x = new file
block_given? ? yield(x) : x
end
def initialize(file)
@file = file
end
def each(&block)
rows.each(&block)
end
def model_class
config_row["model_class"]
end
def ignored_fixtures
config_row["ignore"]
end
private
def rows
@rows ||= raw_rows.reject { |fixture_name, _| fixture_name == "_fixture" }
end
def config_row
@config_row ||= begin
row = raw_rows.find { |fixture_name, _| fixture_name == "_fixture" }
if row
validate_config_row(row.last)
else
{ 'model_class': nil, 'ignore': nil }
end
end
end
def raw_rows
@raw_rows ||= begin
data = ActiveSupport::ConfigurationFile.parse(@file, context:
ActiveRecord::FixtureSet::RenderContext.create_subclass.new.get_binding)
data ? validate(data).to_a : []
rescue RuntimeError => error
raise Fixture::FormatError, error.message
end
end
def validate_config_row(data)
unless Hash === data
raise Fixture::FormatError, "Invalid `_fixture` section: `_fixture` must be a hash: #{@file}"
end
begin
data.assert_valid_keys("model_class", "ignore")
rescue ArgumentError => error
raise Fixture::FormatError, "Invalid `_fixture` section: #{error.message}: #{@file}"
end
data
end
# Validate our unmarshalled data.
def validate(data)
unless Hash === data || YAML::Omap === data
raise Fixture::FormatError, "fixture is not a hash: #{@file}"
end
invalid = data.reject { |_, row| Hash === row }
if invalid.any?
raise Fixture::FormatError, "fixture key is not a hash: #{@file}, keys: #{invalid.keys.inspect}"
end
data
end
end
end
end
# frozen_string_literal: true
module Arel
module Nodes
class Filter < Binary
include Arel::WindowPredications
include Arel::AliasPredication
end
end
end
# frozen_string_literal: true
module Arel
module FilterPredications
def filter(expr)
Nodes::Filter.new(self, expr)
end
end
end
# frozen_string_literal: true
require "active_support/core_ext/string/filters"
module ActiveRecord
module FinderMethods
ONE_AS_ONE = "1 AS one"
# Find by id - This can either be a specific id (1), a list of ids (1, 5, 6), or an array of ids ([5, 6, 10]).
# If one or more records cannot be found for the requested ids, then ActiveRecord::RecordNotFound will be raised.
# If the primary key is an integer, find by id coerces its arguments by using +to_i+.
#
# Person.find(1) # returns the object for ID = 1
# Person.find("1") # returns the object for ID = 1
# Person.find("31-sarah") # returns the object for ID = 31
# Person.find(1, 2, 6) # returns an array for objects with IDs in (1, 2, 6)
# Person.find([7, 17]) # returns an array for objects with IDs in (7, 17)
# Person.find([1]) # returns an array for the object with ID = 1
# Person.where("administrator = 1").order("created_on DESC").find(1)
#
# NOTE: The returned records are in the same order as the ids you provide.
# If you want the results to be sorted by database, you can use ActiveRecord::QueryMethods#where
# method and provide an explicit ActiveRecord::QueryMethods#order option.
# But ActiveRecord::QueryMethods#where method doesn't raise ActiveRecord::RecordNotFound.
#
# ==== Find with lock
#
# Example for find with a lock: Imagine two concurrent transactions:
# each will read <tt>person.visits == 2</tt>, add 1 to it, and save, resulting
# in two saves of <tt>person.visits = 3</tt>. By locking the row, the second
# transaction has to wait until the first is finished; we get the
# expected <tt>person.visits == 4</tt>.
#
# Person.transaction do
# person = Person.lock(true).find(1)
# person.visits += 1
# person.save!
# end
#
# ==== Variations of #find
#
# Person.where(name: 'Spartacus', rating: 4)
# # returns a chainable list (which can be empty).
#
# Person.find_by(name: 'Spartacus', rating: 4)
# # returns the first item or nil.
#
# Person.find_or_initialize_by(name: 'Spartacus', rating: 4)
# # returns the first item or returns a new instance (requires you call .save to persist against the database).
#
# Person.find_or_create_by(name: 'Spartacus', rating: 4)
# # returns the first item or creates it and returns it.
#
# ==== Alternatives for #find
#
# Person.where(name: 'Spartacus', rating: 4).exists?(conditions = :none)
# # returns a boolean indicating if any record with the given conditions exist.
#
# Person.where(name: 'Spartacus', rating: 4).select("field1, field2, field3")
# # returns a chainable list of instances with only the mentioned fields.
#
# Person.where(name: 'Spartacus', rating: 4).ids
# # returns an Array of ids.
#
# Person.where(name: 'Spartacus', rating: 4).pluck(:field1, :field2)
# # returns an Array of the required fields.
def find(*args)
return super if block_given?
find_with_ids(*args)
end
# Finds the first record matching the specified conditions. There
# is no implied ordering so if order matters, you should specify it
# yourself.
#
# If no record is found, returns <tt>nil</tt>.
#
# Post.find_by name: 'Spartacus', rating: 4
# Post.find_by "published_at < ?", 2.weeks.ago
def find_by(arg, *args)
where(arg, *args).take
end
# Like #find_by, except that if no record is found, raises
# an ActiveRecord::RecordNotFound error.
def find_by!(arg, *args)
where(arg, *args).take!
end
# Gives a record (or N records if a parameter is supplied) without any implied
# order. The order will depend on the database implementation.
# If an order is supplied it will be respected.
#
# Person.take # returns an object fetched by SELECT * FROM people LIMIT 1
# Person.take(5) # returns 5 objects fetched by SELECT * FROM people LIMIT 5
# Person.where(["name LIKE '%?'", name]).take
def take(limit = nil)
limit ? find_take_with_limit(limit) : find_take
end
# Same as #take but raises ActiveRecord::RecordNotFound if no record
# is found. Note that #take! accepts no arguments.
def take!
take || raise_record_not_found_exception!
end
# Finds the sole matching record. Raises ActiveRecord::RecordNotFound if no
# record is found. Raises ActiveRecord::SoleRecordExceeded if more than one
# record is found.
#
# Product.where(["price = %?", price]).sole
def sole
found, undesired = first(2)
if found.nil?
raise_record_not_found_exception!
elsif undesired.present?
raise ActiveRecord::SoleRecordExceeded.new(self)
else
found
end
end
# Finds the sole matching record. Raises ActiveRecord::RecordNotFound if no
# record is found. Raises ActiveRecord::SoleRecordExceeded if more than one
# record is found.
#
# Product.find_sole_by(["price = %?", price])
def find_sole_by(arg, *args)
where(arg, *args).sole
end
# Find the first record (or first N records if a parameter is supplied).
# If no order is defined it will order by primary key.
#
# Person.first # returns the first object fetched by SELECT * FROM people ORDER BY people.id LIMIT 1
# Person.where(["user_name = ?", user_name]).first
# Person.where(["user_name = :u", { u: user_name }]).first
# Person.order("created_on DESC").offset(5).first
# Person.first(3) # returns the first three objects fetched by SELECT * FROM people ORDER BY people.id LIMIT 3
#
def first(limit = nil)
if limit
find_nth_with_limit(0, limit)
else
find_nth 0
end
end
# Same as #first but raises ActiveRecord::RecordNotFound if no record
# is found. Note that #first! accepts no arguments.
def first!
first || raise_record_not_found_exception!
end
# Find the last record (or last N records if a parameter is supplied).
# If no order is defined it will order by primary key.
#
# Person.last # returns the last object fetched by SELECT * FROM people
# Person.where(["user_name = ?", user_name]).last
# Person.order("created_on DESC").offset(5).last
# Person.last(3) # returns the last three objects fetched by SELECT * FROM people.
#
# Take note that in that last case, the results are sorted in ascending order:
#
# [#<Person id:2>, #<Person id:3>, #<Person id:4>]
#
# and not:
#
# [#<Person id:4>, #<Person id:3>, #<Person id:2>]
def last(limit = nil)
return find_last(limit) if loaded? || has_limit_or_offset?
result = ordered_relation.limit(limit)
result = result.reverse_order!
limit ? result.reverse : result.first
end
# Same as #last but raises ActiveRecord::RecordNotFound if no record
# is found. Note that #last! accepts no arguments.
def last!
last || raise_record_not_found_exception!
end
# Find the second record.
# If no order is defined it will order by primary key.
#
# Person.second # returns the second object fetched by SELECT * FROM people
# Person.offset(3).second # returns the second object from OFFSET 3 (which is OFFSET 4)
# Person.where(["user_name = :u", { u: user_name }]).second
def second
find_nth 1
end
# Same as #second but raises ActiveRecord::RecordNotFound if no record
# is found.
def second!
second || raise_record_not_found_exception!
end
# Find the third record.
# If no order is defined it will order by primary key.
#
# Person.third # returns the third object fetched by SELECT * FROM people
# Person.offset(3).third # returns the third object from OFFSET 3 (which is OFFSET 5)
# Person.where(["user_name = :u", { u: user_name }]).third
def third
find_nth 2
end
# Same as #third but raises ActiveRecord::RecordNotFound if no record
# is found.
def third!
third || raise_record_not_found_exception!
end
# Find the fourth record.
# If no order is defined it will order by primary key.
#
# Person.fourth # returns the fourth object fetched by SELECT * FROM people
# Person.offset(3).fourth # returns the fourth object from OFFSET 3 (which is OFFSET 6)
# Person.where(["user_name = :u", { u: user_name }]).fourth
def fourth
find_nth 3
end
# Same as #fourth but raises ActiveRecord::RecordNotFound if no record
# is found.
def fourth!
fourth || raise_record_not_found_exception!
end
# Find the fifth record.
# If no order is defined it will order by primary key.
#
# Person.fifth # returns the fifth object fetched by SELECT * FROM people
# Person.offset(3).fifth # returns the fifth object from OFFSET 3 (which is OFFSET 7)
# Person.where(["user_name = :u", { u: user_name }]).fifth
def fifth
find_nth 4
end
# Same as #fifth but raises ActiveRecord::RecordNotFound if no record
# is found.
def fifth!
fifth || raise_record_not_found_exception!
end
# Find the forty-second record. Also known as accessing "the reddit".
# If no order is defined it will order by primary key.
#
# Person.forty_two # returns the forty-second object fetched by SELECT * FROM people
# Person.offset(3).forty_two # returns the forty-second object from OFFSET 3 (which is OFFSET 44)
# Person.where(["user_name = :u", { u: user_name }]).forty_two
def forty_two
find_nth 41
end
# Same as #forty_two but raises ActiveRecord::RecordNotFound if no record
# is found.
def forty_two!
forty_two || raise_record_not_found_exception!
end
# Find the third-to-last record.
# If no order is defined it will order by primary key.
#
# Person.third_to_last # returns the third-to-last object fetched by SELECT * FROM people
# Person.offset(3).third_to_last # returns the third-to-last object from OFFSET 3
# Person.where(["user_name = :u", { u: user_name }]).third_to_last
def third_to_last
find_nth_from_last 3
end
# Same as #third_to_last but raises ActiveRecord::RecordNotFound if no record
# is found.
def third_to_last!
third_to_last || raise_record_not_found_exception!
end
# Find the second-to-last record.
# If no order is defined it will order by primary key.
#
# Person.second_to_last # returns the second-to-last object fetched by SELECT * FROM people
# Person.offset(3).second_to_last # returns the second-to-last object from OFFSET 3
# Person.where(["user_name = :u", { u: user_name }]).second_to_last
def second_to_last
find_nth_from_last 2
end
# Same as #second_to_last but raises ActiveRecord::RecordNotFound if no record
# is found.
def second_to_last!
second_to_last || raise_record_not_found_exception!
end
# Returns true if a record exists in the table that matches the +id+ or
# conditions given, or false otherwise. The argument can take six forms:
#
# * Integer - Finds the record with this primary key.
# * String - Finds the record with a primary key corresponding to this
# string (such as <tt>'5'</tt>).
# * Array - Finds the record that matches these +where+-style conditions
# (such as <tt>['name LIKE ?', "%#{query}%"]</tt>).
# * Hash - Finds the record that matches these +where+-style conditions
# (such as <tt>{name: 'David'}</tt>).
# * +false+ - Returns always +false+.
# * No args - Returns +false+ if the relation is empty, +true+ otherwise.
#
# For more information about specifying conditions as a hash or array,
# see the Conditions section in the introduction to ActiveRecord::Base.
#
# Note: You can't pass in a condition as a string (like <tt>name =
# 'Jamie'</tt>), since it would be sanitized and then queried against
# the primary key column, like <tt>id = 'name = \'Jamie\''</tt>.
#
# Person.exists?(5)
# Person.exists?('5')
# Person.exists?(['name LIKE ?', "%#{query}%"])
# Person.exists?(id: [1, 4, 8])
# Person.exists?(name: 'David')
# Person.exists?(false)
# Person.exists?
# Person.where(name: 'Spartacus', rating: 4).exists?
def exists?(conditions = :none)
if Base === conditions
raise ArgumentError, <<-MSG.squish
You are passing an instance of ActiveRecord::Base to `exists?`.
Please pass the id of the object by calling `.id`.
MSG
end
return false if !conditions || limit_value == 0
if eager_loading?
relation = apply_join_dependency(eager_loading: false)
return relation.exists?(conditions)
end
relation = construct_relation_for_exists(conditions)
return false if relation.where_clause.contradiction?
skip_query_cache_if_necessary { connection.select_rows(relation.arel, "#{name} Exists?").size == 1 }
end
# Returns true if the relation contains the given record or false otherwise.
#
# No query is performed if the relation is loaded; the given record is
# compared to the records in memory. If the relation is unloaded, an
# efficient existence query is performed, as in #exists?.
def include?(record)
if loaded? || offset_value || limit_value || having_clause.any?
records.include?(record)
else
record.is_a?(klass) && exists?(record.id)
end
end
alias :member? :include?
# This method is called whenever no records are found with either a single
# id or multiple ids and raises an ActiveRecord::RecordNotFound exception.
#
# The error message is different depending on whether a single id or
# multiple ids are provided. If multiple ids are provided, then the number
# of results obtained should be provided in the +result_size+ argument and
# the expected number of results should be provided in the +expected_size+
# argument.
def raise_record_not_found_exception!(ids = nil, result_size = nil, expected_size = nil, key = primary_key, not_found_ids = nil) # :nodoc:
conditions = " [#{arel.where_sql(klass)}]" unless where_clause.empty?
name = @klass.name
if ids.nil?
error = +"Couldn't find #{name}"
error << " with#{conditions}" if conditions
raise RecordNotFound.new(error, name, key)
elsif Array.wrap(ids).size == 1
error = "Couldn't find #{name} with '#{key}'=#{ids}#{conditions}"
raise RecordNotFound.new(error, name, key, ids)
else
error = +"Couldn't find all #{name.pluralize} with '#{key}': "
error << "(#{ids.join(", ")})#{conditions} (found #{result_size} results, but was looking for #{expected_size})."
error << " Couldn't find #{name.pluralize(not_found_ids.size)} with #{key.to_s.pluralize(not_found_ids.size)} #{not_found_ids.join(', ')}." if not_found_ids
raise RecordNotFound.new(error, name, key, ids)
end
end
private
def construct_relation_for_exists(conditions)
conditions = sanitize_forbidden_attributes(conditions)
if distinct_value && offset_value
relation = except(:order).limit!(1)
else
relation = except(:select, :distinct, :order)._select!(ONE_AS_ONE).limit!(1)
end
case conditions
when Array, Hash
relation.where!(conditions) unless conditions.empty?
else
relation.where!(primary_key => conditions) unless conditions == :none
end
relation
end
def apply_join_dependency(eager_loading: group_values.empty?)
join_dependency = construct_join_dependency(
eager_load_values | includes_values, Arel::Nodes::OuterJoin
)
relation = except(:includes, :eager_load, :preload).joins!(join_dependency)
if eager_loading && has_limit_or_offset? && !(
using_limitable_reflections?(join_dependency.reflections) &&
using_limitable_reflections?(
construct_join_dependency(
select_association_list(joins_values).concat(
select_association_list(left_outer_joins_values)
), nil
).reflections
)
)
relation = skip_query_cache_if_necessary do
klass.connection.distinct_relation_for_primary_key(relation)
end
end
if block_given?
yield relation, join_dependency
else
relation
end
end
def using_limitable_reflections?(reflections)
reflections.none?(&:collection?)
end
def find_with_ids(*ids)
raise UnknownPrimaryKey.new(@klass) if primary_key.nil?
expects_array = ids.first.kind_of?(Array)
return [] if expects_array && ids.first.empty?
ids = ids.flatten.compact.uniq
model_name = @klass.name
case ids.size
when 0
error_message = "Couldn't find #{model_name} without an ID"
raise RecordNotFound.new(error_message, model_name, primary_key)
when 1
result = find_one(ids.first)
expects_array ? [ result ] : result
else
find_some(ids)
end
end
def find_one(id)
if ActiveRecord::Base === id
raise ArgumentError, <<-MSG.squish
You are passing an instance of ActiveRecord::Base to `find`.
Please pass the id of the object by calling `.id`.
MSG
end
relation = where(primary_key => id)
record = relation.take
raise_record_not_found_exception!(id, 0, 1) unless record
record
end
def find_some(ids)
return find_some_ordered(ids) unless order_values.present?
result = where(primary_key => ids).to_a
expected_size =
if limit_value && ids.size > limit_value
limit_value
else
ids.size
end
# 11 ids with limit 3, offset 9 should give 2 results.
if offset_value && (ids.size - offset_value < expected_size)
expected_size = ids.size - offset_value
end
if result.size == expected_size
result
else
raise_record_not_found_exception!(ids, result.size, expected_size)
end
end
def find_some_ordered(ids)
ids = ids.slice(offset_value || 0, limit_value || ids.size) || []
result = except(:limit, :offset).where(primary_key => ids).records
if result.size == ids.size
result.in_order_of(:id, ids.map { |id| @klass.type_for_attribute(primary_key).cast(id) })
else
raise_record_not_found_exception!(ids, result.size, ids.size)
end
end
def find_take
if loaded?
records.first
else
@take ||= limit(1).records.first
end
end
def find_take_with_limit(limit)
if loaded?
records.take(limit)
else
limit(limit).to_a
end
end
def find_nth(index)
@offsets ||= {}
@offsets[index] ||= find_nth_with_limit(index, 1).first
end
def find_nth_with_limit(index, limit)
if loaded?
records[index, limit] || []
else
relation = ordered_relation
if limit_value
limit = [limit_value - index, limit].min
end
if limit > 0
relation = relation.offset((offset_value || 0) + index) unless index.zero?
relation.limit(limit).to_a
else
[]
end
end
end
def find_nth_from_last(index)
if loaded?
records[-index]
else
relation = ordered_relation
if equal?(relation) || has_limit_or_offset?
relation.records[-index]
else
relation.last(index)[-index]
end
end
end
def find_last(limit)
limit ? records.last(limit) : records.last
end
def ordered_relation
if order_values.empty? && (implicit_order_column || primary_key)
if implicit_order_column && primary_key && implicit_order_column != primary_key
order(table[implicit_order_column].asc, table[primary_key].asc)
else
order(table[implicit_order_column || primary_key].asc)
end
else
self
end
end
end
end
# frozen_string_literal: true
require "erb"
require "yaml"
require "zlib"
require "set"
require "active_support/dependencies"
require "active_support/core_ext/digest/uuid"
require "active_record/fixture_set/file"
require "active_record/fixture_set/render_context"
require "active_record/fixture_set/table_rows"
require "active_record/test_fixtures"
module ActiveRecord
class FixtureClassNotFound < ActiveRecord::ActiveRecordError # :nodoc:
end
# \Fixtures are a way of organizing data that you want to test against; in short, sample data.
#
# They are stored in YAML files, one file per model, which are placed in the directory
# appointed by <tt>ActiveSupport::TestCase.fixture_path=(path)</tt> (this is automatically
# configured for Rails, so you can just put your files in <tt><your-rails-app>/test/fixtures/</tt>).
# The fixture file ends with the +.yml+ file extension, for example:
# <tt><your-rails-app>/test/fixtures/web_sites.yml</tt>).
#
# The format of a fixture file looks like this:
#
# rubyonrails:
# id: 1
# name: Ruby on Rails
# url: http://www.rubyonrails.org
#
# google:
# id: 2
# name: Google
# url: http://www.google.com
#
# This fixture file includes two fixtures. Each YAML fixture (i.e. record) is given a name and
# is followed by an indented list of key/value pairs in the "key: value" format. Records are
# separated by a blank line for your viewing pleasure.
#
# Note: Fixtures are unordered. If you want ordered fixtures, use the omap YAML type.
# See https://yaml.org/type/omap.html
# for the specification. You will need ordered fixtures when you have foreign key constraints
# on keys in the same table. This is commonly needed for tree structures. Example:
#
# --- !omap
# - parent:
# id: 1
# parent_id: NULL
# title: Parent
# - child:
# id: 2
# parent_id: 1
# title: Child
#
# = Using Fixtures in Test Cases
#
# Since fixtures are a testing construct, we use them in our unit and functional tests. There
# are two ways to use the fixtures, but first let's take a look at a sample unit test:
#
# require "test_helper"
#
# class WebSiteTest < ActiveSupport::TestCase
# test "web_site_count" do
# assert_equal 2, WebSite.count
# end
# end
#
# By default, +test_helper.rb+ will load all of your fixtures into your test
# database, so this test will succeed.
#
# The testing environment will automatically load all the fixtures into the database before each
# test. To ensure consistent data, the environment deletes the fixtures before running the load.
#
# In addition to being available in the database, the fixture's data may also be accessed by
# using a special dynamic method, which has the same name as the model.
#
# Passing in a fixture name to this dynamic method returns the fixture matching this name:
#
# test "find one" do
# assert_equal "Ruby on Rails", web_sites(:rubyonrails).name
# end
#
# Passing in multiple fixture names returns all fixtures matching these names:
#
# test "find all by name" do
# assert_equal 2, web_sites(:rubyonrails, :google).length
# end
#
# Passing in no arguments returns all fixtures:
#
# test "find all" do
# assert_equal 2, web_sites.length
# end
#
# Passing in any fixture name that does not exist will raise <tt>StandardError</tt>:
#
# test "find by name that does not exist" do
# assert_raise(StandardError) { web_sites(:reddit) }
# end
#
# Alternatively, you may enable auto-instantiation of the fixture data. For instance, take the
# following tests:
#
# test "find_alt_method_1" do
# assert_equal "Ruby on Rails", @web_sites['rubyonrails']['name']
# end
#
# test "find_alt_method_2" do
# assert_equal "Ruby on Rails", @rubyonrails.name
# end
#
# In order to use these methods to access fixtured data within your test cases, you must specify one of the
# following in your ActiveSupport::TestCase-derived class:
#
# - to fully enable instantiated fixtures (enable alternate methods #1 and #2 above)
# self.use_instantiated_fixtures = true
#
# - create only the hash for the fixtures, do not 'find' each instance (enable alternate method #1 only)
# self.use_instantiated_fixtures = :no_instances
#
# Using either of these alternate methods incurs a performance hit, as the fixtured data must be fully
# traversed in the database to create the fixture hash and/or instance variables. This is expensive for
# large sets of fixtured data.
#
# = Dynamic fixtures with ERB
#
# Sometimes you don't care about the content of the fixtures as much as you care about the volume.
# In these cases, you can mix ERB in with your YAML fixtures to create a bunch of fixtures for load
# testing, like:
#
# <% 1.upto(1000) do |i| %>
# fix_<%= i %>:
# id: <%= i %>
# name: guy_<%= i %>
# <% end %>
#
# This will create 1000 very simple fixtures.
#
# Using ERB, you can also inject dynamic values into your fixtures with inserts like
# <tt><%= Date.today.strftime("%Y-%m-%d") %></tt>.
# This is however a feature to be used with some caution. The point of fixtures are that they're
# stable units of predictable sample data. If you feel that you need to inject dynamic values, then
# perhaps you should reexamine whether your application is properly testable. Hence, dynamic values
# in fixtures are to be considered a code smell.
#
# Helper methods defined in a fixture will not be available in other fixtures, to prevent against
# unwanted inter-test dependencies. Methods used by multiple fixtures should be defined in a module
# that is included in ActiveRecord::FixtureSet.context_class.
#
# - define a helper method in <tt>test_helper.rb</tt>
# module FixtureFileHelpers
# def file_sha(path)
# OpenSSL::Digest::SHA256.hexdigest(File.read(Rails.root.join('test/fixtures', path)))
# end
# end
# ActiveRecord::FixtureSet.context_class.include FixtureFileHelpers
#
# - use the helper method in a fixture
# photo:
# name: kitten.png
# sha: <%= file_sha 'files/kitten.png' %>
#
# = Transactional Tests
#
# Test cases can use begin+rollback to isolate their changes to the database instead of having to
# delete+insert for every test case.
#
# class FooTest < ActiveSupport::TestCase
# self.use_transactional_tests = true
#
# test "godzilla" do
# assert_not_empty Foo.all
# Foo.destroy_all
# assert_empty Foo.all
# end
#
# test "godzilla aftermath" do
# assert_not_empty Foo.all
# end
# end
#
# If you preload your test database with all fixture data (probably by running <tt>bin/rails db:fixtures:load</tt>)
# and use transactional tests, then you may omit all fixtures declarations in your test cases since
# all the data's already there and every case rolls back its changes.
#
# In order to use instantiated fixtures with preloaded data, set +self.pre_loaded_fixtures+ to
# true. This will provide access to fixture data for every table that has been loaded through
# fixtures (depending on the value of +use_instantiated_fixtures+).
#
# When *not* to use transactional tests:
#
# 1. You're testing whether a transaction works correctly. Nested transactions don't commit until
# all parent transactions commit, particularly, the fixtures transaction which is begun in setup
# and rolled back in teardown. Thus, you won't be able to verify
# the results of your transaction until Active Record supports nested transactions or savepoints (in progress).
# 2. Your database does not support transactions. Every Active Record database supports transactions except MySQL MyISAM.
# Use InnoDB, MaxDB, or NDB instead.
#
# = Advanced Fixtures
#
# Fixtures that don't specify an ID get some extra features:
#
# * Stable, autogenerated IDs
# * Label references for associations (belongs_to, has_one, has_many)
# * HABTM associations as inline lists
#
# There are some more advanced features available even if the id is specified:
#
# * Autofilled timestamp columns
# * Fixture label interpolation
# * Support for YAML defaults
#
# == Stable, Autogenerated IDs
#
# Here, have a monkey fixture:
#
# george:
# id: 1
# name: George the Monkey
#
# reginald:
# id: 2
# name: Reginald the Pirate
#
# Each of these fixtures has two unique identifiers: one for the database
# and one for the humans. Why don't we generate the primary key instead?
# Hashing each fixture's label yields a consistent ID:
#
# george: # generated id: 503576764
# name: George the Monkey
#
# reginald: # generated id: 324201669
# name: Reginald the Pirate
#
# Active Record looks at the fixture's model class, discovers the correct
# primary key, and generates it right before inserting the fixture
# into the database.
#
# The generated ID for a given label is constant, so we can discover
# any fixture's ID without loading anything, as long as we know the label.
#
# == Label references for associations (+belongs_to+, +has_one+, +has_many+)
#
# Specifying foreign keys in fixtures can be very fragile, not to
# mention difficult to read. Since Active Record can figure out the ID of
# any fixture from its label, you can specify FK's by label instead of ID.
#
# === +belongs_to+
#
# Let's break out some more monkeys and pirates.
#
# ### in pirates.yml
#
# reginald:
# id: 1
# name: Reginald the Pirate
# monkey_id: 1
#
# ### in monkeys.yml
#
# george:
# id: 1
# name: George the Monkey
# pirate_id: 1
#
# Add a few more monkeys and pirates and break this into multiple files,
# and it gets pretty hard to keep track of what's going on. Let's
# use labels instead of IDs:
#
# ### in pirates.yml
#
# reginald:
# name: Reginald the Pirate
# monkey: george
#
# ### in monkeys.yml
#
# george:
# name: George the Monkey
# pirate: reginald
#
# Pow! All is made clear. Active Record reflects on the fixture's model class,
# finds all the +belongs_to+ associations, and allows you to specify
# a target *label* for the *association* (monkey: george) rather than
# a target *id* for the *FK* (<tt>monkey_id: 1</tt>).
#
# ==== Polymorphic +belongs_to+
#
# Supporting polymorphic relationships is a little bit more complicated, since
# Active Record needs to know what type your association is pointing at. Something
# like this should look familiar:
#
# ### in fruit.rb
#
# belongs_to :eater, polymorphic: true
#
# ### in fruits.yml
#
# apple:
# id: 1
# name: apple
# eater_id: 1
# eater_type: Monkey
#
# Can we do better? You bet!
#
# apple:
# eater: george (Monkey)
#
# Just provide the polymorphic target type and Active Record will take care of the rest.
#
# === +has_and_belongs_to_many+ or <tt>has_many :through</tt>
#
# Time to give our monkey some fruit.
#
# ### in monkeys.yml
#
# george:
# id: 1
# name: George the Monkey
#
# ### in fruits.yml
#
# apple:
# id: 1
# name: apple
#
# orange:
# id: 2
# name: orange
#
# grape:
# id: 3
# name: grape
#
# ### in fruits_monkeys.yml
#
# apple_george:
# fruit_id: 1
# monkey_id: 1
#
# orange_george:
# fruit_id: 2
# monkey_id: 1
#
# grape_george:
# fruit_id: 3
# monkey_id: 1
#
# Let's make the HABTM fixture go away.
#
# ### in monkeys.yml
#
# george:
# id: 1
# name: George the Monkey
# fruits: apple, orange, grape
#
# ### in fruits.yml
#
# apple:
# name: apple
#
# orange:
# name: orange
#
# grape:
# name: grape
#
# Zap! No more fruits_monkeys.yml file. We've specified the list of fruits
# on George's fixture, but we could've just as easily specified a list
# of monkeys on each fruit. As with +belongs_to+, Active Record reflects on
# the fixture's model class and discovers the +has_and_belongs_to_many+
# associations.
#
# == Autofilled Timestamp Columns
#
# If your table/model specifies any of Active Record's
# standard timestamp columns (+created_at+, +created_on+, +updated_at+, +updated_on+),
# they will automatically be set to <tt>Time.now</tt>.
#
# If you've set specific values, they'll be left alone.
#
# == Fixture label interpolation
#
# The label of the current fixture is always available as a column value:
#
# geeksomnia:
# name: Geeksomnia's Account
# subdomain: $LABEL
# email: $LABEL@email.com
#
# Also, sometimes (like when porting older join table fixtures) you'll need
# to be able to get a hold of the identifier for a given label. ERB
# to the rescue:
#
# george_reginald:
# monkey_id: <%= ActiveRecord::FixtureSet.identify(:reginald) %>
# pirate_id: <%= ActiveRecord::FixtureSet.identify(:george) %>
#
# == Support for YAML defaults
#
# You can set and reuse defaults in your fixtures YAML file.
# This is the same technique used in the +database.yml+ file to specify
# defaults:
#
# DEFAULTS: &DEFAULTS
# created_on: <%= 3.weeks.ago.to_fs(:db) %>
#
# first:
# name: Smurf
# <<: *DEFAULTS
#
# second:
# name: Fraggle
# <<: *DEFAULTS
#
# Any fixture labeled "DEFAULTS" is safely ignored.
#
# Besides using "DEFAULTS", you can also specify what fixtures will
# be ignored by setting "ignore" in "_fixture" section.
#
# # users.yml
# _fixture:
# ignore:
# - base
# # or use "ignore: base" when there is only one fixture that needs to be ignored.
#
# base: &base
# admin: false
# introduction: "This is a default description"
#
# admin:
# <<: *base
# admin: true
#
# visitor:
# <<: *base
#
# In the above example, 'base' will be ignored when creating fixtures.
# This can be used for common attributes inheriting.
#
# == Configure the fixture model class
#
# It's possible to set the fixture's model class directly in the YAML file.
# This is helpful when fixtures are loaded outside tests and
# +set_fixture_class+ is not available (e.g.
# when running <tt>bin/rails db:fixtures:load</tt>).
#
# _fixture:
# model_class: User
# david:
# name: David
#
# Any fixtures labeled "_fixture" are safely ignored.
class FixtureSet
#--
# An instance of FixtureSet is normally stored in a single YAML file and
# possibly in a folder with the same name.
#++
MAX_ID = 2**30 - 1
@@all_cached_fixtures = Hash.new { |h, k| h[k] = {} }
cattr_accessor :all_loaded_fixtures, default: {}
class ClassCache
def initialize(class_names, config)
@class_names = class_names.stringify_keys
@config = config
# Remove string values that aren't constants or subclasses of AR
@class_names.delete_if do |klass_name, klass|
!insert_class(@class_names, klass_name, klass)
end
end
def [](fs_name)
@class_names.fetch(fs_name) do
klass = default_fixture_model(fs_name, @config).safe_constantize
insert_class(@class_names, fs_name, klass)
end
end
private
def insert_class(class_names, name, klass)
# We only want to deal with AR objects.
if klass && klass < ActiveRecord::Base
class_names[name] = klass
else
class_names[name] = nil
end
end
def default_fixture_model(fs_name, config)
ActiveRecord::FixtureSet.default_fixture_model_name(fs_name, config)
end
end
class << self
def default_fixture_model_name(fixture_set_name, config = ActiveRecord::Base) # :nodoc:
config.pluralize_table_names ?
fixture_set_name.singularize.camelize :
fixture_set_name.camelize
end
def default_fixture_table_name(fixture_set_name, config = ActiveRecord::Base) # :nodoc:
"#{ config.table_name_prefix }"\
"#{ fixture_set_name.tr('/', '_') }"\
"#{ config.table_name_suffix }".to_sym
end
def reset_cache
@@all_cached_fixtures.clear
end
def cache_for_connection(connection)
@@all_cached_fixtures[connection]
end
def fixture_is_cached?(connection, table_name)
cache_for_connection(connection)[table_name]
end
def cached_fixtures(connection, keys_to_fetch = nil)
if keys_to_fetch
cache_for_connection(connection).values_at(*keys_to_fetch)
else
cache_for_connection(connection).values
end
end
def cache_fixtures(connection, fixtures_map)
cache_for_connection(connection).update(fixtures_map)
end
def instantiate_fixtures(object, fixture_set, load_instances = true)
return unless load_instances
fixture_set.each do |fixture_name, fixture|
object.instance_variable_set "@#{fixture_name}", fixture.find
rescue FixtureClassNotFound
nil
end
end
def instantiate_all_loaded_fixtures(object, load_instances = true)
all_loaded_fixtures.each_value do |fixture_set|
instantiate_fixtures(object, fixture_set, load_instances)
end
end
def create_fixtures(fixtures_directory, fixture_set_names, class_names = {}, config = ActiveRecord::Base, &block)
fixture_set_names = Array(fixture_set_names).map(&:to_s)
class_names = ClassCache.new class_names, config
# FIXME: Apparently JK uses this.
connection = block_given? ? block : lambda { ActiveRecord::Base.connection }
fixture_files_to_read = fixture_set_names.reject do |fs_name|
fixture_is_cached?(connection.call, fs_name)
end
if fixture_files_to_read.any?
fixtures_map = read_and_insert(
fixtures_directory,
fixture_files_to_read,
class_names,
connection,
)
cache_fixtures(connection.call, fixtures_map)
end
cached_fixtures(connection.call, fixture_set_names)
end
# Returns a consistent, platform-independent identifier for +label+.
# Integer identifiers are values less than 2^30. UUIDs are RFC 4122 version 5 SHA-1 hashes.
def identify(label, column_type = :integer)
if column_type == :uuid
Digest::UUID.uuid_v5(Digest::UUID::OID_NAMESPACE, label.to_s)
else
Zlib.crc32(label.to_s) % MAX_ID
end
end
# Superclass for the evaluation contexts used by ERB fixtures.
def context_class
@context_class ||= Class.new
end
private
def read_and_insert(fixtures_directory, fixture_files, class_names, connection) # :nodoc:
fixtures_map = {}
fixture_sets = fixture_files.map do |fixture_set_name|
klass = class_names[fixture_set_name]
fixtures_map[fixture_set_name] = new( # ActiveRecord::FixtureSet.new
nil,
fixture_set_name,
klass,
::File.join(fixtures_directory, fixture_set_name)
)
end
update_all_loaded_fixtures(fixtures_map)
insert(fixture_sets, connection)
fixtures_map
end
def insert(fixture_sets, connection) # :nodoc:
fixture_sets_by_connection = fixture_sets.group_by do |fixture_set|
if fixture_set.model_class
fixture_set.model_class.connection
else
connection.call
end
end
fixture_sets_by_connection.each do |conn, set|
table_rows_for_connection = Hash.new { |h, k| h[k] = [] }
set.each do |fixture_set|
fixture_set.table_rows.each do |table, rows|
table_rows_for_connection[table].unshift(*rows)
end
end
conn.insert_fixtures_set(table_rows_for_connection, table_rows_for_connection.keys)
if ActiveRecord.verify_foreign_keys_for_fixtures && !conn.all_foreign_keys_valid?
raise "Foreign key violations found in your fixture data. Ensure you aren't referring to labels that don't exist on associations."
end
# Cap primary key sequences to max(pk).
if conn.respond_to?(:reset_pk_sequence!)
set.each { |fs| conn.reset_pk_sequence!(fs.table_name) }
end
end
end
def update_all_loaded_fixtures(fixtures_map) # :nodoc:
all_loaded_fixtures.update(fixtures_map)
end
end
attr_reader :table_name, :name, :fixtures, :model_class, :ignored_fixtures, :config
def initialize(_, name, class_name, path, config = ActiveRecord::Base)
@name = name
@path = path
@config = config
self.model_class = class_name
@fixtures = read_fixture_files(path)
@table_name = model_class&.table_name || self.class.default_fixture_table_name(name, config)
end
def [](x)
fixtures[x]
end
def []=(k, v)
fixtures[k] = v
end
def each(&block)
fixtures.each(&block)
end
def size
fixtures.size
end
# Returns a hash of rows to be inserted. The key is the table, the value is
# a list of rows to insert to that table.
def table_rows
# allow specifying fixtures to be ignored by setting `ignore` in `_fixture` section
fixtures.except!(*ignored_fixtures)
TableRows.new(
table_name,
model_class: model_class,
fixtures: fixtures,
).to_hash
end
private
def model_class=(class_name)
if class_name.is_a?(Class) # TODO: Should be an AR::Base type class, or any?
@model_class = class_name
else
@model_class = class_name.safe_constantize if class_name
end
end
def ignored_fixtures=(base)
@ignored_fixtures =
case base
when Array
base
when String
[base]
else
[]
end
@ignored_fixtures << "DEFAULTS" unless @ignored_fixtures.include?("DEFAULTS")
@ignored_fixtures.compact
end
# Loads the fixtures from the YAML file at +path+.
# If the file sets the +model_class+ and current instance value is not set,
# it uses the file value.
def read_fixture_files(path)
yaml_files = Dir["#{path}/{**,*}/*.yml"].select { |f|
::File.file?(f)
} + [yaml_file_path(path)]
yaml_files.each_with_object({}) do |file, fixtures|
FixtureSet::File.open(file) do |fh|
self.model_class ||= fh.model_class if fh.model_class
self.ignored_fixtures ||= fh.ignored_fixtures
fh.each do |fixture_name, row|
fixtures[fixture_name] = ActiveRecord::Fixture.new(row, model_class)
end
end
end
end
def yaml_file_path(path)
"#{path}.yml"
end
end
class Fixture # :nodoc:
include Enumerable
class FixtureError < StandardError # :nodoc:
end
class FormatError < FixtureError # :nodoc:
end
attr_reader :model_class, :fixture
def initialize(fixture, model_class)
@fixture = fixture
@model_class = model_class
end
def class_name
model_class.name if model_class
end
def each(&block)
fixture.each(&block)
end
def [](key)
fixture[key]
end
alias :to_hash :fixture
def find
raise FixtureClassNotFound, "No class attached to find." unless model_class
object = model_class.unscoped do
model_class.find(fixture[model_class.primary_key])
end
# Fixtures can't be eagerly loaded
object.instance_variable_set(:@strict_loading, false)
object
end
end
end
ActiveSupport.run_load_hooks :active_record_fixture_set, ActiveRecord::FixtureSet
# frozen_string_literal: true
module ActiveRecord::Associations
module ForeignAssociation # :nodoc:
def foreign_key_present?
if reflection.klass.primary_key
owner.attribute_present?(reflection.active_record_primary_key)
else
false
end
end
def nullified_owner_attributes
Hash.new.tap do |attrs|
attrs[reflection.foreign_key] = nil
attrs[reflection.type] = nil if reflection.type.present?
end
end
private
# Sets the owner attributes on the given record
def set_owner_attributes(record)
return if options[:through]
key = owner._read_attribute(reflection.join_foreign_key)
record._write_attribute(reflection.join_primary_key, key)
if reflection.type
record._write_attribute(reflection.type, owner.class.polymorphic_name)
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
class Relation
class FromClause # :nodoc:
attr_reader :value, :name
def initialize(value, name)
@value = value
@name = name
end
def merge(other)
self
end
def empty?
value.nil?
end
def ==(other)
self.class == other.class && value == other.value && name == other.name
end
def self.empty
@empty ||= new(nil, nil).freeze
end
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Nodes
class FullOuterJoin < Arel::Nodes::Join
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Nodes
class Function < Arel::Nodes::NodeExpression
include Arel::WindowPredications
include Arel::FilterPredications
attr_accessor :expressions, :alias, :distinct
def initialize(expr, aliaz = nil)
super()
@expressions = expr
@alias = aliaz && SqlLiteral.new(aliaz)
@distinct = false
end
def as(aliaz)
self.alias = SqlLiteral.new(aliaz)
self
end
def hash
[@expressions, @alias, @distinct].hash
end
def eql?(other)
self.class == other.class &&
self.expressions == other.expressions &&
self.alias == other.alias &&
self.distinct == other.distinct
end
alias :== :eql?
end
%w{
Sum
Exists
Max
Min
Avg
}.each do |name|
const_set(name, Class.new(Function))
end
end
end
# frozen_string_literal: true
module ActiveRecord
class FutureResult # :nodoc:
class EventBuffer
def initialize(future_result, instrumenter)
@future_result = future_result
@instrumenter = instrumenter
@events = []
end
def instrument(name, payload = {}, &block)
event = @instrumenter.new_event(name, payload)
@events << event
event.record(&block)
end
def flush
events, @events = @events, []
events.each do |event|
event.payload[:lock_wait] = @future_result.lock_wait
ActiveSupport::Notifications.publish_event(event)
end
end
end
Canceled = Class.new(ActiveRecordError)
delegate :empty?, :to_a, to: :result
attr_reader :lock_wait
def initialize(pool, *args, **kwargs)
@mutex = Mutex.new
@session = nil
@pool = pool
@args = args
@kwargs = kwargs
@pending = true
@error = nil
@result = nil
@instrumenter = ActiveSupport::Notifications.instrumenter
@event_buffer = nil
end
def schedule!(session)
@session = session
@pool.schedule_query(self)
end
def execute!(connection)
execute_query(connection)
end
def cancel
@pending = false
@error = Canceled
self
end
def execute_or_skip
return unless pending?
@pool.with_connection do |connection|
return unless @mutex.try_lock
begin
if pending?
@event_buffer = EventBuffer.new(self, @instrumenter)
connection.with_instrumenter(@event_buffer) do
execute_query(connection, async: true)
end
end
ensure
@mutex.unlock
end
end
end
def result
execute_or_wait
@event_buffer&.flush
if canceled?
raise Canceled
elsif @error
raise @error
else
@result
end
end
def pending?
@pending && (!@session || @session.active?)
end
private
def canceled?
@session && !@session.active?
end
def execute_or_wait
if pending?
start = Process.clock_gettime(Process::CLOCK_MONOTONIC, :float_millisecond)
@mutex.synchronize do
if pending?
execute_query(@pool.connection)
else
@lock_wait = (Process.clock_gettime(Process::CLOCK_MONOTONIC, :float_millisecond) - start)
end
end
else
@lock_wait = 0.0
end
end
def execute_query(connection, async: false)
@result = exec_query(connection, *@args, **@kwargs, async: async)
rescue => error
@error = error
ensure
@pending = false
end
def exec_query(connection, *args, **kwargs)
connection.exec_query(*args, **kwargs)
end
class SelectAll < FutureResult # :nodoc:
private
def exec_query(*, **)
super
rescue ::RangeError
ActiveRecord::Result.empty
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
# Returns the version of the currently loaded Active Record as a <tt>Gem::Version</tt>
def self.gem_version
Gem::Version.new VERSION::STRING
end
module VERSION
MAJOR = 7
MINOR = 1
TINY = 0
PRE = "alpha"
STRING = [MAJOR, MINOR, TINY, PRE].compact.join(".")
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Nodes
class Grouping < Unary
def fetch_attribute(&block)
expr.fetch_attribute(&block)
end
end
end
end
# frozen_string_literal: true
module ActiveRecord::Associations::Builder # :nodoc:
class HasAndBelongsToMany # :nodoc:
attr_reader :lhs_model, :association_name, :options
def initialize(association_name, lhs_model, options)
@association_name = association_name
@lhs_model = lhs_model
@options = options
end
def through_model
join_model = Class.new(ActiveRecord::Base) {
class << self
attr_accessor :left_model
attr_accessor :name
attr_accessor :table_name_resolver
attr_accessor :left_reflection
attr_accessor :right_reflection
end
def self.table_name
# Table name needs to be resolved lazily
# because RHS class might not have been loaded
@table_name ||= table_name_resolver.call
end
def self.compute_type(class_name)
left_model.compute_type class_name
end
def self.add_left_association(name, options)
belongs_to name, required: false, **options
self.left_reflection = _reflect_on_association(name)
end
def self.add_right_association(name, options)
rhs_name = name.to_s.singularize.to_sym
belongs_to rhs_name, required: false, **options
self.right_reflection = _reflect_on_association(rhs_name)
end
def self.retrieve_connection
left_model.retrieve_connection
end
private
def self.suppress_composite_primary_key(pk)
pk unless pk.is_a?(Array)
end
}
join_model.name = "HABTM_#{association_name.to_s.camelize}"
join_model.table_name_resolver = -> { table_name }
join_model.left_model = lhs_model
join_model.add_left_association :left_side, anonymous_class: lhs_model
join_model.add_right_association association_name, belongs_to_options(options)
join_model
end
def middle_reflection(join_model)
middle_name = [lhs_model.name.downcase.pluralize,
association_name.to_s].sort.join("_").gsub("::", "_").to_sym
middle_options = middle_options join_model
HasMany.create_reflection(lhs_model,
middle_name,
nil,
middle_options)
end
private
def middle_options(join_model)
middle_options = {}
middle_options[:class_name] = "#{lhs_model.name}::#{join_model.name}"
if options.key? :foreign_key
middle_options[:foreign_key] = options[:foreign_key]
end
middle_options
end
def table_name
if options[:join_table]
options[:join_table].to_s
else
class_name = options.fetch(:class_name) {
association_name.to_s.camelize.singularize
}
klass = lhs_model.send(:compute_type, class_name.to_s)
[lhs_model.table_name, klass.table_name].sort.join("\0").gsub(/^(.*[._])(.+)\0\1(.+)/, '\1\2_\3').tr("\0", "_")
end
end
def belongs_to_options(options)
rhs_options = {}
if options.key? :class_name
rhs_options[:foreign_key] = options[:class_name].to_s.foreign_key
rhs_options[:class_name] = options[:class_name]
end
if options.key? :association_foreign_key
rhs_options[:foreign_key] = options[:association_foreign_key]
end
rhs_options
end
end
end
# frozen_string_literal: true
module ActiveRecord::Associations::Builder # :nodoc:
class HasMany < CollectionAssociation # :nodoc:
def self.macro
:has_many
end
def self.valid_options(options)
valid = super + [:counter_cache, :join_table, :index_errors]
valid += [:as, :foreign_type] if options[:as]
valid += [:through, :source, :source_type] if options[:through]
valid += [:ensuring_owner_was] if options[:dependent] == :destroy_async
valid += [:disable_joins] if options[:disable_joins] && options[:through]
valid
end
def self.valid_dependent_options
[:destroy, :delete_all, :nullify, :restrict_with_error, :restrict_with_exception, :destroy_async]
end
private_class_method :macro, :valid_options, :valid_dependent_options
end
end
# frozen_string_literal: true
module ActiveRecord
module Associations
# = Active Record Has Many Association
# This is the proxy that handles a has many association.
#
# If the association has a <tt>:through</tt> option further specialization
# is provided by its child HasManyThroughAssociation.
class HasManyAssociation < CollectionAssociation # :nodoc:
include ForeignAssociation
def handle_dependency
case options[:dependent]
when :restrict_with_exception
raise ActiveRecord::DeleteRestrictionError.new(reflection.name) unless empty?
when :restrict_with_error
unless empty?
record = owner.class.human_attribute_name(reflection.name).downcase
owner.errors.add(:base, :'restrict_dependent_destroy.has_many', record: record)
throw(:abort)
end
when :destroy
# No point in executing the counter update since we're going to destroy the parent anyway
load_target.each { |t| t.destroyed_by_association = reflection }
destroy_all
when :destroy_async
load_target.each do |t|
t.destroyed_by_association = reflection
end
unless target.empty?
association_class = target.first.class
primary_key_column = association_class.primary_key.to_sym
ids = target.collect do |assoc|
assoc.public_send(primary_key_column)
end
enqueue_destroy_association(
owner_model_name: owner.class.to_s,
owner_id: owner.id,
association_class: reflection.klass.to_s,
association_ids: ids,
association_primary_key_column: primary_key_column,
ensuring_owner_was_method: options.fetch(:ensuring_owner_was, nil)
)
end
else
delete_all
end
end
def insert_record(record, validate = true, raise = false)
set_owner_attributes(record)
super
end
private
# Returns the number of records in this collection.
#
# If the association has a counter cache it gets that value. Otherwise
# it will attempt to do a count via SQL, bounded to <tt>:limit</tt> if
# there's one. Some configuration options like :group make it impossible
# to do an SQL count, in those cases the array count will be used.
#
# That does not depend on whether the collection has already been loaded
# or not. The +size+ method is the one that takes the loaded flag into
# account and delegates to +count_records+ if needed.
#
# If the collection is empty the target is set to an empty array and
# the loaded flag is set to true as well.
def count_records
count = if reflection.has_cached_counter?
owner.read_attribute(reflection.counter_cache_column).to_i
else
scope.count(:all)
end
# If there's nothing in the database and @target has no new records
# we are certain the current target is an empty array. This is a
# documented side-effect of the method that may avoid an extra SELECT.
loaded! if count == 0
[association_scope.limit_value, count].compact.min
end
def update_counter(difference, reflection = reflection())
if reflection.has_cached_counter?
owner.increment!(reflection.counter_cache_column, difference)
end
end
def update_counter_in_memory(difference, reflection = reflection())
if reflection.counter_must_be_updated_by_has_many?
counter = reflection.counter_cache_column
owner.increment(counter, difference)
owner.send(:"clear_#{counter}_change")
end
end
def delete_count(method, scope)
if method == :delete_all
scope.delete_all
else
scope.update_all(nullified_owner_attributes)
end
end
def delete_or_nullify_all_records(method)
count = delete_count(method, scope)
update_counter(-count)
count
end
# Deletes the records according to the <tt>:dependent</tt> option.
def delete_records(records, method)
if method == :destroy
records.each(&:destroy!)
update_counter(-records.length) unless reflection.inverse_updates_counter_cache?
else
scope = self.scope.where(reflection.klass.primary_key => records)
update_counter(-delete_count(method, scope))
end
end
def concat_records(records, *)
update_counter_if_success(super, records.length)
end
def _create_record(attributes, *)
if attributes.is_a?(Array)
super
else
update_counter_if_success(super, 1)
end
end
def update_counter_if_success(saved_successfully, difference)
if saved_successfully
update_counter_in_memory(difference)
end
saved_successfully
end
def difference(a, b)
a - b
end
def intersection(a, b)
a & b
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Associations
# = Active Record Has Many Through Association
class HasManyThroughAssociation < HasManyAssociation # :nodoc:
include ThroughAssociation
def initialize(owner, reflection)
super
@through_records = {}.compare_by_identity
end
def concat(*records)
unless owner.new_record?
records.flatten.each do |record|
raise_on_type_mismatch!(record)
end
end
super
end
def insert_record(record, validate = true, raise = false)
ensure_not_nested
if record.new_record? || record.has_changes_to_save?
return unless super
end
save_through_record(record)
record
end
private
def concat_records(records)
ensure_not_nested
records = super(records, true)
if owner.new_record? && records
records.flatten.each do |record|
build_through_record(record)
end
end
records
end
# The through record (built with build_record) is temporarily cached
# so that it may be reused if insert_record is subsequently called.
#
# However, after insert_record has been called, the cache is cleared in
# order to allow multiple instances of the same record in an association.
def build_through_record(record)
@through_records[record] ||= begin
ensure_mutable
attributes = through_scope_attributes
attributes[source_reflection.name] = record
attributes[source_reflection.foreign_type] = options[:source_type] if options[:source_type]
through_association.build(attributes)
end
end
attr_reader :through_scope
def through_scope_attributes
scope = through_scope || self.scope
scope.where_values_hash(through_association.reflection.name.to_s).
except!(through_association.reflection.foreign_key,
through_association.reflection.klass.inheritance_column)
end
def save_through_record(record)
association = build_through_record(record)
if association.changed?
association.save!
end
ensure
@through_records.delete(record)
end
def build_record(attributes)
ensure_not_nested
@through_scope = scope
record = super
inverse = source_reflection.inverse_of
if inverse
if inverse.collection?
record.send(inverse.name) << build_through_record(record)
elsif inverse.has_one?
record.send("#{inverse.name}=", build_through_record(record))
end
end
record
ensure
@through_scope = nil
end
def remove_records(existing_records, records, method)
super
delete_through_records(records)
end
def target_reflection_has_associated_record?
!(through_reflection.belongs_to? && owner[through_reflection.foreign_key].blank?)
end
def update_through_counter?(method)
case method
when :destroy
!through_reflection.inverse_updates_counter_cache?
when :nullify
false
else
true
end
end
def delete_or_nullify_all_records(method)
delete_records(load_target, method)
end
def delete_records(records, method)
ensure_not_nested
scope = through_association.scope
scope.where! construct_join_attributes(*records)
scope = scope.where(through_scope_attributes)
case method
when :destroy
if scope.klass.primary_key
count = scope.destroy_all.count(&:destroyed?)
else
scope.each(&:_run_destroy_callbacks)
count = scope.delete_all
end
when :nullify
count = scope.update_all(source_reflection.foreign_key => nil)
else
count = scope.delete_all
end
delete_through_records(records)
if source_reflection.options[:counter_cache] && method != :destroy
counter = source_reflection.counter_cache_column
klass.decrement_counter counter, records.map(&:id)
end
if through_reflection.collection? && update_through_counter?(method)
update_counter(-count, through_reflection)
else
update_counter(-count)
end
count
end
def difference(a, b)
distribution = distribution(b)
a.reject { |record| mark_occurrence(distribution, record) }
end
def intersection(a, b)
distribution = distribution(b)
a.select { |record| mark_occurrence(distribution, record) }
end
def mark_occurrence(distribution, record)
distribution[record] > 0 && distribution[record] -= 1
end
def distribution(array)
array.each_with_object(Hash.new(0)) do |record, distribution|
distribution[record] += 1
end
end
def through_records_for(record)
attributes = construct_join_attributes(record)
candidates = Array.wrap(through_association.target)
candidates.find_all do |c|
attributes.all? do |key, value|
c.public_send(key) == value
end
end
end
def delete_through_records(records)
records.each do |record|
through_records = through_records_for(record)
if through_reflection.collection?
through_records.each { |r| through_association.target.delete(r) }
else
if through_records.include?(through_association.target)
through_association.target = nil
end
end
@through_records.delete(record)
end
end
def find_target
return [] unless target_reflection_has_associated_record?
return scope.to_a if disable_joins
super
end
# NOTE - not sure that we can actually cope with inverses here
def invertible_for?(record)
false
end
end
end
end
# frozen_string_literal: true
module ActiveRecord::Associations::Builder # :nodoc:
class HasOne < SingularAssociation # :nodoc:
def self.macro
:has_one
end
def self.valid_options(options)
valid = super
valid += [:as, :foreign_type] if options[:as]
valid += [:ensuring_owner_was] if options[:dependent] == :destroy_async
valid += [:through, :source, :source_type] if options[:through]
valid += [:disable_joins] if options[:disable_joins] && options[:through]
valid
end
def self.valid_dependent_options
[:destroy, :destroy_async, :delete, :nullify, :restrict_with_error, :restrict_with_exception]
end
def self.define_callbacks(model, reflection)
super
add_touch_callbacks(model, reflection) if reflection.options[:touch]
end
def self.add_destroy_callbacks(model, reflection)
super unless reflection.options[:through]
end
def self.define_validations(model, reflection)
super
if reflection.options[:required]
model.validates_presence_of reflection.name, message: :required
end
end
def self.touch_record(record, name, touch)
instance = record.send(name)
if instance&.persisted?
touch != true ?
instance.touch(touch) : instance.touch
end
end
def self.add_touch_callbacks(model, reflection)
name = reflection.name
touch = reflection.options[:touch]
callback = -> (record) { HasOne.touch_record(record, name, touch) }
model.after_create callback, if: :saved_changes?
model.after_create_commit { association(name).reset_negative_cache }
model.after_update callback, if: :saved_changes?
model.after_destroy callback
model.after_touch callback
end
private_class_method :macro, :valid_options, :valid_dependent_options, :add_destroy_callbacks,
:define_callbacks, :define_validations, :add_touch_callbacks
end
end
# frozen_string_literal: true
module ActiveRecord
module Associations
# = Active Record Has One Association
class HasOneAssociation < SingularAssociation # :nodoc:
include ForeignAssociation
def handle_dependency
case options[:dependent]
when :restrict_with_exception
raise ActiveRecord::DeleteRestrictionError.new(reflection.name) if load_target
when :restrict_with_error
if load_target
record = owner.class.human_attribute_name(reflection.name).downcase
owner.errors.add(:base, :'restrict_dependent_destroy.has_one', record: record)
throw(:abort)
end
else
delete
end
end
def delete(method = options[:dependent])
if load_target
case method
when :delete
target.delete
when :destroy
target.destroyed_by_association = reflection
target.destroy
throw(:abort) unless target.destroyed?
when :destroy_async
primary_key_column = target.class.primary_key.to_sym
id = target.public_send(primary_key_column)
enqueue_destroy_association(
owner_model_name: owner.class.to_s,
owner_id: owner.id,
association_class: reflection.klass.to_s,
association_ids: [id],
association_primary_key_column: primary_key_column,
ensuring_owner_was_method: options.fetch(:ensuring_owner_was, nil)
)
when :nullify
target.update_columns(nullified_owner_attributes) if target.persisted?
end
end
end
private
def replace(record, save = true)
raise_on_type_mismatch!(record) if record
return target unless load_target || record
assigning_another_record = target != record
if assigning_another_record || record.has_changes_to_save?
save &&= owner.persisted?
transaction_if(save) do
remove_target!(options[:dependent]) if target && !target.destroyed? && assigning_another_record
if record
set_owner_attributes(record)
set_inverse_instance(record)
if save && !record.save
nullify_owner_attributes(record)
set_owner_attributes(target) if target
raise RecordNotSaved.new("Failed to save the new associated #{reflection.name}.", record)
end
end
end
end
self.target = record
end
# The reason that the save param for replace is false, if for create (not just build),
# is because the setting of the foreign keys is actually handled by the scoping when
# the record is instantiated, and so they are set straight away and do not need to be
# updated within replace.
def set_new_record(record)
replace(record, false)
end
def remove_target!(method)
case method
when :delete
target.delete
when :destroy
target.destroyed_by_association = reflection
if target.persisted?
target.destroy
end
else
nullify_owner_attributes(target)
remove_inverse_instance(target)
if target.persisted? && owner.persisted? && !target.save
set_owner_attributes(target)
raise RecordNotSaved.new(
"Failed to remove the existing associated #{reflection.name}. " \
"The record failed to save after its foreign key was set to nil.",
target
)
end
end
end
def nullify_owner_attributes(record)
record[reflection.foreign_key] = nil
end
def transaction_if(value, &block)
if value
reflection.klass.transaction(&block)
else
yield
end
end
def _create_record(attributes, raise_error = false, &block)
unless owner.persisted?
raise ActiveRecord::RecordNotSaved.new("You cannot call create unless the parent is saved", owner)
end
super
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Associations
# = Active Record Has One Through Association
class HasOneThroughAssociation < HasOneAssociation # :nodoc:
include ThroughAssociation
private
def replace(record, save = true)
create_through_record(record, save)
self.target = record
end
def create_through_record(record, save)
ensure_not_nested
through_proxy = through_association
through_record = through_proxy.load_target
if through_record && !record
through_record.destroy
elsif record
attributes = construct_join_attributes(record)
if through_record && through_record.destroyed?
through_record = through_proxy.tap(&:reload).target
end
if through_record
if through_record.new_record?
through_record.assign_attributes(attributes)
else
through_record.update(attributes)
end
elsif owner.new_record? || !save
through_proxy.build(attributes)
else
through_proxy.create(attributes)
end
end
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
class DatabaseConfigurations
# A HashConfig object is created for each database configuration entry that
# is created from a hash.
#
# A hash config:
#
# { "development" => { "database" => "db_name" } }
#
# Becomes:
#
# #<ActiveRecord::DatabaseConfigurations::HashConfig:0x00007fd1acbded10
# @env_name="development", @name="primary", @config={database: "db_name"}>
#
# ==== Options
#
# * <tt>:env_name</tt> - The Rails environment, i.e. "development".
# * <tt>:name</tt> - The db config name. In a standard two-tier
# database configuration this will default to "primary". In a multiple
# database three-tier database configuration this corresponds to the name
# used in the second tier, for example "primary_readonly".
# * <tt>:config</tt> - The config hash. This is the hash that contains the
# database adapter, name, and other important information for database
# connections.
class HashConfig < DatabaseConfig
attr_reader :configuration_hash
def initialize(env_name, name, configuration_hash)
super(env_name, name)
@configuration_hash = configuration_hash.symbolize_keys.freeze
end
# Determines whether a database configuration is for a replica / readonly
# connection. If the +replica+ key is present in the config, +replica?+ will
# return +true+.
def replica?
configuration_hash[:replica]
end
# The migrations paths for a database configuration. If the
# +migrations_paths+ key is present in the config, +migrations_paths+
# will return its value.
def migrations_paths
configuration_hash[:migrations_paths]
end
def host
configuration_hash[:host]
end
def socket # :nodoc:
configuration_hash[:socket]
end
def database
configuration_hash[:database]
end
def _database=(database) # :nodoc:
@configuration_hash = configuration_hash.merge(database: database).freeze
end
def pool
(configuration_hash[:pool] || 5).to_i
end
def min_threads
(configuration_hash[:min_threads] || 0).to_i
end
def max_threads
(configuration_hash[:max_threads] || pool).to_i
end
def max_queue
max_threads * 4
end
def checkout_timeout
(configuration_hash[:checkout_timeout] || 5).to_f
end
# +reaping_frequency+ is configurable mostly for historical reasons, but it could
# also be useful if someone wants a very low +idle_timeout+.
def reaping_frequency
configuration_hash.fetch(:reaping_frequency, 60)&.to_f
end
def idle_timeout
timeout = configuration_hash.fetch(:idle_timeout, 300).to_f
timeout if timeout > 0
end
def adapter
configuration_hash[:adapter]
end
# The path to the schema cache dump file for a database.
# If omitted, the filename will be read from ENV or a
# default will be derived.
def schema_cache_path
configuration_hash[:schema_cache_path]
end
def default_schema_cache_path
"db/schema_cache.yml"
end
def lazy_schema_cache_path
schema_cache_path || default_schema_cache_path
end
def primary? # :nodoc:
Base.configurations.primary?(name)
end
# Determines whether to dump the schema/structure files and the
# filename that should be used.
#
# If +configuration_hash[:schema_dump]+ is set to +false+ or +nil+
# the schema will not be dumped.
#
# If the config option is set that will be used. Otherwise Rails
# will generate the filename from the database config name.
def schema_dump(format = ActiveRecord.schema_format)
if configuration_hash.key?(:schema_dump)
if config = configuration_hash[:schema_dump]
config
end
elsif primary?
schema_file_type(format)
else
"#{name}_#{schema_file_type(format)}"
end
end
def database_tasks? # :nodoc:
!replica? && !!configuration_hash.fetch(:database_tasks, true)
end
private
def schema_file_type(format)
case format
when :ruby
"schema.rb"
when :sql
"structure.sql"
end
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Type
class HashLookupTypeMap # :nodoc:
def initialize(parent = nil)
@mapping = {}
@cache = Concurrent::Map.new do |h, key|
h.fetch_or_store(key, Concurrent::Map.new)
end
end
def lookup(lookup_key, *args)
fetch(lookup_key, *args) { Type.default_value }
end
def fetch(lookup_key, *args, &block)
@cache[lookup_key].fetch_or_store(args) do
perform_fetch(lookup_key, *args, &block)
end
end
def register_type(key, value = nil, &block)
raise ::ArgumentError unless value || block
if block
@mapping[key] = block
else
@mapping[key] = proc { value }
end
@cache.clear
end
def clear
@mapping.clear
@cache.clear
end
def alias_type(type, alias_type)
register_type(type) { |_, *args| lookup(alias_type, *args) }
end
def key?(key)
@mapping.key?(key)
end
def keys
@mapping.keys
end
private
def perform_fetch(type, *args, &block)
@mapping.fetch(type, block).call(type, *args)
end
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Nodes
class HomogeneousIn < Node
attr_reader :attribute, :values, :type
def initialize(values, attribute, type)
@values = values
@attribute = attribute
@type = type
end
def hash
ivars.hash
end
def eql?(other)
super || (self.class == other.class && self.ivars == other.ivars)
end
alias :== :eql?
def equality?
type == :in
end
def invert
Arel::Nodes::HomogeneousIn.new(values, attribute, type == :in ? :notin : :in)
end
def left
attribute
end
def right
attribute.quoted_array(values)
end
def table_name
attribute.relation.table_alias || attribute.relation.name
end
def column_name
attribute.name
end
def casted_values
type = attribute.type_caster
casted_values = values.map do |raw_value|
type.serialize(raw_value) if type.serializable?(raw_value)
end
casted_values.compact!
casted_values
end
def proc_for_binds
-> value { ActiveModel::Attribute.with_cast_value(attribute.name, value, attribute.type_caster) }
end
def fetch_attribute(&block)
if attribute
yield attribute
else
expr.fetch_attribute(&block)
end
end
protected
def ivars
[@attribute, @values, @type]
end
end
end
end
# frozen_string_literal: true
require "strscan"
module ActiveRecord
module ConnectionAdapters
module PostgreSQL
module OID # :nodoc:
class Hstore < Type::Value # :nodoc:
ERROR = "Invalid Hstore document: %s"
include ActiveModel::Type::Helpers::Mutable
def type
:hstore
end
def deserialize(value)
return value unless value.is_a?(::String)
scanner = StringScanner.new(value)
hash = {}
until scanner.eos?
unless scanner.skip(/"/)
raise(ArgumentError, ERROR % scanner.string.inspect)
end
unless key = scanner.scan_until(/(?<!\\)(?=")/)
raise(ArgumentError, ERROR % scanner.string.inspect)
end
unless scanner.skip(/"=>?/)
raise(ArgumentError, ERROR % scanner.string.inspect)
end
if scanner.scan(/NULL/)
value = nil
else
unless scanner.skip(/"/)
raise(ArgumentError, ERROR % scanner.string.inspect)
end
unless value = scanner.scan_until(/(?<!\\)(?=")/)
raise(ArgumentError, ERROR % scanner.string.inspect)
end
unless scanner.skip(/"/)
raise(ArgumentError, ERROR % scanner.string.inspect)
end
end
key.gsub!('\"', '"')
key.gsub!("\\\\", "\\")
if value
value.gsub!('\"', '"')
value.gsub!("\\\\", "\\")
end
hash[key] = value
unless scanner.skip(/, /) || scanner.eos?
raise(ArgumentError, ERROR % scanner.string.inspect)
end
end
hash
end
def serialize(value)
if value.is_a?(::Hash)
value.map { |k, v| "#{escape_hstore(k)}=>#{escape_hstore(v)}" }.join(", ")
elsif value.respond_to?(:to_unsafe_h)
serialize(value.to_unsafe_h)
else
value
end
end
def accessor
ActiveRecord::Store::StringKeyedHashAccessor
end
# Will compare the Hash equivalents of +raw_old_value+ and +new_value+.
# By comparing hashes, this avoids an edge case where the order of
# the keys change between the two hashes, and they would not be marked
# as equal.
def changed_in_place?(raw_old_value, new_value)
deserialize(raw_old_value) != new_value
end
private
def escape_hstore(value)
if value.nil?
"NULL"
else
if value == ""
'""'
else
'"%s"' % value.to_s.gsub(/(["\\])/, '\\\\\1')
end
end
end
end
end
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Nodes
class In < Arel::Nodes::Binary
include FetchAttribute
def equality?; true; end
def invert
Arel::Nodes::NotIn.new(left, right)
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module ConnectionAdapters
module PostgreSQL
module OID # :nodoc:
class Inet < Cidr # :nodoc:
def type
:inet
end
end
end
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Nodes
class InfixOperation < Binary
include Arel::Expressions
include Arel::Predications
include Arel::OrderPredications
include Arel::AliasPredication
include Arel::Math
attr_reader :operator
def initialize(operator, left, right)
super(left, right)
@operator = operator
end
end
class Multiplication < InfixOperation
def initialize(left, right)
super(:*, left, right)
end
end
class Division < InfixOperation
def initialize(left, right)
super(:/, left, right)
end
end
class Addition < InfixOperation
def initialize(left, right)
super(:+, left, right)
end
end
class Subtraction < InfixOperation
def initialize(left, right)
super(:-, left, right)
end
end
class Concat < InfixOperation
def initialize(left, right)
super(:"||", left, right)
end
end
class Contains < InfixOperation
def initialize(left, right)
super(:"@>", left, right)
end
end
class Overlaps < InfixOperation
def initialize(left, right)
super(:"&&", left, right)
end
end
class BitwiseAnd < InfixOperation
def initialize(left, right)
super(:&, left, right)
end
end
class BitwiseOr < InfixOperation
def initialize(left, right)
super(:|, left, right)
end
end
class BitwiseXor < InfixOperation
def initialize(left, right)
super(:^, left, right)
end
end
class BitwiseShiftLeft < InfixOperation
def initialize(left, right)
super(:<<, left, right)
end
end
class BitwiseShiftRight < InfixOperation
def initialize(left, right)
super(:>>, left, right)
end
end
end
end
# frozen_string_literal: true
require "active_support/inflector"
require "active_support/core_ext/hash/indifferent_access"
module ActiveRecord
# == Single table inheritance
#
# Active Record allows inheritance by storing the name of the class in a column that by
# default is named "type" (can be changed by overwriting <tt>Base.inheritance_column</tt>).
# This means that an inheritance looking like this:
#
# class Company < ActiveRecord::Base; end
# class Firm < Company; end
# class Client < Company; end
# class PriorityClient < Client; end
#
# When you do <tt>Firm.create(name: "37signals")</tt>, this record will be saved in
# the companies table with type = "Firm". You can then fetch this row again using
# <tt>Company.where(name: '37signals').first</tt> and it will return a Firm object.
#
# Be aware that because the type column is an attribute on the record every new
# subclass will instantly be marked as dirty and the type column will be included
# in the list of changed attributes on the record. This is different from non
# Single Table Inheritance(STI) classes:
#
# Company.new.changed? # => false
# Firm.new.changed? # => true
# Firm.new.changes # => {"type"=>["","Firm"]}
#
# If you don't have a type column defined in your table, single-table inheritance won't
# be triggered. In that case, it'll work just like normal subclasses with no special magic
# for differentiating between them or reloading the right type with find.
#
# Note, all the attributes for all the cases are kept in the same table. Read more:
# https://www.martinfowler.com/eaaCatalog/singleTableInheritance.html
#
module Inheritance
extend ActiveSupport::Concern
included do
class_attribute :store_full_class_name, instance_writer: false, default: true
# Determines whether to store the full constant name including namespace when using STI.
# This is true, by default.
class_attribute :store_full_sti_class, instance_writer: false, default: true
set_base_class
end
module ClassMethods
# Determines if one of the attributes passed in is the inheritance column,
# and if the inheritance column is attr accessible, it initializes an
# instance of the given subclass instead of the base class.
def new(attributes = nil, &block)
if abstract_class? || self == Base
raise NotImplementedError, "#{self} is an abstract class and cannot be instantiated."
end
if _has_attribute?(inheritance_column)
subclass = subclass_from_attributes(attributes)
if subclass.nil? && scope_attributes = current_scope&.scope_for_create
subclass = subclass_from_attributes(scope_attributes)
end
if subclass.nil? && base_class?
subclass = subclass_from_attributes(column_defaults)
end
end
if subclass && subclass != self
subclass.new(attributes, &block)
else
super
end
end
# Returns +true+ if this does not need STI type condition. Returns
# +false+ if STI type condition needs to be applied.
def descends_from_active_record?
if self == Base
false
elsif superclass.abstract_class?
superclass.descends_from_active_record?
else
superclass == Base || !columns_hash.include?(inheritance_column)
end
end
def finder_needs_type_condition? # :nodoc:
# This is like this because benchmarking justifies the strange :false stuff
:true == (@finder_needs_type_condition ||= descends_from_active_record? ? :false : :true)
end
# Returns the class descending directly from ActiveRecord::Base, or
# an abstract class, if any, in the inheritance hierarchy.
#
# If A extends ActiveRecord::Base, A.base_class will return A. If B descends from A
# through some arbitrarily deep hierarchy, B.base_class will return A.
#
# If B < A and C < B and if A is an abstract_class then both B.base_class
# and C.base_class would return B as the answer since A is an abstract_class.
attr_reader :base_class
# Returns whether the class is a base class.
# See #base_class for more information.
def base_class?
base_class == self
end
# Set this to +true+ if this is an abstract class (see
# <tt>abstract_class?</tt>).
# If you are using inheritance with Active Record and don't want a class
# to be considered as part of the STI hierarchy, you must set this to
# true.
# +ApplicationRecord+, for example, is generated as an abstract class.
#
# Consider the following default behaviour:
#
# Shape = Class.new(ActiveRecord::Base)
# Polygon = Class.new(Shape)
# Square = Class.new(Polygon)
#
# Shape.table_name # => "shapes"
# Polygon.table_name # => "shapes"
# Square.table_name # => "shapes"
# Shape.create! # => #<Shape id: 1, type: nil>
# Polygon.create! # => #<Polygon id: 2, type: "Polygon">
# Square.create! # => #<Square id: 3, type: "Square">
#
# However, when using <tt>abstract_class</tt>, +Shape+ is omitted from
# the hierarchy:
#
# class Shape < ActiveRecord::Base
# self.abstract_class = true
# end
# Polygon = Class.new(Shape)
# Square = Class.new(Polygon)
#
# Shape.table_name # => nil
# Polygon.table_name # => "polygons"
# Square.table_name # => "polygons"
# Shape.create! # => NotImplementedError: Shape is an abstract class and cannot be instantiated.
# Polygon.create! # => #<Polygon id: 1, type: nil>
# Square.create! # => #<Square id: 2, type: "Square">
#
# Note that in the above example, to disallow the creation of a plain
# +Polygon+, you should use <tt>validates :type, presence: true</tt>,
# instead of setting it as an abstract class. This way, +Polygon+ will
# stay in the hierarchy, and Active Record will continue to correctly
# derive the table name.
attr_accessor :abstract_class
# Returns whether this class is an abstract class or not.
def abstract_class?
defined?(@abstract_class) && @abstract_class == true
end
# Sets the application record class for Active Record
#
# This is useful if your application uses a different class than
# ApplicationRecord for your primary abstract class. This class
# will share a database connection with Active Record. It is the class
# that connects to your primary database.
def primary_abstract_class
if ActiveRecord.application_record_class && ActiveRecord.application_record_class.name != name
raise ArgumentError, "The `primary_abstract_class` is already set to #{ActiveRecord.application_record_class.inspect}. There can only be one `primary_abstract_class` in an application."
end
self.abstract_class = true
ActiveRecord.application_record_class = self
end
# Returns the value to be stored in the inheritance column for STI.
def sti_name
store_full_sti_class && store_full_class_name ? name : name.demodulize
end
# Returns the class for the provided +type_name+.
#
# It is used to find the class correspondent to the value stored in the inheritance column.
def sti_class_for(type_name)
if store_full_sti_class && store_full_class_name
type_name.constantize
else
compute_type(type_name)
end
rescue NameError
raise SubclassNotFound,
"The single-table inheritance mechanism failed to locate the subclass: '#{type_name}'. " \
"This error is raised because the column '#{inheritance_column}' is reserved for storing the class in case of inheritance. " \
"Please rename this column if you didn't intend it to be used for storing the inheritance class " \
"or overwrite #{name}.inheritance_column to use another column for that information."
end
# Returns the value to be stored in the polymorphic type column for Polymorphic Associations.
def polymorphic_name
store_full_class_name ? base_class.name : base_class.name.demodulize
end
# Returns the class for the provided +name+.
#
# It is used to find the class correspondent to the value stored in the polymorphic type column.
def polymorphic_class_for(name)
if store_full_class_name
name.constantize
else
compute_type(name)
end
end
def inherited(subclass)
subclass.set_base_class
subclass.instance_variable_set(:@_type_candidates_cache, Concurrent::Map.new)
super
end
def dup # :nodoc:
# `initialize_dup` / `initialize_copy` don't work when defined
# in the `singleton_class`.
other = super
other.set_base_class
other
end
def initialize_clone(other) # :nodoc:
super
set_base_class
end
protected
# Returns the class type of the record using the current module as a prefix. So descendants of
# MyApp::Business::Account would appear as MyApp::Business::AccountSubclass.
def compute_type(type_name)
if type_name.start_with?("::")
# If the type is prefixed with a scope operator then we assume that
# the type_name is an absolute reference.
type_name.constantize
else
type_candidate = @_type_candidates_cache[type_name]
if type_candidate && type_constant = type_candidate.safe_constantize
return type_constant
end
# Build a list of candidates to search for
candidates = []
name.scan(/::|$/) { candidates.unshift "#{$`}::#{type_name}" }
candidates << type_name
candidates.each do |candidate|
constant = candidate.safe_constantize
if candidate == constant.to_s
@_type_candidates_cache[type_name] = candidate
return constant
end
end
raise NameError.new("uninitialized constant #{candidates.first}", candidates.first)
end
end
def set_base_class # :nodoc:
@base_class = if self == Base
self
else
unless self < Base
raise ActiveRecordError, "#{name} doesn't belong in a hierarchy descending from ActiveRecord"
end
if superclass == Base || superclass.abstract_class?
self
else
superclass.base_class
end
end
end
private
# Called by +instantiate+ to decide which class to use for a new
# record instance. For single-table inheritance, we check the record
# for a +type+ column and return the corresponding class.
def discriminate_class_for_record(record)
if using_single_table_inheritance?(record)
find_sti_class(record[inheritance_column])
else
super
end
end
def using_single_table_inheritance?(record)
record[inheritance_column].present? && _has_attribute?(inheritance_column)
end
def find_sti_class(type_name)
type_name = base_class.type_for_attribute(inheritance_column).cast(type_name)
subclass = sti_class_for(type_name)
unless subclass == self || descendants.include?(subclass)
raise SubclassNotFound, "Invalid single-table inheritance type: #{subclass.name} is not a subclass of #{name}"
end
subclass
end
def type_condition(table = arel_table)
sti_column = table[inheritance_column]
sti_names = ([self] + descendants).map(&:sti_name)
predicate_builder.build(sti_column, sti_names)
end
# Detect the subclass from the inheritance column of attrs. If the inheritance column value
# is not self or a valid subclass, raises ActiveRecord::SubclassNotFound
def subclass_from_attributes(attrs)
attrs = attrs.to_h if attrs.respond_to?(:permitted?)
if attrs.is_a?(Hash)
subclass_name = attrs[inheritance_column] || attrs[inheritance_column.to_sym]
if subclass_name.present?
find_sti_class(subclass_name)
end
end
end
end
def initialize_dup(other)
super
ensure_proper_type
end
private
def initialize_internals_callback
super
ensure_proper_type
end
# Sets the attribute used for single table inheritance to this class name if this is not the
# ActiveRecord::Base descendant.
# Considering the hierarchy Reply < Message < ActiveRecord::Base, this makes it possible to
# do Reply.new without having to set <tt>Reply[Reply.inheritance_column] = "Reply"</tt> yourself.
# No such attribute would be set for objects of the Message class in that example.
def ensure_proper_type
klass = self.class
if klass.finder_needs_type_condition?
_write_attribute(klass.inheritance_column, klass.sti_name)
end
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Nodes
class InnerJoin < Arel::Nodes::Join
end
end
end
# frozen_string_literal: true
require "active_support/core_ext/enumerable"
module ActiveRecord
class InsertAll # :nodoc:
attr_reader :model, :connection, :inserts, :keys
attr_reader :on_duplicate, :update_only, :returning, :unique_by, :update_sql
def initialize(model, inserts, on_duplicate:, update_only: nil, returning: nil, unique_by: nil, record_timestamps: nil)
raise ArgumentError, "Empty list of attributes passed" if inserts.blank?
@model, @connection, @inserts, @keys = model, model.connection, inserts, inserts.first.keys.map(&:to_s)
@on_duplicate, @update_only, @returning, @unique_by = on_duplicate, update_only, returning, unique_by
@record_timestamps = record_timestamps.nil? ? model.record_timestamps : record_timestamps
disallow_raw_sql!(on_duplicate)
disallow_raw_sql!(returning)
configure_on_duplicate_update_logic
if model.scope_attributes?
@scope_attributes = model.scope_attributes
@keys |= @scope_attributes.keys
end
@keys = @keys.to_set
@returning = (connection.supports_insert_returning? ? primary_keys : false) if @returning.nil?
@returning = false if @returning == []
@unique_by = find_unique_index_for(unique_by)
@on_duplicate = :skip if @on_duplicate == :update && updatable_columns.empty?
ensure_valid_options_for_connection!
end
def execute
message = +"#{model} "
message << "Bulk " if inserts.many?
message << (on_duplicate == :update ? "Upsert" : "Insert")
connection.exec_insert_all to_sql, message
end
def updatable_columns
@updatable_columns ||= keys - readonly_columns - unique_by_columns
end
def primary_keys
Array(connection.schema_cache.primary_keys(model.table_name))
end
def skip_duplicates?
on_duplicate == :skip
end
def update_duplicates?
on_duplicate == :update
end
def map_key_with_value
inserts.map do |attributes|
attributes = attributes.stringify_keys
attributes.merge!(scope_attributes) if scope_attributes
attributes.reverse_merge!(timestamps_for_create) if record_timestamps?
verify_attributes(attributes)
keys_including_timestamps.map do |key|
yield key, attributes[key]
end
end
end
def record_timestamps?
@record_timestamps
end
# TODO: Consider renaming this method, as it only conditionally extends keys, not always
def keys_including_timestamps
@keys_including_timestamps ||= if record_timestamps?
keys + model.all_timestamp_attributes_in_model
else
keys
end
end
private
attr_reader :scope_attributes
def configure_on_duplicate_update_logic
if custom_update_sql_provided? && update_only.present?
raise ArgumentError, "You can't set :update_only and provide custom update SQL via :on_duplicate at the same time"
end
if update_only.present?
@updatable_columns = Array(update_only)
@on_duplicate = :update
elsif custom_update_sql_provided?
@update_sql = on_duplicate
@on_duplicate = :update
end
end
def custom_update_sql_provided?
@custom_update_sql_provided ||= Arel.arel_node?(on_duplicate)
end
def find_unique_index_for(unique_by)
if !connection.supports_insert_conflict_target?
return if unique_by.nil?
raise ArgumentError, "#{connection.class} does not support :unique_by"
end
name_or_columns = unique_by || model.primary_key
match = Array(name_or_columns).map(&:to_s)
if index = unique_indexes.find { |i| match.include?(i.name) || i.columns == match }
index
elsif match == primary_keys
unique_by.nil? ? nil : ActiveRecord::ConnectionAdapters::IndexDefinition.new(model.table_name, "#{model.table_name}_primary_key", true, match)
else
raise ArgumentError, "No unique index found for #{name_or_columns}"
end
end
def unique_indexes
connection.schema_cache.indexes(model.table_name).select(&:unique)
end
def ensure_valid_options_for_connection!
if returning && !connection.supports_insert_returning?
raise ArgumentError, "#{connection.class} does not support :returning"
end
if skip_duplicates? && !connection.supports_insert_on_duplicate_skip?
raise ArgumentError, "#{connection.class} does not support skipping duplicates"
end
if update_duplicates? && !connection.supports_insert_on_duplicate_update?
raise ArgumentError, "#{connection.class} does not support upsert"
end
if unique_by && !connection.supports_insert_conflict_target?
raise ArgumentError, "#{connection.class} does not support :unique_by"
end
end
def to_sql
connection.build_insert_sql(ActiveRecord::InsertAll::Builder.new(self))
end
def readonly_columns
primary_keys + model.readonly_attributes.to_a
end
def unique_by_columns
Array(unique_by&.columns)
end
def verify_attributes(attributes)
if keys_including_timestamps != attributes.keys.to_set
raise ArgumentError, "All objects being inserted must have the same keys"
end
end
def disallow_raw_sql!(value)
return if !value.is_a?(String) || Arel.arel_node?(value)
raise ArgumentError, "Dangerous query method (method whose arguments are used as raw " \
"SQL) called: #{value}. " \
"Known-safe values can be passed " \
"by wrapping them in Arel.sql()."
end
def timestamps_for_create
model.all_timestamp_attributes_in_model.index_with(connection.high_precision_current_timestamp)
end
class Builder # :nodoc:
attr_reader :model
delegate :skip_duplicates?, :update_duplicates?, :keys, :keys_including_timestamps, :record_timestamps?, to: :insert_all
def initialize(insert_all)
@insert_all, @model, @connection = insert_all, insert_all.model, insert_all.connection
end
def into
"INTO #{model.quoted_table_name} (#{columns_list})"
end
def values_list
types = extract_types_from_columns_on(model.table_name, keys: keys_including_timestamps)
values_list = insert_all.map_key_with_value do |key, value|
next value if Arel::Nodes::SqlLiteral === value
connection.with_yaml_fallback(types[key].serialize(value))
end
connection.visitor.compile(Arel::Nodes::ValuesList.new(values_list))
end
def returning
return unless insert_all.returning
if insert_all.returning.is_a?(String)
insert_all.returning
else
format_columns(insert_all.returning)
end
end
def conflict_target
if index = insert_all.unique_by
sql = +"(#{format_columns(index.columns)})"
sql << " WHERE #{index.where}" if index.where
sql
elsif update_duplicates?
"(#{format_columns(insert_all.primary_keys)})"
end
end
def updatable_columns
quote_columns(insert_all.updatable_columns)
end
def touch_model_timestamps_unless(&block)
return "" unless update_duplicates? && record_timestamps?
model.timestamp_attributes_for_update_in_model.filter_map do |column_name|
if touch_timestamp_attribute?(column_name)
"#{column_name}=(CASE WHEN (#{updatable_columns.map(&block).join(" AND ")}) THEN #{model.quoted_table_name}.#{column_name} ELSE #{connection.high_precision_current_timestamp} END),"
end
end.join
end
def raw_update_sql
insert_all.update_sql
end
alias raw_update_sql? raw_update_sql
private
attr_reader :connection, :insert_all
def touch_timestamp_attribute?(column_name)
insert_all.updatable_columns.exclude?(column_name)
end
def columns_list
format_columns(insert_all.keys_including_timestamps)
end
def extract_types_from_columns_on(table_name, keys:)
columns = connection.schema_cache.columns_hash(table_name)
unknown_column = (keys - columns.keys).first
raise UnknownAttributeError.new(model.new, unknown_column) if unknown_column
keys.index_with { |key| model.type_for_attribute(key) }
end
def format_columns(columns)
columns.respond_to?(:map) ? quote_columns(columns).join(",") : columns
end
def quote_columns(columns)
columns.map(&connection.method(:quote_column_name))
end
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
class InsertManager < Arel::TreeManager
def initialize(table = nil)
@ast = Nodes::InsertStatement.new(table)
end
def into(table)
@ast.relation = table
self
end
def columns; @ast.columns end
def values=(val); @ast.values = val; end
def select(select)
@ast.select = select
end
def insert(fields)
return if fields.empty?
if String === fields
@ast.values = Nodes::SqlLiteral.new(fields)
else
@ast.relation ||= fields.first.first.relation
values = []
fields.each do |column, value|
@ast.columns << column
values << value
end
@ast.values = create_values(values)
end
self
end
def create_values(values)
Nodes::ValuesList.new([values])
end
def create_values_list(rows)
Nodes::ValuesList.new(rows)
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Nodes
class InsertStatement < Arel::Nodes::Node
attr_accessor :relation, :columns, :values, :select
def initialize(relation = nil)
super()
@relation = relation
@columns = []
@values = nil
@select = nil
end
def initialize_copy(other)
super
@columns = @columns.clone
@values = @values.clone if @values
@select = @select.clone if @select
end
def hash
[@relation, @columns, @values, @select].hash
end
def eql?(other)
self.class == other.class &&
self.relation == other.relation &&
self.columns == other.columns &&
self.select == other.select &&
self.values == other.values
end
alias :== :eql?
end
end
end
# frozen_string_literal: true
require "active_support/core_ext/string/filters"
module ActiveRecord
module Integration
extend ActiveSupport::Concern
included do
##
# :singleton-method:
# Indicates the format used to generate the timestamp in the cache key, if
# versioning is off. Accepts any of the symbols in <tt>Time::DATE_FORMATS</tt>.
#
# This is +:usec+, by default.
class_attribute :cache_timestamp_format, instance_writer: false, default: :usec
##
# :singleton-method:
# Indicates whether to use a stable #cache_key method that is accompanied
# by a changing version in the #cache_version method.
#
# This is +true+, by default on Rails 5.2 and above.
class_attribute :cache_versioning, instance_writer: false, default: false
##
# :singleton-method:
# Indicates whether to use a stable #cache_key method that is accompanied
# by a changing version in the #cache_version method on collections.
#
# This is +false+, by default until Rails 6.1.
class_attribute :collection_cache_versioning, instance_writer: false, default: false
end
# Returns a +String+, which Action Pack uses for constructing a URL to this
# object. The default implementation returns this record's id as a +String+,
# or +nil+ if this record's unsaved.
#
# For example, suppose that you have a User model, and that you have a
# <tt>resources :users</tt> route. Normally, +user_path+ will
# construct a path with the user object's 'id' in it:
#
# user = User.find_by(name: 'Phusion')
# user_path(user) # => "/users/1"
#
# You can override +to_param+ in your model to make +user_path+ construct
# a path using the user's name instead of the user's id:
#
# class User < ActiveRecord::Base
# def to_param # overridden
# name
# end
# end
#
# user = User.find_by(name: 'Phusion')
# user_path(user) # => "/users/Phusion"
def to_param
# We can't use alias_method here, because method 'id' optimizes itself on the fly.
id && id.to_s # Be sure to stringify the id for routes
end
# Returns a stable cache key that can be used to identify this record.
#
# Product.new.cache_key # => "products/new"
# Product.find(5).cache_key # => "products/5"
#
# If ActiveRecord::Base.cache_versioning is turned off, as it was in Rails 5.1 and earlier,
# the cache key will also include a version.
#
# Product.cache_versioning = false
# Product.find(5).cache_key # => "products/5-20071224150000" (updated_at available)
def cache_key
if new_record?
"#{model_name.cache_key}/new"
else
if cache_version
"#{model_name.cache_key}/#{id}"
else
timestamp = max_updated_column_timestamp
if timestamp
timestamp = timestamp.utc.to_fs(cache_timestamp_format)
"#{model_name.cache_key}/#{id}-#{timestamp}"
else
"#{model_name.cache_key}/#{id}"
end
end
end
end
# Returns a cache version that can be used together with the cache key to form
# a recyclable caching scheme. By default, the #updated_at column is used for the
# cache_version, but this method can be overwritten to return something else.
#
# Note, this method will return nil if ActiveRecord::Base.cache_versioning is set to
# +false+.
def cache_version
return unless cache_versioning
if has_attribute?("updated_at")
timestamp = updated_at_before_type_cast
if can_use_fast_cache_version?(timestamp)
raw_timestamp_to_cache_version(timestamp)
elsif timestamp = updated_at
timestamp.utc.to_fs(cache_timestamp_format)
end
elsif self.class.has_attribute?("updated_at")
raise ActiveModel::MissingAttributeError, "missing attribute: updated_at"
end
end
# Returns a cache key along with the version.
def cache_key_with_version
if version = cache_version
"#{cache_key}-#{version}"
else
cache_key
end
end
module ClassMethods
# Defines your model's +to_param+ method to generate "pretty" URLs
# using +method_name+, which can be any attribute or method that
# responds to +to_s+.
#
# class User < ActiveRecord::Base
# to_param :name
# end
#
# user = User.find_by(name: 'Fancy Pants')
# user.id # => 123
# user_path(user) # => "/users/123-fancy-pants"
#
# Values longer than 20 characters will be truncated. The value
# is truncated word by word.
#
# user = User.find_by(name: 'David Heinemeier Hansson')
# user.id # => 125
# user_path(user) # => "/users/125-david-heinemeier"
#
# Because the generated param begins with the record's +id+, it is
# suitable for passing to +find+. In a controller, for example:
#
# params[:id] # => "123-fancy-pants"
# User.find(params[:id]).id # => 123
def to_param(method_name = nil)
if method_name.nil?
super()
else
define_method :to_param do
if (default = super()) &&
(result = send(method_name).to_s).present? &&
(param = result.squish.parameterize.truncate(20, separator: /-/, omission: "")).present?
"#{default}-#{param}"
else
default
end
end
end
end
def collection_cache_key(collection = all, timestamp_column = :updated_at) # :nodoc:
collection.send(:compute_cache_key, timestamp_column)
end
end
private
# Detects if the value before type cast
# can be used to generate a cache_version.
#
# The fast cache version only works with a
# string value directly from the database.
#
# We also must check if the timestamp format has been changed
# or if the timezone is not set to UTC then
# we cannot apply our transformations correctly.
def can_use_fast_cache_version?(timestamp)
timestamp.is_a?(String) &&
cache_timestamp_format == :usec &&
self.class.connection.default_timezone == :utc &&
!updated_at_came_from_user?
end
# Converts a raw database string to `:usec`
# format.
#
# Example:
#
# timestamp = "2018-10-15 20:02:15.266505"
# raw_timestamp_to_cache_version(timestamp)
# # => "20181015200215266505"
#
# PostgreSQL truncates trailing zeros,
# https://github.com/postgres/postgres/commit/3e1beda2cde3495f41290e1ece5d544525810214
# to account for this we pad the output with zeros
def raw_timestamp_to_cache_version(timestamp)
key = timestamp.delete("- :.")
if key.length < 20
key.ljust(20, "0")
else
key
end
end
end
end
# frozen_string_literal: true
require "active_record/scoping/default"
require "active_record/scoping/named"
module ActiveRecord
# This class is used to create a table that keeps track of values and keys such
# as which environment migrations were run in.
#
# This is enabled by default. To disable this functionality set
# `use_metadata_table` to false in your database configuration.
class InternalMetadata < ActiveRecord::Base # :nodoc:
self.record_timestamps = true
class << self
def enabled?
ActiveRecord::Base.connection.use_metadata_table?
end
def primary_key
"key"
end
def table_name
"#{table_name_prefix}#{internal_metadata_table_name}#{table_name_suffix}"
end
def []=(key, value)
return unless enabled?
find_or_initialize_by(key: key).update!(value: value)
end
def [](key)
return unless enabled?
where(key: key).pick(:value)
end
# Creates an internal metadata table with columns +key+ and +value+
def create_table
return unless enabled?
unless connection.table_exists?(table_name)
connection.create_table(table_name, id: false) do |t|
t.string :key, **connection.internal_string_options_for_primary_key
t.string :value
t.timestamps
end
end
end
def drop_table
return unless enabled?
connection.drop_table table_name, if_exists: true
end
end
end
end
# frozen_string_literal: true
require "active_support/duration"
module ActiveRecord
module ConnectionAdapters
module PostgreSQL
module OID # :nodoc:
class Interval < Type::Value # :nodoc:
def type
:interval
end
def cast_value(value)
case value
when ::ActiveSupport::Duration
value
when ::String
begin
::ActiveSupport::Duration.parse(value)
rescue ::ActiveSupport::Duration::ISO8601Parser::ParsingError
nil
end
else
super
end
end
def serialize(value)
case value
when ::ActiveSupport::Duration
value.iso8601(precision: self.precision)
when ::Numeric
# Sometimes operations on Times returns just float number of seconds so we need to handle that.
# Example: Time.current - (Time.current + 1.hour) # => -3600.000001776 (Float)
value.seconds.iso8601(precision: self.precision)
else
super
end
end
def type_cast_for_schema(value)
serialize(value).inspect
end
end
end
end
end
end
# frozen_string_literal: true
require "active_record/associations/join_dependency/join_part"
require "active_support/core_ext/array/extract"
module ActiveRecord
module Associations
class JoinDependency # :nodoc:
class JoinAssociation < JoinPart # :nodoc:
attr_reader :reflection, :tables
attr_accessor :table
def initialize(reflection, children)
super(reflection.klass, children)
@reflection = reflection
end
def match?(other)
return true if self == other
super && reflection == other.reflection
end
def join_constraints(foreign_table, foreign_klass, join_type, alias_tracker)
joins = []
chain = []
reflection.chain.each do |reflection|
table, terminated = yield reflection
@table ||= table
if terminated
foreign_table, foreign_klass = table, reflection.klass
break
end
chain << [reflection, table]
end
# The chain starts with the target table, but we want to end with it here (makes
# more sense in this context), so we reverse
chain.reverse_each do |reflection, table|
klass = reflection.klass
scope = reflection.join_scope(table, foreign_table, foreign_klass)
unless scope.references_values.empty?
associations = scope.eager_load_values | scope.includes_values
unless associations.empty?
scope.joins! scope.construct_join_dependency(associations, Arel::Nodes::OuterJoin)
end
end
arel = scope.arel(alias_tracker.aliases)
nodes = arel.constraints.first
if nodes.is_a?(Arel::Nodes::And)
others = nodes.children.extract! do |node|
!Arel.fetch_attribute(node) { |attr| attr.relation.name == table.name }
end
end
joins << join_type.new(table, Arel::Nodes::On.new(nodes))
if others && !others.empty?
joins.concat arel.join_sources
append_constraints(joins.last, others)
end
# The current table in this iteration becomes the foreign table in the next
foreign_table, foreign_klass = table, klass
end
joins
end
def readonly?
return @readonly if defined?(@readonly)
@readonly = reflection.scope && reflection.scope_for(base_klass.unscoped).readonly_value
end
def strict_loading?
return @strict_loading if defined?(@strict_loading)
@strict_loading = reflection.scope && reflection.scope_for(base_klass.unscoped).strict_loading_value
end
private
def append_constraints(join, constraints)
if join.is_a?(Arel::Nodes::StringJoin)
join_string = Arel::Nodes::And.new(constraints.unshift join.left)
join.left = Arel.sql(base_klass.connection.visitor.compile(join_string))
else
right = join.right
right.expr = Arel::Nodes::And.new(constraints.unshift right.expr)
end
end
end
end
end
end
# frozen_string_literal: true
require "active_record/associations/join_dependency/join_part"
module ActiveRecord
module Associations
class JoinDependency # :nodoc:
class JoinBase < JoinPart # :nodoc:
attr_reader :table
def initialize(base_klass, table, children)
super(base_klass, children)
@table = table
end
def match?(other)
return true if self == other
super && base_klass == other.base_klass
end
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Associations
class JoinDependency # :nodoc:
extend ActiveSupport::Autoload
eager_autoload do
autoload :JoinBase
autoload :JoinAssociation
end
class Aliases # :nodoc:
def initialize(tables)
@tables = tables
@alias_cache = tables.each_with_object({}) { |table, h|
h[table.node] = table.columns.each_with_object({}) { |column, i|
i[column.name] = column.alias
}
}
@columns_cache = tables.each_with_object({}) { |table, h|
h[table.node] = table.columns
}
end
def columns
@tables.flat_map(&:column_aliases)
end
def column_aliases(node)
@columns_cache[node]
end
def column_alias(node, column)
@alias_cache[node][column]
end
Table = Struct.new(:node, :columns) do # :nodoc:
def column_aliases
t = node.table
columns.map { |column| t[column.name].as(column.alias) }
end
end
Column = Struct.new(:name, :alias)
end
def self.make_tree(associations)
hash = {}
walk_tree associations, hash
hash
end
def self.walk_tree(associations, hash)
case associations
when Symbol, String
hash[associations.to_sym] ||= {}
when Array
associations.each do |assoc|
walk_tree assoc, hash
end
when Hash
associations.each do |k, v|
cache = hash[k] ||= {}
walk_tree v, cache
end
else
raise ConfigurationError, associations.inspect
end
end
def initialize(base, table, associations, join_type)
tree = self.class.make_tree associations
@join_root = JoinBase.new(base, table, build(tree, base))
@join_type = join_type
end
def base_klass
join_root.base_klass
end
def reflections
join_root.drop(1).map!(&:reflection)
end
def join_constraints(joins_to_add, alias_tracker, references)
@alias_tracker = alias_tracker
@joined_tables = {}
@references = {}
references.each do |table_name|
@references[table_name.to_sym] = table_name if table_name.is_a?(Arel::Nodes::SqlLiteral)
end unless references.empty?
joins = make_join_constraints(join_root, join_type)
joins.concat joins_to_add.flat_map { |oj|
if join_root.match? oj.join_root
walk(join_root, oj.join_root, oj.join_type)
else
make_join_constraints(oj.join_root, oj.join_type)
end
}
end
def instantiate(result_set, strict_loading_value, &block)
primary_key = aliases.column_alias(join_root, join_root.primary_key)
seen = Hash.new { |i, parent|
i[parent] = Hash.new { |j, child_class|
j[child_class] = {}
}
}.compare_by_identity
model_cache = Hash.new { |h, klass| h[klass] = {} }
parents = model_cache[join_root]
column_aliases = aliases.column_aliases(join_root)
column_names = []
result_set.columns.each do |name|
column_names << name unless /\At\d+_r\d+\z/.match?(name)
end
if column_names.empty?
column_types = {}
else
column_types = result_set.column_types
unless column_types.empty?
attribute_types = join_root.attribute_types
column_types = column_types.slice(*column_names).delete_if { |k, _| attribute_types.key?(k) }
end
column_aliases += column_names.map! { |name| Aliases::Column.new(name, name) }
end
message_bus = ActiveSupport::Notifications.instrumenter
payload = {
record_count: result_set.length,
class_name: join_root.base_klass.name
}
message_bus.instrument("instantiation.active_record", payload) do
result_set.each { |row_hash|
parent_key = primary_key ? row_hash[primary_key] : row_hash
parent = parents[parent_key] ||= join_root.instantiate(row_hash, column_aliases, column_types, &block)
construct(parent, join_root, row_hash, seen, model_cache, strict_loading_value)
}
end
parents.values
end
def apply_column_aliases(relation)
@join_root_alias = relation.select_values.empty?
relation._select!(-> { aliases.columns })
end
def each(&block)
join_root.each(&block)
end
protected
attr_reader :join_root, :join_type
private
attr_reader :alias_tracker, :join_root_alias
def aliases
@aliases ||= Aliases.new join_root.each_with_index.map { |join_part, i|
column_names = if join_part == join_root && !join_root_alias
primary_key = join_root.primary_key
primary_key ? [primary_key] : []
else
join_part.column_names
end
columns = column_names.each_with_index.map { |column_name, j|
Aliases::Column.new column_name, "t#{i}_r#{j}"
}
Aliases::Table.new(join_part, columns)
}
end
def make_join_constraints(join_root, join_type)
join_root.children.flat_map do |child|
make_constraints(join_root, child, join_type)
end
end
def make_constraints(parent, child, join_type)
foreign_table = parent.table
foreign_klass = parent.base_klass
child.join_constraints(foreign_table, foreign_klass, join_type, alias_tracker) do |reflection|
table, terminated = @joined_tables[reflection]
root = reflection == child.reflection
if table && (!root || !terminated)
@joined_tables[reflection] = [table, root] if root
next table, true
end
table_name = @references[reflection.name.to_sym]&.to_s
table = alias_tracker.aliased_table_for(reflection.klass.arel_table, table_name) do
name = reflection.alias_candidate(parent.table_name)
root ? name : "#{name}_join"
end
@joined_tables[reflection] ||= [table, root] if join_type == Arel::Nodes::OuterJoin
table
end.concat child.children.flat_map { |c| make_constraints(child, c, join_type) }
end
def walk(left, right, join_type)
intersection, missing = right.children.map { |node1|
[left.children.find { |node2| node1.match? node2 }, node1]
}.partition(&:first)
joins = intersection.flat_map { |l, r| r.table = l.table; walk(l, r, join_type) }
joins.concat missing.flat_map { |_, n| make_constraints(left, n, join_type) }
end
def find_reflection(klass, name)
klass._reflect_on_association(name) ||
raise(ConfigurationError, "Can't join '#{klass.name}' to association named '#{name}'; perhaps you misspelled it?")
end
def build(associations, base_klass)
associations.map do |name, right|
reflection = find_reflection base_klass, name
reflection.check_validity!
reflection.check_eager_loadable!
if reflection.polymorphic?
raise EagerLoadPolymorphicError.new(reflection)
end
JoinAssociation.new(reflection, build(right, reflection.klass))
end
end
def construct(ar_parent, parent, row, seen, model_cache, strict_loading_value)
return if ar_parent.nil?
parent.children.each do |node|
if node.reflection.collection?
other = ar_parent.association(node.reflection.name)
other.loaded!
elsif ar_parent.association_cached?(node.reflection.name)
model = ar_parent.association(node.reflection.name).target
construct(model, node, row, seen, model_cache, strict_loading_value)
next
end
key = aliases.column_alias(node, node.primary_key)
id = row[key]
if id.nil?
nil_association = ar_parent.association(node.reflection.name)
nil_association.loaded!
next
end
model = seen[ar_parent][node][id]
if model
construct(model, node, row, seen, model_cache, strict_loading_value)
else
model = construct_model(ar_parent, node, row, model_cache, id, strict_loading_value)
seen[ar_parent][node][id] = model
construct(model, node, row, seen, model_cache, strict_loading_value)
end
end
end
def construct_model(record, node, row, model_cache, id, strict_loading_value)
other = record.association(node.reflection.name)
model = model_cache[node][id] ||=
node.instantiate(row, aliases.column_aliases(node)) do |m|
m.strict_loading! if strict_loading_value
other.set_inverse_instance(m)
end
if node.reflection.collection?
other.target.push(model)
else
other.target = model
end
model.readonly! if node.readonly?
model.strict_loading! if node.strict_loading?
model
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Associations
class JoinDependency # :nodoc:
# A JoinPart represents a part of a JoinDependency. It is inherited
# by JoinBase and JoinAssociation. A JoinBase represents the Active Record which
# everything else is being joined onto. A JoinAssociation represents an association which
# is joining to the base. A JoinAssociation may result in more than one actual join
# operations (for example a has_and_belongs_to_many JoinAssociation would result in
# two; one for the join table and one for the target table).
class JoinPart # :nodoc:
include Enumerable
# The Active Record class which this join part is associated 'about'; for a JoinBase
# this is the actual base model, for a JoinAssociation this is the target model of the
# association.
attr_reader :base_klass, :children
delegate :table_name, :column_names, :primary_key, :attribute_types, to: :base_klass
def initialize(base_klass, children)
@base_klass = base_klass
@children = children
end
def match?(other)
self.class == other.class
end
def each(&block)
yield self
children.each { |child| child.each(&block) }
end
def each_children(&block)
children.each do |child|
yield self, child
child.each_children(&block)
end
end
# An Arel::Table for the active_record
def table
raise NotImplementedError
end
def extract_record(row, column_names_with_alias)
# This code is performance critical as it is called per row.
# see: https://github.com/rails/rails/pull/12185
hash = {}
index = 0
length = column_names_with_alias.length
while index < length
column = column_names_with_alias[index]
hash[column.name] = row[column.alias]
index += 1
end
hash
end
def instantiate(row, aliases, column_types = {}, &block)
base_klass.instantiate(extract_record(row, aliases), column_types, &block)
end
end
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Nodes
###
# Class that represents a join source
#
# https://www.sqlite.org/syntaxdiagrams.html#join-source
class JoinSource < Arel::Nodes::Binary
def initialize(single_source, joinop = [])
super
end
def empty?
!left && right.empty?
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
class Migration
module JoinTable # :nodoc:
private
def find_join_table_name(table_1, table_2, options = {})
options.delete(:table_name) || join_table_name(table_1, table_2)
end
def join_table_name(table_1, table_2)
ModelSchema.derive_join_table_name(table_1, table_2).to_sym
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Type
class Json < ActiveModel::Type::Value
include ActiveModel::Type::Helpers::Mutable
def type
:json
end
def deserialize(value)
return value unless value.is_a?(::String)
ActiveSupport::JSON.decode(value) rescue nil
end
def serialize(value)
ActiveSupport::JSON.encode(value) unless value.nil?
end
def changed_in_place?(raw_old_value, new_value)
deserialize(raw_old_value) != new_value
end
def accessor
ActiveRecord::Store::StringKeyedHashAccessor
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module ConnectionAdapters
module PostgreSQL
module OID # :nodoc:
class Jsonb < Type::Json # :nodoc:
def type
:jsonb
end
end
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Encryption
# A key is a container for a given +secret+
#
# Optionally, it can include +public_tags+. These tags are meant to be stored
# in clean (public) and can be used, for example, to include information that
# references the key for a future retrieval operation.
class Key
attr_reader :secret, :public_tags
def initialize(secret)
@secret = secret
@public_tags = Properties.new
end
def self.derive_from(password)
secret = ActiveRecord::Encryption.key_generator.derive_key_from(password)
ActiveRecord::Encryption::Key.new(secret)
end
def id
Digest::SHA1.hexdigest(secret).first(4)
end
end
end
end
# frozen_string_literal: true
require "securerandom"
module ActiveRecord
module Encryption
# Utility for generating and deriving random keys.
class KeyGenerator
# Returns a random key. The key will have a size in bytes of +:length+ (configured +Cipher+'s length by default)
def generate_random_key(length: key_length)
SecureRandom.random_bytes(length)
end
# Returns a random key in hexadecimal format. The key will have a size in bytes of +:length+ (configured +Cipher+'s
# length by default)
#
# Hexadecimal format is handy for representing keys as printable text. To maximize the space of characters used, it is
# good practice including not printable characters. Hexadecimal format ensures that generated keys are representable with
# plain text
#
# To convert back to the original string with the desired length:
#
# [ value ].pack("H*")
def generate_random_hex_key(length: key_length)
generate_random_key(length: length).unpack("H*")[0]
end
# Derives a key from the given password. The key will have a size in bytes of +:length+ (configured +Cipher+'s length
# by default)
#
# The generated key will be salted with the value of +ActiveRecord::Encryption.key_derivation_salt+
def derive_key_from(password, length: key_length)
ActiveSupport::KeyGenerator.new(password).generate_key(key_derivation_salt, length)
end
private
def key_derivation_salt
@key_derivation_salt ||= ActiveRecord::Encryption.config.key_derivation_salt
end
def key_length
@key_length ||= ActiveRecord::Encryption.cipher.key_length
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Encryption
# A +KeyProvider+ serves keys:
#
# * An encryption key
# * A list of potential decryption keys. Serving multiple decryption keys supports rotation-schemes
# where new keys are added but old keys need to continue working
class KeyProvider
def initialize(keys)
@keys = Array(keys)
end
# Returns the first key in the list as the active key to perform encryptions
#
# When +ActiveRecord::Encryption.config.store_key_references+ is true, the key will include
# a public tag referencing the key itself. That key will be stored in the public
# headers of the encrypted message
def encryption_key
@encryption_key ||= @keys.last.tap do |key|
key.public_tags.encrypted_data_key_id = key.id if ActiveRecord::Encryption.config.store_key_references
end
@encryption_key
end
# Returns the list of decryption keys
#
# When the message holds a reference to its encryption key, it will return an array
# with that key. If not, it will return the list of keys.
def decryption_keys(encrypted_message)
if encrypted_message.headers.encrypted_data_key_id
keys_grouped_by_id[encrypted_message.headers.encrypted_data_key_id]
else
@keys
end
end
private
def keys_grouped_by_id
@keys_grouped_by_id ||= @keys.group_by(&:id)
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module ConnectionAdapters
module PostgreSQL
module OID # :nodoc:
class LegacyPoint < Type::Value # :nodoc:
include ActiveModel::Type::Helpers::Mutable
def type
:point
end
def cast(value)
case value
when ::String
if value.start_with?("(") && value.end_with?(")")
value = value[1...-1]
end
cast(value.split(","))
when ::Array
value.map { |v| Float(v) }
else
value
end
end
def serialize(value)
if value.is_a?(::Array)
"(#{number_for_point(value[0])},#{number_for_point(value[1])})"
else
super
end
end
private
def number_for_point(number)
number.to_s.delete_suffix(".0")
end
end
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module ConnectionAdapters
class LegacyPoolManager # :nodoc:
def initialize
@name_to_pool_config = {}
end
def shard_names
@name_to_pool_config.keys
end
def pool_configs(_ = nil)
@name_to_pool_config.values
end
def remove_pool_config(_, shard)
@name_to_pool_config.delete(shard)
end
def get_pool_config(_, shard)
@name_to_pool_config[shard]
end
def set_pool_config(role, shard, pool_config)
if pool_config
@name_to_pool_config[shard] = pool_config
else
raise ArgumentError, "The `pool_config` for the :#{role} role and :#{shard} shard was `nil`. Please check your configuration. If you want your writing role to be something other than `:writing` set `config.active_record.writing_role` in your application configuration. The same setting should be applied for the `reading_role` if applicable."
end
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module LegacyYamlAdapter # :nodoc:
def self.convert(coder)
return coder unless coder.is_a?(Psych::Coder)
case coder["active_record_yaml_version"]
when 1, 2 then coder
else
raise("Active Record doesn't know how to load YAML with this format.")
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Validations
class LengthValidator < ActiveModel::Validations::LengthValidator # :nodoc:
def validate_each(record, attribute, association_or_value)
if association_or_value.respond_to?(:loaded?) && association_or_value.loaded?
association_or_value = association_or_value.target.reject(&:marked_for_destruction?)
end
super
end
end
module ClassMethods
# Validates that the specified attributes match the length restrictions supplied.
# If the attribute is an association, records that are marked for destruction are not counted.
#
# See ActiveModel::Validations::HelperMethods.validates_length_of for more information.
def validates_length_of(*attr_names)
validates_with LengthValidator, _merge_attributes(attr_names)
end
alias_method :validates_size_of, :validates_length_of
end
end
end
# frozen_string_literal: true
module ActiveRecord
class LogSubscriber < ActiveSupport::LogSubscriber
IGNORE_PAYLOAD_NAMES = ["SCHEMA", "EXPLAIN"]
class_attribute :backtrace_cleaner, default: ActiveSupport::BacktraceCleaner.new
def self.runtime=(value)
ActiveRecord::RuntimeRegistry.sql_runtime = value
end
def self.runtime
ActiveRecord::RuntimeRegistry.sql_runtime ||= 0
end
def self.reset_runtime
rt, self.runtime = runtime, 0
rt
end
def strict_loading_violation(event)
debug do
owner = event.payload[:owner]
association = event.payload[:reflection].klass
name = event.payload[:reflection].name
color("Strict loading violation: #{owner} is marked for strict loading. The #{association} association named :#{name} cannot be lazily loaded.", RED)
end
end
def sql(event)
self.class.runtime += event.duration
return unless logger.debug?
payload = event.payload
return if IGNORE_PAYLOAD_NAMES.include?(payload[:name])
name = if payload[:async]
"ASYNC #{payload[:name]} (#{payload[:lock_wait].round(1)}ms) (db time #{event.duration.round(1)}ms)"
else
"#{payload[:name]} (#{event.duration.round(1)}ms)"
end
name = "CACHE #{name}" if payload[:cached]
sql = payload[:sql]
binds = nil
if payload[:binds]&.any?
casted_params = type_casted_binds(payload[:type_casted_binds])
binds = []
payload[:binds].each_with_index do |attr, i|
attribute_name = attr.respond_to?(:name) ? attr.name : attr[i].name
filtered_params = filter(attribute_name, casted_params[i])
binds << render_bind(attr, filtered_params)
end
binds = binds.inspect
binds.prepend(" ")
end
name = colorize_payload_name(name, payload[:name])
sql = color(sql, sql_color(sql), true) if colorize_logging
debug " #{name} #{sql}#{binds}"
end
private
def type_casted_binds(casted_binds)
casted_binds.respond_to?(:call) ? casted_binds.call : casted_binds
end
def render_bind(attr, value)
case attr
when ActiveModel::Attribute
if attr.type.binary? && attr.value
value = "<#{attr.value_for_database.to_s.bytesize} bytes of binary data>"
end
when Array
attr = attr.first
else
attr = nil
end
[attr&.name, value]
end
def colorize_payload_name(name, payload_name)
if payload_name.blank? || payload_name == "SQL" # SQL vs Model Load/Exists
color(name, MAGENTA, true)
else
color(name, CYAN, true)
end
end
def sql_color(sql)
case sql
when /\A\s*rollback/mi
RED
when /select .*for update/mi, /\A\s*lock/mi
WHITE
when /\A\s*select/i
BLUE
when /\A\s*insert/i
GREEN
when /\A\s*update/i
YELLOW
when /\A\s*delete/i
RED
when /transaction\s*\Z/i
CYAN
else
MAGENTA
end
end
def logger
ActiveRecord::Base.logger
end
def debug(progname = nil, &block)
return unless super
if ActiveRecord.verbose_query_logs
log_query_source
end
end
def log_query_source
source = extract_query_source_location(caller)
if source
logger.debug(" ↳ #{source}")
end
end
def extract_query_source_location(locations)
backtrace_cleaner.clean(locations.lazy).first
end
def filter(name, value)
ActiveRecord::Base.inspection_filter.filter_param(name, value)
end
end
end
ActiveRecord::LogSubscriber.attach_to :active_record
# frozen_string_literal: true
module ActiveRecord
module ConnectionAdapters
module PostgreSQL
module OID # :nodoc:
class Macaddr < Type::String # :nodoc:
def type
:macaddr
end
def changed?(old_value, new_value, _new_value_before_type_cast)
old_value.class != new_value.class ||
new_value && old_value.casecmp(new_value) != 0
end
def changed_in_place?(raw_old_value, new_value)
raw_old_value.class != new_value.class ||
new_value && raw_old_value.casecmp(new_value) != 0
end
end
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module TypeCaster
class Map # :nodoc:
def initialize(klass)
@klass = klass
end
def type_cast_for_database(attr_name, value)
type = type_for_attribute(attr_name)
type.serialize(value)
end
def type_for_attribute(name)
klass.type_for_attribute(name)
end
private
attr_reader :klass
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Nodes
class Matches < Binary
attr_reader :escape
attr_accessor :case_sensitive
def initialize(left, right, escape = nil, case_sensitive = false)
super(left, right)
@escape = escape && Nodes.build_quoted(escape)
@case_sensitive = case_sensitive
end
end
class DoesNotMatch < Matches; end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Math
def *(other)
Arel::Nodes::Multiplication.new(self, other)
end
def +(other)
Arel::Nodes::Grouping.new(Arel::Nodes::Addition.new(self, other))
end
def -(other)
Arel::Nodes::Grouping.new(Arel::Nodes::Subtraction.new(self, other))
end
def /(other)
Arel::Nodes::Division.new(self, other)
end
def &(other)
Arel::Nodes::Grouping.new(Arel::Nodes::BitwiseAnd.new(self, other))
end
def |(other)
Arel::Nodes::Grouping.new(Arel::Nodes::BitwiseOr.new(self, other))
end
def ^(other)
Arel::Nodes::Grouping.new(Arel::Nodes::BitwiseXor.new(self, other))
end
def <<(other)
Arel::Nodes::Grouping.new(Arel::Nodes::BitwiseShiftLeft.new(self, other))
end
def >>(other)
Arel::Nodes::Grouping.new(Arel::Nodes::BitwiseShiftRight.new(self, other))
end
def ~@
Arel::Nodes::BitwiseNot.new(self)
end
end
end
# frozen_string_literal: true
require "active_support/core_ext/hash/keys"
module ActiveRecord
class Relation
class HashMerger # :nodoc:
attr_reader :relation, :hash
def initialize(relation, hash, rewhere = nil)
hash.assert_valid_keys(*Relation::VALUE_METHODS)
@relation = relation
@hash = hash
@rewhere = rewhere
end
def merge
Merger.new(relation, other, @rewhere).merge
end
# Applying values to a relation has some side effects. E.g.
# interpolation might take place for where values. So we should
# build a relation to merge in rather than directly merging
# the values.
def other
other = Relation.create(
relation.klass,
table: relation.table,
predicate_builder: relation.predicate_builder
)
hash.each do |k, v|
k = :_select if k == :select
if Array === v
other.public_send("#{k}!", *v)
else
other.public_send("#{k}!", v)
end
end
other
end
end
class Merger # :nodoc:
attr_reader :relation, :values, :other
def initialize(relation, other, rewhere = nil)
@relation = relation
@values = other.values
@other = other
@rewhere = rewhere
end
NORMAL_VALUES = Relation::VALUE_METHODS - Relation::CLAUSE_METHODS -
[
:select, :includes, :preload, :joins, :left_outer_joins,
:order, :reverse_order, :lock, :create_with, :reordering
]
def merge
NORMAL_VALUES.each do |name|
value = values[name]
# The unless clause is here mostly for performance reasons (since the `send` call might be moderately
# expensive), most of the time the value is going to be `nil` or `.blank?`, the only catch is that
# `false.blank?` returns `true`, so there needs to be an extra check so that explicit `false` values
# don't fall through the cracks.
unless value.nil? || (value.blank? && false != value)
relation.public_send(:"#{name}!", *value)
end
end
merge_select_values
merge_multi_values
merge_single_values
merge_clauses
merge_preloads
merge_joins
merge_outer_joins
relation
end
private
def merge_select_values
return if other.select_values.empty?
if other.klass == relation.klass
relation.select_values |= other.select_values
else
relation.select_values |= other.instance_eval do
arel_columns(select_values)
end
end
end
def merge_preloads
return if other.preload_values.empty? && other.includes_values.empty?
if other.klass == relation.klass
relation.preload_values |= other.preload_values unless other.preload_values.empty?
relation.includes_values |= other.includes_values unless other.includes_values.empty?
else
reflection = relation.klass.reflect_on_all_associations.find do |r|
r.class_name == other.klass.name
end || return
unless other.preload_values.empty?
relation.preload! reflection.name => other.preload_values
end
unless other.includes_values.empty?
relation.includes! reflection.name => other.includes_values
end
end
end
def merge_joins
return if other.joins_values.empty?
if other.klass == relation.klass
relation.joins_values |= other.joins_values
else
associations, others = other.joins_values.partition do |join|
case join
when Hash, Symbol, Array; true
end
end
join_dependency = other.construct_join_dependency(
associations, Arel::Nodes::InnerJoin
)
relation.joins!(join_dependency, *others)
end
end
def merge_outer_joins
return if other.left_outer_joins_values.empty?
if other.klass == relation.klass
relation.left_outer_joins_values |= other.left_outer_joins_values
else
associations, others = other.left_outer_joins_values.partition do |join|
case join
when Hash, Symbol, Array; true
end
end
join_dependency = other.construct_join_dependency(
associations, Arel::Nodes::OuterJoin
)
relation.left_outer_joins!(join_dependency, *others)
end
end
def merge_multi_values
if other.reordering_value
# override any order specified in the original relation
relation.reorder!(*other.order_values)
elsif other.order_values.any?
# merge in order_values from relation
relation.order!(*other.order_values)
end
extensions = other.extensions - relation.extensions
relation.extending!(*extensions) if extensions.any?
end
def merge_single_values
relation.lock_value ||= other.lock_value if other.lock_value
unless other.create_with_value.blank?
relation.create_with_value = (relation.create_with_value || {}).merge(other.create_with_value)
end
end
def merge_clauses
relation.from_clause = other.from_clause if replace_from_clause?
where_clause = relation.where_clause.merge(other.where_clause, @rewhere)
relation.where_clause = where_clause unless where_clause.empty?
having_clause = relation.having_clause.merge(other.having_clause)
relation.having_clause = having_clause unless having_clause.empty?
end
def replace_from_clause?
relation.from_clause.empty? && !other.from_clause.empty? &&
relation.klass.base_class == other.klass.base_class
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Encryption
# A message defines the structure of the data we store in encrypted attributes. It contains:
#
# * An encrypted payload
# * A list of unencrypted headers
#
# See +Encryptor#encrypt+
class Message
attr_accessor :payload, :headers
def initialize(payload: nil, headers: {})
validate_payload_type(payload)
@payload = payload
@headers = Properties.new(headers)
end
def ==(other_message)
payload == other_message.payload && headers == other_message.headers
end
private
def validate_payload_type(payload)
unless payload.is_a?(String) || payload.nil?
raise ActiveRecord::Encryption::Errors::ForbiddenClass, "Only string payloads allowed"
end
end
end
end
end
# frozen_string_literal: true
require "base64"
module ActiveRecord
module Encryption
# A message serializer that serializes +Messages+ with JSON.
#
# The generated structure is pretty simple:
#
# {
# p: <payload>,
# h: {
# header1: value1,
# header2: value2,
# ...
# }
# }
#
# Both the payload and the header values are encoded with Base64
# to prevent JSON parsing errors and encoding issues when
# storing the resulting serialized data.
class MessageSerializer
def load(serialized_content)
data = JSON.parse(serialized_content)
parse_message(data, 1)
rescue JSON::ParserError
raise ActiveRecord::Encryption::Errors::Encoding
end
def dump(message)
raise ActiveRecord::Encryption::Errors::ForbiddenClass unless message.is_a?(ActiveRecord::Encryption::Message)
JSON.dump message_to_json(message)
end
private
def parse_message(data, level)
validate_message_data_format(data, level)
ActiveRecord::Encryption::Message.new(payload: decode_if_needed(data["p"]), headers: parse_properties(data["h"], level))
end
def validate_message_data_format(data, level)
if level > 2
raise ActiveRecord::Encryption::Errors::Decryption, "More than one level of hash nesting in headers is not supported"
end
unless data.is_a?(Hash) && data.has_key?("p")
raise ActiveRecord::Encryption::Errors::Decryption, "Invalid data format: hash without payload"
end
end
def parse_properties(headers, level)
ActiveRecord::Encryption::Properties.new.tap do |properties|
headers&.each do |key, value|
properties[key] = value.is_a?(Hash) ? parse_message(value, level + 1) : decode_if_needed(value)
end
end
end
def message_to_json(message)
{
p: encode_if_needed(message.payload),
h: headers_to_json(message.headers)
}
end
def headers_to_json(headers)
headers.transform_values do |value|
value.is_a?(ActiveRecord::Encryption::Message) ? message_to_json(value) : encode_if_needed(value)
end
end
def encode_if_needed(value)
if value.is_a?(String)
::Base64.strict_encode64 value
else
value
end
end
def decode_if_needed(value)
if value.is_a?(String)
::Base64.strict_decode64(value)
else
value
end
rescue ArgumentError, TypeError
raise Errors::Encoding
end
end
end
end
# frozen_string_literal: true
require "rails/generators/migration"
module ActiveRecord
module Generators # :nodoc:
module Migration
extend ActiveSupport::Concern
include Rails::Generators::Migration
module ClassMethods
# Implement the required interface for Rails::Generators::Migration.
def next_migration_number(dirname)
next_migration_number = current_migration_number(dirname) + 1
ActiveRecord::Migration.next_migration_number(next_migration_number)
end
end
private
def primary_key_type
key_type = options[:primary_key_type]
", id: :#{key_type}" if key_type
end
def foreign_key_type
key_type = options[:primary_key_type]
", type: :#{key_type}" if key_type
end
def db_migrate_path
if defined?(Rails.application) && Rails.application
configured_migrate_path || default_migrate_path
else
"db/migrate"
end
end
def default_migrate_path
Rails.application.config.paths["db/migrate"].to_ary.first
end
def configured_migrate_path
return unless database = options[:database]
config = ActiveRecord::Base.configurations.configs_for(
env_name: Rails.env,
name: database
)
config&.migrations_paths
end
end
end
end
# frozen_string_literal: true
require "rails/generators/active_record"
module ActiveRecord
module Generators # :nodoc:
class MigrationGenerator < Base # :nodoc:
argument :attributes, type: :array, default: [], banner: "field[:type][:index] field[:type][:index]"
class_option :timestamps, type: :boolean
class_option :primary_key_type, type: :string, desc: "The type for primary key"
class_option :database, type: :string, aliases: %i(--db), desc: "The database for your migration. By default, the current environment's primary database is used."
def create_migration_file
set_local_assigns!
validate_file_name!
migration_template @migration_template, File.join(db_migrate_path, "#{file_name}.rb")
end
private
attr_reader :migration_action, :join_tables
# Sets the default migration template that is being used for the generation of the migration.
# Depending on command line arguments, the migration template and the table name instance
# variables are set up.
def set_local_assigns!
@migration_template = "migration.rb"
case file_name
when /^(add)_.*_to_(.*)/, /^(remove)_.*?_from_(.*)/
@migration_action = $1
@table_name = normalize_table_name($2)
when /join_table/
if attributes.length == 2
@migration_action = "join"
@join_tables = pluralize_table_names? ? attributes.map(&:plural_name) : attributes.map(&:singular_name)
set_index_names
end
when /^create_(.+)/
@table_name = normalize_table_name($1)
@migration_template = "create_table_migration.rb"
end
end
def set_index_names
attributes.each_with_index do |attr, i|
attr.index_name = [attr, attributes[i - 1]].map { |a| index_name_for(a) }
end
end
def index_name_for(attribute)
if attribute.foreign_key?
attribute.name
else
attribute.name.singularize.foreign_key
end.to_sym
end
def attributes_with_index
attributes.select { |a| !a.reference? && a.has_index? }
end
# A migration file name can only contain underscores (_), lowercase characters,
# and numbers 0-9. Any other file name will raise an IllegalMigrationNameError.
def validate_file_name!
unless /^[_a-z0-9]+$/.match?(file_name)
raise IllegalMigrationNameError.new(file_name)
end
end
def normalize_table_name(_table_name)
pluralize_table_names? ? _table_name.pluralize : _table_name.singularize
end
end
end
end
# frozen_string_literal: true
require "rails/generators/active_record"
module ActiveRecord
module Generators # :nodoc:
class ModelGenerator < Base # :nodoc:
argument :attributes, type: :array, default: [], banner: "field[:type][:index] field[:type][:index]"
check_class_collision
class_option :migration, type: :boolean
class_option :timestamps, type: :boolean
class_option :parent, type: :string, desc: "The parent class for the generated model"
class_option :indexes, type: :boolean, default: true, desc: "Add indexes for references and belongs_to columns"
class_option :primary_key_type, type: :string, desc: "The type for primary key"
class_option :database, type: :string, aliases: %i(--db), desc: "The database for your model's migration. By default, the current environment's primary database is used."
# creates the migration file for the model.
def create_migration_file
return if skip_migration_creation?
attributes.each { |a| a.attr_options.delete(:index) if a.reference? && !a.has_index? } if options[:indexes] == false
migration_template "../../migration/templates/create_table_migration.rb", File.join(db_migrate_path, "create_#{table_name}.rb")
end
def create_model_file
generate_abstract_class if database && !parent
template "model.rb", File.join("app/models", class_path, "#{file_name}.rb")
end
def create_module_file
return if regular_class_path.empty?
template "module.rb", File.join("app/models", "#{class_path.join('/')}.rb") if behavior == :invoke
end
hook_for :test_framework
private
# Skip creating migration file if:
# - options parent is present and database option is not present
# - migrations option is nil or false
def skip_migration_creation?
parent && !database || !migration
end
def attributes_with_index
attributes.select { |a| !a.reference? && a.has_index? }
end
# Used by the migration template to determine the parent name of the model
def parent_class_name
if parent
parent
elsif database
abstract_class_name
else
"ApplicationRecord"
end
end
def generate_abstract_class
path = File.join("app/models", "#{database.underscore}_record.rb")
return if File.exist?(path)
template "abstract_base_class.rb", path
end
def abstract_class_name
"#{database.camelize}Record"
end
def database
options[:database]
end
def parent
options[:parent]
end
def migration
options[:migration]
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
class FixtureSet
class ModelMetadata # :nodoc:
def initialize(model_class)
@model_class = model_class
end
def primary_key_name
@primary_key_name ||= @model_class && @model_class.primary_key
end
def primary_key_type
@primary_key_type ||= @model_class && @model_class.type_for_attribute(@model_class.primary_key).type
end
def has_primary_key_column?
@has_primary_key_column ||= primary_key_name &&
@model_class.columns.any? { |col| col.name == primary_key_name }
end
def timestamp_column_names
@model_class.all_timestamp_attributes_in_model
end
def inheritance_column_name
@inheritance_column_name ||= @model_class && @model_class.inheritance_column
end
end
end
end
# frozen_string_literal: true
require "monitor"
module ActiveRecord
module ModelSchema
extend ActiveSupport::Concern
##
# :singleton-method: primary_key_prefix_type
# :call-seq: primary_key_prefix_type
#
# The prefix type that will be prepended to every primary key column name.
# The options are +:table_name+ and +:table_name_with_underscore+. If the first is specified,
# the Product class will look for "productid" instead of "id" as the primary column. If the
# latter is specified, the Product class will look for "product_id" instead of "id". Remember
# that this is a global setting for all Active Records.
##
# :singleton-method: primary_key_prefix_type=
# :call-seq: primary_key_prefix_type=(prefix_type)
#
# Sets the prefix type that will be prepended to every primary key column name.
# The options are +:table_name+ and +:table_name_with_underscore+. If the first is specified,
# the Product class will look for "productid" instead of "id" as the primary column. If the
# latter is specified, the Product class will look for "product_id" instead of "id". Remember
# that this is a global setting for all Active Records.
##
# :singleton-method: table_name_prefix
# :call-seq: table_name_prefix
#
# The prefix string to prepend to every table name.
##
# :singleton-method: table_name_prefix=
# :call-seq: table_name_prefix=(prefix)
#
# Sets the prefix string to prepend to every table name. So if set to "basecamp_", all table
# names will be named like "basecamp_projects", "basecamp_people", etc. This is a convenient
# way of creating a namespace for tables in a shared database. By default, the prefix is the
# empty string.
#
# If you are organising your models within modules you can add a prefix to the models within
# a namespace by defining a singleton method in the parent module called table_name_prefix which
# returns your chosen prefix.
##
# :singleton-method: table_name_suffix
# :call-seq: table_name_suffix
#
# The suffix string to append to every table name.
##
# :singleton-method: table_name_suffix=
# :call-seq: table_name_suffix=(suffix)
#
# Works like +table_name_prefix=+, but appends instead of prepends (set to "_basecamp" gives "projects_basecamp",
# "people_basecamp"). By default, the suffix is the empty string.
#
# If you are organising your models within modules, you can add a suffix to the models within
# a namespace by defining a singleton method in the parent module called table_name_suffix which
# returns your chosen suffix.
##
# :singleton-method: schema_migrations_table_name
# :call-seq: schema_migrations_table_name
#
# The name of the schema migrations table. By default, the value is <tt>"schema_migrations"</tt>.
##
# :singleton-method: schema_migrations_table_name=
# :call-seq: schema_migrations_table_name=(table_name)
#
# Sets the name of the schema migrations table.
##
# :singleton-method: internal_metadata_table_name
# :call-seq: internal_metadata_table_name
#
# The name of the internal metadata table. By default, the value is <tt>"ar_internal_metadata"</tt>.
##
# :singleton-method: internal_metadata_table_name=
# :call-seq: internal_metadata_table_name=(table_name)
#
# Sets the name of the internal metadata table.
##
# :singleton-method: pluralize_table_names
# :call-seq: pluralize_table_names
#
# Indicates whether table names should be the pluralized versions of the corresponding class names.
# If true, the default table name for a Product class will be "products". If false, it would just be "product".
# See table_name for the full rules on table/class naming. This is true, by default.
##
# :singleton-method: pluralize_table_names=
# :call-seq: pluralize_table_names=(value)
#
# Set whether table names should be the pluralized versions of the corresponding class names.
# If true, the default table name for a Product class will be "products". If false, it would just be "product".
# See table_name for the full rules on table/class naming. This is true, by default.
##
# :singleton-method: implicit_order_column
# :call-seq: implicit_order_column
#
# The name of the column records are ordered by if no explicit order clause
# is used during an ordered finder call. If not set the primary key is used.
##
# :singleton-method: implicit_order_column=
# :call-seq: implicit_order_column=(column_name)
#
# Sets the column to sort records by when no explicit order clause is used
# during an ordered finder call. Useful when the primary key is not an
# auto-incrementing integer, for example when it's a UUID. Records are subsorted
# by the primary key if it exists to ensure deterministic results.
##
# :singleton-method: immutable_strings_by_default=
# :call-seq: immutable_strings_by_default=(bool)
#
# Determines whether columns should infer their type as +:string+ or
# +:immutable_string+. This setting does not affect the behavior of
# <tt>attribute :foo, :string</tt>. Defaults to false.
included do
class_attribute :primary_key_prefix_type, instance_writer: false
class_attribute :table_name_prefix, instance_writer: false, default: ""
class_attribute :table_name_suffix, instance_writer: false, default: ""
class_attribute :schema_migrations_table_name, instance_accessor: false, default: "schema_migrations"
class_attribute :internal_metadata_table_name, instance_accessor: false, default: "ar_internal_metadata"
class_attribute :pluralize_table_names, instance_writer: false, default: true
class_attribute :implicit_order_column, instance_accessor: false
class_attribute :immutable_strings_by_default, instance_accessor: false
# Defines the name of the table column which will store the class name on single-table
# inheritance situations.
#
# The default inheritance column name is +type+, which means it's a
# reserved word inside Active Record. To be able to use single-table
# inheritance with another column name, or to use the column +type+ in
# your own model for something else, you can set +inheritance_column+:
#
# self.inheritance_column = 'zoink'
class_attribute :inheritance_column, instance_accessor: false, default: "type"
singleton_class.class_eval do
alias_method :_inheritance_column=, :inheritance_column=
private :_inheritance_column=
alias_method :inheritance_column=, :real_inheritance_column=
end
self.protected_environments = ["production"]
self.ignored_columns = [].freeze
delegate :type_for_attribute, :column_for_attribute, to: :class
initialize_load_schema_monitor
end
# Derives the join table name for +first_table+ and +second_table+. The
# table names appear in alphabetical order. A common prefix is removed
# (useful for namespaced models like Music::Artist and Music::Record):
#
# artists, records => artists_records
# records, artists => artists_records
# music_artists, music_records => music_artists_records
def self.derive_join_table_name(first_table, second_table) # :nodoc:
[first_table.to_s, second_table.to_s].sort.join("\0").gsub(/^(.*_)(.+)\0\1(.+)/, '\1\2_\3').tr("\0", "_")
end
module ClassMethods
# Guesses the table name (in forced lower-case) based on the name of the class in the
# inheritance hierarchy descending directly from ActiveRecord::Base. So if the hierarchy
# looks like: Reply < Message < ActiveRecord::Base, then Message is used
# to guess the table name even when called on Reply. The rules used to do the guess
# are handled by the Inflector class in Active Support, which knows almost all common
# English inflections. You can add new inflections in config/initializers/inflections.rb.
#
# Nested classes are given table names prefixed by the singular form of
# the parent's table name. Enclosing modules are not considered.
#
# ==== Examples
#
# class Invoice < ActiveRecord::Base
# end
#
# file class table_name
# invoice.rb Invoice invoices
#
# class Invoice < ActiveRecord::Base
# class Lineitem < ActiveRecord::Base
# end
# end
#
# file class table_name
# invoice.rb Invoice::Lineitem invoice_lineitems
#
# module Invoice
# class Lineitem < ActiveRecord::Base
# end
# end
#
# file class table_name
# invoice/lineitem.rb Invoice::Lineitem lineitems
#
# Additionally, the class-level +table_name_prefix+ is prepended and the
# +table_name_suffix+ is appended. So if you have "myapp_" as a prefix,
# the table name guess for an Invoice class becomes "myapp_invoices".
# Invoice::Lineitem becomes "myapp_invoice_lineitems".
#
# Active Model Naming's +model_name+ is the base name used to guess the
# table name. In case a custom Active Model Name is defined, it will be
# used for the table name as well:
#
# class PostRecord < ActiveRecord::Base
# class << self
# def model_name
# ActiveModel::Name.new(self, nil, "Post")
# end
# end
# end
#
# PostRecord.table_name
# # => "posts"
#
# You can also set your own table name explicitly:
#
# class Mouse < ActiveRecord::Base
# self.table_name = "mice"
# end
def table_name
reset_table_name unless defined?(@table_name)
@table_name
end
# Sets the table name explicitly. Example:
#
# class Project < ActiveRecord::Base
# self.table_name = "project"
# end
def table_name=(value)
value = value && value.to_s
if defined?(@table_name)
return if value == @table_name
reset_column_information if connected?
end
@table_name = value
@quoted_table_name = nil
@arel_table = nil
@sequence_name = nil unless defined?(@explicit_sequence_name) && @explicit_sequence_name
@predicate_builder = nil
end
# Returns a quoted version of the table name, used to construct SQL statements.
def quoted_table_name
@quoted_table_name ||= connection.quote_table_name(table_name)
end
# Computes the table name, (re)sets it internally, and returns it.
def reset_table_name # :nodoc:
self.table_name = if abstract_class?
superclass == Base ? nil : superclass.table_name
elsif superclass.abstract_class?
superclass.table_name || compute_table_name
else
compute_table_name
end
end
def full_table_name_prefix # :nodoc:
(module_parents.detect { |p| p.respond_to?(:table_name_prefix) } || self).table_name_prefix
end
def full_table_name_suffix # :nodoc:
(module_parents.detect { |p| p.respond_to?(:table_name_suffix) } || self).table_name_suffix
end
# The array of names of environments where destructive actions should be prohibited. By default,
# the value is <tt>["production"]</tt>.
def protected_environments
if defined?(@protected_environments)
@protected_environments
else
superclass.protected_environments
end
end
# Sets an array of names of environments where destructive actions should be prohibited.
def protected_environments=(environments)
@protected_environments = environments.map(&:to_s)
end
def real_inheritance_column=(value) # :nodoc:
self._inheritance_column = value.to_s
end
# The list of columns names the model should ignore. Ignored columns won't have attribute
# accessors defined, and won't be referenced in SQL queries.
def ignored_columns
if defined?(@ignored_columns)
@ignored_columns
else
superclass.ignored_columns
end
end
# Sets the columns names the model should ignore. Ignored columns won't have attribute
# accessors defined, and won't be referenced in SQL queries.
#
# A common usage pattern for this method is to ensure all references to an attribute
# have been removed and deployed, before a migration to drop the column from the database
# has been deployed and run. Using this two step approach to dropping columns ensures there
# is no code that raises errors due to having a cached schema in memory at the time the
# schema migration is run.
#
# For example, given a model where you want to drop the "category" attribute, first mark it
# as ignored:
#
# class Project < ActiveRecord::Base
# # schema:
# # id :bigint
# # name :string, limit: 255
# # category :string, limit: 255
#
# self.ignored_columns = [:category]
# end
#
# The schema still contains "category", but now the model omits it, so any meta-driven code or
# schema caching will not attempt to use the column:
#
# Project.columns_hash["category"] => nil
#
# You will get an error if accessing that attribute directly, so ensure all usages of the
# column are removed (automated tests can help you find any usages).
#
# user = Project.create!(name: "First Project")
# user.category # => raises NoMethodError
def ignored_columns=(columns)
reload_schema_from_cache
@ignored_columns = columns.map(&:to_s).freeze
end
def sequence_name
if base_class?
@sequence_name ||= reset_sequence_name
else
(@sequence_name ||= nil) || base_class.sequence_name
end
end
def reset_sequence_name # :nodoc:
@explicit_sequence_name = false
@sequence_name = connection.default_sequence_name(table_name, primary_key)
end
# Sets the name of the sequence to use when generating ids to the given
# value, or (if the value is +nil+ or +false+) to the value returned by the
# given block. This is required for Oracle and is useful for any
# database which relies on sequences for primary key generation.
#
# If a sequence name is not explicitly set when using Oracle,
# it will default to the commonly used pattern of: #{table_name}_seq
#
# If a sequence name is not explicitly set when using PostgreSQL, it
# will discover the sequence corresponding to your primary key for you.
#
# class Project < ActiveRecord::Base
# self.sequence_name = "projectseq" # default would have been "project_seq"
# end
def sequence_name=(value)
@sequence_name = value.to_s
@explicit_sequence_name = true
end
# Determines if the primary key values should be selected from their
# corresponding sequence before the insert statement.
def prefetch_primary_key?
connection.prefetch_primary_key?(table_name)
end
# Returns the next value that will be used as the primary key on
# an insert statement.
def next_sequence_value
connection.next_sequence_value(sequence_name)
end
# Indicates whether the table associated with this class exists
def table_exists?
connection.schema_cache.data_source_exists?(table_name)
end
def attributes_builder # :nodoc:
unless defined?(@attributes_builder) && @attributes_builder
defaults = _default_attributes.except(*(column_names - [primary_key]))
@attributes_builder = ActiveModel::AttributeSet::Builder.new(attribute_types, defaults)
end
@attributes_builder
end
def columns_hash # :nodoc:
load_schema
@columns_hash
end
def columns
load_schema
@columns ||= columns_hash.values.freeze
end
def attribute_types # :nodoc:
load_schema
@attribute_types ||= Hash.new(Type.default_value)
end
def yaml_encoder # :nodoc:
@yaml_encoder ||= ActiveModel::AttributeSet::YAMLEncoder.new(attribute_types)
end
# Returns the type of the attribute with the given name, after applying
# all modifiers. This method is the only valid source of information for
# anything related to the types of a model's attributes. This method will
# access the database and load the model's schema if it is required.
#
# The return value of this method will implement the interface described
# by ActiveModel::Type::Value (though the object itself may not subclass
# it).
#
# +attr_name+ The name of the attribute to retrieve the type for. Must be
# a string or a symbol.
def type_for_attribute(attr_name, &block)
attr_name = attr_name.to_s
attr_name = attribute_aliases[attr_name] || attr_name
if block
attribute_types.fetch(attr_name, &block)
else
attribute_types[attr_name]
end
end
# Returns the column object for the named attribute.
# Returns an +ActiveRecord::ConnectionAdapters::NullColumn+ if the
# named attribute does not exist.
#
# class Person < ActiveRecord::Base
# end
#
# person = Person.new
# person.column_for_attribute(:name) # the result depends on the ConnectionAdapter
# # => #<ActiveRecord::ConnectionAdapters::Column:0x007ff4ab083980 @name="name", @sql_type="varchar(255)", @null=true, ...>
#
# person.column_for_attribute(:nothing)
# # => #<ActiveRecord::ConnectionAdapters::NullColumn:0xXXX @name=nil, @sql_type=nil, @cast_type=#<Type::Value>, ...>
def column_for_attribute(name)
name = name.to_s
columns_hash.fetch(name) do
ConnectionAdapters::NullColumn.new(name)
end
end
# Returns a hash where the keys are column names and the values are
# default values when instantiating the Active Record object for this table.
def column_defaults
load_schema
@column_defaults ||= _default_attributes.deep_dup.to_hash.freeze
end
def _default_attributes # :nodoc:
load_schema
@default_attributes ||= ActiveModel::AttributeSet.new({})
end
# Returns an array of column names as strings.
def column_names
@column_names ||= columns.map(&:name).freeze
end
def symbol_column_to_string(name_symbol) # :nodoc:
@symbol_column_to_string_name_hash ||= column_names.index_by(&:to_sym)
@symbol_column_to_string_name_hash[name_symbol]
end
# Returns an array of column objects where the primary id, all columns ending in "_id" or "_count",
# and columns used for single table inheritance have been removed.
def content_columns
@content_columns ||= columns.reject do |c|
c.name == primary_key ||
c.name == inheritance_column ||
c.name.end_with?("_id", "_count")
end.freeze
end
# Resets all the cached information about columns, which will cause them
# to be reloaded on the next request.
#
# The most common usage pattern for this method is probably in a migration,
# when just after creating a table you want to populate it with some default
# values, e.g.:
#
# class CreateJobLevels < ActiveRecord::Migration[7.1]
# def up
# create_table :job_levels do |t|
# t.integer :id
# t.string :name
#
# t.timestamps
# end
#
# JobLevel.reset_column_information
# %w{assistant executive manager director}.each do |type|
# JobLevel.create(name: type)
# end
# end
#
# def down
# drop_table :job_levels
# end
# end
def reset_column_information
connection.clear_cache!
([self] + descendants).each(&:undefine_attribute_methods)
connection.schema_cache.clear_data_source_cache!(table_name)
reload_schema_from_cache
initialize_find_by_cache
end
protected
def initialize_load_schema_monitor
@load_schema_monitor = Monitor.new
end
private
def inherited(child_class)
super
child_class.initialize_load_schema_monitor
end
def schema_loaded?
defined?(@schema_loaded) && @schema_loaded
end
def load_schema
return if schema_loaded?
@load_schema_monitor.synchronize do
return if defined?(@columns_hash) && @columns_hash
load_schema!
@schema_loaded = true
rescue
reload_schema_from_cache # If the schema loading failed half way through, we must reset the state.
raise
end
end
def load_schema!
unless table_name
raise ActiveRecord::TableNotSpecified, "#{self} has no table configured. Set one with #{self}.table_name="
end
columns_hash = connection.schema_cache.columns_hash(table_name)
columns_hash = columns_hash.except(*ignored_columns) unless ignored_columns.empty?
@columns_hash = columns_hash.freeze
@columns_hash.each do |name, column|
type = connection.lookup_cast_type_from_column(column)
type = _convert_type_from_options(type)
define_attribute(
name,
type,
default: column.default,
user_provided_default: false
)
end
end
def reload_schema_from_cache
@arel_table = nil
@column_names = nil
@symbol_column_to_string_name_hash = nil
@attribute_types = nil
@content_columns = nil
@default_attributes = nil
@column_defaults = nil
@attributes_builder = nil
@columns = nil
@columns_hash = nil
@schema_loaded = false
@attribute_names = nil
@yaml_encoder = nil
subclasses.each do |descendant|
descendant.send(:reload_schema_from_cache)
end
end
# Guesses the table name, but does not decorate it with prefix and suffix information.
def undecorated_table_name(model_name)
table_name = model_name.to_s.demodulize.underscore
pluralize_table_names ? table_name.pluralize : table_name
end
# Computes and returns a table name according to default conventions.
def compute_table_name
if base_class?
# Nested classes are prefixed with singular parent table name.
if module_parent < Base && !module_parent.abstract_class?
contained = module_parent.table_name
contained = contained.singularize if module_parent.pluralize_table_names
contained += "_"
end
"#{full_table_name_prefix}#{contained}#{undecorated_table_name(model_name)}#{full_table_name_suffix}"
else
# STI subclasses always use their superclass's table.
base_class.table_name
end
end
def _convert_type_from_options(type)
if immutable_strings_by_default && type.respond_to?(:to_immutable_string)
type.to_immutable_string
else
type
end
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module ConnectionAdapters
module PostgreSQL
module OID # :nodoc:
class Money < Type::Decimal # :nodoc:
def type
:money
end
def scale
2
end
def cast_value(value)
return value unless ::String === value
# Because money output is formatted according to the locale, there are two
# cases to consider (note the decimal separators):
# (1) $12,345,678.12
# (2) $12.345.678,12
# Negative values are represented as follows:
# (3) -$2.55
# (4) ($2.55)
value = value.sub(/^\((.+)\)$/, '-\1') # (4)
case value
when /^-?\D*+[\d,]+\.\d{2}$/ # (1)
value.gsub!(/[^-\d.]/, "")
when /^-?\D*+[\d.]+,\d{2}$/ # (2)
value.gsub!(/[^-\d,]/, "").sub!(/,/, ".")
end
super(value)
end
end
end
end
end
end
# frozen_string_literal: true
require "rails/generators/active_record"
module ActiveRecord
module Generators # :nodoc:
class MultiDbGenerator < ::Rails::Generators::Base # :nodoc:
source_root File.expand_path("templates", __dir__)
def create_multi_db
filename = "multi_db.rb"
template filename, "config/initializers/#{filename}"
end
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Visitors
class MySQL < Arel::Visitors::ToSql
private
def visit_Arel_Nodes_Bin(o, collector)
collector << "BINARY "
visit o.expr, collector
end
def visit_Arel_Nodes_UnqualifiedColumn(o, collector)
visit o.expr, collector
end
###
# :'(
# To retrieve all rows from a certain offset up to the end of the result set,
# you can use some large number for the second parameter.
# https://dev.mysql.com/doc/refman/en/select.html
def visit_Arel_Nodes_SelectStatement(o, collector)
if o.offset && !o.limit
o.limit = Arel::Nodes::Limit.new(18446744073709551615)
end
super
end
def visit_Arel_Nodes_SelectCore(o, collector)
o.froms ||= Arel.sql("DUAL")
super
end
def visit_Arel_Nodes_Concat(o, collector)
collector << " CONCAT("
visit o.left, collector
collector << ", "
visit o.right, collector
collector << ") "
collector
end
def visit_Arel_Nodes_IsNotDistinctFrom(o, collector)
collector = visit o.left, collector
collector << " <=> "
visit o.right, collector
end
def visit_Arel_Nodes_IsDistinctFrom(o, collector)
collector << "NOT "
visit_Arel_Nodes_IsNotDistinctFrom o, collector
end
def visit_Arel_Nodes_Regexp(o, collector)
infix_value o, collector, " REGEXP "
end
def visit_Arel_Nodes_NotRegexp(o, collector)
infix_value o, collector, " NOT REGEXP "
end
# no-op
def visit_Arel_Nodes_NullsFirst(o, collector)
visit o.expr, collector
end
# In the simple case, MySQL allows us to place JOINs directly into the UPDATE
# query. However, this does not allow for LIMIT, OFFSET and ORDER. To support
# these, we must use a subquery.
def prepare_update_statement(o)
if o.offset || has_group_by_and_having?(o) ||
has_join_sources?(o) && has_limit_or_offset_or_orders?(o)
super
else
o
end
end
alias :prepare_delete_statement :prepare_update_statement
# MySQL doesn't automatically create a temporary table for use subquery, so we have
# to give it some prompting in the form of a subsubquery.
def build_subselect(key, o)
subselect = super
# Materialize subquery by adding distinct
# to work with MySQL 5.7.6 which sets optimizer_switch='derived_merge=on'
unless has_limit_or_offset_or_orders?(subselect)
core = subselect.cores.last
core.set_quantifier = Arel::Nodes::Distinct.new
end
Nodes::SelectStatement.new.tap do |stmt|
core = stmt.cores.last
core.froms = Nodes::Grouping.new(subselect).as("__active_record_temp")
core.projections = [Arel.sql(quote_column_name(key.name))]
end
end
end
end
end
# frozen_string_literal: true
require "active_record/connection_adapters/abstract_mysql_adapter"
require "active_record/connection_adapters/mysql/database_statements"
gem "mysql2", "~> 0.5"
require "mysql2"
module ActiveRecord
module ConnectionHandling # :nodoc:
# Establishes a connection to the database that's used by all Active Record objects.
def mysql2_connection(config)
config = config.symbolize_keys
config[:flags] ||= 0
if config[:flags].kind_of? Array
config[:flags].push "FOUND_ROWS"
else
config[:flags] |= Mysql2::Client::FOUND_ROWS
end
ConnectionAdapters::Mysql2Adapter.new(
ConnectionAdapters::Mysql2Adapter.new_client(config),
logger,
nil,
config,
)
end
end
module ConnectionAdapters
class Mysql2Adapter < AbstractMysqlAdapter
ER_BAD_DB_ERROR = 1049
ER_ACCESS_DENIED_ERROR = 1045
ER_CONN_HOST_ERROR = 2003
ER_UNKNOWN_HOST_ERROR = 2005
ADAPTER_NAME = "Mysql2"
include MySQL::DatabaseStatements
class << self
def new_client(config)
Mysql2::Client.new(config)
rescue Mysql2::Error => error
if error.error_number == ConnectionAdapters::Mysql2Adapter::ER_BAD_DB_ERROR
raise ActiveRecord::NoDatabaseError.db_error(config[:database])
elsif error.error_number == ConnectionAdapters::Mysql2Adapter::ER_ACCESS_DENIED_ERROR
raise ActiveRecord::DatabaseConnectionError.username_error(config[:username])
elsif [ConnectionAdapters::Mysql2Adapter::ER_CONN_HOST_ERROR, ConnectionAdapters::Mysql2Adapter::ER_UNKNOWN_HOST_ERROR].include?(error.error_number)
raise ActiveRecord::DatabaseConnectionError.hostname_error(config[:host])
else
raise ActiveRecord::ConnectionNotEstablished, error.message
end
end
end
def initialize(connection, logger, connection_options, config)
check_prepared_statements_deprecation(config)
superclass_config = config.reverse_merge(prepared_statements: false)
super(connection, logger, connection_options, superclass_config)
end
def self.database_exists?(config)
!!ActiveRecord::Base.mysql2_connection(config)
rescue ActiveRecord::NoDatabaseError
false
end
def supports_json?
!mariadb? && database_version >= "5.7.8"
end
def supports_comments?
true
end
def supports_comments_in_create?
true
end
def supports_savepoints?
true
end
def savepoint_errors_invalidate_transactions?
true
end
def supports_lazy_transactions?
true
end
# HELPER METHODS ===========================================
def each_hash(result, &block) # :nodoc:
if block_given?
result.each(as: :hash, symbolize_keys: true, &block)
else
to_enum(:each_hash, result)
end
end
def error_number(exception)
exception.error_number if exception.respond_to?(:error_number)
end
#--
# QUOTING ==================================================
#++
def quote_string(string)
@raw_connection.escape(string)
rescue Mysql2::Error => error
raise translate_exception(error, message: error.message, sql: "<escape>", binds: [])
end
#--
# CONNECTION MANAGEMENT ====================================
#++
def active?
@raw_connection.ping
end
def reconnect!(restore_transactions: false)
@lock.synchronize do
@raw_connection.close
connect
super
end
end
alias :reset! :reconnect!
# Disconnects from the database if already connected.
# Otherwise, this method does nothing.
def disconnect!
super
@raw_connection.close
end
def discard! # :nodoc:
super
@raw_connection.automatic_close = false
@raw_connection = nil
end
private
def check_prepared_statements_deprecation(config)
if !config.key?(:prepared_statements)
ActiveSupport::Deprecation.warn(<<-MSG.squish)
The default value of `prepared_statements` for the mysql2 adapter will be changed from +false+ to +true+ in Rails 7.2.
MSG
end
end
def connect
@raw_connection = self.class.new_client(@config)
end
def configure_connection
@raw_connection.query_options[:as] = :array
super
end
def full_version
schema_cache.database_version.full_version_string
end
def get_full_version
@raw_connection.server_info[:version]
end
def translate_exception(exception, message:, sql:, binds:)
if exception.is_a?(Mysql2::Error::TimeoutError) && !exception.error_number
ActiveRecord::AdapterTimeout.new(message, sql: sql, binds: binds)
else
super
end
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
module Tasks # :nodoc:
class MySQLDatabaseTasks # :nodoc:
ER_DB_CREATE_EXISTS = 1007
delegate :connection, :establish_connection, to: ActiveRecord::Base
def self.using_database_configurations?
true
end
def initialize(db_config)
@db_config = db_config
@configuration_hash = db_config.configuration_hash
end
def create
establish_connection(configuration_hash_without_database)
connection.create_database(db_config.database, creation_options)
establish_connection(db_config)
end
def drop
establish_connection(db_config)
connection.drop_database(db_config.database)
end
def purge
establish_connection(db_config)
connection.recreate_database(db_config.database, creation_options)
end
def charset
connection.charset
end
def collation
connection.collation
end
def structure_dump(filename, extra_flags)
args = prepare_command_options
args.concat(["--result-file", "#{filename}"])
args.concat(["--no-data"])
args.concat(["--routines"])
args.concat(["--skip-comments"])
ignore_tables = ActiveRecord::SchemaDumper.ignore_tables
if ignore_tables.any?
args += ignore_tables.map { |table| "--ignore-table=#{db_config.database}.#{table}" }
end
args.concat([db_config.database.to_s])
args.unshift(*extra_flags) if extra_flags
run_cmd("mysqldump", args, "dumping")
end
def structure_load(filename, extra_flags)
args = prepare_command_options
args.concat(["--execute", %{SET FOREIGN_KEY_CHECKS = 0; SOURCE #{filename}; SET FOREIGN_KEY_CHECKS = 1}])
args.concat(["--database", db_config.database.to_s])
args.unshift(*extra_flags) if extra_flags
run_cmd("mysql", args, "loading")
end
private
attr_reader :db_config, :configuration_hash
def configuration_hash_without_database
configuration_hash.merge(database: nil)
end
def creation_options
Hash.new.tap do |options|
options[:charset] = configuration_hash[:encoding] if configuration_hash.include?(:encoding)
options[:collation] = configuration_hash[:collation] if configuration_hash.include?(:collation)
end
end
def prepare_command_options
args = {
host: "--host",
port: "--port",
socket: "--socket",
username: "--user",
password: "--password",
encoding: "--default-character-set",
sslca: "--ssl-ca",
sslcert: "--ssl-cert",
sslcapath: "--ssl-capath",
sslcipher: "--ssl-cipher",
sslkey: "--ssl-key"
}.filter_map { |opt, arg| "#{arg}=#{configuration_hash[opt]}" if configuration_hash[opt] }
args
end
def run_cmd(cmd, args, action)
fail run_cmd_error(cmd, args, action) unless Kernel.system(cmd, *args)
end
def run_cmd_error(cmd, args, action)
msg = +"failed to execute: `#{cmd}`\n"
msg << "Please check the output above for any errors and make sure that `#{cmd}` is installed in your PATH and has proper permissions.\n\n"
msg
end
end
end
end
# frozen_string_literal: true
module ActiveRecord
# = Active Record \Named \Scopes
module Scoping
module Named
extend ActiveSupport::Concern
module ClassMethods
# Returns an ActiveRecord::Relation scope object.
#
# posts = Post.all
# posts.size # Fires "select count(*) from posts" and returns the count
# posts.each {|p| puts p.name } # Fires "select * from posts" and loads post objects
#
# fruits = Fruit.all
# fruits = fruits.where(color: 'red') if options[:red_only]
# fruits = fruits.limit(10) if limited?
#
# You can define a scope that applies to all finders using
# {default_scope}[rdoc-ref:Scoping::Default::ClassMethods#default_scope].
def all
scope = current_scope
if scope
if self == scope.klass
scope.clone
else
relation.merge!(scope)
end
else
default_scoped
end
end
def scope_for_association(scope = relation) # :nodoc:
if current_scope&.empty_scope?
scope
else
default_scoped(scope)
end
end
# Returns a scope for the model with default scopes.
def default_scoped(scope = relation, all_queries: nil)
build_default_scope(scope, all_queries: all_queries) || scope
end
def default_extensions # :nodoc:
if scope = scope_for_association || build_default_scope
scope.extensions
else
[]
end
end
# Adds a class method for retrieving and querying objects.
# The method is intended to return an ActiveRecord::Relation
# object, which is composable with other scopes.
# If it returns +nil+ or +false+, an
# {all}[rdoc-ref:Scoping::Named::ClassMethods#all] scope is returned instead.
#
# A \scope represents a narrowing of a database query, such as
# <tt>where(color: :red).select('shirts.*').includes(:washing_instructions)</tt>.
#
# class Shirt < ActiveRecord::Base
# scope :red, -> { where(color: 'red') }
# scope :dry_clean_only, -> { joins(:washing_instructions).where('washing_instructions.dry_clean_only = ?', true) }
# end
#
# The above calls to #scope define class methods <tt>Shirt.red</tt> and
# <tt>Shirt.dry_clean_only</tt>. <tt>Shirt.red</tt>, in effect,
# represents the query <tt>Shirt.where(color: 'red')</tt>.
#
# Note that this is simply 'syntactic sugar' for defining an actual
# class method:
#
# class Shirt < ActiveRecord::Base
# def self.red
# where(color: 'red')
# end
# end
#
# Unlike <tt>Shirt.find(...)</tt>, however, the object returned by
# <tt>Shirt.red</tt> is not an Array but an ActiveRecord::Relation,
# which is composable with other scopes; it resembles the association object
# constructed by a {has_many}[rdoc-ref:Associations::ClassMethods#has_many]
# declaration. For instance, you can invoke <tt>Shirt.red.first</tt>, <tt>Shirt.red.count</tt>,
# <tt>Shirt.red.where(size: 'small')</tt>. Also, just as with the
# association objects, named \scopes act like an Array, implementing
# Enumerable; <tt>Shirt.red.each(&block)</tt>, <tt>Shirt.red.first</tt>,
# and <tt>Shirt.red.inject(memo, &block)</tt> all behave as if
# <tt>Shirt.red</tt> really was an array.
#
# These named \scopes are composable. For instance,
# <tt>Shirt.red.dry_clean_only</tt> will produce all shirts that are
# both red and dry clean only. Nested finds and calculations also work
# with these compositions: <tt>Shirt.red.dry_clean_only.count</tt>
# returns the number of garments for which these criteria obtain.
# Similarly with <tt>Shirt.red.dry_clean_only.average(:thread_count)</tt>.
#
# All scopes are available as class methods on the ActiveRecord::Base
# descendant upon which the \scopes were defined. But they are also
# available to {has_many}[rdoc-ref:Associations::ClassMethods#has_many]
# associations. If,
#
# class Person < ActiveRecord::Base
# has_many :shirts
# end
#
# then <tt>elton.shirts.red.dry_clean_only</tt> will return all of
# Elton's red, dry clean only shirts.
#
# \Named scopes can also have extensions, just as with
# {has_many}[rdoc-ref:Associations::ClassMethods#has_many] declarations:
#
# class Shirt < ActiveRecord::Base
# scope :red, -> { where(color: 'red') } do
# def dom_id
# 'red_shirts'
# end
# end
# end
#
# Scopes can also be used while creating/building a record.
#
# class Article < ActiveRecord::Base
# scope :published, -> { where(published: true) }
# end
#
# Article.published.new.published # => true
# Article.published.create.published # => true
#
# \Class methods on your model are automatically available
# on scopes. Assuming the following setup:
#
# class Article < ActiveRecord::Base
# scope :published, -> { where(published: true) }
# scope :featured, -> { where(featured: true) }
#
# def self.latest_article
# order('published_at desc').first
# end
#
# def self.titles
# pluck(:title)
# end
# end
#
# We are able to call the methods like this:
#
# Article.published.featured.latest_article
# Article.featured.titles
def scope(name, body, &block)
unless body.respond_to?(:call)
raise ArgumentError, "The scope body needs to be callable."
end
if dangerous_class_method?(name)
raise ArgumentError, "You tried to define a scope named \"#{name}\" " \
"on the model \"#{self.name}\", but Active Record already defined " \
"a class method with the same name."
end
if method_defined_within?(name, Relation)
raise ArgumentError, "You tried to define a scope named \"#{name}\" " \
"on the model \"#{self.name}\", but ActiveRecord::Relation already defined " \
"an instance method with the same name."
end
extension = Module.new(&block) if block
if body.respond_to?(:to_proc)
singleton_class.define_method(name) do |*args|
scope = all._exec_scope(*args, &body)
scope = scope.extending(extension) if extension
scope
end
else
singleton_class.define_method(name) do |*args|
scope = body.call(*args) || all
scope = scope.extending(extension) if extension
scope
end
end
singleton_class.send(:ruby2_keywords, name)
generate_relation_method(name)
end
private
def singleton_method_added(name)
generate_relation_method(name) if Kernel.respond_to?(name) && !ActiveRecord::Relation.method_defined?(name)
end
end
end
end
end
# frozen_string_literal: true
module Arel # :nodoc: all
module Nodes
class NamedFunction < Arel::Nodes::Function
attr_accessor :name
def initialize(name, expr, aliaz = nil)
super(expr, aliaz)
@name = name
end
def hash
super ^ @name.hash
end
def eql?(other)
super && self.name == other.name
end
alias :== :eql?
end
end
end
This file has been truncated, but you can view the full file.
# frozen_string_literal: true
require "active_support/core_ext/hash/except"
require "active_support/core_ext/module/redefine_method"
require "active_support/core_ext/hash/indifferent_access"
module ActiveRecord
module NestedAttributes # :nodoc:
class TooManyRecords < ActiveRecordError
end
extend ActiveSupport::Concern
included do
class_attribute :nested_attributes_options, instance_writer: false, default: {}
end
# = Active Record Nested Attributes
#
# Nested attributes allow you to save attributes on associated records
# through the parent. By default nested attribute updating is turned off
# and you can enable it using the accepts_nested_attributes_for class
# method. When you enable nested attributes an attribute writer is
# defined on the model.
#
# The attribute writer is named after the association, which means that
# in the following example, two new methods are added to your model:
#
# <tt>author_attributes=(attributes)</tt> and
# <tt>pages_attributes=(attributes)</tt>.
#
# class Book < ActiveRecord::Base
# has_one :author
# has_many :pages
#
# accepts_nested_attributes_for :author, :pages
# end
#
# Note that the <tt>:autosave</tt> option is automatically enabled on every
# association that accepts_nested_a
View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

View raw

(Sorry about that, but we can’t show files that are this big right now.)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment