Skip to content

Instantly share code, notes, and snippets.

@camertron
Last active February 9, 2017 22:28
Show Gist options
  • Save camertron/58ec20558e491fcd359bbb2f150191e6 to your computer and use it in GitHub Desktop.
Save camertron/58ec20558e491fcd359bbb2f150191e6 to your computer and use it in GitHub Desktop.
Fast Sprockets
require 'digest/md5'
# The AssetFingerprinter calculates a digest hash of all the app's assets. It is
# used to avoid precompiling assets if none of them have changed since the last
# precompilation. This is done by generating a list of all assets from the list
# of configured asset paths and throwing all their contents through a digest
# algorithm. The result is a unique asset "fingerprint" that can be compared to
# other fingerprints to determine if any assets have changed.
#
# The fingerprinter also allows adding any additional files that should be used
# in the digest calculation. There are two types of file: files and assets.
# Files are regular old files. The fingerprinter will digest them verbatim.
# Asset files are looked up via Sprockets' find_asset method, which means they
# get precompiled and the precompiled contents digested. It should be
# exceedingly rare to need to add asset files. They are only required when the
# contents of an asset depends on some outside content. For example, the
# i18n-js-assets gem dynamically generates assets whose content is determined
# by Rails' i18n yaml files. The asset files themselves may not change, but
# their precompiled contents may change depending on changes made to the i18n
# yaml files they depend on. In such a case, the precompiled contents must be
# used to compute the fingerprint - the asset content alone isn't enough.
#
# - To specify another file to digest, use the add_file method.
# - To specify an asset whose precompiled contents should be digested, use the
# add_asset method.
#
class AssetFingerprinter
attr_reader :app, :digest_class, :additional_files, :additional_assets
def initialize(app, digest_class = Digest::MD5)
@app = app
@digest_class = digest_class
@additional_files = []
@additional_assets = []
end
# file is a string path
def add_file(file)
additional_files << file
@fingerprint = nil
end
# asset is a string path
def add_asset(asset)
additional_assets << asset
@fingerprint = nil
end
def fingerprint
@fingerprint ||= begin
digest = digest_class.new
# digest all files
digest_contents(all_possible_files, digest)
digest_contents(collect_files(additional_files), digest)
# digest "source" files, which are files whose contents must be
# determined via Sprockets' find_asset method
digest_contents(collect_assets(additional_assets), digest)
# return final digest string
digest.hexdigest
end
end
private
def all_possible_files
# generate a list of all possible assets, filter out directories
collect_files(
assets.paths.flat_map do |p|
Dir.glob(File.join(p, '**', '**')).select { |f| File.file?(f) }
end
)
end
def collect_files(file_list)
file_list.each_with_object({}) do |file, ret|
contents = File.read(file, mode: 'rb')
ret[digest_class.hexdigest(contents)] = contents
end
end
def collect_assets(asset_list)
asset_list.each_with_object({}) do |asset, ret|
contents = assets.find_asset(asset).source
ret[digest_class.hexdigest(contents)] = contents
end
end
def digest_contents(file_hash, digest)
file_hash.keys.sort.each do |file_digest|
digest << file_hash[file_digest]
end
end
def config
app.config
end
def assets
app.assets
end
end
require 'benchmark'
require 'parallel'
require 'json'
# This monkeypatch is designed to compile assets concurrently. It uses the
# Parallel gem to fork off a number of child processes which call Sprocket's
# built-in Manifest#compile method for each logical asset path. If you start
# seeing errors or other issues caused by this code, you can safely delete this
# file. Note that we tried to use sprockets-derailleur for this task, but it
# wasn't ready for rails 4.
module Sprockets
class ParallelCompiler
attr_reader :manifest
def initialize(manifest)
@manifest = manifest
end
def compile(*args)
logger.warn "Precompiling with #{worker_count} workers"
time = Benchmark.measure do
results = compile_in_parallel(find_precompile_paths(*args))
write_manifest(results)
end
logger.info "Completed precompiling assets (#{time.real.round(2)}s)"
end
private
def write_manifest(results)
File.write(manifest.filename, results.to_json)
end
def compile_in_parallel(paths)
flatten_precomp_results(
Parallel.map(paths, in_processes: worker_count) do |path|
manifest.compile_without_parallelism([path])
{ 'files' => {}, 'assets' => {} }.tap do |data|
manifest.find([path]) do |asset|
logger.info("Writing #{asset.digest_path}")
data['files'][asset.digest_path] = properties_for(asset)
data['assets'][asset.logical_path] = asset.digest_path
if alias_logical_path = manifest.class.compute_alias_logical_path(asset.logical_path)
data['assets'][alias_logical_path] = asset.digest_path
end
end
end
end
)
end
def flatten_precomp_results(results)
results.each_with_object({}) do |result, ret|
result.each_pair do |key, data|
(ret[key] ||= {}).merge!(data)
end
end
end
def find_precompile_paths(*args)
paths, filters = args.flatten.partition do |pre|
manifest.class.simple_logical_path?(pre)
end
filters = filters.map do |filter|
manifest.class.compile_match_filter(filter)
end
environment.logical_paths.each do |logical_path, filename|
if filters.any? { |f| f.call(logical_path, filename) }
paths << filename
end
end
paths
end
def properties_for(asset)
{
'logical_path' => asset.logical_path,
'mtime' => asset.mtime.iso8601,
'size' => asset.bytesize,
'digest' => asset.hexdigest,
}
end
def worker_count
@worker_count ||= ENV.fetch('SPROCKETS_WORKER_COUNT', 4).to_i
end
def environment
manifest.environment
end
def logger
manifest.send(:logger)
end
end
# this is the actual monkeypatch
class Manifest
def compile_with_parallelism(*args)
ParallelCompiler.new(self).compile(*args)
end
alias_method_chain :compile, :parallelism
end
end
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment