Create a gist now

Instantly share code, notes, and snippets.

What would you like to do?
Run this script from the root of your Redmine installation to convert from a Chiliproject installation
# encoding: UTF-8
# Chiliproject to Redmine converter
# =================================
# This script takes an existing Chiliproject database and
# converts it to be compatible with Redmine (>= v2.3). The
# database is converted in such a way that it can be run multiple
# times against a production Chiliproject install without
# interfering with it's operation. This is done by duplicating
# the entire Chiliproject database into a new database for
# Redmine. All conversions, transformation and adjustments are
# then performed on the new database without touching the
# chiliproject production database in any way.
# = Requirements
# * Ruby >= 1.9.3 (this was developed and run using Ruby 2.0)
# * database user has permissions to create/drop databases
# * database user has access to both chiliproject and redmine database
# * Redmine has been setup and fully configured for use
# = Notes
# this fork from
# additionally tries to migrate the wiki history
# it successfully ran a migration of an chiliproject-2.11.0 -> redmine-2.1 installation with >16000 issues and >200 wiki pages using ruby-2.0.
# upgrading to redmine 3.1.2 went fine, afterwards.
# before this migration, we migrated from trac to redmine to chiliproject, so it proved quite stable...
# == History conversion
# If you previously converted from Redmine to Chiliproject you
# likely have existing issue journal history in the
# `journal_details` table. If so, enter the greatest journal_id
# from this table in the 'journal_start_id' config option. This
# will cause the script to only update history created AFTER the
# transition to Chiliproject. If you did not previously convert,
# then you can leave the value of this option at '0' to convert
# everything.
# == Serialization
# If you run this under Ruby 1.9 you may experience issues related
# to the deserialization of the "changes" column from Chiliproject.
# Ruby 2.0 switched to Psych from Syck for YAML serialization. In
# Ruby 1.9 you could configure which YAML encoder to use. In 2.0 you
# cannot. Therefore the Syck gem is included to prevent
# deserialization errors.
# == Disclaimer
# This software is provided as-is with no warranty whatsoever. Use at
# your own risk! The developer is not responsible for any damages/
# corruption which may occur to your system.
require 'rubygems'
require 'mysql2'
require 'syck'
# configuration
redmine_db = 'redmine_production'
journal_start_id = 0
chili_db = 'chiliproject_production'
config = {
encoding: 'utf8',
username: '',
password: '',
host: ''
puts "====== Migrating #{chili_db} to #{redmine_db} ======"
# initialize connection to the server
client =
# See comment on journal_start_id above.
puts "== Determine chili's journal_start_id"
result = client.query("SELECT MAX(journal_id) FROM #{chili_db}.journal_details")
journal_start_id = result.to_a.first.to_a.first.last || 0
puts "journal_start_id is #{journal_start_id}"
# Creates a new redmine database. Any existing
# redmine DB will be dropped and recreated.
puts "== Setup the Redmine database"
puts " -> Create"
client.query("DROP DATABASE IF EXISTS `#{redmine_db}`")
client.query("CREATE DATABASE `#{redmine_db}`")
client.query("alter database #{redmine_db} DEFAULT CHARACTER SET utf8 collate utf8_general_ci")
client.query("USE `#{chili_db}`")
# Get the full Table list from the chiliproject DB and
# copy each table to the redmine database, making sure
# each table is set to UTF-8 encoding. Follow this by
# copying all the data over
client.query('SHOW TABLES', as: :array).each do |tbl|
puts " -> Copying table #{tbl.first}"
tbl_create = client.query("SHOW CREATE TABLE `#{chili_db}`.`#{tbl.first}`", as: :array).first[1]
tbl_create.gsub!('CREATE TABLE ', "CREATE TABLE `#{redmine_db}`.")
client.query("alter table `#{tbl.first}` CONVERT TO CHARACTER SET utf8")
client.query("INSERT INTO `#{redmine_db}`.`#{tbl.first}` SELECT * FROM `#{chili_db}`.`#{tbl.first}`")
# switch to the redmine database for the remainder of
# the script
client.query("USE `#{redmine_db}`")
puts "== Pre-migrations alter queries"
puts " -> Updating journals created_on"
query = <<-SQL
ALTER TABLE journals
CHANGE COLUMN created_at created_on DATETIME,
CHANGE COLUMN journaled_id journalized_id INTEGER(11),
CHANGE COLUMN activity_type journalized_type VARCHAR(255)
# Need to ensure the new database is up to date with the
# Redmine database migrations.
puts "== Redmine migrations"
`RAILS_ENV=production rake db:migrate`
# Modify the imported Chiliproject tables to be Redmine
# compatible
puts "== Post-migrations alter queries"
puts " -> Updating wiki_contents"
query = <<-SQL
ALTER TABLE wiki_contents
ADD comments VARCHAR(250) NULL,
CHANGE COLUMN lock_version version INTEGER(11);
puts " -> Updating journals.journalized_type"
query = <<-SQL
UPDATE journals SET journalized_type='Issue'
WHERE journalized_type='issues';
# Prior to converting history preserve the chili data in a new
# column
puts " -> Updating journal columns"
query = <<-SQL
ALTER TABLE journals
CHANGE COLUMN changes changes_chili TEXT NULL,
# Chili stores changes as a serialized column, Redmine has a
# row entry for each change in a separate table. This section of
# code converts them. If you previously converted from Redmine
# to ChiliProject you will already have data in the `journal_details`
# table. Use the journal_start_id to only translate data created
# after the conversion.
# How this works
# 1) Read the chiliproject changes column
# 2) Unserialize the column from YAML using Syck
# 3) Iterate through the keys and build update SQL values strings
# 4) Every 5,000 entries do an insert to the `journal_details` table
puts "== Converting journal history"
data = []
client.query("USE `#{redmine_db}`")
# We remove old entries from wiki_content_versions (except version 1), because these
# entries will be rebuilt from the "journals" table. They might be left over from
# a former Redmine installation, which was migrated to chili.
client.query("DELETE FROM wiki_content_versions WHERE version > 1")
results = client.query("SELECT id, changes_chili, journalized_id, journalized_type, version, user_id, notes, created_on FROM journals WHERE id > #{journal_start_id} AND version > 1")
results.each do |j|
next if j['changes_chili'].nil?
journalized_type = j['journalized_type']
journal_id = j["id"]
changes_chili = j['changes_chili']
if changes_chili.index("{{{")
next if journal_id == 82076
# If this is a "wiki_edits" entry, copy it into the wiki_content_versions table.
if journalized_type == "wiki_edits"
user_id = j["user_id"]
notes = client.escape j["notes"].to_s[0..240]
created_at = j["created_on"]
journalized_id = j["journalized_id"]
version = j["version"]
changes_chili = Syck.load(j['changes_chili'])
wiki_compression = client.escape changes_chili["compression"]
wiki_data = client.escape changes_chili["data"]
sql = <<-SQL
INSERT INTO wiki_content_versions
(id, wiki_content_id, page_id, version, author_id, comments, compression, data, updated_on)
(NULL, #{journalized_id}, #{journalized_id}, #{version}, #{user_id}, '#{notes}', '#{wiki_compression}', '#{wiki_data}', '#{created_at}')
unpacked_record = Syck.load(j['changes_chili'])
unpacked_record.each do |key, v|
case key
when /\Aattachments/
property = 'attachment'
prop_key = key.gsub(/[^0-9]/,'')
when /\Acustom_values/
property = 'cf'
prop_key = key.gsub(/[^0-9]/,'')
property = 'attr'
prop_key = key
old_value = v[0].is_a?(String) ? client.escape(v[0]) : v[0]
new_value = v[1].is_a?(String) ? client.escape(v[1]) : v[1]
data << "(#{j['id']},'#{property}','#{prop_key}','#{old_value}','#{new_value}')".force_encoding('UTF-8')
if data.size >= 5_000
puts " -> Inserting journal details batch"
client.query("INSERT INTO `journal_details` (`journal_id`, `property`, `prop_key`, `old_value`, `value`) VALUES #{data.join(',')}")
data = []
rescue => e
puts " *** Could not parse changes for Journal #{j['id']} (#{e.class}: #{e.message} #{e.backtrace.first})"
puts " -> Inserting journal details batch"
client.query("INSERT INTO `journal_details` (`journal_id`, `property`, `prop_key`, `old_value`, `value`) VALUES #{data.join(',')}")
# == Fix wiki content version numbers:
# In our situation we have mismatching version counts on wiki_contents. This
# might be due to the fact that we migrated from very old redmine to old
# chili years ago.
query = <<-SQL
UPDATE wiki_contents
SELECT page_id, max(version) AS version
FROM wiki_content_versions
GROUP BY page_id
) as wiki_content_versions on wiki_contents.page_id = wiki_content_versions.page_id
# After history has been converted, clean up the history and
# drop the now unecessary columns
puts "== Journal history data cleanup"
puts " -> Clearing empty rows"
query = <<-SQL
delete from journals where (notes is null or notes = '' )
and changes_chili is not null and not exists (
select 1 from journal_details x where
puts " -> Dropping unnecessary columns"
query = <<-SQL
ALTER TABLE journals
DROP COLUMN changes_chili,
puts "\n== Done"
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment