Skip to content

Instantly share code, notes, and snippets.

Last active October 13, 2015 21:58
Show Gist options
  • Save rbirnie/4261855 to your computer and use it in GitHub Desktop.
Save rbirnie/4261855 to your computer and use it in GitHub Desktop.
#! /usr/bin/env ruby
# This scripts runs on remote puppetmasters that you wish to import their nodes facts into Foreman
# it uploads all of the new facts its encounter based on a control file which is stored in /tmp directory.
# This script can run in cron, e.g. once every minute
# if you run it on many puppetmasters at the same time, you might consider adding something like:
# sleep rand(10) # that not all PM hammers the DB at once.
# note this requires ruby 1.8.7+
# puppet config dir
# URL where Foreman lives
# Temp file keeping the last run time
stat_file = "/tmp/foreman_fact_import"
require 'fileutils'
require 'net/http'
require 'net/https'
require 'uri'
require 'thread'
last_run = File.exists?(stat_file) ? File.stat(stat_file).mtime.utc : - 365*60*60
facts = Dir["#{puppetdir}/yaml/facts/*.yaml"]
def process(filename, last_run)
last_fact = File.stat(filename).mtime.utc
if last_fact > last_run
fact =
puts "Importing #{filename}"
uri = URI.parse(url)
http =, uri.port)
if uri.scheme == 'https' then
http.use_ssl = true
http.verify_mode = OpenSSL::SSL::VERIFY_NONE
req ="/fact_values/create?format=yml")
req.set_form_data({'facts' => fact})
response = http.request(req)
rescue Exception => e
raise "Could not send facts to Foreman: #{e}"
queue =
facts.each{|e| queue << e }
threads = []
16.times do
threads << do
while (e = queue.pop(true) rescue nil)
process(e, last_run)
threads.each {|t| t.join }
puts "All Threads Joined. Processing Done"
FileUtils.touch stat_file
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment