Skip to content

Instantly share code, notes, and snippets.

@colin
Forked from newbamboo/getting_started.markdown
Created March 13, 2009 20:27
Show Gist options
  • Save colin/78740 to your computer and use it in GitHub Desktop.
Save colin/78740 to your computer and use it in GitHub Desktop.

Getting Started

Panda runs entirely within Amazon's Web Services, so before continuing you will need to sign up for an account and have access to EC2, S3 and SimpleDB services.

If this is the first time you've used EC2, you'll need to ensure you've setup certificates and local environment. Everything is described in Amazon's EC2 Getting Started Guide.

If you would like to run Panda on a different platform, please first follow the Local Installation Guide

Launch the AMI

We've provided a base image for Panda has all of the necessary software installed, including FFmpeg, Merb and Nginx.

ec2-run-instances  ami-f837d091 -k KEYPAIRNAME

Download and configure Panda

Next grab panda to your local machine:

git clone git://github.com/newbamboo/panda.git
cd panda
git branch --track dm origin/dm
git checkout dm

Edit the deploy file in config/deploy.rb adding in your EC2 key location and instance address.

Copy the example config file and edit the options.

cd config
cp panda_init.rb.example panda_init.rb

Edit the config file in panda_init.rb. When using the Panda AMI, the api_key, upload_redirect_url and state_update_url are the most important config options. If you go for S3 you will also need to make bucket, explained below.

Now setup the mailer for errors and other notifications from Panda.

cp mailer.rb.example mailer.rb

The default sendmail options should work for most people, however emails may be put into your Spam if they're coming from an EC2 IP. If that's the case you can enter your own SMTP server (e.g. your Gmail one)

Create the deployment directory structure:

cap deploy:setup

Copy the config files to the EC2 instance:

cap copy_config_files_to_server

Finally, deploy Panda to your new instance and migrate the database:

cap deploy
cap deploy:automigrate

If you now visit your instance's address in your web browser you should see the Panda login screen. Before we can login we must setup the admin user and encoding profiles as described below.

SSH into the instance

Next you must ssh into your instance and all commands from now on should be run on the instance:

ssh -i YOURSSHKEYFILE root@HOST1111

Create S3 bucket

Next we'll need to create the S3 bucket referenced in the panda_init.rb file we just edit.

Goto the dir where Panda is deployed:

cd /var/www/panda/current

Login to the Merb interactive console:

merb -e production -i

In the console now create the S3 bucket entered into you padna_init.rb config unless it already exists:

Panda::Setup.create_s3_bucket

{:lang=ruby}

Then upload the default Flash player and swfobject script to allow us to embed videos.

S3VideoObject.store('player.swf', open('public/player.swf'), :access => :public_read)
S3VideoObject.store('swfobject2.js', open('public/javascripts/swfobject2.js'), :access => :public_read)
S3VideoObject.store('expressInstall.swf', open('public/expressInstall.swf'), :access => :public_read)

{:lang=ruby}

Create first admin user

Panda provides an admin panel which allows you to see the encoded queue and details about all the videos that have been processed. A user must be created to allow us to login to the admin panel.

Still logged into the Merb console from the previous stepm run the following to create your admin user:

u = User.new
u.login = 'admin'
u.email = 'email@mydomain'
u.set_password('password')
u.save

{:lang=ruby}

Create an encoding profile

When a video is uploaded to Panda, it will create an encoding record for each profile and add these to the queued to be encoded. If you create several different profiles every uploaded video will be encoded to them all.

Panda supports pretty much any format FFmpeg does. There is some special handling of both FLV and h264 with AAC audio for Flash. If you need to add any special options the actual FFmpeg commands run are defined in app/models/video.rb (the encode method is a good place to start).

Before continuing you will need to create at least one profile by running one of the following in the Merb console we're stilled logged into:

Profile.create!(:title => "Flash video SD",  :container => "flv", :video_bitrate => 300, :audio_bitrate => 48, :width => 320, :height => 240, :fps => 24, :position => 0, :player => "flash")

Profile.create!(:title => "Flash video HI",  :container => "flv", :video_bitrate => 400, :audio_bitrate => 48, :width => 480, :height => 360, :fps => 24, :position => 1, :player => "flash")

Profile.create!(:title => "Flash h264 SD",   :container => "mp4", :video_bitrate => 300, :audio_codec => "aac", :audio_bitrate => 48, :width => 320, :height => 240, :fps => 24, :position => 2, :player => "flash")

Profile.create!(:title => "Flash h264 HI",   :container => "mp4", :video_bitrate => 400, :audio_bitrate => 48, :audio_coded => "aac", :width => 480, :height => 360, :fps => 24, :position => 3, :player => "flash")

Profile.create!(:title => "Flash h264 480p", :container => "mp4", :video_bitrate => 600, :audio_bitrate => 48, :audio_coded => "aac", :width => 852, :height => 480, :fps => 24, :position => 4, :player => "flash")

{:lang=ruby}

Start Panda

We should now be setup to start the Panda application!

TODO: Monit for encoder and notifier

The encoder and notifier must also be launched:

/etc/init.d/panda-encoder start
/etc/init.d/panda-notifier start

Upload some videos!

Visit the admin panel at root@ec2-000-000-000-000.compute-1.amazonaws.com and login with the detail of the user you created earlier. Click the "Upload a test video" link on the right hand side and a popup will show the video upload form. When uploading the video you will see the progress bar track it. Once uploaded the video will begin encoding and show up in the videos list once complete.

Setup EBS for persistent MySQL storage

Since Panda is storing the video metadata in MySQL, if this is for a production environment you will want to attach an Elastic Block Storage volume to the instance and move the MySQL data store there.

Steps taken from this guide: http://developer.amazonwebservices.com/connect/entry.jspa?externalID=1663

Find the zone of your Panda instance:

ec2-describe-instances i-IIII1111

Create an EBS volume in the same zone as your instance (in this example the zone is us-east-1a). By default I've used 1GB which should be enough for most setups:

ec2-create-volume -z us-east-1a -s 1
ec2-describe-volumes vol-VVVV1111

Once it's available, attach the EBS volume to the instance as /dev/sdh:

ec2-attach-volume -d /dev/sdh -i i-IIII1111 vol-VVVV1111

SSH into your instance now:

ssh -i YOURSSHKEYFILE root@HOST1111

Create an XFS file system on the EBS volume and mount it as /vol (see this AWS Forum post for how we avoid the kernel panic that xfs log version 2 causes on the Ubuntu Intrepid kernel we're using):

mkfs.xfs -l version=1 /dev/sdh

echo "/dev/sdh /vol xfs noatime 0 0" >> /etc/fstab

mkdir /vol
mount /vol

Move the MySQL storage to the EBS volume:

/etc/init.d/mysql stop

mkdir /vol/lib /vol/log
mv /var/lib/mysql /vol/lib/
mv /var/log/mysql /vol/log/

Tell MySQL to look on the mounted EBS volume for data files and binary logs (and the slow query log if you wish). Save the database configuration to the EBS volume for future use:

cat > /etc/mysql/conf.d/mysql-ec2.cnf <<EOM
[mysqld]
innodb_file_per_table
datadir          = /vol/lib/mysql
log_bin          = /vol/log/mysql/mysql-bin.log
max_binlog_size  = 1000M
#log_slow_queries = /vol/log/mysql/mysql-slow.log
#long_query_time  = 10
EOM

rsync -aR /etc/mysql /vol/

Restart the MySQL server:

/etc/init.d/mysql start

Next steps: Integrating Panda with you application

Now you have a Panda service up and running, the next step is to integrate it with you web application to allow users to upload videos directly from there.

See the Integrating with Ruby on Rails guide if you're using Rails (or another similar Ruby framework). If you're using a different environment the API Documentation provides full instructions for integrating with the REST API.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment