Skip to content

Instantly share code, notes, and snippets.

- hosts: localhost
vars:
qs: [
{'hardware_q_id': '1', 'logical_queue':['1', '6-10', '12-15']},
{'hardware_q_id': '2', 'logical_queue':['2-5']},
{'hardware_q_id': '3', 'logical_queue':['7'], 'default': 'True'}]
q: {}
tasks:
- set_fact:
  • Get contrail-docker code from https://github.com/hkumarmk/contrail-docker (master)

  • Build the images: you may use build.sh script it is rather a convenient script as of now for me – I made few hardcoded defaults (like contrail-install-package url) which probably only make sense to my environment – but you may just provide appropriate parameters

    run script.sh with appropriate possitional params ./script.sh <package_url> <image_path> where component is one of the component name like config, control, database, lb, analytics package_url is the web url from which contrail-install-package url to be downloaded image_path - a local path to a directory to which container image will be copied

## Current state
Currently we have containerized contrail-config. We have not added any provisioning/orchestration system till yet, I just take standard provisioned system, stopped all config services and started config container on that node. Currently I have added this on a single node setup, but adding to multi node setup is not going to be difficult.
I have a setup on the node 10.204.217.162 – you may login to it and get a feel of it.
```
# Stopped supervisor-config on the host
@hkumarmk
hkumarmk / Containerizing-contrail-services.md
Last active June 15, 2016 13:33
Containerizing Contrail Services

Changes in provisioning/deployment in container world

There are two stages in deployment of containerized contrail services.

  • Build stage – In this stage installing the packages and setup with reasonable default configurations are added and a docker image is produced. This is happening during artifact build in our servers and docker image is produced as the artifact. This is a ready to deploy container image which only required customer environment specific configuration.
  • Provisioning stage - In this stage, the artifact produced in last stage is deployed on customer environment, added custom configurations and other operations and is started on appropriate nodes (cfgm nodes in case of contrail-config ).

NOTE: Only configuration is done during provisioning stage, no packages will be installed at this moment – so the images should be specific for a contrail version and an openstack build (e.g contrail-config-liberty-3.1-50, contrail-config-mitaka-3.1.50)

Design approach

#!/bin/bash -x
run=${1:-'basic_job'}
export AVAILABLE_TESTBEDS=testbed_sanity_nodei31.py
export BUILDID=25
export BRANCH=R3.0
export DISTRO=ubuntu-14-04
export SKU=kilo
export JENKINS_TRIGGERED=0
export LOCK_TESTBED_ON_FAILURE=0
export TEST_CONTAINER_IMAGE_DIR=/cs-shared/images/contrail-test-docker-images/
#!/bin/bash
to_build=${@:-'contrail-test contrail-test-ci'}
TEST_ARTIFACT=${TEST_ARTIFACT:-$(readlink -f sb/build/artifacts_extra/docker-contrail-test-*~kilo.tgz)}
TEST_CI_ARTIFACT=${TEST_CI_ARTIFACT:-$(readlink -f sb/build/artifacts_extra/docker-contrail-test-ci-*~kilo.tgz)}
FABRIC_UTILS_ARTIFACT=${FABRIC_UTILS_ARTIFACT:-$(readlink -f sb/build/artifacts_extra/contrail-fabric-utils-*~kilo.tgz)}
CONTRAIL_PACKAGE_DEB_JUNO=${CONTRAIL_PACKAGE_DEB_JUNO:-$(readlink -f sb/build/artifacts/contrail-install-packages*~juno_all.deb)}
CONTRAIL_PACKAGE_DEB_KILO=${CONTRAIL_PACKAGE_DEB_KILO:-$(readlink -f sb/build/artifacts/contrail-install-packages*~kilo_all.deb)}
DOCKER_IMAGE_EXPORT_PATH=${DOCKER_IMAGE_EXPORT_PATH:-$(readlink -f sb/build/artifacts/)}
IPADDRESS=${IPADDRESS:-$(ip a show docker0 | awk '/inet / {split ($2,a,"/"); print a[1]}')}
analyticstestsanity3.log:2016-04-06 23:43:46,865 - INFO - END TEST : test_db_purge : FAILED[0:12:12]
analyticstestsanity3.log:2016-04-06 23:50:19,259 - INFO - END TEST : test_verify_generator_collector_connections : FAILED[0:06:33]
analyticstestsanity3.log:2016-04-06 23:50:57,161 - INFO - END TEST : test_verify_process_status_agent : FAILED[0:00:17]
analyticstestsanity3.log:2016-04-06 23:51:14,424 - INFO - END TEST : test_verify_process_status_analytics_node : FAILED[0:00:17]
analyticstestsanity3.log:2016-04-06 23:51:31,515 - INFO - END TEST : test_verify_process_status_config : FAILED[0:00:17]
analyticstestsanity3.log:2016-04-06 23:51:48,268 - INFO - END TEST : test_verify_process_status_control_node : FAILED[0:00:17]
analyticstestsanitywithresource.log:2016-04-07 03:10:33,123 - INFO - END TEST : test_verify_flow_tables : FAILED[0:06:07]
-> some analytics problems
testbasicvmvn0.log:2016-04-07 03:58:17,839 - INFO - END TEST : test_max_vm_flows : FAILED[0:00:53]
python -m testtools.run scripts.analytics.test_analytics.AnalyticsTestSanity3.test_verify_process_status_analytics_node
Log file : /contrail-test/logs/keystone_tests.log
Tests running...
Log file : /contrail-test/logs/analyticstestsanity3.log
2016-03-22 12:21:50,409 - INFO - Project ctest-AnalyticsTestSanity3-20973050 not found, creating it
2016-03-22 12:21:50,665 - INFO - Created Project:ctest-AnalyticsTestSanity3-20973050, ID : 38af4b35-ca50-4055-b36d-c268fc9c36c6
[2015-10-09 11:14:29,241: ERROR/MainProcess] Task overcast.django.apps.buildsvc.tasks.build[b76260ee-75db-47e3-85e9-d466c3a25932] raised unexpected: CommandFailed("['reprepro', '-b', u'/home/hkumar/overcastvenv/src/overcast/data/repos/hkumarmk/hkumarmk', '--ignore=wrongdistribution', 'include', u'overcast', '/tmp/tmp6IGqLo/buildsvctest_0.1+72_source.changes'] returned 254. Output: ERROR: Unexpected content of file '/tmp/tmp6IGqLo/buildsvctest_0.1+72.tar.gz'!\nmd5 expected: a14fb06fb54160a7262c6602abd0b44d, got: 88b27617c849209164fa16bb85848005\nsha1 expected: d0d3f6d587c307e07099cd9df8e9d0155cebe40f, got: 1a936c399f4bb0bcb9c2f4a2e02691af3743efca\nsha256 expected: e2af4d982ca81114cd2b7308256650224992aea801c6771e811cbf1343fdf162, got: b0b6b2e3258df2fb4cc8e5cfdad2f6c10f984d93ca769c28970ca91d5dcd390d\nsize expected: 1310, got: 1305\nThere have been errors!\n (stderr: None)",)
Traceback (most recent call last):
File "/home/hkumar/overcastvenv/local/lib/python2.7/site-packages/celery/app/trace.py", line 240, in t
[2015-10-08 12:00:33,936: DEBUG/Worker-4] ['git', 'clone', u'https://github.com/sorenh/buildsvctest', '-b', u'python', 'build'], input=None, cwd='/tmp/tmpTMPS1K', override_env=None, discard_stderr=False
[2015-10-08 12:00:36,660: INFO/Worker-4] ['git', 'clone', u'https://github.com/sorenh/buildsvctest', '-b', u'python', 'build'] returned with returncode 0.
[2015-10-08 12:00:36,661: INFO/Worker-4] ['git', 'clone', u'https://github.com/sorenh/buildsvctest', '-b', u'python', 'build'] gave stdout: Cloning into 'build'...
.
[2015-10-08 12:00:36,661: INFO/Worker-4] ['git', 'clone', u'https://github.com/sorenh/buildsvctest', '-b', u'python', 'build'] gave stderr: None.
[2015-10-08 12:00:36,662: DEBUG/Worker-4] ['git', 'rev-parse', 'HEAD'], input=None, cwd='/tmp/tmpTMPS1K/build', override_env=None, discard_stderr=False
[2015-10-08 12:00:36,674: INFO/Worker-4] ['git', 'rev-parse', 'HEAD'] returned with returncode 0.
[2015-10-08 12:00:36,674: INFO/Worker-4] ['git', 'rev-parse', 'HEAD'] gave stdout: f887be9d39a46bfb9834b