Skip to content

Instantly share code, notes, and snippets.

View smalleni's full-sized avatar
💭
OpenStack, OpenShift, Automation, Scalability, Performance and More!!!

Sai Sindhur Malleni smalleni

💭
OpenStack, OpenShift, Automation, Scalability, Performance and More!!!
  • Red Hat, Inc. | University of Illinois, Urbana-Champaign | NC State
View GitHub Profile
@smalleni
smalleni / collectd.conf
Created September 2, 2017 23:00
Collectd Config for JMX
# Interval default is 10s
Interval 10
# Hostname for this machine, if not defined, use gethostname(2) system call
Hostname "overcloud-controller-0"
# Loaded Plugins:
LoadPlugin "logfile"
<Plugin "logfile">
File "/var/log/collectd.log"
+--------------------------------------+------+--------------------------------------+-------------+--------------------+-------------+
[stack@c02-h10-r620 ~]$ openstack overcloud node introspect c3df42ac-2632-4c48-8683-f727b9307bd2
Started Mistral Workflow. Execution ID: 21a37b4d-6721-4f52-98a3-f4b84351c496
Waiting for introspection to finish...
^[^BnSuccessfully introspected all nodes.
Introspection completed.
stack@c02-h10-r620 ~]$ ironic node-show c3df42ac-2632-4c48-8683-f727b9307bd2
+------------------------+--------------------------------------------------------------------------+
| Property | Value |
working but set to maintenance 3f87fe68-b9ae-4c38-bdc1-a11ab9e301ca
@smalleni
smalleni / 100compute.txt
Created March 15, 2017 04:03
Scale Lab@100computes
stack@c02-h10-r620 ~]$ nova list
+--------------------------------------+------------------------+--------+------------+-------------+----------------------+
| ID | Name | Status | Task State | Power State | Networks |
+--------------------------------------+------------------------+--------+------------+-------------+----------------------+
| e0f6813a-5003-400e-83e6-80f544226e55 | overcloud-compute-0 | ACTIVE | - | Running | ctlplane=192.0.2.59 |
| 92a7b26c-d1bd-45fe-a338-e13d7be0da37 | overcloud-compute-1 | ACTIVE | - | Running | ctlplane=192.0.2.100 |
| 44e11016-43d5-4ff6-b9ff-a837feefe075 | overcloud-compute-10 | ACTIVE | - | Running | ctlplane=192.0.2.87 |
| 9b3e9e49-ad51-4262-bef6-1bf27ef723d9 | overcloud-compute-11 | ACTIVE | - | Running | ctlplane=192.0.2.76 |
| d70ac87a-24d0-44e6-8d97-8d75a9f031f4 | overcloud-compute-12 | ACTIVE | - | Running | ctlplane=192.0
/var/log/ironic/ironic-conductor.log:700394:2017-03-10 00:44:07.119 2550 ERROR ironic.drivers.modules.agent_base_vendor [req-e8d183c2-11fc-4a73-98c7-3
a89dc9220e2 - - - - -] Failed to install a bootloader when deploying node 9bef615b-2033-4628-bcb0-530a62f683f0. Error: {u'message': u'Error finding the disk or partition device to deploy the image onto: No partition with UUID c2bb0683-ebe8-4541-9c5c-a811d0326ae5 found on device /dev/sdak', u'code':
404, u'type': u'DeviceNotFound', u'details': u'No partition with UUID c2bb0683-ebe8-4541-9c5c-a811d0326ae5 found on device /dev/sdak'}
/var/log/ironic/ironic-conductor.log:700687:2017-03-10 00:44:24.315 2550 ERROR ironic.drivers.modules.agent_base_vendor [req-e8d183c2-11fc-4a73-98c7-3
a89dc9220e2 - - - - -] Asynchronous exception for node 9bef615b-2033-4628-bcb0-530a62f683f0: Node failed to get image for deploy. Exception: Failed to install a bootloader when deploying node 9bef615b-2033-4628-bcb0-530a62f683f0. Error: {u'message': u'Error finding the disk or partitio
@smalleni
smalleni / time.txt
Last active March 16, 2017 18:08
Time
70 node deploy kicked off at 12:40pm EST 03/07/17
53 node deploy Tue Mar 7 19:26:42 UTC 2017
51 node deploy 1r930 foreign disk one r620 clean failed (tuned nova, keystonee and heat) Tue Mar 7 21:42:54 UTC 2017------
Completed at
real 115m1.931s
user 0m11.489s
sys 0m1.888s
Tue Mar 7 23:37:56 UTC 2017
51 with node pinning 1r930 foreign disk and one r20 clean failed(different from previous time)
Wed Mar 8 22:41:26 UTC 2017
@smalleni
smalleni / Full-Lab-Deploy.md
Last active February 21, 2024 15:56
Deploying RHOP 10 in Red Hat Scale Lab

Full Scale Lab Deployment

Test Plan

  • Build Undercloud

  • Install Monitoring

  • Import Nodes

  • Introspect

If deployed external with vlan, and ping to gateway from router namespace doesnnt work, go to /etc/neutron/plugins/ml2/openvswitchagen.ini and set external bridge mapping to empty string and restart l3 agents and delete router gateway and re add.
INFO] Completed phase post-configure\n2016-11-29 08:23:26,412 INFO: os-refresh-config completed successfully\n2016-11-29 08:23:26,623 INFO: Generated new ssh key in ~/.ssh/id_rsa\n2016-11-29 08:23:27,655 INFO: Created flavor \"baremetal\" with profile \"None\"\n2016-11-29 08:23:27,754 INFO: Created flavor \"control\" with profile \"control\"\n2016-11-29 08:23:27,854 INFO: Created flavor \"compute\" with profile \"compute\"\n2016-11-29 08:23:27,949 INFO: Created flavor \"ceph-storage\" with profile \"ceph-storage\"\n2016-11-29 08:23:28,042 INFO: Created flavor \"block-storage\" with profile \"block-storage\"\n2016-11-29 08:23:28,145 INFO: Created flavor \"swift-storage\" with profile \"swift-storage\"\nTraceback (most recent call last):\n File \"<string>\", line 1, in <module>\n File \"/usr/lib/python2.7/site-packages/instack_undercloud/undercloud.py\", line 1174, in install\n _post_config(instack_env)\n File \"/usr/lib/python2.7/site-packages/instack_undercloud/undercloud.py\", line 1156, in _post_confi
[stack@dhcp23-185 browbeat]$ python browbeat.py -w
2016-02-09 15:04:24,187 - browbeat.Shaker - INFO - Starting Shaker workloads
2016-02-09 15:04:24,192 - browbeat.Shaker - INFO - Shaker is installed, continuing
2016-02-09 15:04:25,299 - browbeat.Shaker - INFO - Shaker image is built, continuing
2016-02-09 15:04:25,300 - browbeat.Shaker - INFO - Scenario: l2
2016-02-09 15:15:54,779 - browbeat.Shaker - ERROR - Failed scenario: l2
2016-02-09 15:15:54,779 - browbeat.Shaker - ERROR - saved log to: results/20160209-150424/shaker/l2/20160209-150425-browbeat-shaker-l2.log
2016-02-09 15:15:54,780 - browbeat.Shaker - INFO - Current number of scenarios executed: 1
2016-02-09 15:15:54,780 - browbeat.Shaker - INFO - Current number of scenarios passed: 0
2016-02-09 15:15:54,780 - browbeat.Shaker - INFO - Current number of scenarios failed: 1