Skip to content

Instantly share code, notes, and snippets.

@shannonmitchell
Created January 27, 2017 22:32
Show Gist options
  • Save shannonmitchell/c811b684693758f84dcb91a8da8fca1f to your computer and use it in GitHub Desktop.
Save shannonmitchell/c811b684693758f84dcb91a8da8fca1f to your computer and use it in GitHub Desktop.
By looking at the keystone restarts, it happened twice on infra-1, once on infra-2 and none on infra-3.
root@deploy:~# ansible -i /opt/openstack-ansible/playbooks/inventory/dynamic_inventory.py keystone_all -m shell -a "grep restart /var/log/salt/minion | grep '2017-01-27 18'"
infra-3_keystone_container-8a194cbe | FAILED | rc=1 >>
infra-1_keystone_container-0ea74c78 | SUCCESS | rc=0 >>
2017-01-27 18:23:52,437 [salt.loaded.int.module.cmdmod][INFO ] Executing command 'service apache2 restart' in directory '/root'
2017-01-27 18:34:07,240 [salt.loaded.int.module.cmdmod][INFO ] Executing command 'service apache2 restart' in directory '/root'
infra-2_keystone_container-40fe1a29 | SUCCESS | rc=0 >>
2017-01-27 18:44:24,258 [salt.loaded.int.module.cmdmod][INFO ] Executing command 'service apache2 restart' in directory '/root'
It looks like it takes a few for haproxy to start spreading out the load after each restart.
root@infra-1:/var/log/haproxy# ./keystone.py
min: 1 2 3
-----------------------
23: 364 397 395
24: 41 338 359
25: 74 334 351
26: 78 342 356
27: 101 345 357
28: 228 370 420
29: 309 395 377
30: 358 369 381
31: 366 366 372
32: 398 424 387
33: 417 397 400
34: 87 316 344
35: 57 362 334
36: 79 346 344
37: 125 364 339
38: 168 403 387
39: 309 382 383
40: 387 391 386
41: 415 398 407
42: 358 355 355
43: 424 403 404
44: 387 186 374
45: 376 57 348
46: 388 62 376
47: 382 100 384
48: 404 143 375
49: 431 223 397
50: 419 244 385
51: 397 327 375
52: 159 145 151
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment