Skip to content

Instantly share code, notes, and snippets.

@rzarzynski
Last active July 2, 2020 07:35
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save rzarzynski/9d46459765e89ce1459ccee56d85c4d1 to your computer and use it in GitHub Desktop.
Save rzarzynski/9d46459765e89ce1459ccee56d85c4d1 to your computer and use it in GitHub Desktop.

Stack canaries for ceph::bufferptr

It looks we have a nasty issue leading to a calling the destructor of a statically allocated ceph::buffer::ptr instance twice. This happens even without hypercombining.

Investigation

rzarz@ubulap:/work/ceph-2/build$ gdb --args python3.5 /work/ceph-2/build/bin/ceph -c /work/ceph-2/build/ceph.conf config assimilate-conf -i -
...
(gdb) run
Starting program: /usr/bin/python3.5 /work/ceph-2/build/bin/ceph -c /work/ceph-2/build/ceph.conf config assimilate-conf -i -
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1".
[Detaching after fork from child process 2055]
*** DEVELOPER MODE: setting PATH, PYTHONPATH and LD_LIBRARY_PATH ***
process 2051 is executing new program: /usr/bin/python3.5
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1".
[Detaching after fork from child process 2059]
^C
Program received signal SIGINT, Interrupt.
0x00007ffff7dec8d9 in dl_open_worker (a=a@entry=0x7fffffffc4a0) at dl-open.c:568
568     dl-open.c: No such file or directory.
(gdb) watch *0x7ffff60f9500
Hardware watchpoint 1: *0x7ffff60f9500
(gdb) cont
Continuing.

Hardware watchpoint 1: *0x7ffff60f9500

Old value = 0
New value = -17958194
0x00007fffed209763 in ceph::buffer::v15_2_0::ptr::ptr (this=0x7ffff5d64d60 <ceph::buffer::v15_2_0::list::always_empty_bptr>) at /work/ceph-2/build/boost/include/boost/asio/impl/error.ipp:121
121       return instance;
(gdb) bt
#0  0x00007fffed209763 in ceph::buffer::v15_2_0::ptr::ptr (this=0x7ffff5d64d60 <ceph::buffer::v15_2_0::list::always_empty_bptr>) at /work/ceph-2/build/boost/include/boost/asio/impl/error.ipp:121
#1  __static_initialization_and_destruction_0 (__initialize_p=1, __priority=65535) at /work/ceph-2/src/common/buffer.cc:1332
#2  _GLOBAL__sub_I_buffer.cc(void) () at /work/ceph-2/src/common/buffer.cc:2287
#3  0x00007ffff7de76ca in call_init (l=<optimized out>, argc=argc@entry=8, argv=argv@entry=0x7fffffffdc38, env=env@entry=0x7fffffffdc80) at dl-init.c:72
#4  0x00007ffff7de77db in call_init (env=0x7fffffffdc80, argv=0x7fffffffdc38, argc=8, l=<optimized out>) at dl-init.c:30
#5  _dl_init (main_map=main_map@entry=0xc41b80, argc=8, argv=0x7fffffffdc38, env=0x7fffffffdc80) at dl-init.c:120
#6  0x00007ffff7dec8f2 in dl_open_worker (a=a@entry=0x7fffffffc4a0) at dl-open.c:575
#7  0x00007ffff7de7574 in _dl_catch_error (objname=objname@entry=0x7fffffffc490, errstring=errstring@entry=0x7fffffffc498, mallocedp=mallocedp@entry=0x7fffffffc48f, 
    operate=operate@entry=0x7ffff7dec4e0 <dl_open_worker>, args=args@entry=0x7fffffffc4a0) at dl-error.c:187
#8  0x00007ffff7debdb9 in _dl_open (file=0x7ffff674b578 "/work/ceph-2/build/lib/cython_modules/lib.3/rados.cpython-35m-x86_64-linux-gnu.so", mode=-2147483646, 
    caller_dlopen=0x60765a <_PyImport_FindSharedFuncptr+138>, nsid=-2, argc=<optimized out>, argv=<optimized out>, env=0x7fffffffdc80) at dl-open.c:660
#9  0x00007ffff75ecf09 in dlopen_doit (a=a@entry=0x7fffffffc6d0) at dlopen.c:66
#10 0x00007ffff7de7574 in _dl_catch_error (objname=0xbd6f90, errstring=0xbd6f98, mallocedp=0xbd6f88, operate=0x7ffff75eceb0 <dlopen_doit>, args=0x7fffffffc6d0) at dl-error.c:187
#11 0x00007ffff75ed571 in _dlerror_run (operate=operate@entry=0x7ffff75eceb0 <dlopen_doit>, args=args@entry=0x7fffffffc6d0) at dlerror.c:163
#12 0x00007ffff75ecfa1 in __dlopen (file=<optimized out>, mode=<optimized out>) at dlopen.c:87
#13 0x000000000060765a in _PyImport_FindSharedFuncptr ()
#14 0x000000000061253b in _PyImport_LoadDynamicModuleWithSpec ()
#15 0x0000000000612a68 in ?? ()
#16 0x00000000004ea9b6 in PyCFunction_Call ()
#17 0x0000000000542565 in PyEval_EvalFrameEx ()
#18 0x0000000000544f43 in ?? ()
#19 0x0000000000541002 in PyEval_EvalFrameEx ()
#20 0x00000000005405e4 in PyEval_EvalFrameEx ()
#21 0x00000000005405e4 in PyEval_EvalFrameEx ()
#22 0x00000000005405e4 in PyEval_EvalFrameEx ()
#23 0x00000000005405e4 in PyEval_EvalFrameEx ()
#24 0x0000000000545d8b in PyEval_EvalCodeEx ()
#25 0x00000000004ec9a3 in ?? ()
#26 0x00000000005bfd87 in PyObject_Call ()
#27 0x00000000005c0bb6 in _PyObject_CallMethodIdObjArgs ()
#28 0x0000000000535715 in PyImport_ImportModuleLevelObject ()
#29 0x000000000054ec68 in ?? ()
#30 0x00000000004ea927 in PyCFunction_Call ()
#31 0x00000000005bfd87 in PyObject_Call ()
#32 0x0000000000531200 in PyEval_CallObjectWithKeywords ()
#33 0x000000000053efa3 in PyEval_EvalFrameEx ()
#34 0x0000000000544f43 in ?? ()
#35 0x0000000000545c3f in PyEval_EvalCode ()
#36 0x0000000000622642 in ?? ()
#37 0x0000000000624aea in PyRun_FileExFlags ()
#38 0x00000000006252dc in PyRun_SimpleFileExFlags ()
#39 0x000000000063e616 in Py_Main ()
#40 0x00000000004d1761 in main ()
(gdb) frame 0
#0  0x00007fffed209763 in ceph::buffer::v15_2_0::ptr::ptr (this=0x7ffff5d64d60 <ceph::buffer::v15_2_0::list::always_empty_bptr>) at /work/ceph-2/build/boost/include/boost/asio/impl/error.ipp:121
121       return instance;
(gdb) disassemble 
Dump of assembler code for function _GLOBAL__sub_I_buffer.cc(void):
   0x00007fffed209700 <+0>:     push   %rbp
   0x00007fffed209701 <+1>:     lea    0x8b5b698(%rip),%rdi        # 0x7ffff5d64da0 <_ZStL8__ioinit>
   0x00007fffed209708 <+8>:     mov    %rsp,%rbp
   0x00007fffed20970b <+11>:    push   %rbx
   0x00007fffed20970c <+12>:    sub    $0x8,%rsp
   0x00007fffed209710 <+16>:    callq  0x7fffed1a6310 <_ZNSt8ios_base4InitC1Ev@plt>
   0x00007fffed209715 <+21>:    mov    0x935d94(%rip),%rdi        # 0x7fffedb3f4b0
   0x00007fffed20971c <+28>:    lea    0x93893d(%rip),%rdx        # 0x7fffedb42060
   0x00007fffed209723 <+35>:    lea    0x8b5b676(%rip),%rsi        # 0x7ffff5d64da0 <_ZStL8__ioinit>
   0x00007fffed20972a <+42>:    callq  0x7fffed1a5fa0 <__cxa_atexit@plt>
   0x00007fffed20972f <+47>:    lea    0x5f6545(%rip),%rdi        # 0x7fffed7ffc7b
   0x00007fffed209736 <+54>:    callq  0x7fffed31cc30 <get_env_bool(char const*)>
   0x00007fffed20973b <+59>:    mov    0x9359ee(%rip),%rsi        # 0x7fffedb3f130
   0x00007fffed209742 <+66>:    lea    0x2ebb17(%rip),%rdi        # 0x7fffed4f5260 <ceph::buffer::v15_2_0::ptr::~ptr()>
   0x00007fffed209749 <+73>:    lea    0x938910(%rip),%rdx        # 0x7fffedb42060
   0x00007fffed209750 <+80>:    mov    %al,0x8b5b63a(%rip)        # 0x7ffff5d64d90 <_ZL16buffer_track_crc>
   0x00007fffed209756 <+86>:    movabs $0xdeadbeeffeedface,%rax
   0x00007fffed209760 <+96>:    mov    %rax,(%rsi)
=> 0x00007fffed209763 <+99>:    movq   $0x0,0x10(%rsi)
...
(gdb) cont                                                                                                                                                                                           [99/1871]
Continuing.
[Detaching after fork from child process 2157]
[New Thread 0x7fffea1b1700 (LWP 2165)]
[Thread 0x7fffea1b1700 (LWP 2165) exited]
[New Thread 0x7fffea1b1700 (LWP 2168)]
[Thread 0x7fffea1b1700 (LWP 2168) exited]
[New Thread 0x7fffea1b1700 (LWP 2169)]
[New Thread 0x7fffe8f4f700 (LWP 2182)]
[New Thread 0x7fffe3fff700 (LWP 2183)]
[New Thread 0x7fffe37fe700 (LWP 2184)]
[New Thread 0x7fffe2ffd700 (LWP 2185)]
[New Thread 0x7fffe27fc700 (LWP 2186)]
[New Thread 0x7fffe1ffb700 (LWP 2187)]
[Thread 0x7fffe1ffb700 (LWP 2187) exited]
[Thread 0x7fffe2ffd700 (LWP 2185) exited]
[Thread 0x7fffe27fc700 (LWP 2186) exited]
[New Thread 0x7fffe2ffd700 (LWP 2188)]
2020-06-25T20:43:23.047+0200 7fffea1b1700 -1 WARNING: all dangerous and experimental features are enabled.
[New Thread 0x7fffe27fc700 (LWP 2189)]
2020-06-25T20:43:23.055+0200 7fffea1b1700 -1 WARNING: all dangerous and experimental features are enabled.
[New Thread 0x7fffe1ffb700 (LWP 2190)]
[New Thread 0x7fffe17fa700 (LWP 2191)]
[New Thread 0x7fffe0ff9700 (LWP 2192)]
[New Thread 0x7fffc7fff700 (LWP 2193)]
[New Thread 0x7fffc77fe700 (LWP 2194)]
[New Thread 0x7fffc6ffd700 (LWP 2195)]
[New Thread 0x7fffc67fc700 (LWP 2196)]
[New Thread 0x7fffc5ffb700 (LWP 2197)]
[Thread 0x7fffea1b1700 (LWP 2169) exited]

[New Thread 0x7fffea1b1700 (LWP 2199)]
[Thread 0x7fffea1b1700 (LWP 2199) exited]
[New Thread 0x7fffea1b1700 (LWP 2200)]
[Thread 0x7fffea1b1700 (LWP 2200) exited]
[New Thread 0x7fffea1b1700 (LWP 2201)]
[Thread 0x7fffc5ffb700 (LWP 2197) exited]
[Thread 0x7fffc67fc700 (LWP 2196) exited]
[Thread 0x7fffc6ffd700 (LWP 2195) exited]
[Thread 0x7fffc77fe700 (LWP 2194) exited]
[Thread 0x7fffe0ff9700 (LWP 2192) exited]
[Thread 0x7fffe17fa700 (LWP 2191) exited]
[Thread 0x7fffc7fff700 (LWP 2193) exited]
[Thread 0x7fffe8f4f700 (LWP 2182) exited]
[Thread 0x7fffe3fff700 (LWP 2183) exited]
[Thread 0x7fffe37fe700 (LWP 2184) exited]
[Thread 0x7fffe27fc700 (LWP 2189) exited]
[Thread 0x7fffe1ffb700 (LWP 2190) exited]
[Thread 0x7fffe2ffd700 (LWP 2188) exited]
[Thread 0x7fffea1b1700 (LWP 2201) exited]

Thread 1 "ceph" hit Hardware watchpoint 1: *0x7ffff60f9500

Old value = -17958194
New value = -340934434
ceph::buffer::v15_2_0::ptr::~ptr (this=0x7ffff60f9500 <ceph::buffer::v15_2_0::list::always_empty_bptr>, __in_chrg=<optimized out>) at /work/ceph-2/src/include/buffer.h:304
304           release_raw();
(gdb) bt                                                                                                                                                                                             [42/1871]
#0  ceph::buffer::v15_2_0::ptr::~ptr (this=0x7ffff60f9500 <ceph::buffer::v15_2_0::list::always_empty_bptr>, __in_chrg=<optimized out>) at /work/ceph-2/src/include/buffer.h:304
#1  0x00007ffff7829ff8 in __run_exit_handlers (status=0, listp=0x7ffff7bb45f8 <__exit_funcs>, run_list_atexit=run_list_atexit@entry=true) at exit.c:82
#2  0x00007ffff782a045 in __GI_exit (status=<optimized out>) at exit.c:104
#3  0x0000000000623edf in Py_Exit ()
#4  0x0000000000623fca in ?? ()
#5  0x0000000000624036 in PyErr_PrintEx ()
#6  0x00000000006252f9 in PyRun_SimpleFileExFlags ()
#7  0x000000000063e616 in Py_Main ()
#8  0x00000000004d1761 in main ()
(gdb) disassemble 
Dump of assembler code for function ceph::buffer::v15_2_0::ptr::~ptr():
   0x00007ffff5df8160 <+0>:     push   %rbp
   0x00007ffff5df8161 <+1>:     mov    %rsp,%rbp
   0x00007ffff5df8164 <+4>:     push   %rbx
   0x00007ffff5df8165 <+5>:     sub    $0x38,%rsp
   0x00007ffff5df8169 <+9>:     mov    %fs:0x28,%rax
   0x00007ffff5df8172 <+18>:    mov    %rax,-0x18(%rbp)
   0x00007ffff5df8176 <+22>:    xor    %eax,%eax
   0x00007ffff5df8178 <+24>:    movabs $0xdeadbeeffeedface,%rax
   0x00007ffff5df8182 <+34>:    cmp    %rax,(%rdi)
   0x00007ffff5df8185 <+37>:    jne    0x7ffff5df81af <ceph::buffer::v15_2_0::ptr::~ptr()+79>
   0x00007ffff5df8187 <+39>:    movabs $0xbadc0ffeebadc0de,%rax
   0x00007ffff5df8191 <+49>:    mov    %rax,(%rdi)
=> 0x00007ffff5df8194 <+52>:    callq  0x7ffff5d9ee00 <_ZN4ceph6buffer7v15_2_03ptr11release_rawEv@plt>
...
(gdb) print *this
$1 = {canary = {state = ceph::canary_t::state_t::CAGE_EMPTIED}, _raw = 0x0, _off = 0, _len = 0}
(gdb) cont
Continuing.
/work/ceph-2/src/include/buffer.h: In function 'void ceph::canary_t::is_alive_or_die() const' thread 7ffff7fcb700 time 2020-06-25T20:46:19.711120+0200
/work/ceph-2/src/include/buffer.h: 95: ceph_abort_msg("Canary is dead! Oops...")
 ceph version 16.0.0-2441-gb2fa48d (b2fa48d633742041b1f7270708f78755f7e10dd5) pacific (dev)
 1: (ceph::__ceph_abort(char const*, int, char const*, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)+0xf4) [0x7fffed1b13ac]
 2: (ceph::buffer::v15_2_0::ptr::~ptr()+0x7d) [0x7fffed4f52dd]
 3: (()+0x39ff8) [0x7ffff7829ff8]
 4: (()+0x3a045) [0x7ffff782a045]
 5: /usr/bin/python3.5() [0x623edf]
 6: /usr/bin/python3.5() [0x623fca]
 7: (PyErr_PrintEx()+0x36) [0x624036]
 8: (PyRun_SimpleFileExFlags()+0x1d9) [0x6252f9]
 9: (Py_Main()+0x456) [0x63e616]
 10: (main()+0xe1) [0x4d1761]
 11: (__libc_start_main()+0xf0) [0x7ffff7810830]
 12: (_start()+0x29) [0x5d57c9]

Thread 1 "ceph" received signal SIGABRT, Aborted.
0x00007ffff7825428 in __GI_raise (sig=sig@entry=6) at ../sysdeps/unix/sysv/linux/raise.c:54
54      ../sysdeps/unix/sysv/linux/raise.c: No such file or directory.
(gdb) info proc mappings 
process 2051
Mapped address spaces:

          Start Addr           End Addr       Size     Offset objfile
            0x400000           0x7a8000   0x3a8000        0x0 /usr/bin/python3.5
            0x9a7000           0x9a9000     0x2000   0x3a7000 /usr/bin/python3.5
            0x9a9000           0xa40000    0x97000   0x3a9000 /usr/bin/python3.5
            0xa40000           0xfdc000   0x59c000        0x0 [heap]
...

      0x7ffff77f0000     0x7ffff79b0000   0x1c0000        0x0 /lib/x86_64-linux-gnu/libc-2.23.so
...

After installing the missed libc6-dbgsym

...
Thread 1 "ceph" hit Hardware watchpoint 1: *0x7ffff60f9500

Old value = -17958194
New value = -340934434
ceph::buffer::v15_2_0::ptr::~ptr (this=0x7ffff60f9500 <ceph::buffer::v15_2_0::list::always_empty_bptr>, __in_chrg=<optimized out>) at /work/ceph-2/src/include/buffer.h:304
304           release_raw();
(gdb) bt
#0  ceph::buffer::v15_2_0::ptr::~ptr (this=0x7ffff60f9500 <ceph::buffer::v15_2_0::list::always_empty_bptr>, __in_chrg=<optimized out>) at /work/ceph-2/src/include/buffer.h:304
#1  0x00007ffff7829ff8 in __run_exit_handlers (status=0, listp=0x7ffff7bb45f8 <__exit_funcs>, run_list_atexit=run_list_atexit@entry=true) at exit.c:82
#2  0x00007ffff782a045 in __GI_exit (status=<optimized out>) at exit.c:104
#3  0x0000000000623edf in Py_Exit ()
#4  0x0000000000623fca in ?? ()
#5  0x0000000000624036 in PyErr_PrintEx ()
#6  0x00000000006252f9 in PyRun_SimpleFileExFlags ()
#7  0x000000000063e616 in Py_Main ()
#8  0x00000000004d1761 in main ()
(gdb) cont
Continuing.
/work/ceph-2/src/include/buffer.h: In function 'void ceph::canary_t::is_alive_or_die() const' thread 7ffff7fcb700 time 2020-06-25T21:08:04.569696+0200
/work/ceph-2/src/include/buffer.h: 95: ceph_abort_msg("Canary is dead! Oops...")
 ceph version 16.0.0-2441-gb2fa48d (b2fa48d633742041b1f7270708f78755f7e10dd5) pacific (dev)
 1: (ceph::__ceph_abort(char const*, int, char const*, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)+0xf4) [0x7fffed1b13ac]
 2: (ceph::buffer::v15_2_0::ptr::~ptr()+0x7d) [0x7fffed4f52dd]
 3: (()+0x39ff8) [0x7ffff7829ff8]
 4: (()+0x3a045) [0x7ffff782a045]
 5: /usr/bin/python3.5() [0x623edf]
 6: /usr/bin/python3.5() [0x623fca]
 7: (PyErr_PrintEx()+0x36) [0x624036]
 8: (PyRun_SimpleFileExFlags()+0x1d9) [0x6252f9]
 9: (Py_Main()+0x456) [0x63e616]
 10: (main()+0xe1) [0x4d1761]
 11: (__libc_start_main()+0xf0) [0x7ffff7810830]
 12: (_start()+0x29) [0x5d57c9]

Thread 1 "ceph" received signal SIGABRT, Aborted.
0x00007ffff7825428 in __GI_raise (sig=sig@entry=6) at ../sysdeps/unix/sysv/linux/raise.c:54
54      ../sysdeps/unix/sysv/linux/raise.c: No such file or directory.
(gdb) bt
#0  0x00007ffff7825428 in __GI_raise (sig=sig@entry=6) at ../sysdeps/unix/sysv/linux/raise.c:54
#1  0x00007ffff782702a in __GI_abort () at abort.c:89
#2  0x00007fffed1b148d in ceph::__ceph_abort (file=file@entry=0x7fffed7ae578 "/work/ceph-2/src/include/buffer.h", line=line@entry=95, 
    func=func@entry=0x7fffed8000e0 <_ZZNK4ceph8canary_t15is_alive_or_dieEvE19__PRETTY_FUNCTION__> "void ceph::canary_t::is_alive_or_die() const", msg="Canary is dead! Oops...")
    at /work/ceph-2/src/common/assert.cc:198
#3  0x00007fffed4f52dd in ceph::canary_t::is_alive_or_die (this=<optimized out>) at /work/ceph-2/src/include/buffer.h:95
#4  ceph::buffer::v15_2_0::ptr::~ptr (this=<optimized out>, __in_chrg=<optimized out>) at /work/ceph-2/src/include/buffer.h:300
#5  0x00007ffff7829ff8 in __run_exit_handlers (status=0, listp=0x7ffff7bb45f8 <__exit_funcs>, run_list_atexit=run_list_atexit@entry=true) at exit.c:82
#6  0x00007ffff782a045 in __GI_exit (status=<optimized out>) at exit.c:104
#7  0x0000000000623edf in Py_Exit ()
#8  0x0000000000623fca in ?? ()
#9  0x0000000000624036 in PyErr_PrintEx ()
#10 0x00000000006252f9 in PyRun_SimpleFileExFlags ()
#11 0x000000000063e616 in Py_Main ()
#12 0x00000000004d1761 in main ()

__run_exit_handlers has been called twice and tried to destruct already destructed buffer::list::always_empty_bptr. Oops.

The canary implementation

diff --git a/src/include/buffer.h b/src/include/buffer.h
index 6542e9d..f8b3dc7 100644
--- a/src/include/buffer.h
+++ b/src/include/buffer.h
@@ -82,19 +82,55 @@ namespace ceph {
 #define BUFFER_CORRUPTION_DEBUG
 #ifdef BUFFER_CORRUPTION_DEBUG
 class canary_t {
-  std::uint64_t magic{0xBADC0FFEEBADC0DE};
+  enum class state_t : std::uint64_t {
+    NOT_CAUGHT_YET = 0xC01DCAFFEBADC0DE,
+    ALIVE_IN_CAGE = 0xDEADBEEFFEEDFACE,
+    CAGE_EMPTIED = 0xBADC0FFEEBADC0DE
+  } state;
+
 public:
+  canary_t() : state(state_t::ALIVE_IN_CAGE) {}
   void is_alive_or_die() const {
-    if (magic != 0xBADC0FFEEBADC0DE) {
+    if (state != state_t::ALIVE_IN_CAGE) {
       ceph_abort_msg("Canary is dead! Oops...");
     }
   }
+
+  void make_it_free() {
+    state = state_t::CAGE_EMPTIED;
+  }
+
+  template<class StorageT>
+  static void cage_is_empty_or_die(StorageT& storage) {
+    static_assert(sizeof(StorageT) >= sizeof(canary_t));
+    auto& canary = reinterpret_cast<canary_t&>(storage);
+    switch (canary.state) {
+    case state_t::NOT_CAUGHT_YET:
+    case state_t::CAGE_EMPTIED:
+      return;
+    case state_t::ALIVE_IN_CAGE:
+      ceph_abort_msg("Canary shall not be left in the cage!");
+      return;
+    default:
+      ceph_abort_msg("Huh, corrupted canary.");
+      return;
+    }
+  }
+
+  template<class StorageT>
+  static void prepare_cage(StorageT& storage) {
+    static_assert(sizeof(StorageT) >= sizeof(canary_t));
+    auto& canary = reinterpret_cast<canary_t&>(storage);
+    canary.state = state_t::NOT_CAUGHT_YET;
+  }
 };
 # define INSERT_CANARY(name) canary_t name
 # define VERIFY_CANARY(name) (name).is_alive_or_die()
+# define FREE_CANARY(name)   (name).make_it_free()
 #else
 # define INSERT_CANARY(name) /* nop */
 # define VERIFY_CANARY(name) /* nop */
+# define FREE_CANARY(name)   /* nop */
 #endif
 
 template <class T>
@@ -262,6 +298,7 @@ struct error_code;
     ptr& operator= (ptr&& p) noexcept;
     ~ptr() {
       VERIFY_CANARY(canary);
+      FREE_CANARY(canary);
       // BE CAREFUL: this destructor is called also for hypercombined ptr_node.
       // After freeing underlying raw, `*this` can become inaccessible as well!
       release_raw();
diff --git a/src/include/buffer_raw.h b/src/include/buffer_raw.h
index 6483edd..d4cf46d 100644
--- a/src/include/buffer_raw.h
+++ b/src/include/buffer_raw.h
@@ -44,13 +44,16 @@ inline namespace v15_2_0 {
 
     explicit raw(unsigned l, int mempool=mempool::mempool_buffer_anon)
       : data(nullptr), len(l), nref(0), mempool(mempool) {
+      ceph::canary_t::prepare_cage(bptr_storage);
       mempool::get_pool(mempool::pool_index_t(mempool)).adjust_count(1, len);
     }
     raw(char *c, unsigned l, int mempool=mempool::mempool_buffer_anon)
       : data(c), len(l), nref(0), mempool(mempool) {
+      ceph::canary_t::prepare_cage(bptr_storage);
       mempool::get_pool(mempool::pool_index_t(mempool)).adjust_count(1, len);
     }
     virtual ~raw() {
+      ceph::canary_t::cage_is_empty_or_die(bptr_storage);
       mempool::get_pool(mempool::pool_index_t(mempool)).adjust_count(
        -1, -(int)len);
     }

Canary-free confirmation

rzarz@ubulap:/work/ceph-2/build$ /work/ceph-2/build/bin/ceph -c /work/ceph-2/build/ceph.conf config assimilate-conf -i -
*** DEVELOPER MODE: setting PATH, PYTHONPATH and LD_LIBRARY_PATH ***
always_empty_bptr_t on memory: 0x7fc48744a500
always_empty_bptr_t on memory: 0x7fc48744a500
2020-06-30T08:09:34.976+0200 7fc47b502700 -1 WARNING: all dangerous and experimental features are enabled.
2020-06-30T08:09:34.984+0200 7fc47b502700 -1 WARNING: all dangerous and experimental features are enabled.
~always_empty_bptr_t on memory: 0x7fc48744a500
~always_empty_bptr_t on memory: 0x7fc48744a500
/work/ceph-2/src/include/buffer.h: In function 'void ceph::canary_t::is_alive_or_die() const' thread 7fc489321700 time 2020-06-30T08:09:42.021253+0200
/work/ceph-2/src/include/buffer.h: 95: ceph_abort_msg("Canary is dead! Oops...")
 ceph version 16.0.0-2442-g36b32ed (36b32ed810a54b9c42b9981d66d5c70a8064d517) pacific (dev)
 1: (ceph::__ceph_abort(char const*, int, char const*, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)+0xf4) [0x7fc47e50243c]
 2: (ceph::buffer::v15_2_0::list::always_empty_bptr_t::~always_empty_bptr_t()+0x126) [0x7fc47e841866]
 3: (()+0x39ff8) [0x7fc488b7aff8]
 4: (()+0x3a045) [0x7fc488b7b045]
 5: /usr/bin/python3.5() [0x623edf]
 6: /usr/bin/python3.5() [0x623fca]
 7: (PyErr_PrintEx()+0x36) [0x624036]
 8: (PyRun_SimpleFileExFlags()+0x1d9) [0x6252f9]
 9: (Py_Main()+0x456) [0x63e616]
 10: (main()+0xe1) [0x4d1761]
 11: (__libc_start_main()+0xf0) [0x7fc488b61830]
 12: (_start()+0x29) [0x5d57c9]
Aborted
diff --git a/src/common/buffer.cc b/src/common/buffer.cc
index 3e4ecdf..01457d9 100644
--- a/src/common/buffer.cc
+++ b/src/common/buffer.cc
@@ -1329,7 +1329,13 @@ static ceph::spinlock debug_lock;
     _len++;
   }
 
-  buffer::ptr buffer::list::always_empty_bptr;
+  buffer::list::always_empty_bptr_t::always_empty_bptr_t() {
+    std::cout << __func__ << " on memory: " << (void*)this << std::endl;
+  }
+  buffer::list::always_empty_bptr_t::~always_empty_bptr_t() {
+    std::cout << __func__ << " on memory: " << (void*)this << std::endl;
+  }
+  buffer::list::always_empty_bptr_t buffer::list::always_empty_bptr;
 
   buffer::ptr_node& buffer::list::refill_append_space(const unsigned len)
   {
diff --git a/src/fmt b/src/fmt
index 51bf9cf..7ad3015 160000
--- a/src/fmt
+++ b/src/fmt
@@ -1 +1 @@
-Subproject commit 51bf9cfacb644659e5d9c7e6fe66396726f2f4f4
+Subproject commit 7ad3015f5bc77eda28d52f820e6d89955bf0784a
diff --git a/src/include/buffer.h b/src/include/buffer.h
index d2c3156..f254a2a 100644
--- a/src/include/buffer.h
+++ b/src/include/buffer.h
@@ -956,11 +956,15 @@ struct error_code;
       return page_aligned_appender(this, min_pages);
     }
 
+  struct always_empty_bptr_t : public ptr {
+    always_empty_bptr_t();
+    ~always_empty_bptr_t();
+  };
   private:
     // always_empty_bptr has no underlying raw but its _len is always 0.
     // This is useful for e.g. get_append_buffer_unused_tail_length() as
     // it allows to avoid conditionals on hot paths.
-    static ptr always_empty_bptr;
+    static always_empty_bptr_t always_empty_bptr;
     ptr_node& refill_append_space(const unsigned len);
 
   public:

How a constructor could be called twice on same memory?

(gdb) cont
Continuing.
[Detaching after fork from child process 13432]

Breakpoint 3, ceph::buffer::v15_2_0::list::always_empty_bptr_t::always_empty_bptr_t (this=0x7ffff60f9500 <ceph::buffer::v15_2_0::list::always_empty_bptr>) at /work/ceph-2/src/common/buffer.cc:1332
1332      buffer::list::always_empty_bptr_t::always_empty_bptr_t() {
(gdb) bt
#0  ceph::buffer::v15_2_0::list::always_empty_bptr_t::always_empty_bptr_t (this=0x7ffff60f9500 <ceph::buffer::v15_2_0::list::always_empty_bptr>) at /work/ceph-2/src/common/buffer.cc:1332
#1  0x00007fffed2097d0 in __static_initialization_and_destruction_0 (__initialize_p=1, __priority=65535) at /work/ceph-2/build/boost/include/boost/asio/impl/error.ipp:121
#2  _GLOBAL__sub_I_buffer.cc(void) () at /work/ceph-2/src/common/buffer.cc:2289
#3  0x00007ffff7de76ca in call_init (l=<optimized out>, argc=argc@entry=8, argv=argv@entry=0x7fffffffdc38, env=env@entry=0x7fffffffdc80) at dl-init.c:72
#4  0x00007ffff7de77db in call_init (env=0x7fffffffdc80, argv=0x7fffffffdc38, argc=8, l=<optimized out>) at dl-init.c:30
#5  _dl_init (main_map=main_map@entry=0xc41730, argc=8, argv=0x7fffffffdc38, env=0x7fffffffdc80) at dl-init.c:120
#6  0x00007ffff7dec8f2 in dl_open_worker (a=a@entry=0x7fffffffc4a0) at dl-open.c:575
#7  0x00007ffff7de7574 in _dl_catch_error (objname=objname@entry=0x7fffffffc490, errstring=errstring@entry=0x7fffffffc498, mallocedp=mallocedp@entry=0x7fffffffc48f, 
    operate=operate@entry=0x7ffff7dec4e0 <dl_open_worker>, args=args@entry=0x7fffffffc4a0) at dl-error.c:187
#8  0x00007ffff7debdb9 in _dl_open (file=0x7ffff674b578 "/work/ceph-2/build/lib/cython_modules/lib.3/rados.cpython-35m-x86_64-linux-gnu.so", mode=-2147483646, 
    caller_dlopen=0x60765a <_PyImport_FindSharedFuncptr+138>, nsid=-2, argc=<optimized out>, argv=<optimized out>, env=0x7fffffffdc80) at dl-open.c:660
#9  0x00007ffff75ecf09 in dlopen_doit (a=a@entry=0x7fffffffc6d0) at dlopen.c:66
#10 0x00007ffff7de7574 in _dl_catch_error (objname=0xb49940, errstring=0xb49948, mallocedp=0xb49938, operate=0x7ffff75eceb0 <dlopen_doit>, args=0x7fffffffc6d0) at dl-error.c:187
#11 0x00007ffff75ed571 in _dlerror_run (operate=operate@entry=0x7ffff75eceb0 <dlopen_doit>, args=args@entry=0x7fffffffc6d0) at dlerror.c:163
#12 0x00007ffff75ecfa1 in __dlopen (file=<optimized out>, mode=<optimized out>) at dlopen.c:87
#13 0x000000000060765a in _PyImport_FindSharedFuncptr ()
#14 0x000000000061253b in _PyImport_LoadDynamicModuleWithSpec ()
#15 0x0000000000612a68 in ?? ()
#16 0x00000000004ea9b6 in PyCFunction_Call ()
#17 0x0000000000542565 in PyEval_EvalFrameEx ()
#18 0x0000000000544f43 in ?? ()
#19 0x0000000000541002 in PyEval_EvalFrameEx ()
#20 0x00000000005405e4 in PyEval_EvalFrameEx ()
#21 0x00000000005405e4 in PyEval_EvalFrameEx ()
#22 0x00000000005405e4 in PyEval_EvalFrameEx ()
#23 0x00000000005405e4 in PyEval_EvalFrameEx ()
#24 0x0000000000545d8b in PyEval_EvalCodeEx ()
#25 0x00000000004ec9a3 in ?? ()
#26 0x00000000005bfd87 in PyObject_Call ()
#27 0x00000000005c0bb6 in _PyObject_CallMethodIdObjArgs ()
#28 0x0000000000535715 in PyImport_ImportModuleLevelObject ()
#29 0x000000000054ec68 in ?? ()
#30 0x00000000004ea927 in PyCFunction_Call ()
#31 0x00000000005bfd87 in PyObject_Call ()
#32 0x0000000000531200 in PyEval_CallObjectWithKeywords ()
#33 0x000000000053efa3 in PyEval_EvalFrameEx ()
#34 0x0000000000544f43 in ?? ()
#35 0x0000000000545c3f in PyEval_EvalCode ()
#36 0x0000000000622642 in ?? ()
#37 0x0000000000624aea in PyRun_FileExFlags ()
#38 0x00000000006252dc in PyRun_SimpleFileExFlags ()
#39 0x000000000063e616 in Py_Main ()
#40 0x00000000004d1761 in main ()
(gdb) cont
Continuing.
always_empty_bptr_t on memory: 0x7ffff60f9500

Breakpoint 3, 0x00007ffff5d9eae0 in ceph::buffer::v15_2_0::list::always_empty_bptr_t::always_empty_bptr_t()@plt () from /work/ceph-2/build/lib/librados.so.2
(gdb) bt
#0  0x00007ffff5d9eae0 in ceph::buffer::v15_2_0::list::always_empty_bptr_t::always_empty_bptr_t()@plt () from /work/ceph-2/build/lib/librados.so.2
#1  0x00007ffff5dab010 in __static_initialization_and_destruction_0 (__initialize_p=1, __priority=65535) at /work/ceph-2/build/boost/include/boost/asio/impl/error.ipp:121
#2  _GLOBAL__sub_I_buffer.cc(void) () at /work/ceph-2/src/common/buffer.cc:2289
#3  0x00007ffff7de76ca in call_init (l=<optimized out>, argc=argc@entry=8, argv=argv@entry=0x7fffffffdc38, env=env@entry=0x7fffffffdc80) at dl-init.c:72
#4  0x00007ffff7de77db in call_init (env=0x7fffffffdc80, argv=0x7fffffffdc38, argc=8, l=<optimized out>) at dl-init.c:30
#5  _dl_init (main_map=main_map@entry=0xc41730, argc=8, argv=0x7fffffffdc38, env=0x7fffffffdc80) at dl-init.c:120
#6  0x00007ffff7dec8f2 in dl_open_worker (a=a@entry=0x7fffffffc4a0) at dl-open.c:575
#7  0x00007ffff7de7574 in _dl_catch_error (objname=objname@entry=0x7fffffffc490, errstring=errstring@entry=0x7fffffffc498, mallocedp=mallocedp@entry=0x7fffffffc48f, 
    operate=operate@entry=0x7ffff7dec4e0 <dl_open_worker>, args=args@entry=0x7fffffffc4a0) at dl-error.c:187
#8  0x00007ffff7debdb9 in _dl_open (file=0x7ffff674b578 "/work/ceph-2/build/lib/cython_modules/lib.3/rados.cpython-35m-x86_64-linux-gnu.so", mode=-2147483646, 
    caller_dlopen=0x60765a <_PyImport_FindSharedFuncptr+138>, nsid=-2, argc=<optimized out>, argv=<optimized out>, env=0x7fffffffdc80) at dl-open.c:660
#9  0x00007ffff75ecf09 in dlopen_doit (a=a@entry=0x7fffffffc6d0) at dlopen.c:66
#10 0x00007ffff7de7574 in _dl_catch_error (objname=0xb49940, errstring=0xb49948, mallocedp=0xb49938, operate=0x7ffff75eceb0 <dlopen_doit>, args=0x7fffffffc6d0) at dl-error.c:187
#11 0x00007ffff75ed571 in _dlerror_run (operate=operate@entry=0x7ffff75eceb0 <dlopen_doit>, args=args@entry=0x7fffffffc6d0) at dlerror.c:163
#12 0x00007ffff75ecfa1 in __dlopen (file=<optimized out>, mode=<optimized out>) at dlopen.c:87
#13 0x000000000060765a in _PyImport_FindSharedFuncptr ()
#14 0x000000000061253b in _PyImport_LoadDynamicModuleWithSpec ()
#15 0x0000000000612a68 in ?? ()
#16 0x00000000004ea9b6 in PyCFunction_Call ()
#17 0x0000000000542565 in PyEval_EvalFrameEx ()
#18 0x0000000000544f43 in ?? ()
#19 0x0000000000541002 in PyEval_EvalFrameEx ()
#20 0x00000000005405e4 in PyEval_EvalFrameEx ()
#21 0x00000000005405e4 in PyEval_EvalFrameEx ()
#22 0x00000000005405e4 in PyEval_EvalFrameEx ()
#23 0x00000000005405e4 in PyEval_EvalFrameEx ()
#24 0x0000000000545d8b in PyEval_EvalCodeEx ()
#25 0x00000000004ec9a3 in ?? ()
#26 0x00000000005bfd87 in PyObject_Call ()
#27 0x00000000005c0bb6 in _PyObject_CallMethodIdObjArgs ()
#28 0x0000000000535715 in PyImport_ImportModuleLevelObject ()
#29 0x000000000054ec68 in ?? ()
#30 0x00000000004ea927 in PyCFunction_Call ()
#31 0x00000000005bfd87 in PyObject_Call ()
#32 0x0000000000531200 in PyEval_CallObjectWithKeywords ()
#33 0x000000000053efa3 in PyEval_EvalFrameEx ()
#34 0x0000000000544f43 in ?? ()
#35 0x0000000000545c3f in PyEval_EvalCode ()
#36 0x0000000000622642 in ?? ()
#37 0x0000000000624aea in PyRun_FileExFlags ()
#38 0x00000000006252dc in PyRun_SimpleFileExFlags ()
#39 0x000000000063e616 in Py_Main ()
#40 0x00000000004d1761 in main ()

More experimenting

rzarz@ubulap:/work/ceph-2/build$ /work/ceph-2/build/bin/ceph -c /work/ceph-2/build/ceph.conf config assimilate-conf -i -
*** DEVELOPER MODE: setting PATH, PYTHONPATH and LD_LIBRARY_PATH ***
always_empty_bptr_t on memory: 0x7f4169532510
double_dtor_hunter on memory: 0x7f4169532500 label: defined inside bl
double_dtor_hunter on memory: 0x7f416919dd88 label: defined outside bl
always_empty_bptr_t on memory: 0x7f4169532510
double_dtor_hunter on memory: 0x7f4169532500 label: defined inside bl
double_dtor_hunter on memory: 0x7f4169532528 label: defined outside bl
2020-06-30T12:53:03.826+0200 7f415d5ea700 -1 WARNING: all dangerous and experimental features are enabled.
2020-06-30T12:53:03.846+0200 7f415d5ea700 -1 WARNING: all dangerous and experimental features are enabled.
~double_dtor_hunter on memory: 0x7f4169532528
~double_dtor_hunter on memory: 0x7f4169532500
~always_empty_bptr_t on memory: 0x7f4169532510
~double_dtor_hunter on memory: 0x7f416919dd88
~double_dtor_hunter on memory: 0x7f4169532500
~always_empty_bptr_t on memory: 0x7f4169532510
/work/ceph-2/src/include/buffer.h: In function 'void ceph::canary_t::is_alive_or_die() const' thread 7f416b409700 time 2020-06-30T12:53:09.819971+0200
/work/ceph-2/src/include/buffer.h: 100: ceph_abort_msg("Canary is dead! Oops...")
 ceph version 16.0.0-2442-g36b32ed (36b32ed810a54b9c42b9981d66d5c70a8064d517) pacific (dev)
 1: (ceph::__ceph_abort(char const*, int, char const*, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)+0xf4) [0x7f41605ea59c]
 2: (ceph::buffer::v15_2_0::list::always_empty_bptr_t::~always_empty_bptr_t()+0x126) [0x7f4160929be6]
 3: (()+0x39ff8) [0x7f416ac62ff8]
 4: (()+0x3a045) [0x7f416ac63045]
 5: /usr/bin/python3.5() [0x623edf]
 6: /usr/bin/python3.5() [0x623fca]
 7: (PyErr_PrintEx()+0x36) [0x624036]
 8: (PyRun_SimpleFileExFlags()+0x1d9) [0x6252f9]
 9: (Py_Main()+0x456) [0x63e616]
 10: (main()+0xe1) [0x4d1761]
 11: (__libc_start_main()+0xf0) [0x7f416ac49830]
 12: (_start()+0x29) [0x5d57c9]
Aborted
--- a/src/common/buffer.cc
+++ b/src/common/buffer.cc
@@ -48,6 +48,14 @@ using namespace ceph;
 #define CEPH_BUFFER_ALLOC_UNIT  4096u
 #define CEPH_BUFFER_APPEND_SIZE (CEPH_BUFFER_ALLOC_UNIT - sizeof(raw_combined))
 
+ceph::double_dtor_hunter::double_dtor_hunter(char* label) {
+  std::cout << __func__ << " on memory: " << (void*)this
+            << " label: " << label << std::endl;
+}
+ceph::double_dtor_hunter::~double_dtor_hunter() {
+  std::cout << __func__ << " on memory: " << (void*)this << std::endl;
+}
+
 #ifdef BUFFER_DEBUG
 static ceph::spinlock debug_lock;
 # define bdout { std::lock_guard<ceph::spinlock> lg(debug_lock); std::cout
@@ -2281,3 +2296,6 @@ const boost::system::error_category& buffer_category() noexcept {
 }
 }
 }
+
+static ceph::double_dtor_hunter freehunter("defined outside bl");
diff --git a/src/include/buffer.h b/src/include/buffer.h
index d2c3156..6774e29 100644
--- a/src/include/buffer.h
+++ b/src/include/buffer.h
@@ -79,6 +79,11 @@ template<typename T> class DencDumper;
 
 namespace ceph {
 
+struct double_dtor_hunter {
+  double_dtor_hunter(char* label);
+  ~double_dtor_hunter();
+};
+
 #define BUFFER_CORRUPTION_DEBUG
 #ifdef BUFFER_CORRUPTION_DEBUG
 class canary_t {
@@ -956,11 +961,16 @@ struct error_code;
       return page_aligned_appender(this, min_pages);
     }
 
   struct always_empty_bptr_t : public ptr {
     always_empty_bptr_t();
     ~always_empty_bptr_t();
   };
   private:
     // always_empty_bptr has no underlying raw but its _len is always 0.
     // This is useful for e.g. get_append_buffer_unused_tail_length() as
     // it allows to avoid conditionals on hot paths.
     static always_empty_bptr_t always_empty_bptr;
+    static double_dtor_hunter hunter_in_bl;
     ptr_node& refill_append_space(const unsigned len);
 
   public:

What are linking dependencies of rados Python binding?

rzarz@ubulap:/work/ceph-2/build$ ldd /work/ceph-2/build/lib/cython_modules/lib.3/rados.cpython-35m-x86_64-linux-gnu.so 
/work/ceph-2/build/lib/cython_modules/lib.3/rados.cpython-35m-x86_64-linux-gnu.so: /usr/lib/x86_64-linux-gnu/librados.so.2: no version information available (required by /work/ceph-2/build/lib/cython_modules/lib.3/rados.cpython-35m-x86_64-linux-gnu.so)
        linux-vdso.so.1 =>  (0x00007ffdc336d000)
        librados.so.2 => /usr/lib/x86_64-linux-gnu/librados.so.2 (0x00007f881c4d2000)
        libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f881c108000)
        libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007f881bf04000)
        libboost_thread.so.1.58.0 => /usr/lib/x86_64-linux-gnu/libboost_thread.so.1.58.0 (0x00007f881bcde000)
        libboost_random.so.1.58.0 => /usr/lib/x86_64-linux-gnu/libboost_random.so.1.58.0 (0x00007f881bad7000)
        libblkid.so.1 => /lib/x86_64-linux-gnu/libblkid.so.1 (0x00007f881b896000)
        libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f881b679000)
        libnss3.so => /usr/lib/x86_64-linux-gnu/libnss3.so (0x00007f881b332000)
        libsmime3.so => /usr/lib/x86_64-linux-gnu/libsmime3.so (0x00007f881b106000)
        libnspr4.so => /usr/lib/x86_64-linux-gnu/libnspr4.so (0x00007f881aec7000)
        librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007f881acbf000)
        libboost_iostreams.so.1.58.0 => /usr/lib/x86_64-linux-gnu/libboost_iostreams.so.1.58.0 (0x00007f881aaa6000)
        libboost_system.so.1.58.0 => /usr/lib/x86_64-linux-gnu/libboost_system.so.1.58.0 (0x00007f881a8a2000)
        libstdc++.so.6 => /usr/lib/x86_64-linux-gnu/libstdc++.so.6 (0x00007f881a4bf000)
        libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f881a1b6000)
        /lib64/ld-linux-x86-64.so.2 (0x00007f8825f25000)
        libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007f8819f9e000)
        libuuid.so.1 => /lib/x86_64-linux-gnu/libuuid.so.1 (0x00007f8819d99000)
        libnssutil3.so => /usr/lib/x86_64-linux-gnu/libnssutil3.so (0x00007f8819b6c000)
        libplc4.so => /usr/lib/x86_64-linux-gnu/libplc4.so (0x00007f8819967000)
        libplds4.so => /usr/lib/x86_64-linux-gnu/libplds4.so (0x00007f8819763000)
        libbz2.so.1.0 => /lib/x86_64-linux-gnu/libbz2.so.1.0 (0x00007f8819553000)
        libz.so.1 => /lib/x86_64-linux-gnu/libz.so.1 (0x00007f8819339000)

libceph-common.so is pulled as a dependency of librados.so.2:

rzarz@ubulap:/work/ceph-2/build$ ldd /work/ceph-2/build/lib/librados.so.2
        linux-vdso.so.1 =>  (0x00007ffefd977000)
        libceph-common.so.2 => /work/ceph-2/build/lib/libceph-common.so.2 (0x00007f1442531000)
        libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007f144232d000)
        librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007f1442125000)
        libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f1441f08000)
        libstdc++.so.6 => /usr/lib/x86_64-linux-gnu/libstdc++.so.6 (0x00007f1441b25000)
        libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007f144190d000)
        libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f1441543000)
        /lib64/ld-linux-x86-64.so.2 (0x00007f144b709000)
        libresolv.so.2 => /lib/x86_64-linux-gnu/libresolv.so.2 (0x00007f1441328000)
        libblkid.so.1 => /lib/x86_64-linux-gnu/libblkid.so.1 (0x00007f14410e7000)
        libcrypto.so.1.0.0 => /lib/x86_64-linux-gnu/libcrypto.so.1.0.0 (0x00007f1440ca2000)
        libudev.so.1 => /lib/x86_64-linux-gnu/libudev.so.1 (0x00007f144b8e3000)
        libibverbs.so.1 => /usr/lib/libibverbs.so.1 (0x00007f1440a93000)
        librdmacm.so.1 => /usr/lib/x86_64-linux-gnu/librdmacm.so.1 (0x00007f144087c000)
        libz.so.1 => /lib/x86_64-linux-gnu/libz.so.1 (0x00007f1440662000)
        libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f1440359000)
        libuuid.so.1 => /lib/x86_64-linux-gnu/libuuid.so.1 (0x00007f1440154000)

To which DSO bufferlist really belongs?

rzarz@ubulap:/work/ceph-2/build$ nm -DC lib/libceph-common.so | grep always_empty_bptr
0000000008e42d70 B ceph::buffer::v15_2_0::list::always_empty_bptr
00000000005cdf20 T ceph::buffer::v15_2_0::list::always_empty_bptr_t::always_empty_bptr_t()
00000000005cdf20 T ceph::buffer::v15_2_0::list::always_empty_bptr_t::always_empty_bptr_t()
00000000005ceac0 T ceph::buffer::v15_2_0::list::always_empty_bptr_t::~always_empty_bptr_t()
00000000005ceac0 T ceph::buffer::v15_2_0::list::always_empty_bptr_t::~always_empty_bptr_t()
rzarz@ubulap:/work/ceph-2/build$ nm -DC lib/librados.so.2 | grep always_empty_bptr
000000000038e510 B ceph::buffer::v15_2_0::list::always_empty_bptr
00000000000879d0 T ceph::buffer::v15_2_0::list::always_empty_bptr_t::always_empty_bptr_t()
00000000000879d0 T ceph::buffer::v15_2_0::list::always_empty_bptr_t::always_empty_bptr_t()
0000000000088570 T ceph::buffer::v15_2_0::list::always_empty_bptr_t::~always_empty_bptr_t()
0000000000088570 T ceph::buffer::v15_2_0::list::always_empty_bptr_t::~always_empty_bptr_t()

Oops?

Oops!

set(ceph_common_objs
  # ...
  $<TARGET_OBJECTS:common_buffer_obj>
# ...
add_library(common STATIC ${ceph_common_objs})
target_link_libraries(common ${ceph_common_deps})

add_library(ceph-common SHARED ${ceph_common_objs})
# C/C++ API
add_library(librados ${CEPH_SHARED}
  # ...
  $<TARGET_OBJECTS:common_buffer_obj>)

The fix

diff --git a/src/librados/CMakeLists.txt b/src/librados/CMakeLists.txt
index 6335d0c..6e2c8ce 100644
--- a/src/librados/CMakeLists.txt
+++ b/src/librados/CMakeLists.txt
@@ -8,8 +8,7 @@ add_library(librados_impl STATIC
 # C/C++ API
 add_library(librados ${CEPH_SHARED}
   librados_c.cc
-  librados_cxx.cc
-  $<TARGET_OBJECTS:common_buffer_obj>)
+  librados_cxx.cc)
 if(ENABLE_SHARED)
   set_target_properties(librados PROPERTIES
     OUTPUT_NAME rados
rzarz@ubulap:/work/ceph-2/build$ nm -DC lib/librados.so.2 | grep always_empty_bptr
                 U ceph::buffer::v15_2_0::list::always_empty_bptr
rzarz@ubulap:/work/ceph-2/build$ /work/ceph-2/build/bin/ceph -c /work/ceph-2/build/ceph.conf config assimilate-conf -i -
*** DEVELOPER MODE: setting PATH, PYTHONPATH and LD_LIBRARY_PATH ***
always_empty_bptr_t on memory: 0x7f8cff72ed70
double_dtor_hunter on memory: 0x7f8cff72ed60 label: defined inside bl
double_dtor_hunter on memory: 0x7f8cff72ed88 label: defined outside bl
2020-06-30T13:32:51.445+0200 7f8cf3b7b700 -1 WARNING: all dangerous and experimental features are enabled.
2020-06-30T13:32:51.477+0200 7f8cf3b7b700 -1 WARNING: all dangerous and experimental features are enabled.
~double_dtor_hunter on memory: 0x7f8cff72ed88
~double_dtor_hunter on memory: 0x7f8cff72ed60
~always_empty_bptr_t on memory: 0x7f8cff72ed70
@rzarzynski
Copy link
Author

Let's take a look on the static handling wihtout the fix:

rzarz@ubulap:/work/ceph-2/build$ objdump -dC lib/librados.so.2 
...
000000000003fe70 <_GLOBAL__sub_I_buffer.cc>:
   3fe70:       55                      push   %rbp
   3fe71:       48 8d 3d e8 06 35 00    lea    0x3506e8(%rip),%rdi        # 390560 <std::__ioinit>
   3fe78:       53                      push   %rbx
   3fe79:       48 83 ec 08             sub    $0x8,%rsp
   3fe7d:       e8 5e 50 ff ff          callq  34ee0 <std::ios_base::Init::Init()@plt>
   3fe82:       48 8b 3d 5f 81 34 00    mov    0x34815f(%rip),%rdi        # 387fe8 <_DYNAMIC+0x598>
   3fe89:       48 8d 15 60 91 34 00    lea    0x349160(%rip),%rdx        # 388ff0 <__dso_handle>
   3fe90:       48 8d 35 c9 06 35 00    lea    0x3506c9(%rip),%rsi        # 390560 <std::__ioinit>
   3fe97:       e8 a4 44 ff ff          callq  34340 <__cxa_atexit@plt>
   3fe9c:       48 8d 3d 68 31 0f 00    lea    0xf3168(%rip),%rdi        # 13300b <typeinfo name for std::_Sp_counted_ptr<librados::ObjListCtx*, (__gnu_cxx::_Lock_policy)2>+0x1eb>
   3fea3:       e8 d8 4d ff ff          callq  34c80 <get_env_bool(char const*)@plt>
   3fea8:       48 8b 1d 21 7e 34 00    mov    0x347e21(%rip),%rbx        # 387cd0 <_DYNAMIC+0x280>
   3feaf:       88 05 9b 06 35 00       mov    %al,0x35069b(%rip)        # 390550 <buffer_track_crc>
   3feb5:       48 89 df                mov    %rbx,%rdi
   3feb8:       e8 03 3c ff ff          callq  33ac0 <ceph::buffer::v15_2_0::list::always_empty_bptr_t::always_empty_bptr_t()@plt>
...
rzarz@ubulap:/work/ceph-2/build$ objdump -s -j .init_array lib/librados.so.2 

lib/librados.so.2:     file format elf64-x86-64

Contents of section .init_array:
 3819c0 a0150400 00000000 60f80300 00000000  ........`.......
 3819d0 60fb0300 00000000 70fe0300 00000000  `.......p.......
 3819e0 80010400 00000000 60040400 00000000  ........`.......
 3819f0 10080400 00000000 800b0400 00000000  ................
 381a00 d00b0400 00000000 700c0400 00000000  ........p.......
 381a10 500f0400 00000000 a0100400 00000000  P...............
 381a20 50130400 00000000                    P.......        

Keep in mind the addresses in .init_array require endianess conversion (70fe0300 is 0003fe70).

rzarz@ubulap:/work/ceph-2/build$ objdump -dC lib/libceph-common.so.2
...
00000000002e8ea0 <_GLOBAL__sub_I_buffer.cc>:
  2e8ea0:       55                      push   %rbp
  2e8ea1:       48 8d 3d 18 bf b6 08    lea    0x8b6bf18(%rip),%rdi        # 8e54dc0 <std::__ioinit>
  2e8ea8:       53                      push   %rbx
  2e8ea9:       48 83 ec 08             sub    $0x8,%rsp
  2e8ead:       e8 fe c4 f9 ff          callq  2853b0 <std::ios_base::Init::Init()@plt>
  2e8eb2:       48 8b 3d 47 65 94 00    mov    0x946547(%rip),%rdi        # c2f400 <_DYNAMIC+0x1d28>
  2e8eb9:       48 8d 15 80 91 94 00    lea    0x949180(%rip),%rdx        # c32040 <__dso_handle>
  2e8ec0:       48 8d 35 f9 be b6 08    lea    0x8b6bef9(%rip),%rsi        # 8e54dc0 <std::__ioinit>
  2e8ec7:       e8 64 c1 f9 ff          callq  285030 <__cxa_atexit@plt>
  2e8ecc:       48 8d 3d 21 28 5e 00    lea    0x5e2821(%rip),%rdi        # 8cb6f4 <typeinfo name for std::_Sp_counted_ptr_inplace<RDMADispatcher, std::allocator<RDMADispatcher>, (__gnu_cxx::_Lock_policy)2>
+0x1d4>
  2e8ed3:       e8 a8 e2 10 00          callq  3f7180 <get_env_bool(char const*)>
  2e8ed8:       48 8b 1d 91 61 94 00    mov    0x946191(%rip),%rbx        # c2f070 <_DYNAMIC+0x1998>
  2e8edf:       88 05 cb be b6 08       mov    %al,0x8b6becb(%rip)        # 8e54db0 <buffer_track_crc>
  2e8ee5:       48 89 df                mov    %rbx,%rdi
  2e8ee8:       e8 f3 ca 2d 00          callq  5c59e0 <ceph::buffer::v15_2_0::list::always_empty_bptr_t::always_empty_bptr_t()>
...
rzarz@ubulap:/work/ceph-2/build$ objdump -s -j .init_array lib/libceph-common.so 

lib/libceph-common.so:     file format elf64-x86-64

Contents of section .init_array:
...
 c10b28 a08e2e00 00000000 b0912e00 00000000  ................
...

@smithfarm
Copy link

Does this fix need to be backported? And how far? Just to octopus?

@rzarzynski
Copy link
Author

Just to the record:

Yeah, git branch --contains says that the introducing commit 7bf6b5ee1208a359826c74ab033e6bbbfc65969f (Thu Jan 31 16:06:40 2019 -0500) is also in octopus.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment