Skip to content

Instantly share code, notes, and snippets.

@anthonylouisbsb
Created June 10, 2020 08:04
Show Gist options
  • Save anthonylouisbsb/fb11453371fe35784dd38051cde56860 to your computer and use it in GitHub Desktop.
Save anthonylouisbsb/fb11453371fe35784dd38051cde56860 to your computer and use it in GitHub Desktop.
await: <coroutine object create_client at 0x7f76106c6888>
await: <coroutine object _config_client at 0x7f76106c6a98>
await: <coroutine object _load_cluster_config at 0x7f76106c6ba0>
await: <coroutine object execute_single_node at 0x7f76106c6ca8>
await: <coroutine object _send_shannondb_message at 0x7f76106c6e60>
await: <coroutine object create_connection at 0x7f76106c6f68>
got result from await: <shannondb.protocol.ShannonDBClientProtocol object at 0x7f76106752e8>
Function execution: create_connection - 0.004685878753662109
got result from await: {'cluster_config': {'kafka.producer_config.buffer.memory': '134217728', 'kafka.topic.dictionary.prefix': 'dev-shannondb-dictionary-topic-', 'kafka.allshards.topic': 'dev-shannondb-index-topic-all-shards', 'kafka.consumer_config.fetch.max.wait.ms': '100', 'kafka.consumer_config.database.id': 'shard_0', 'shannondb.nodes': ['shannondb-presto:1234', 'shannondb-presto:1235'], 'kafka.DC01.bootstrap.servers.fallback': 'kafka-dc1:9092', 'kafka.consumer_config.session.timeout.ms': '6000', 'kafka.topic.dictionary.prefix.fallback': 'dev-shannondb-dictionary-topic-fallback-', 'kafka.producer_config.buffer.linger.ms': '100', 'kafka.consumer_config.enable.auto.commit': 'false', 'kafka.database.prefix': 'shard_', 'kafka.producer_config.acks': '1', 'kafka.DC01.bootstrap.servers': 'kafka-dc1:9092', 'kafka.topic.prefix': 'dev-shannondb-index-topic-', 'kafka.consumer_config.value.deserializer': 'org.apache.kafka.common.serialization.ByteArrayDeserializer', 'kafka.topic.prefix.fallback': 'dev-shannondb-index-topic-fallback-', 'kafka.polltimeoutms': '500', 'kafka.sleepTimeWhenEmptyTopic': '100', 'kafka.producer_config.client.id': 'shard_0', 'kafka.producer_config.value.serializer': 'org.apache.kafka.common.serialization.ByteArraySerializer', 'kafka.consumer_config.key.deserializer': 'org.apache.kafka.common.serialization.ByteArrayDeserializer', 'index.shards.1': ['0', '1', '2', '3'], 'index.shards.0': ['0', '1', '2', '3'], 'port': 0, 'shannondb.datacenters.DC01': '0,1', 'kafka.producer_config.compression.type': 'lz4', 'kafka.producer_config.key.serializer': 'org.apache.kafka.common.serialization.ByteArraySerializer', 'kafka.consumer_config.connections.max.idle.ms': '860000000', 'kafka.producer_config.retries': '2', 'disk.0.1': '/opt/shannondb/s1data/node00', 'disk.1.1': '/opt/shannondb/s1data/node01', 'kafka.consumer_config.max.poll.records': '10000', 'kafka.consumer_config.request.timeout.ms': '10000'}, 'status': 'ok'}
Function execution: _send_shannondb_message - 0.0073299407958984375
got result from await: {'cluster_config': {'kafka.producer_config.buffer.memory': '134217728', 'kafka.topic.dictionary.prefix': 'dev-shannondb-dictionary-topic-', 'kafka.allshards.topic': 'dev-shannondb-index-topic-all-shards', 'kafka.consumer_config.fetch.max.wait.ms': '100', 'kafka.consumer_config.database.id': 'shard_0', 'shannondb.nodes': ['shannondb-presto:1234', 'shannondb-presto:1235'], 'kafka.DC01.bootstrap.servers.fallback': 'kafka-dc1:9092', 'kafka.consumer_config.session.timeout.ms': '6000', 'kafka.topic.dictionary.prefix.fallback': 'dev-shannondb-dictionary-topic-fallback-', 'kafka.producer_config.buffer.linger.ms': '100', 'kafka.consumer_config.enable.auto.commit': 'false', 'kafka.database.prefix': 'shard_', 'kafka.producer_config.acks': '1', 'kafka.DC01.bootstrap.servers': 'kafka-dc1:9092', 'kafka.topic.prefix': 'dev-shannondb-index-topic-', 'kafka.consumer_config.value.deserializer': 'org.apache.kafka.common.serialization.ByteArrayDeserializer', 'kafka.topic.prefix.fallback': 'dev-shannondb-index-topic-fallback-', 'kafka.polltimeoutms': '500', 'kafka.sleepTimeWhenEmptyTopic': '100', 'kafka.producer_config.client.id': 'shard_0', 'kafka.producer_config.value.serializer': 'org.apache.kafka.common.serialization.ByteArraySerializer', 'kafka.consumer_config.key.deserializer': 'org.apache.kafka.common.serialization.ByteArrayDeserializer', 'index.shards.1': ['0', '1', '2', '3'], 'index.shards.0': ['0', '1', '2', '3'], 'port': 0, 'shannondb.datacenters.DC01': '0,1', 'kafka.producer_config.compression.type': 'lz4', 'kafka.producer_config.key.serializer': 'org.apache.kafka.common.serialization.ByteArraySerializer', 'kafka.consumer_config.connections.max.idle.ms': '860000000', 'kafka.producer_config.retries': '2', 'disk.0.1': '/opt/shannondb/s1data/node00', 'disk.1.1': '/opt/shannondb/s1data/node01', 'kafka.consumer_config.max.poll.records': '10000', 'kafka.consumer_config.request.timeout.ms': '10000'}, 'status': 'ok'}
Function execution: execute_single_node - 0.007576942443847656
got result from await: {'kafka.producer_config.buffer.memory': '134217728', 'kafka.topic.dictionary.prefix': 'dev-shannondb-dictionary-topic-', 'kafka.allshards.topic': 'dev-shannondb-index-topic-all-shards', 'kafka.consumer_config.fetch.max.wait.ms': '100', 'kafka.consumer_config.database.id': 'shard_0', 'shannondb.nodes': ['shannondb-presto:1234', 'shannondb-presto:1235'], 'kafka.DC01.bootstrap.servers.fallback': 'kafka-dc1:9092', 'kafka.consumer_config.session.timeout.ms': '6000', 'kafka.topic.dictionary.prefix.fallback': 'dev-shannondb-dictionary-topic-fallback-', 'kafka.producer_config.buffer.linger.ms': '100', 'kafka.consumer_config.enable.auto.commit': 'false', 'kafka.database.prefix': 'shard_', 'kafka.producer_config.acks': '1', 'kafka.DC01.bootstrap.servers': 'kafka-dc1:9092', 'kafka.topic.prefix': 'dev-shannondb-index-topic-', 'kafka.consumer_config.value.deserializer': 'org.apache.kafka.common.serialization.ByteArrayDeserializer', 'kafka.topic.prefix.fallback': 'dev-shannondb-index-topic-fallback-', 'kafka.polltimeoutms': '500', 'kafka.sleepTimeWhenEmptyTopic': '100', 'kafka.producer_config.client.id': 'shard_0', 'kafka.producer_config.value.serializer': 'org.apache.kafka.common.serialization.ByteArraySerializer', 'kafka.consumer_config.key.deserializer': 'org.apache.kafka.common.serialization.ByteArrayDeserializer', 'index.shards.1': ['0', '1', '2', '3'], 'index.shards.0': ['0', '1', '2', '3'], 'port': 0, 'shannondb.datacenters.DC01': '0,1', 'kafka.producer_config.compression.type': 'lz4', 'kafka.producer_config.key.serializer': 'org.apache.kafka.common.serialization.ByteArraySerializer', 'kafka.consumer_config.connections.max.idle.ms': '860000000', 'kafka.producer_config.retries': '2', 'disk.0.1': '/opt/shannondb/s1data/node00', 'disk.1.1': '/opt/shannondb/s1data/node01', 'kafka.consumer_config.max.poll.records': '10000', 'kafka.consumer_config.request.timeout.ms': '10000'}
Function execution: _load_cluster_config - 0.0077207088470458984
got result from await: None
Function execution: _config_client - 0.00814509391784668
Function execution: get_client_config - 5.245208740234375e-06
got result from await: <shannondb.ShannonDBClient object at 0x7f76106da748>
Function execution: create_client - 0.08143854141235352
await: <coroutine object create_columns_in_bulk at 0x7f76106a0ba0>
await: <coroutine object create_single_column at 0x7f76106a0d00>
Function execution: get_preferred_datacenter_for_column_creation - 5.7220458984375e-06
await: <coroutine object execute_single_node at 0x7f76106a0e08>
await: <coroutine object _send_shannondb_message at 0x7f76106a0fc0>
await: <coroutine object create_connection at 0x7f760fc3d0f8>
got result from await: <shannondb.protocol.ShannonDBClientProtocol object at 0x7f7610687dd8>
Function execution: create_connection - 0.0013034343719482422
got result from await: {'status': 'ok'}
Function execution: _send_shannondb_message - 0.16424226760864258
got result from await: {'status': 'ok'}
Function execution: execute_single_node - 0.16470718383789062
got result from await: None
Function execution: create_single_column - 0.16493678092956543
got result from await: None
Function execution: create_columns_in_bulk - 0.16501712799072266
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment