Leave ready the DB with a set of 10K documents with a length arround of 1K
from pymongo import MongoClient
from bson.objectid import ObjectId
client = MongoClient('localhost', 27017)
db = client['test']
collection = db['test']
collection.remove()
Leave ready the DB with a set of 10K documents with a length arround of 1K
from pymongo import MongoClient
from bson.objectid import ObjectId
client = MongoClient('localhost', 27017)
db = client['test']
collection = db['test']
collection.remove()
Size operation arround 50 bytes, write concern is acknowledgment.
Bulk trhoughput vs number of update operations
2 | 4 | 8 | 16 | 32 | 64 | 128 | 256 | 512 |
---|---|---|---|---|---|---|---|---|
4.5K | 9K | 12K | 16K | 19k | 22K | 23K | 24K | 27K |
1 publisher > 1 queue | |
queues = 10K | |
messages per queue = 100 | |
concurrence consumers = 20 | |
queues binded by consumer = 50 | |
qos by consumer = 40 | |
asyncronous pattern | |
Throughput got 16K messages, why ? |
+-------------------+-------------------------+---------+---------+---------+-------+ | |
|Name |Parameters | Real| User| Sys| Msg/s| | |
+-------------------+-------------------------+---------+---------+---------+-------+ | |
|Pika_Threads |{'threads': 2} | 3.03| 1.24| 0.15| 1650| | |
|Pika_Threads |{'threads': 4} | 1.78| 1.26| 0.19| 2808| | |
|Pika_Threads |{'threads': 8} | 1.48| 1.12| 0.16| 3378| | |
|Pika_Threads |{'threads': 16} | 1.43| 1.10| 0.27| 3496| | |
|Pika_Threads |{'threads': 32} | 1.31| 1.14| 0.30| 3816| | |
|Pika_Async |{'connections': 2} | 2.75| 0.96| 0.07| 1818| | |
|Pika_Async |{'connections': 4} | 1.98| 0.88| 0.09| 2525| |
from django.contrib.auth.hashers import BasePasswordHasher | |
class TestPasswordHasher(BasePasswordHasher): | |
""" | |
Just for testing propuses to be used in test environments | |
and avoid CPU BOUND operations by current installed | |
pasword hashers. | |
""" | |
algorithm = "test" |
# More info about ANSI escape sequences | |
# http://ascii-table.com/ansi-escape-sequences.php | |
import sys | |
import random | |
from time import sleep | |
ROWS = 10 | |
while True: |
Output of this command [1]
$ python set_memmory_usage.py
1. Differnece between list, dict and set containers with 1M of numbers regarding the size and container overhead
Sizeof with dict type: 48M, overhead per item 50b
Sizeof with set type: 32M, overhead per item 33b
Sizeof with list type: 8M, overhead per item 8b
----------
class ImmutableRecord(object): | |
def __init__(self, name): | |
self.__name = name | |
@property | |
def name(self): | |
return self.__name | |
def __hash__(self): |
Parsing many lines (lines 1000) (Repeated 10 times) ------------------------------------------------Time took Python json (Python version): 0.501879239018308
Time took cythonized Python json: 0.08349561202339828
Time took Python json (C version) : 0.05558486797963269
Time took ujson: 0.029640772991115227
var net = require('net'); | |
var sleep = require('sleep'); | |
var client = new net.Socket(); | |
client.connect(6379, '127.0.0.1', function() { | |
console.log('Connected'); | |
sleep.sleep(10); // wait for data and RST package | |
}); | |
client.on('data', function(data) { |