[Following up on question posted: https://stackoverflow.com/questions/52862040/reduced-named-pipe-fifo-throughput-with-nodejs-consumer/52863010#52863010]
I am trying to use a named pipe to pass a stream of images from a python process to a nodejs process, ideally at 60 frames per second for 1 megabyte images. The throughput I'm getting is only around 25 frames per second. I was surprised that it was so slow so I tested transferring the frames from one python process to a second python process. I was able to achieve transfer rates of around 500 frames per second. I'm new to nodejs so I could be easily be missing something, but I thought it would have comparable speed. Why is my nodejs program so much slower at consuming data from the named pipe?
I'm using a nodejs readable stream and reading on the 'readable' event:
const fs = require('fs');
fifo = fs.createReadStream(process.argv[2], { highWaterMark: 4*1024**2, });
let t0 = Date.now();
fifo.on('readable', () => {
const chunk = fifo.read(1024 ** 2);
if (chunk !== null) {
let t1 = Date.now();
process.stdout.write(`${(t1 - t0)/1000.}s, ${1000./(t1 - t0)}fps\n`);
t0 = t1;
}
});
fifo.on('end', () => {
process.stdout.write('end');
});
My python producer just writes bytes to the named pipe as if it was a file:
import sys
import numpy as np
im = np.random.randint(0, 255, size=(1024, 1024)).astype(np.uint8).ravel()
with open(sys.argv[1], 'wb') as f:
while True:
f.write(im.tobytes())
f.flush()
The python reader just reads from the named pipe as if it was a file:
import sys
import numpy as np
import time
l = 1024 ** 2
t0 = time.time()
with open(sys.argv[1], 'rb') as f:
while True:
im = f.read(l)
t1 = time.time()
print('{}s, {}fps'.format(t1 - t0, 1/(t1 - t0)))
t0 = t1
To test the python to javascript transfer:
mkfifo /tmp/video; python producer.py /tmp/video & node reader.js /tmp/video
And to test the python to python transfer:
mkfifo /tmp/video; python producer.py /tmp/video & python reader.py /tmp/video
I'm using mac (OS 10.13.6, 2.7 GHz Intel Core i5). Python 3.7.0. Node v8.9.1.
I also tried using the 'data' event for the nodejs reader, and it was just as slow. Could the nodejs events have enough overhead to slow down the reading?
Any ideas would be greatly appreciated!
great job, dear