This example renders thirty frames of the Three.js 'your first scene' example and sends them to a Node.js microservice, which saves them to the filesystem.
test-render-client.html
- Creates the scene, camera, renderer, and kicks off the render loop, which stops after 30 frames have been rendered.
- As an optimization, it doesn't add the Three.js canvas to the browser DOM, rendering it offscreen, but reporting progress.
- It extracts the data for each frame using canvas.toDataURL(), sending that to a web worker process for transmission.
- When all frames are rendered, it sends a 'done' message to the worker.
test-render-worker.js
- Sets up a queue for incoming frames to be sent to the server.
- Services the queue at an interval.
- If sending is in progress, the service method exits, doing nothing.
- If rendering is done and the queue is empty, it closes the socket and terminates the worker.
- Otherwise, if the queue has items, it sets the sending flag, emits a frame event on the socket, with a callback.
- The callback clears the sending flag.
- The onmessage handler checks the message type.
- If 'done' it sets the done flag.
- If 'frame', it pushes the frame onto the queue.
test-render-server.js
- Creates a Socket.io server, setting listeners for 'frame' and 'disconnect' events.
- The onFrame handler takes data and a callback.
- It writes the data to disk as '/var/tmp/test-render-server//frame-xxxx.png'
- Calls the callback, sending the test-render-worker confirmation that it got the data.
- The onDisconnect handler removes the event listeners from the connection object.
NOTES: Typical output (without the browser's debugger being open):
Total Frames: 30 Total time: 4165ms ms per frame: 138.83333333333334
This output on the web page only indicates how long it took to render the frames and feed them to the web worker, not the total time to render AND transmit. We're not concerned about transmission time, only how long it takes to render a frame.
This approach takes about 4 seconds to render 1 second's worth of frames. And it's only rotating a simple cube. A more complex scene would of course take longer.
This setup would be fine if you're not trying to sync scene elements to an audio track, but in my real application, I am. Audio is playing, and spectrum analysis is being done continuously so that objects in the scene may have properties such as position and size be computed from the audio's low, mid, high, or overall volume. What is likely to happen here is that when the server tries to make a video from these frames and the audio, they will be out of sync.
If you comment out line 65 of test-render-client.html (which calls canvas.toDataURL()), the time to render a frame drops to around 20ms or less. Therefore the process of extracting the data cannot happen inside the rendering loop.