Working with large streamed files can be a very heavy consumption, basically because if you process slow and you download fast, you must provide, at least, the total size of the downloaded file in terms of memory. And this could turn a very expensive process.
What if, instead of storing the whole "waiting list" in memory, we store it in our locally SSD?
- You open a streamed download of the feed, and you save it into a local socket file
- When you finish the stream, you append an END_TOKEN value into the file.
- You open a local file and read from it as soon as there is something to read
- When you read a partial content that ends with END_TOKEN, you're done.
- You remove the socket file
This is only for testing purposes. No tested on production.
- Add your own feed file (no matter the size. The bigger, the better)
php index.php
In your console, each 1 second, you will se the memory consumed by the PHP process (both the regular and the real) in MB. You will see as well a x
character once you read some content in your stream, and a 'LAST' token when you are done reading the file.