Last active
April 4, 2020 15:34
-
-
Save Immolare/956bae0ffc5790d327ec223861ea2b12 to your computer and use it in GitHub Desktop.
[TIP] Process large amount of bulk tasks
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# Queuing process vs one call : | |
## One call : | |
How to finish a server. Could result by an execution time exception, not optimised for large amount of tasks and very bad for user experience | |
## Queuing : | |
The tasks will be added to a queue and treated one by one with a cron job : this save user experience and performance | |
# For each task generate 1 .json/.txt file in a directory | |
- Loop on tasks | |
- For each task create a .json/.txt file with all the datas required for treatment | |
- Name the file with Id or timestamp or whatelse useful | |
- In one directory, stock all the files corresponding to the tasks you want to add in queue | |
# Then create a script to read and treat X .json/.txt files in the directory with one call | |
- Get the X files you want to treat at once | |
- For each file, open its content, treat it (send mail, update, etc...) | |
- Close the file | |
- If sending or update is successful, remove the file | |
# Then create a cronjob every minute to treat the queue | |
- Example : every minute, read and treat 10 files, then remove them from the queue | |
- That's it |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment