Skip to content

Instantly share code, notes, and snippets.

Embed
What would you like to do?
Module that allows Elm app to execute N `Cmd` in parallel (queue the remaining `Cmd`), when one is done execute the next Cmd in queue
module CmdWorkerPool exposing (Request, State, init, request, update)
import Task exposing (Task)
{-| State
- workersLimit limits the maximum number of concurrent Cmd
- workersCount tracks the current number of Cmd in progress
- backlog stores a List of Cmd that are waiting to be dispatched
Add a field in your model
type alias Model =
{ counter : Int
, message : String
-- add this
, cmdWorkerPool : CmdWorkerPool.State
}
And initialize with a max number of workers limit, e.g. `5`
init : Flags -> ( Model, Cmd Msg )
init flags =
let
doneCmd =
-- or Cmd.none
processExit 1
model =
{ counter = 0
, message = "Hello"
-- add this
, cmdWorkerPool = CmdWorkerPool.init 5 doneCmd
}
firstRequests =
-- prepare a list of CmdWorkerPool.request
[]
( newWorkerPool, cmd ) =
CmdWorkerPool.update exit model.cmdWorkerPool firstRequests
in
( { model | cmdWorkerPool = newWorkerPool }, cmd )
-}
type State msg
= State
{ workersLimit : Int
, workersCount : Int
, doneCmd : Cmd msg
, backlog : List (Cmd msg)
}
{-| Initialize an init `State` with
- `workersLimit` a cap on number of concurrent workers
- `doneCmd` a Cmd msg to dispatch when everything is done
-}
init : Int -> Cmd msg -> State msg
init workersLimit doneCmd =
State
{ workersLimit = workersLimit
, workersCount = 0
, doneCmd = doneCmd
, backlog = []
}
{-| We don't want to receive `Cmd msg`; we cannot control if they expect a reply.
So we need to receive `Request msg` instead; force users to use `request` function
-}
type Request msg
= Request (Cmd msg)
{-| Return `Request` values from `Task`; for supplying to `update`
-}
request : (Task x a -> Cmd msg) -> Task x a -> Request msg
request attempt task =
Request (attempt task)
cmdFrom : Request msg -> Cmd msg
cmdFrom (Request cmd) =
cmd
{-| Given
- we've just completed a Cmd (so `workersCount - 1`)
- and given a new list of new Cmd
Then
1. execute first few new Cmd if we have capacity
2. queue the remaining Cmd
3. if there are no outstanding Cmd, dispatch a final Cmd
Important notes:
- `newRequests : List (Request msg)` are used because it isn't correct for
`update` to receive `List (Cmd msg)` since not all `Cmd msg` returns; we
cannot know when work has ended with a fire-and-forget `Cmd msg` to do the
necessary accounting to free up workers.
- `doneCmd : Cmd msg` can be a fire-and-forget Cmd
Usage A:
- at the end of `init`, call `update Cmd.none model listOfCmdsToDo`
- this will dispatch some `listOfCmdsToDo` and queue the remainder; depends on `workersLimit`
Usage B:
- in `update` after handling a tracked Cmd, call `update Cmd.none model []`
- the `[]` means no new cmds to add
- if there are any Cmd waiting in queue, they'll be popped out of queue and dispatched
- if there are no Cmd waiting, `doneCmd : Cmd msg` will be dispatched
-}
update : State msg -> List (Request msg) -> ( State msg, Cmd msg )
update (State model) newRequests =
let
workersCount =
-- no negative numbers
max 0 (model.workersCount - 1)
availableCapacity =
model.workersLimit - workersCount
allPendingCmdList =
model.backlog ++ List.map cmdFrom newRequests
( cmdsNow, cmdsLater ) =
if availableCapacity <= 0 then
( [], allPendingCmdList )
else
( List.take availableCapacity allPendingCmdList
, List.drop availableCapacity allPendingCmdList
)
_ =
Debug.log "update"
( ( "workers active", workersCount )
, ( "workers new", List.length cmdsNow )
, ( "cmd backlog", List.length cmdsLater )
)
in
case ( workersCount, List.length cmdsNow, List.length cmdsLater ) of
( 0, 0, 0 ) ->
( State model, model.doneCmd )
( newWorkersCount, cmdsNowCount, _ ) ->
( State
{ model
| workersCount = newWorkersCount + cmdsNowCount
, backlog = cmdsLater
}
, Cmd.batch cmdsNow
)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment