Skip to content

Instantly share code, notes, and snippets.

@jerstlouis
Last active Apr 24, 2020
Embed
What would you like to do?
/*
This is an example request for an OGC API combining and chaining diffent modular blocks.
It would render a map combining 3 layers: satellite imagery, soil characteristics and elevation contours.
It uses the following OGC API modular blocks:
- Processes (with 'Maps' as a specialized process),
- Coverages (a 'data access' modular block, i.e. delivering a specific type of data layer aka 'collection')
- Features (a 'data access' modular block, i.e. delivering a specific type of data layer aka 'collection', including STAC as a specialized Features API)
- Styles
- Tiles (optionally)
This JSON along with BBOX, WIDTH and HEIGHT parameters can be POSTed to an OGC API implementing the "Map" process at the end of the daisy chain
It can also be used together with a Tiles API (or alternatively potentially the DGGS API) to provide an efficient mechanism to cache the results on either or both client & server.
Tiles (or DGGS, or subsetting, or other data partioning/delivery modular blocks) can also be used along the daisy chain, between the final Map process and
the intermediate processes & data delivery APIs, as indpendently negotiated between the different hops of the daisy chain (invisible to the user at the end of the chain).
This could potentially following the client tiles end-requests, or be driven by the hops themselves determining that to be the most efficient way to perform the requests,
based on declared API conformance classes.
This partitioning or chunking of the data can offer distributed computing possibility, allowing the whole process to complete in a short time synchronously.
The nested structure of this JSON document is based on the following simple types:
class URLOrID : String; // A text string identifying either a resource of particular type on the local server (if a simple identifier), or a URL to a remote server
class Process : UrlOrID; // A process resource
class Collection : UrlOrID; // A collection resource
// (e.g. a Feature Collection, a Coverage (including ready-to display imagery or rendered map as a specialized kind of gridded coverage
// whose cells are pixels)
class AbstractCollection
{
String id; // An id allowing to name the individual inputs, e.g. useful to use as layer IDs within a style sheet
Map<String, any_object> parameters; // The list of parameters to supply for retrieving this collection (e.g. query parameters after the ?)
// These are parameters for either a collection end-point (e.g. Feature or Coverage) or for a process
// If a data partitioning/delivery API is additional combined (e.g. tiles) (based on negotiation by the daisy chain)
// these parameters should still apply
Array<String, any_object> inputParameters; // The list of parameters specific to this one collection, which will be used by the process using this collection as an INPUT.
// If this collection is a process, these parameters will be used with the OUTPUT of that process.
}
class CollectionRetrieval : AbstractCollection
{
Collection collection; // The collection resource
}
class ProcessInvokation : AbstractCollection
{
Process process; // The process resource
Array<AbstractCollection> inputs; // The collections to use as inputs (which could themselves be ProcessInvokation to form a daisy chain)
}
It also attempts to encode Cascading Map Style Sheets (http://docs.opengeospatial.org/per/18-025.pdf -- Appendix C) as a JSON object to define
portrayal rules directly within this document, as an alternative to referencing an external style sheet
or encoding a style sheet in a single text field.
The processes involved are the following:
- Map (local) -- The map rendering processing end-point that this request is being POSTed to)
- Map (remote) -- A remote Map process from Landcare Research NZ
- STACToCoverage -- A process taking as input the items returned from a STAC feature collection output, access data from where the imagery is hosted
(e.g. S3 bucket), and will provide this data as a Coverage API collection
- Sentinel2BandsToARGB -- A process creating an ARGB image out of a Sentinel2 Coverage (e.g. selecting bands, panchromatic sharpening, atmospheric correction)
The output is a specialized pixels-based gridded coverage, for which a simple WM(T)S-like API might be suffcient,
i.e. it can simply return a PNG image based on a BBOX or tiles for example.
- ElevationContours -- A process generating elevation contours from a Digital Elevation Model, at a specific distance interval
Some ways this request can be used include:
- Include additional BBOX, WIDTH and HEIGHT parameters directly in this document (or as additional query parameters for the GET), and get a PNG map back
- POST this document to a /collections/ end-point to create a new virtual collection, to which you could then also append a /tiles/ to retrieve it as tiles
- POST this document to a /map/tiles/ end-point and get back a templated URL for retrieving this map rendered as tiles
If the user wants to perform client-side rendering, data layers of Coverage/Imagery or Features can be requested in the same manner (skipping the Map rendering process).
More complex or resource intensive workflows might be, which this system should also support.
An additional "callback" field on the top-level ProcessInvokation might be useful for asynchronous support.
Similarly, estimate and billing (including chainable calculation support) can be implemented as additional fields to this request (e.g. "estimateOnly" : true).
Even though the processes can (optionally?) be all linked/described at /processes/ , the actual end-point for any process could be anywhere.
(Including on a separate for /processes/ to act as a catalog of process, just like /collections/ could act as a catalog of collections found elsewhere,
and both could provide a search mechanism for finding useful processes & collections from all over the world, which a user might find useful).
The processes can be discoverable at /processes .
A single process resource can be by itself a very versatile tool, effectively making it possible to upload code, virtual machines, containers, or algorithms as parameters, e.g.:
- WCPS Runner
- ADES Runner
- R runner
- Python runner
Alternatively one could also POST to /processes to create a new process pre-baking some parameters and/or inputs.
The top level object is a ProcessInvokation:
*/
{
"parameters" : { "background" : "gray" },
"inputs" : [
{
"id" : "imagery",
"process" : "Sentinel2BandsToARGB",
"inputs" :
[
{
"process" : "STACToCoverage",
"inputs" : [
{
"collection" : "https://fn8d3qzbhk.execute-api.us-west-2.amazonaws.com/omega/collections/sentinel-s2-l2a",
// TODO: Make this an actual STAC filter retrieving the latest well-lit summer time scenes with less than 50% cloud cover
"parameters" : { "filter" : "cloud_cover < 50 AND month >= 'May' AND month <= 'September' AND sun_elevation > 60 SORT BY DESCENDING Date UNIQUE(path, row)" }
}
]
}
]
},
// NOTE: As an example, this is presented explicitly as a process referencing a vector features collection and a style from a Styles API,
// but that map output may have a a simple collection URL end-point e.g. as
// { "id" : "soils", "collection" : "https://smap.landcareresearch.co.nz/ogcapi/collections/soils/map/default", "inputParameters" : { "style", ...
{
"id" : "soils",
"process" : "https://smap.landcareresearch.co.nz/ogcapi/map",
// Assume Landcare Research provides a defalut style sheet in SLD/SE
"parameters" : { "style" : "https://smap.landcareresearch.co.nz/ogcapi/collections/soils/styles/default.sld" },
"inputs" : [ { "collection" : "https://smap.landcareresearch.co.nz/ogcapi/collections/soils" } ],
// Make the whole layer 50% opacity to overlay on our satellite imagery
"inputParameters" : { "style" : { "cmss" : { "rules" : [ { "opacity" : 0.5 } ] } } }
},
{
"id" : "contours",
"process" : "ElevationContours",
// We want contours about 20 pixels apart, so assuming the contour generator understands 'metersPerPixel' based on multi-resolution tiles zoom levels,
// or scale determined from bounding box +resolution
"parameters" : { "distance" : "20 * metersPerPixel" },
"inputs" : [ { "collection" : "SRTM_ViewFinderPanorama" } ],
"inputParameters" : {
"style" :
{
// In this example we want to specify styling directly in this document in JSON, so we use a JSON encoding of CMSS instead of an external style sheet
"cmss" :
{
"rules" : [
{
// Render default contours a pinkish color, 1 pixels wide
"stroke" : { "color" : "0xAAFFCC", "width" : 1 },
// Show elevation labels in bright green Arial 14 pt, with a 1.5 pixels black outline/halo
"label" : { "elements" : [
{
"type" : "Text",
"text" : "elevation",
"font" : { "color" : "lime", "face" : "Arial", "size" : 14, "outline" : { "size" : 1.5, "color" : "black" } }
}
] }
},
{
// Render contours whose elevation is divisible by 100 in white, 2 pixels wide
"selectors" : "[distance % 100 == 0]",
"stroke" : { "color" : "white", "width" : 2 },
// Show elevation labels for these major contours in black Arial 12 pt, with a 2 pixels white outline/halo
"label" : { "elements" : [
{
"type" : "Text",
"text" : "elevation",
"font" : { "color" : "black", "face" : "Arial", "size" : 12, "outline" : { "size" : 2, "color" : "white" } }
}
] }
}
] }
}
}
}
]
}
{
"parameters" : { "background" : "gray" },
"inputs" : [
{
"id" : "imagery",
"process" : "Sentinel2BandsToARGB",
"inputs" :
[
{
"process" : "STACToCoverage",
"inputs" : [
{
"collection" : "https://fn8d3qzbhk.execute-api.us-west-2.amazonaws.com/omega/collections/sentinel-s2-l2a",
"parameters" : { "filter" : "cloud_cover < 50 AND month >= 'May' AND month <= 'September' AND sun_elevation > 60 SORT BY DESCENDING Date UNIQUE(path, row)" }
}
]
}
]
},
{
"id" : "soils",
"process" : "https://smap.landcareresearch.co.nz/ogcapi/map",
"parameters" : { "style" : "https://smap.landcareresearch.co.nz/ogcapi/collections/soils/styles/default.sld" },
"inputs" : [ { "collection" : "https://smap.landcareresearch.co.nz/ogcapi/collections/soils" } ],
"inputParameters" : { "style" : { "cmss" : { "rules" : [ { "opacity" : 0.5 } ] } } }
},
{
"id" : "contours",
"process" : "ElevationContours",
"parameters" : { "distance" : "20 * metersPerPixel" },
"inputs" : [ { "collection" : "SRTM_ViewFinderPanorama" } ],
"inputParameters" : {
"style" :
{
"cmss" :
{
"rules" : [
{
"stroke" : { "color" : "0xAAFFCC", "width" : 1 },
"label" : { "elements" : [
{
"type" : "Text",
"text" : "elevation",
"font" : { "color" : "lime", "face" : "Arial", "size" : 14, "outline" : { "size" : 1.5, "color" : "black" } }
}
] }
},
{
"selectors" : "[distance % 100 == 0]",
"stroke" : { "color" : "white", "width" : 2 },
"label" : { "elements" : [
{
"type" : "Text",
"text" : "elevation",
"font" : { "color" : "black", "face" : "Arial", "size" : 12, "outline" : { "size" : 2, "color" : "white" } }
}
] }
}
] }
}
}
}
]
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment