Skip to content

Instantly share code, notes, and snippets.

@danielballan
Created January 22, 2015 22:14
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 1 You must be signed in to fork a gist
  • Save danielballan/418b6261f6b547c64552 to your computer and use it in GitHub Desktop.
Save danielballan/418b6261f6b547c64552 to your computer and use it in GitHub Desktop.
Display the source blob
Display the rendered blob
Raw
{
"metadata": {
"name": "",
"signature": "sha256:b5fa6a9652c9f2a20559b011d288c2b0f0162a0a1320d9c8745aa5625590243c"
},
"nbformat": 3,
"nbformat_minor": 0,
"worksheets": [
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Simple Data Broker Demo\n",
"\n",
"Here, we demonstrate that a simple implementation of the data broker can fetch and combine data from all three sources. It has only one interface, a `search` function.\n",
"\n",
"Note the example in the docstring below, illustrating the format of the returned data."
]
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"from databroker import broker"
],
"language": "python",
"metadata": {},
"outputs": [],
"prompt_number": 8
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"help(broker.search)"
],
"language": "python",
"metadata": {},
"outputs": [
{
"output_type": "stream",
"stream": "stdout",
"text": [
"Help on function search in module databroker.broker.simple_broker:\n",
"\n",
"search(beamline_id, start_time, end_time, ca_host, channels=None)\n",
" Get data from all events from a given beamline between two times.\n",
" \n",
" Parameters\n",
" ----------\n",
" beamline_id : string\n",
" e.g., 'srx'\n",
" start_time : string or datetime object\n",
" e.g., datetime.datetime(2015, 1, 1) or '2015-01-01' (ISO format)\n",
" end_time : string or datetime object\n",
" e.g., datetime.datetime(2015, 1, 1) or '2015-01-01' (ISO format)\n",
" ca_host : URL string\n",
" the URL of your archiver's ArchiveDataServer.cgi. For example,\n",
" 'http://cr01arc01/cgi-bin/ArchiveDataServer.cgi'\n",
" channels : list, optional\n",
" All queries will return applicable data from the N most popular\n",
" channels. If data from additional channels is needed, their full\n",
" identifiers (not human-readable names) must be given here as a list\n",
" of strings.\n",
" \n",
" Returns\n",
" -------\n",
" data : list\n",
" See example below illustrating the format of the returned dataset.\n",
" \n",
" Example\n",
" -------\n",
" >>> search('srx', '2015-01-01', '2015-01-02')\n",
" [(<unix epoch time>, {'chan1': <value>, 'chan2': <value>},\n",
" (<unix epoch time>, {'temp': <value>)}\n",
" \n",
" That is, it results a list of tuples, where each tuple contains a time and\n",
" a dictionary of name/value pairs. Every value is guaranteed to be either a\n",
" scalar Python primitive (int, float, string) or a numpy ndarray.\n",
"\n"
]
}
],
"prompt_number": 9
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Sidebar: Dummy Sources\n",
"\n",
"The databroker repo also contains a `sources` module that imitates the APIs of the three sources. It will be useful for testing and development, and it serves as a stopgap until the new MDS format is fully specified and implemented.\n",
"\n",
"You can switch between using \"live\" and \"dummy\" versions of any source at runtime."
]
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"from databroker import sources\n",
"\n",
"# Switch to dummy versions of all sources. The API is unchanged.\n",
"sources.switch(metadatastore=False, filestore=False, channelarchiver=False)"
],
"language": "python",
"metadata": {},
"outputs": [],
"prompt_number": 10
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Currently, the dummy Metadata Store and File Store just return boilerplate data for any query. To put simulated data into the dummy Channel Archiver, we sneak it in where we would normally specify the host URL."
]
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"from databroker.tests.test_databroker import generate_ca_data\n",
"\n",
"start, end = '2015-01-01 00:00:00', '2015-01-01 00:01:00'\n",
"simulated_ca_data = generate_ca_data(['SR11BCM01:LIFETIME_MONITOR', 'SR11BCM01:CURRENT_MONITOR'], start, end)"
],
"language": "python",
"metadata": {},
"outputs": [],
"prompt_number": 16
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## A Sample Query\n",
"\n",
"A basic query returns all the applicable data from the MDS and FS along with data from the most commonly used CA channels."
]
},
{
"cell_type": "code",
"collapsed": false,
"input": [
"broker.search('srx', start, end, ca_host=str(simulated_ca_data))"
],
"language": "python",
"metadata": {},
"outputs": [
{
"metadata": {},
"output_type": "pyout",
"prompt_number": 14,
"text": [
"[(datetime.datetime(2014, 1, 1, 1, 2, 3),\n",
" {'picture': array([[ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.],\n",
" [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.],\n",
" [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.],\n",
" [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.],\n",
" [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.],\n",
" [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.],\n",
" [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.],\n",
" [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.],\n",
" [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.],\n",
" [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.]]), 'temp': 273}),\n",
" (datetime.datetime(2014, 1, 1, 1, 2, 3),\n",
" {'picture': array([[ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.],\n",
" [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.],\n",
" [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.],\n",
" [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.],\n",
" [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.],\n",
" [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.],\n",
" [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.],\n",
" [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.],\n",
" [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.],\n",
" [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.]]), 'temp': 273}),\n",
" (datetime.datetime(2014, 1, 1, 1, 2, 3),\n",
" {'picture': array([[ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.],\n",
" [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.],\n",
" [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.],\n",
" [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.],\n",
" [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.],\n",
" [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.],\n",
" [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.],\n",
" [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.],\n",
" [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.],\n",
" [ 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.]]), 'temp': 273}),\n",
" (datetime.datetime(2015, 1, 1, 0, 0), {'SR11BCM01:LIFETIME_MONITOR': 0}),\n",
" (datetime.datetime(2015, 1, 1, 0, 1), {'SR11BCM01:LIFETIME_MONITOR': 1}),\n",
" (datetime.datetime(2015, 1, 1, 0, 0), {'SR11BCM01:CURRENT_MONITOR': 0}),\n",
" (datetime.datetime(2015, 1, 1, 0, 1), {'SR11BCM01:CURRENT_MONITOR': 1})]"
]
}
],
"prompt_number": 14
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## What about the muggler?\n",
"\n",
"All communication with the three data sources takes place through the broker. (Fuller implementations of the broker will manage various optimizations in data retrieval. All of that must be isolated from higher layers.) The muggler can make requests for additional data, such as adding a source from the Channel Archiver, but it will do so through the broker."
]
},
{
"cell_type": "code",
"collapsed": false,
"input": [],
"language": "python",
"metadata": {},
"outputs": []
}
],
"metadata": {}
}
]
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment