Skip to content

Instantly share code, notes, and snippets.

@rsignell-usgs
Last active May 20, 2018 21:35
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save rsignell-usgs/f99529eb1dc6c648048317f3cd685dd2 to your computer and use it in GitHub Desktop.
Save rsignell-usgs/f99529eb1dc6c648048317f3cd685dd2 to your computer and use it in GitHub Desktop.
Display the source blob
Display the rendered blob
Raw
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Explore ERDDAP timeseries data using Jupyter Widgets\n",
"Inspired by [Jason Grout's excellent ESIP Tech Dive talk on \"Jupyter Widgets\"](https://youtu.be/CVcrTRQkTxo?t=2596), this notebook uses the `ipyleaflet` and `bqplot` widgets\n",
"to interactively explore the last two weeks of time series data from an ERDDAP Server. Select a `standard_name` from the list, then click a station to see the time series. "
]
},
{
"cell_type": "markdown",
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-06T22:00:31.001042Z",
"start_time": "2017-12-06T22:00:30.979029Z"
}
},
"source": [
"NOTE: To access a protected ERDDAP endpoint is protected, you can add a `~/.netrc` file like this:\n",
"```\n",
"machine cgoms.coas.oregonstate.edu\n",
"login <username>\n",
"password <password>\n",
"```"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:37.291663Z",
"start_time": "2017-12-12T22:59:34.813470Z"
}
},
"outputs": [],
"source": [
"import numpy as np\n",
"import pandas as pd"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:37.636385Z",
"start_time": "2017-12-12T22:59:37.299680Z"
}
},
"outputs": [],
"source": [
"import pendulum"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"`ipyleaflet` and `bqplot` are both Jupyter widgets, so can interact with Python like any other widget. Since we want to click on a map in a notebook and get an interactive time series plot, they are perfect tools to use here. "
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:37.755635Z",
"start_time": "2017-12-12T22:59:37.643400Z"
}
},
"outputs": [],
"source": [
"import ipyleaflet as ipyl\n",
"import bqplot as bq\n",
"import ipywidgets as ipyw"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"To make working with ERDDAP simpler, we use `erddapy`, a high-level python interface to ERDDAP's RESTful API"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:37.883904Z",
"start_time": "2017-12-12T22:59:37.762650Z"
}
},
"outputs": [],
"source": [
"from erddapy import ERDDAP\n",
"from erddapy.utilities import urlopen"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This code should work with minor modifications on any ERDDAP (v1.64+) endpoint that has `cdm_data_type=timeseries` or `cdm_data_type=point` datasets. Change the values for other ERDDAP endpoints or regions of interest"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:37.915971Z",
"start_time": "2017-12-12T22:59:37.890919Z"
}
},
"outputs": [],
"source": [
"endpoint = 'http://erddap.sensors.ioos.us/erddap'\n",
"\n",
"initial_standard_name = 'sea_surface_wave_significant_height'\n",
"\n",
"nchar = 9 # number of characters for short dataset name\n",
"cdm_data_type = 'TimeSeries'\n",
"center = [35, -100]\n",
"zoom = 3\n",
"\n",
"min_time = pendulum.parse('2017-11-01T00:00:00Z')\n",
"max_time = pendulum.parse('2017-11-11T00:00:00Z')"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:37.986118Z",
"start_time": "2017-12-12T22:59:37.959062Z"
}
},
"outputs": [],
"source": [
"endpoint = 'https://gamone.whoi.edu/erddap'\n",
"\n",
"initial_standard_name = 'sea_water_temperature'\n",
"\n",
"nchar = 9 # number of characters for short dataset name\n",
"cdm_data_type = 'TimeSeries'\n",
"center = [35, -100]\n",
"zoom = 3\n",
"\n",
"min_time = pendulum.parse('2011-05-05T00:00:00Z')\n",
"max_time = pendulum.parse('2011-05-15T00:00:00Z')"
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:38.023196Z",
"start_time": "2017-12-12T22:59:37.996139Z"
}
},
"outputs": [],
"source": [
"endpoint = 'https://erddap-uncabled.oceanobservatories.org/uncabled/erddap'\n",
"\n",
"initial_standard_name = 'sea_water_temperature'\n",
"\n",
"nchar = 8 # number of characters for short dataset name\n",
"cdm_data_type = 'Point'\n",
"center = [35, -100]\n",
"zoom = 1\n",
"\n",
"min_time = pendulum.parse('2017-08-01T00:00:00Z')\n",
"max_time = pendulum.parse('2017-08-03T00:00:00Z')"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:38.059272Z",
"start_time": "2017-12-12T22:59:38.034219Z"
}
},
"outputs": [],
"source": [
"endpoint = 'https://cgoms.coas.oregonstate.edu/erddap'\n",
"\n",
"initial_standard_name = 'air_temperature'\n",
"\n",
"nchar = 8 # number of characters for short dataset name\n",
"cdm_data_type = 'TimeSeries'\n",
"center = [44, -124]\n",
"zoom = 6\n",
"\n",
"now = pendulum.now(tz='utc')\n",
"max_time = now\n",
"min_time = now.subtract(days=3)"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:38.098354Z",
"start_time": "2017-12-12T22:59:38.070295Z"
}
},
"outputs": [],
"source": [
"endpoint = 'http://ooivm1.whoi.net/erddap'\n",
"\n",
"initial_standard_name = 'solar_panel_1_voltage'\n",
"\n",
"nchar = 8 # number of characters for short dataset name\n",
"cdm_data_type = 'TimeSeries'\n",
"center = [41.0, -70.]\n",
"zoom = 7\n",
"\n",
"now = pendulum.now(tz='utc')\n",
"max_time = now\n",
"min_time = now.subtract(days=3)"
]
},
{
"cell_type": "code",
"execution_count": 10,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:37.951045Z",
"start_time": "2017-12-12T22:59:37.925992Z"
}
},
"outputs": [],
"source": [
"server = 'http://www.neracoos.org/erddap'\n",
"\n",
"standard_name = 'significant_height_of_wind_and_swell_waves'\n",
"#standard_name = 'sea_water_temperature'\n",
"\n",
"nchar = 3 # number of characters for short dataset name\n",
"cdm_data_type = 'TimeSeries'\n",
"center = [42.5, -68]\n",
"zoom = 6\n",
"\n",
"now = pendulum.now(tz='utc')\n",
"search_max_time = now\n",
"search_min_time = now.subtract(weeks=2)"
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:38.187540Z",
"start_time": "2017-12-12T22:59:38.110379Z"
}
},
"outputs": [],
"source": [
"e = ERDDAP(server=server, protocol='tabledap')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Find all the `standard_name` attributes that exist on this ERDDAP endpoint, using [ERDDAP's \"categorize\" service](http://www.neracoos.org/erddap/categorize/index.html)"
]
},
{
"cell_type": "code",
"execution_count": 12,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:38.272719Z",
"start_time": "2017-12-12T22:59:38.195557Z"
}
},
"outputs": [],
"source": [
"url='{}/categorize/standard_name/index.csv'.format(server)\n",
"df = pd.read_csv(urlopen(url), skiprows=[1, 2])\n",
"standard_names = df['Category'].values"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Create a dropdown menu widget with all the `standard_name` values found"
]
},
{
"cell_type": "code",
"execution_count": 13,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:38.323826Z",
"start_time": "2017-12-12T22:59:38.279734Z"
}
},
"outputs": [],
"source": [
"widget_std_names = ipyw.Dropdown(options=standard_names, value=standard_name)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Create a text widget to enter the search minimum time"
]
},
{
"cell_type": "code",
"execution_count": 14,
"metadata": {},
"outputs": [],
"source": [
"widget_search_min_time = ipyw.Text(\n",
" value=search_min_time.to_datetime_string(),\n",
" description='Search Min',\n",
" disabled=False\n",
")"
]
},
{
"cell_type": "code",
"execution_count": 15,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:38.371927Z",
"start_time": "2017-12-12T22:59:38.330841Z"
}
},
"outputs": [],
"source": [
"widget_search_max_time = ipyw.Text(\n",
" value=search_max_time.to_datetime_string(),\n",
" description='Search Max',\n",
" disabled=False\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This function puts lon,lat and datasetID into a GeoJSON feature"
]
},
{
"cell_type": "code",
"execution_count": 16,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:38.469131Z",
"start_time": "2017-12-12T22:59:38.446082Z"
}
},
"outputs": [],
"source": [
"def point(dataset, lon, lat, nchar):\n",
" geojsonFeature = {\n",
" \"type\": \"Feature\",\n",
" \"properties\": {\n",
" \"datasetID\": dataset,\n",
" \"short_dataset_name\": dataset[:nchar]\n",
" },\n",
" \"geometry\": {\n",
" \"type\": \"Point\",\n",
" \"coordinates\": [lon, lat]\n",
" }\n",
" };\n",
" geojsonFeature['properties']['style'] = {'color': 'Grey'}\n",
" return geojsonFeature"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This function finds all the datasets with a given standard_name in the specified time period, and return GeoJSON"
]
},
{
"cell_type": "code",
"execution_count": 17,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:38.518233Z",
"start_time": "2017-12-12T22:59:38.476145Z"
}
},
"outputs": [],
"source": [
"def adv_search(e, standard_name, cdm_data_type, min_time, max_time):\n",
" try:\n",
" search_url = e.get_search_url(response='csv', cdm_data_type=cdm_data_type.lower(), items_per_page=100000,\n",
" standard_name=standard_name, min_time=min_time, max_time=max_time)\n",
" df = pd.read_csv(urlopen(search_url))\n",
" except:\n",
" df = []\n",
" if len(var)>14:\n",
" v = '{}...'.format(standard_name[:15])\n",
" else:\n",
" v = standard_name\n",
" figure.title = 'No {} found in this time range. Pick another variable.'.format(v)\n",
" figure.marks[0].y = 0.0 * figure.marks[0].y\n",
" return df"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This function returns the lon,lat vlaues from allDatasets"
]
},
{
"cell_type": "code",
"execution_count": 18,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:38.542284Z",
"start_time": "2017-12-12T22:59:38.526250Z"
}
},
"outputs": [],
"source": [
"def alllonlat(e, cdm_data_type, min_time, max_time):\n",
" url='{}/tabledap/allDatasets.csv?datasetID%2CminLongitude%2CminLatitude&cdm_data_type=%22{}%22&minTime%3C={}&maxTime%3E={}'.format(e.server,\n",
" cdm_data_type,max_time.to_datetime_string(),min_time.to_datetime_string())\n",
" df = pd.read_csv(urlopen(url), skiprows=[1])\n",
" return df"
]
},
{
"cell_type": "code",
"execution_count": 19,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:38.607420Z",
"start_time": "2017-12-12T22:59:38.548296Z"
}
},
"outputs": [],
"source": [
"def stdname2geojson(e, standard_name, cdm_data_type, search_min_time, search_max_time):\n",
" '''return geojson containing lon, lat and datasetID for all matching stations'''\n",
" # find matching datsets using Advanced Search\n",
" dfa = adv_search(e, standard_name, cdm_data_type, search_min_time, search_max_time)\n",
" if isinstance(dfa, pd.DataFrame):\n",
" datasets = dfa['Dataset ID'].values\n",
"\n",
" # find lon,lat values from allDatasets \n",
" dfll = alllonlat(e, cdm_data_type, search_min_time, search_max_time)\n",
" # extract lon,lat values of matching datasets from allDatasets dataframe\n",
" dfr = dfll[dfll['datasetID'].isin(dfa['Dataset ID'])]\n",
" # contruct the GeoJSON using fast itertuples\n",
" geojson = {'features':[point(row[1],row[2],row[3],3) for row in dfr.itertuples()]}\n",
" else:\n",
" geojson = {'features':[]}\n",
" datasets = []\n",
" return geojson, datasets"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The `map_click_handler` function updates the time series plot when a station marker is clicked"
]
},
{
"cell_type": "code",
"execution_count": 20,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:38.639488Z",
"start_time": "2017-12-12T22:59:38.615437Z"
}
},
"outputs": [],
"source": [
"def map_click_handler(event=None, id=None, properties=None):\n",
" global dataset_id, standard_name\n",
" print('map clicked')\n",
" dataset_id = properties['datasetID']\n",
" # get standard_name from dropdown widget\n",
" standard_name = widget_std_names.value\n",
" print(dataset_id, standard_name, constraints)\n",
" widget_dsnames.value = dataset_id\n",
" update_timeseries_plot(dataset=dataset_id, standard_name=standard_name, constraints=constraints)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The `search_button_handler` function updates the map when the `Search` button is selected "
]
},
{
"cell_type": "code",
"execution_count": 21,
"metadata": {},
"outputs": [],
"source": [
"def widget_replot_button_handler(change):\n",
" global dataset_id, constraints\n",
" plot_start_time = pendulum.parse(widget_plot_start_time.value)\n",
" plot_stop_time = pendulum.parse(widget_plot_stop_time.value)\n",
"\n",
" constraints = {\n",
" 'time>=': plot_start_time,\n",
" 'time<=': plot_stop_time\n",
" }\n",
" dataset_id = widget_dsnames.value\n",
" update_timeseries_plot(dataset=dataset_id, standard_name=standard_name, constraints=constraints)"
]
},
{
"cell_type": "code",
"execution_count": 22,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:38.686586Z",
"start_time": "2017-12-12T22:59:38.646502Z"
}
},
"outputs": [],
"source": [
"def widget_search_button_handler(change):\n",
" global features, datasets, standard_name, dataset_id, constraints\n",
" search_min_time = pendulum.parse(widget_search_min_time.value)\n",
" search_max_time = pendulum.parse(widget_search_max_time.value)\n",
"\n",
" # get standard_name from dropdown widget\n",
" standard_name = widget_std_names.value\n",
"\n",
" # get updated datsets and map features\n",
" features, datasets = stdname2geojson(e, standard_name, cdm_data_type, search_min_time, search_max_time)\n",
" # update map\n",
" feature_layer = ipyl.GeoJSON(data=features)\n",
" feature_layer.on_click(map_click_handler)\n",
" map.layers = [map.layers[0], feature_layer]\n",
" \n",
" # widget_plot_start_time.value = widget_search_min_time.value\n",
" # widget_plot_stop_time.value = widget_search_max_time.value\n",
"\n",
" # populate datasets widget with new info\n",
" dataset_id = datasets[0]\n",
" widget_dsnames.options = datasets\n",
" widget_dsnames.value = dataset_id\n",
" \n",
" constraints = {\n",
" 'time>=': search_min_time,\n",
" 'time<=': search_max_time\n",
" }\n",
" print(dataset_id, standard_name, constraints)\n",
" update_timeseries_plot(dataset=dataset_id, standard_name=standard_name, constraints=constraints)"
]
},
{
"cell_type": "code",
"execution_count": 23,
"metadata": {},
"outputs": [],
"source": [
"def update_timeseries_plot(dataset=None, standard_name=None, constraints=None, title_len=18):\n",
" df, var = get_data(dataset=dataset, standard_name=standard_name, constraints=constraints)\n",
" figure.marks[0].x = df.index\n",
" figure.marks[0].y = df[var]\n",
" figure.title = '{} - {}'.format(dataset[:title_len], var)"
]
},
{
"cell_type": "code",
"execution_count": 24,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:38.720658Z",
"start_time": "2017-12-12T22:59:38.692599Z"
}
},
"outputs": [],
"source": [
"widget_search_button = ipyw.Button(\n",
" value=False,\n",
" description='Update search',\n",
" disabled=False,\n",
" button_style='')"
]
},
{
"cell_type": "code",
"execution_count": 25,
"metadata": {},
"outputs": [],
"source": [
"widget_replot_button = ipyw.Button(\n",
" value=False,\n",
" description='Update TimeSeries',\n",
" disabled=False,\n",
" button_style='')"
]
},
{
"cell_type": "code",
"execution_count": 26,
"metadata": {},
"outputs": [],
"source": [
"widget_replot_button.on_click(widget_replot_button_handler)"
]
},
{
"cell_type": "code",
"execution_count": 27,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:38.736691Z",
"start_time": "2017-12-12T22:59:38.728674Z"
}
},
"outputs": [],
"source": [
"widget_search_button.on_click(widget_search_button_handler)"
]
},
{
"cell_type": "code",
"execution_count": 28,
"metadata": {},
"outputs": [],
"source": [
"widget_plot_start_time = ipyw.Text(\n",
" value=search_min_time.to_datetime_string(),\n",
" description='Plot Min',\n",
" disabled=False\n",
")"
]
},
{
"cell_type": "code",
"execution_count": 29,
"metadata": {},
"outputs": [],
"source": [
"widget_plot_stop_time = ipyw.Text(\n",
" value=search_max_time.to_datetime_string(),\n",
" description='Plot Max',\n",
" disabled=False\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This function returns the specified dataset time series values as a Pandas dataframe"
]
},
{
"cell_type": "code",
"execution_count": 30,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:38.767756Z",
"start_time": "2017-12-12T22:59:38.743706Z"
}
},
"outputs": [],
"source": [
"def get_data(dataset=None, standard_name=None, constraints=None):\n",
" print(dataset_id, standard_name, constraints)\n",
" var = e.get_var_by_attr(dataset_id=dataset, \n",
" standard_name=lambda v: str(v).lower() == standard_name.lower())[0]\n",
" download_url = e.get_download_url(dataset_id=dataset, constraints=constraints,\n",
" variables=['time',var], response='csv')\n",
" df = pd.read_csv(urlopen(download_url), index_col='time', parse_dates=True, skiprows=[1])\n",
" return df, var"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This defines the initial `ipyleaflet` map "
]
},
{
"cell_type": "code",
"execution_count": 31,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:38.963166Z",
"start_time": "2017-12-12T22:59:38.808842Z"
}
},
"outputs": [],
"source": [
"map = ipyl.Map(center=center, zoom=zoom, layout=dict(width='750px', height='350px'))\n",
"features, datasets = stdname2geojson(e, standard_name, cdm_data_type, search_min_time, search_max_time)\n",
"dataset_id = datasets[0]\n",
"feature_layer = ipyl.GeoJSON(data=features)\n",
"feature_layer.on_click(map_click_handler)\n",
"map.layers = [map.layers[0], feature_layer]"
]
},
{
"cell_type": "code",
"execution_count": 32,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:38.800826Z",
"start_time": "2017-12-12T22:59:38.774771Z"
}
},
"outputs": [],
"source": [
"widget_dsnames = ipyw.Dropdown(options=datasets, value=dataset_id)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This defines the intitial `bqplot` time series plot"
]
},
{
"cell_type": "code",
"execution_count": 33,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:43.043717Z",
"start_time": "2017-12-12T22:59:38.968176Z"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"A01_accelerometer_all significant_height_of_wind_and_swell_waves {'time>=': DateTime(2018, 5, 6, 21, 35, 24, 725992, tzinfo=Timezone('UTC')), 'time<=': DateTime(2018, 5, 20, 21, 35, 24, 725992, tzinfo=Timezone('UTC'))}\n"
]
}
],
"source": [
"dt_x = bq.DateScale()\n",
"sc_y = bq.LinearScale()\n",
"\n",
"constraints = {\n",
" 'time>=': search_min_time,\n",
" 'time<=': search_max_time\n",
"}\n",
"\n",
"df, var = get_data(dataset=dataset_id, standard_name=standard_name, constraints=constraints)\n",
"def_tt = bq.Tooltip(fields=['y'], formats=['.2f'], labels=['value'])\n",
"time_series = bq.Lines(x=df.index, y=df[var], \n",
" scales={'x': dt_x, 'y': sc_y}, tooltip=def_tt)\n",
"ax_x = bq.Axis(scale=dt_x, label='Time')\n",
"ax_y = bq.Axis(scale=sc_y, orientation='vertical')\n",
"figure = bq.Figure(marks=[time_series], axes=[ax_x, ax_y])\n",
"figure.title = '{} - {}'.format(dataset_id[:18], var)\n",
"figure.layout.height = '300px'\n",
"figure.layout.width = '800px'"
]
},
{
"cell_type": "code",
"execution_count": 34,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:43.093822Z",
"start_time": "2017-12-12T22:59:43.056744Z"
}
},
"outputs": [],
"source": [
"#Not currently using this (cell below setting \"observe\" to this function is commented out)\n",
"def widget_dsnames_handler(change):\n",
" dataset_id = widget_dsnames.value\n",
" constraints = {\n",
" 'time>=': search_min_time,\n",
" 'time<=': search_max_time\n",
" }\n",
" update_timeseries_plot(dataset=dataset_id, standard_name=standard_name, constraints=constraints)"
]
},
{
"cell_type": "code",
"execution_count": 35,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:43.166975Z",
"start_time": "2017-12-12T22:59:43.104845Z"
}
},
"outputs": [],
"source": [
"#widget_dsnames.observe(widget_replot_button_handler)"
]
},
{
"cell_type": "code",
"execution_count": 36,
"metadata": {},
"outputs": [],
"source": [
"#all this widget does it take up 7 cm of vertical space \n",
"ispace = ipyw.HTML(\n",
" value='<style> .space {margin-bottom: 6.5cm;}</style><p class=\"space\"> </p>',\n",
" placeholder='',\n",
" description='',\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This specifies the widget layout"
]
},
{
"cell_type": "code",
"execution_count": 37,
"metadata": {
"ExecuteTime": {
"end_time": "2017-12-12T22:59:43.460591Z",
"start_time": "2017-12-12T22:59:43.181005Z"
},
"scrolled": false
},
"outputs": [
{
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "df29c89dd1e14b26849df221ea1c76bc",
"version_major": 2,
"version_minor": 0
},
"text/plain": [
"Box(children=(Box(children=(Map(basemap={'url': 'https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png', 'max_zoo…"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"form_item_layout = ipyw.Layout(display='flex', flex_flow='column', justify_content='space-between')\n",
"\n",
"col1 = ipyw.Box([map, figure], layout=form_item_layout)\n",
"col2 = ipyw.Box([widget_std_names, widget_search_min_time, widget_search_max_time, widget_search_button,\n",
" ispace, widget_dsnames, widget_plot_start_time, widget_plot_stop_time, widget_replot_button], layout=form_item_layout)\n",
"\n",
"form_items = [col1, col2]\n",
"\n",
"form = ipyw.Box(form_items, layout=ipyw.Layout(display='flex', flex_flow='row', border='solid 2px',\n",
" align_items='flex-start', width='100%'))\n",
"\n",
"form"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.5"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment