Skip to content

Instantly share code, notes, and snippets.

@jhkennedy
Last active November 9, 2022 07:58
Show Gist options
  • Star 2 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save jhkennedy/e9e7ec353b2a05419e50413368ff0505 to your computer and use it in GitHub Desktop.
Save jhkennedy/e9e7ec353b2a05419e50413368ff0505 to your computer and use it in GitHub Desktop.
Display the source blob
Display the rendered blob
Raw
{
"cells": [
{
"cell_type": "markdown",
"id": "6153a241",
"metadata": {},
"source": [
"# 2018 Kīlauea volcano analysis with HyP3 and MintPy\n",
"\n",
"This notebook walks through performing a time-series analysis of the 2018 Kīlauea volcano eruption and collapse with On Demand InSAR products from the Alaska Satellite facility and MintPy. We'll:\n",
"\n",
"1. Use the [ASF Search Python package](https://docs.asf.alaska.edu/asf_search/basics/) to:\n",
" - Search ASF's catalog for Sentinel-1 SAR products covering the [Kīlauea volcano](https://www.usgs.gov/volcanoes/kilauea)\n",
" - Select a reference scene to generate a baseline stack\n",
" - Select a [short baseline subset (SBAS)](https://docs.asf.alaska.edu/vertex/sbas/) of scene pairs for InSAR processing\n",
"\n",
"\n",
"2. Use the [HyP3 Python SDK](https://hyp3-docs.asf.alaska.edu/using/sdk/) to:\n",
" - Request On Demand InSAR products from ASF HyP3\n",
" - Download the InSAR products when they are done processing\n",
"\n",
"\n",
"3. Use [GDAL](https://gdal.org/api/index.html#python-api) and [MintPy](https://mintpy.readthedocs.io/en/latest/) to:\n",
" - Prepare the InSAR products for MintPy\n",
" - perform a time-series analysis with MintPy\n",
" \n",
"---\n",
"\n",
"**Note:** This notebook does assume you have some familiarity with InSAR processing with MintPy already, and is a minimal example without much context or explanations. If you're new to InSAR and MintPy, I suggest checking out:\n",
"* our [InSAR on Demand story map](https://storymaps.arcgis.com/stories/68a8a3253900411185ae9eb6bb5283d3)\n",
"\n",
"\n",
"* [OpenSARlab's](https://opensarlab-docs.asf.alaska.edu/) highly detailed walkthrough of using HyP3 + MintPy via these notebooks:\n",
" * [Prepare a HyP3 InSAR Stack for MintPy](https://nbviewer.org/github/ASFOpenSARlab/opensarlab-notebooks/blob/master/SAR_Training/English/Master/Prepare_HyP3_InSAR_Stack_for_MintPy.ipynb)\n",
" * [MintPy Time-series Analysis](https://nbviewer.org/github/ASFOpenSARlab/opensarlab-notebooks/blob/master/SAR_Training/English/Master/MintPy_Time_Series_From_Prepared_Data_Stack.ipynb)\n",
" \n",
" Note: While these notebooks make some assumptions you're working in OpenSARlab, you can run these \n",
" notebooks outside OpenSARlab by creating [this conda environment](https://github.com/ASFOpenSARlab/opensarlab-envs/blob/main/Environment_Configs/insar_analysis_env.yml)."
]
},
{
"cell_type": "markdown",
"id": "c6db1f5c",
"metadata": {},
"source": [
"## 0. Initial Setup\n",
"\n",
"To run this notebook, you'll need a conda environment with the required dependencies. You can set up a new environment (recommended) and run the jupyter server like:\n",
"```shell\n",
"conda create -n hyp3-mintpy python=3.8 asf_search hyp3_sdk \"mintpy>=1.3.2\" pandas jupyter ipympl\n",
"\n",
"conda activate hyp3-mintpy\n",
"jupyter notebook mintpy_timeseries_analysis_agu_2021.ipynb\n",
"```\n",
"Or, install these dependencies into your own environment:\n",
"```shell\n",
"conda install hyp3-mintpy python=3.8 asf_search hyp3_sdk \"mintpy>=1.3.2\" pandas jupyter ipympl\n",
"\n",
"jupyter notebook mintpy_timeseries_analysis_agu_2021.ipynb\n",
"```"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "5408efae",
"metadata": {},
"outputs": [],
"source": [
"from pathlib import Path\n",
"\n",
"from dateutil.parser import parse as parse_date"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "9713aa55",
"metadata": {},
"outputs": [],
"source": [
"project_name = 'hawaii-sbas-agu'\n",
"\n",
"work_dir = Path.cwd() / project_name\n",
"data_dir = work_dir / 'data'\n",
"\n",
"stack_start = parse_date('2018-01-01')\n",
"stack_end = parse_date('2018-10-01')\n",
"max_temporal_baseline = 13 # days"
]
},
{
"cell_type": "markdown",
"id": "b38a3880",
"metadata": {},
"source": [
"## 1. Select InSAR pairs with ASF Search"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "461ec862",
"metadata": {},
"outputs": [],
"source": [
"import asf_search as asf\n",
"import pandas as pd\n",
"\n",
"search_results = asf.geo_search(\n",
" platform=asf.SENTINEL1,\n",
" intersectsWith='POINT(-155.287 19.421)',\n",
" start='2018-04-18',\n",
" end='2018-05-12',\n",
" processingLevel=asf.SLC,\n",
" beamMode=asf.IW,\n",
" flightDirection=asf.ASCENDING,\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "cd8089dd",
"metadata": {},
"outputs": [],
"source": [
"baseline_results = asf.baseline_search.stack_from_product(search_results[-1])\n",
"\n",
"columns = list(baseline_results[0].properties.keys()) + ['geometry', ]\n",
"data = [list(scene.properties.values()) + [scene.geometry, ] for scene in baseline_results]\n",
"\n",
"stack = pd.DataFrame(data, columns=columns)\n",
"stack['startTime'] = stack.startTime.apply(parse_date)\n",
"\n",
"stack = stack.loc[(stack_start <= stack.startTime) & (stack.startTime <= stack_end)]"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "9d8771be",
"metadata": {},
"outputs": [],
"source": [
"sbas_pairs = set()\n",
"for reference, rt in stack.loc[::-1, ['sceneName', 'temporalBaseline']].itertuples(index=False):\n",
" secondaries = stack.loc[\n",
" (stack.sceneName != reference)\n",
" & (stack.temporalBaseline - rt <= max_temporal_baseline)\n",
" & (stack.temporalBaseline - rt > 0)\n",
" ]\n",
" for secondary in secondaries.sceneName:\n",
" sbas_pairs.add((reference, secondary))"
]
},
{
"cell_type": "markdown",
"id": "bfc7657c",
"metadata": {},
"source": [
"## 2. Request On Demand InSAR products from ASF HyP3\n",
"\n",
"Use your [NASA Earthdata login](https://urs.earthdata.nasa.gov/) to connect to [ASF HyP3](https://hyp3-docs.asf.alaska.edu/)."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "8492a2c8",
"metadata": {},
"outputs": [],
"source": [
"import hyp3_sdk as sdk\n",
"\n",
"hyp3 = sdk.HyP3(prompt=True)"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "4c378369",
"metadata": {},
"outputs": [],
"source": [
"jobs = sdk.Batch()\n",
"for reference, secondary in sbas_pairs:\n",
" jobs += hyp3.submit_insar_job(reference, secondary, name=project_name,\n",
" include_dem=True, include_look_vectors=True)"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "12c1177d",
"metadata": {},
"outputs": [],
"source": [
"jobs = hyp3.find_jobs(name=project_name)"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "bd5436d3",
"metadata": {},
"outputs": [],
"source": [
"jobs = hyp3.watch(jobs)"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "76e6a3a4",
"metadata": {},
"outputs": [],
"source": [
"insar_products = jobs.download_files(data_dir)\n",
"insar_products = [sdk.util.extract_zipped_product(ii) for ii in insar_products]"
]
},
{
"cell_type": "markdown",
"id": "dc767f0e",
"metadata": {},
"source": [
"## 3. Time-series Analysis with MintPy"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "087e7e0f",
"metadata": {},
"outputs": [],
"source": [
"from osgeo import gdal\n",
"\n",
"corners = [gdal.Info(str(dem), format='json')['cornerCoordinates'] for dem in data_dir.glob('*/*_dem.tif')]\n",
"ulx = max(corner['upperLeft'][0] for corner in corners)\n",
"uly = min(corner['upperLeft'][1] for corner in corners)\n",
"lrx = min(corner['lowerRight'][0] for corner in corners)\n",
"lry = max(corner['lowerRight'][1] for corner in corners)"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "b370fbbf",
"metadata": {},
"outputs": [],
"source": [
"files_for_mintpy = ['_water_mask.tif', '_corr.tif', '_unw_phase.tif', '_dem.tif', '_lv_theta.tif', '_lv_phi.tif']\n",
"for product_dir in insar_products:\n",
" for file_suffix in files_for_mintpy:\n",
" src_file = product_dir / f'{product_dir.name}{file_suffix}'\n",
" dst_file = product_dir / f'{src_file.stem}_clipped{src_file.suffix}'\n",
" gdal.Translate(destName=str(dst_file), srcDS=str(src_file), projWin=[ulx, uly, lrx, lry])"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "9704de1b",
"metadata": {},
"outputs": [],
"source": [
"mintpy_config = work_dir / 'mintpy_config.txt'\n",
"mintpy_config.write_text(\n",
"f\"\"\"\n",
"mintpy.load.processor = hyp3\n",
"##---------interferogram datasets:\n",
"mintpy.load.unwFile = {data_dir}/*/*_unw_phase_clipped.tif\n",
"mintpy.load.corFile = {data_dir}/*/*_corr_clipped.tif\n",
"##---------geometry datasets:\n",
"mintpy.load.demFile = {data_dir}/*/*_dem_clipped.tif\n",
"mintpy.load.incAngleFile = {data_dir}/*/*_lv_theta_clipped.tif\n",
"mintpy.load.azAngleFile = {data_dir}/*/*_lv_phi_clipped.tif\n",
"mintpy.load.waterMaskFile = {data_dir}/*/*_water_mask_clipped.tif\n",
"\"\"\")\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "5562273b",
"metadata": {},
"outputs": [],
"source": [
"!smallbaselineApp.py --dir {work_dir} {mintpy_config}"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "26615db4",
"metadata": {},
"outputs": [],
"source": [
"%matplotlib widget\n",
"from mintpy import view, tsview"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "89517ea3",
"metadata": {},
"outputs": [],
"source": [
"view.main([f'{work_dir}/velocity.h5'])"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "2cde140a",
"metadata": {},
"outputs": [],
"source": [
"tsview.main([f'{work_dir}/timeseries.h5'])"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "57e6962c",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.12"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
@NKakar
Copy link

NKakar commented Mar 23, 2022

Hello, Thank you for the notebook. I get error in the following step, do you have an idea? Thanks
TypeError: Indexing arrays must have integer dtypes - step - correct_troposphere

grab metadata from ref_file: /Test_Data/hawaii-sbas-agu/timeseries.h5
grab dataset structure from ref_file: /Test_Data/hawaii-sbas-agu/timeseries.h5
create HDF5 file: /Test_Data/hawaii-sbas-agu/timeseries_ERA5.h5 with w mode
create dataset : bperp of float32 in size of (41,) with compression = None
create dataset : date of |S8 in size of (41,) with compression = None
create dataset : timeseries of float32 in size of (41, 775, 3213) with compression = None
close HDF5 file: /Test_Data/hawaii-sbas-agu/timeseries_ERA5.h5

@jhkennedy
Copy link
Author

Hi @NKakar! That's an error due to a change in the h5py dependency of MintPy as documented here:
insarlab/MintPy#741

It should be fixed in the next release, but you can pin h5py<3 to resolve it for now. Hope that helps!

@srinty
Copy link

srinty commented Sep 9, 2022

Hi, Thank you so much for this notebook. I am having an error for, stack.loc[(stack_start <= stack.startTime) & (stack.startTime <= stack_end)]
Can you please tell how to resolve this issue?

Error:
[SpyderKernelApp] ERROR | Error in message handler
Traceback (most recent call last):
File "/opt/anaconda3/lib/python3.8/site-packages/ipykernel/kernelbase.py", line 461, in dispatch_queue
await self.process_one()
File "/opt/anaconda3/lib/python3.8/site-packages/ipykernel/kernelbase.py", line 450, in process_one
await dispatch(*args)
TypeError: object NoneType can't be used in 'await' expression
Traceback (most recent call last):

File "/opt/anaconda3/lib/python3.8/site-packages/pandas/core/arrays/datetimelike.py", line 536, in _validate_comparison_value
self._check_compatible_with(other)

File "/opt/anaconda3/lib/python3.8/site-packages/pandas/core/arrays/datetimes.py", line 508, in _check_compatible_with
self._assert_tzawareness_compat(other)

File "/opt/anaconda3/lib/python3.8/site-packages/pandas/core/arrays/datetimes.py", line 715, in _assert_tzawareness_compat
raise TypeError(

TypeError: Cannot compare tz-naive and tz-aware datetime-like objects

The above exception was the direct cause of the following exception:

Traceback (most recent call last):

File "/opt/anaconda3/lib/python3.8/site-packages/pandas/core/arrays/datetimelike.py", line 1008, in _cmp_method
other = self._validate_comparison_value(other)

File "/opt/anaconda3/lib/python3.8/site-packages/pandas/core/arrays/datetimelike.py", line 539, in _validate_comparison_value
raise InvalidComparison(other) from err

InvalidComparison: 2018-01-01 00:00:00

During handling of the above exception, another exception occurred:

Traceback (most recent call last):

File "/var/folders/5q/5tntqnbn0yzc5gh280ncl2jm0000gn/T/ipykernel_975/1019817884.py", line 1, in
stack = stack.loc[(stack_start <= stack.startTime) & (stack.startTime <= stack_end)]

File "/opt/anaconda3/lib/python3.8/site-packages/pandas/core/ops/common.py", line 70, in new_method
return method(self, other)

File "/opt/anaconda3/lib/python3.8/site-packages/pandas/core/arraylike.py", line 60, in ge
return self._cmp_method(other, operator.ge)

File "/opt/anaconda3/lib/python3.8/site-packages/pandas/core/series.py", line 5620, in _cmp_method
res_values = ops.comparison_op(lvalues, rvalues, op)

File "/opt/anaconda3/lib/python3.8/site-packages/pandas/core/ops/array_ops.py", line 269, in comparison_op
res_values = op(lvalues, rvalues)

File "/opt/anaconda3/lib/python3.8/site-packages/pandas/core/ops/common.py", line 70, in new_method
return method(self, other)

File "/opt/anaconda3/lib/python3.8/site-packages/pandas/core/arraylike.py", line 60, in ge
return self._cmp_method(other, operator.ge)

File "/opt/anaconda3/lib/python3.8/site-packages/pandas/core/arrays/datetimelike.py", line 1010, in _cmp_method
return invalid_comparison(self, other, op)

File "/opt/anaconda3/lib/python3.8/site-packages/pandas/core/ops/invalid.py", line 34, in invalid_comparison
raise TypeError(f"Invalid comparison between dtype={left.dtype} and {typ}")

TypeError: Invalid comparison between dtype=datetime64[ns, tzutc()] and datetime

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment