Skip to content

Instantly share code, notes, and snippets.

Simeon H.K. Fitch metasim

Block or report user

Report or block metasim

Hide content and notifications from this user.

Learn more about blocking users

Contact Support about this user’s behavior.

Learn more about reporting abuse

Report abuse
View GitHub Profile
View multiband-read-example.ipynb
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
View rf-kmeans.ipynb
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
View backtrace.scala
java.lang.AssertionError: assertion failed: unsafe symbol Logger (child of package scalalogging) in runtime reflection universe
at scala.reflect.internal.Symbols$Symbol.<init>(Symbols.scala:184)
at scala.reflect.internal.Symbols$TypeSymbol.<init>(Symbols.scala:3009)
at scala.reflect.internal.Symbols$ClassSymbol.<init>(Symbols.scala:3201)
at scala.reflect.internal.Symbols$StubClassSymbol.<init>(Symbols.scala:3496)
at scala.reflect.internal.Symbols$Symbol.newStubSymbol(Symbols.scala:498)
at scala.reflect.internal.pickling.UnPickler$Scan.readExtSymbol$1(UnPickler.scala:258)
at scala.reflect.internal.pickling.UnPickler$Scan.readSymbol(UnPickler.scala:284)
at scala.reflect.internal.pickling.UnPickler$Scan.readSymbolRef(UnPickler.scala:649)
at scala.reflect.internal.pickling.UnPickler$Scan.readType(UnPickler.scala:417)
View LayerToGeoTIFF.scala
import geotrellis.raster.io.geotiff.SinglebandGeoTiff
import org.apache.spark.SparkFiles
import org.apache.spark.sql.SparkSession
import org.locationtech.rasterframes._
import org.locationtech.rasterframes.datasource.geotiff._
object LayerToGeoTiff {
def main(args: Array[String]): Unit = {
implicit val spark = new SparkSession.Builder()
View zonal-stats.ipynb
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
View foo.py
def set_dims(parts):
assert(len(parts) is 2, "Expected dimensions specification to have exactly two components")
assert(all([isinstance(p, int) and p > 0 for p in parts]),
"Expected all components in dimensions to be positive integers")
options.update({
"imageWidth": parts[0],
"imageHeight": parts[1]
})
if raster_dimensions is not None:
@metasim
metasim / read-model.py
Created Nov 11, 2018
Test files for loading a saved model using RasterFrames
View read-model.py
# Tested with Spark 2.2.1, Python 3.6.4
# Even though we've used `--py-files` on the command line,
# unintuitively this still seems necessary to import pyrasterframes
SparkContext.addPyFile(spark.sparkContext, 'pyrasterframes.zip')
from pyrasterframes import *
spark.withRasterFrames()
View guardrail-bug.yaml
swagger: '2.0'
info:
title: Asset Catalog
version: 0.5.1
description: >-
This is a derivation of the <a href="https://raw.githubusercontent.com/radiantearth/stac-spec/master/api-spec/STAC-standalone-swagger2.yaml">STAC API Specification</a>.
This is an OpenAPI definition of the core SpatioTemporal Asset Catalog API
specification. Any service that implements this endpoint to allow search of
spatiotemporal assets can be considered a STAC API. The endpoint is also
available as an OpenAPI fragment that can be integrated with other OpenAPI
View gist:13b8dbfb9603ad7b292a10e08efb57a8
18/09/07 16:00:12 DEBUG RasterRef$: Fetching Extent(369450.0, 3353100.0, 371763.3, 3355458.3) from HttpGeoTiffRasterSource(https://s3-us-west-2.amazonaws.com/landsat-pds/c1/L8/149/039/LC08_L1TP_149039_20170411_20170415_01_T1/LC08_L1TP_149039_20170411_20170415_01_T1_B4.TIF)
ReadMonitor(reads=0, total=45876)
+-----------+------------------------------------------------------------------------------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
|spatial_key|bounds |src
View 4215.csv
We can make this file beautiful and searchable if this error is corrected: It looks like row 2 should actually have 59 columns, instead of 3. in line 1.
PANCHROMATIC_LINES,NADIR_OFFNADIR,sunAzimuth,REFLECTIVE_SAMPLES,upperLeftCornerLongitude,cloudCover,MAP_PROJECTION_L1,cartURL,sunElevation,path,BPF_NAME_TIRS,THERMAL_LINES,GROUND_CONTROL_POINTS_MODEL,row,imageQuality1,REFLECTIVE_LINES,ELLIPSOID,GEOMETRIC_RMSE_MODEL,browseURL,browseAvailable,dayOrNight,CPF_NAME,DATA_TYPE_L1,THERMAL_SAMPLES,upperRightCornerLatitude,lowerLeftCornerLatitude,sceneStartTime,dateUpdated,sensor,PANCHROMATIC_SAMPLES,GROUND_CONTROL_POINTS_VERSION,LANDSAT_PRODUCT_ID,acquisitionDate,upperRightCornerLongitude,PROCESSING_SOFTWARE_VERSION,GRID_CELL_SIZE_REFLECTIVE,lowerRightCornerLongitude,lowerRightCornerLatitude,sceneCenterLongitude,COLLECTION_CATEGORY,GRID_CELL_SIZE_PANCHROMATIC,BPF_NAME_OLI,sceneCenterLatitude,CLOUD_COVER_LAND,lowerLeftCornerLongitude,GEOMETRIC_RMSE_MODEL_X,GEOMETRIC_RMSE_MODEL_Y,sceneStopTime,upperLeftCornerLatitude,UTM_ZONE,DATE_L1_GENERATED,GRID_CELL_SIZE_THERMAL,DATUM,COLLECTION_NUMBER,sceneID,RLUT_FILE_NAME,TIRS_SSM_MODEL,ROLL_ANGLE,receivingStation
16421,NADIR,162
You can’t perform that action at this time.