Skip to content

Instantly share code, notes, and snippets.

View metasim's full-sized avatar
🇺🇦

Simeon H.K. Fitch metasim

🇺🇦
View GitHub Profile
@metasim
metasim / read-model.py
Created November 11, 2018 19:59
Test files for loading a saved model using RasterFrames
# Tested with Spark 2.2.1, Python 3.6.4
# Even though we've used `--py-files` on the command line,
# unintuitively this still seems necessary to import pyrasterframes
SparkContext.addPyFile(spark.sparkContext, 'pyrasterframes.zip')
from pyrasterframes import *
spark.withRasterFrames()
swagger: '2.0'
info:
title: Asset Catalog
version: 0.5.1
description: >-
This is a derivation of the <a href="https://raw.githubusercontent.com/radiantearth/stac-spec/master/api-spec/STAC-standalone-swagger2.yaml">STAC API Specification</a>.
This is an OpenAPI definition of the core SpatioTemporal Asset Catalog API
specification. Any service that implements this endpoint to allow search of
spatiotemporal assets can be considered a STAC API. The endpoint is also
available as an OpenAPI fragment that can be integrated with other OpenAPI
18/09/07 16:00:12 DEBUG RasterRef$: Fetching Extent(369450.0, 3353100.0, 371763.3, 3355458.3) from HttpGeoTiffRasterSource(https://s3-us-west-2.amazonaws.com/landsat-pds/c1/L8/149/039/LC08_L1TP_149039_20170411_20170415_01_T1/LC08_L1TP_149039_20170411_20170415_01_T1_B4.TIF)
ReadMonitor(reads=0, total=45876)
+-----------+------------------------------------------------------------------------------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
|spatial_key|bounds |src
We can make this file beautiful and searchable if this error is corrected: It looks like row 2 should actually have 59 columns, instead of 3. in line 1.
PANCHROMATIC_LINES,NADIR_OFFNADIR,sunAzimuth,REFLECTIVE_SAMPLES,upperLeftCornerLongitude,cloudCover,MAP_PROJECTION_L1,cartURL,sunElevation,path,BPF_NAME_TIRS,THERMAL_LINES,GROUND_CONTROL_POINTS_MODEL,row,imageQuality1,REFLECTIVE_LINES,ELLIPSOID,GEOMETRIC_RMSE_MODEL,browseURL,browseAvailable,dayOrNight,CPF_NAME,DATA_TYPE_L1,THERMAL_SAMPLES,upperRightCornerLatitude,lowerLeftCornerLatitude,sceneStartTime,dateUpdated,sensor,PANCHROMATIC_SAMPLES,GROUND_CONTROL_POINTS_VERSION,LANDSAT_PRODUCT_ID,acquisitionDate,upperRightCornerLongitude,PROCESSING_SOFTWARE_VERSION,GRID_CELL_SIZE_REFLECTIVE,lowerRightCornerLongitude,lowerRightCornerLatitude,sceneCenterLongitude,COLLECTION_CATEGORY,GRID_CELL_SIZE_PANCHROMATIC,BPF_NAME_OLI,sceneCenterLatitude,CLOUD_COVER_LAND,lowerLeftCornerLongitude,GEOMETRIC_RMSE_MODEL_X,GEOMETRIC_RMSE_MODEL_Y,sceneStopTime,upperLeftCornerLatitude,UTM_ZONE,DATE_L1_GENERATED,GRID_CELL_SIZE_THERMAL,DATUM,COLLECTION_NUMBER,sceneID,RLUT_FILE_NAME,TIRS_SSM_MODEL,ROLL_ANGLE,receivingStation
16421,NADIR,162
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@metasim
metasim / jmh-result.json
Created April 15, 2018 20:24
WKB getNumPoints Benchmark Result
[
{
"jmhVersion" : "1.20",
"benchmark" : "org.locationtech.geomesa.spark.jts.WKBNumPointsBench.deserializeNumPoints",
"mode" : "thrpt",
"threads" : 1,
"forks" : 1,
"jvm" : "/Library/Java/JavaVirtualMachines/jdk1.8.0_161.jdk/Contents/Home/jre/bin/java",
"jvmArgs" : [
],

Creating RasterFrames

Initialization

There are a couple of setup steps necessary anytime you want to work with RasterFrames. the first is to import the API symbols into scope:

@metasim
metasim / overload-on-context-bound.scala
Created September 11, 2017 15:00
Figure out how to do overloading via more specific context bounds.
import geotrellis.spark.{SpatialComponent, TemporalComponent}
object Test {
class Foo {
// ...
}
object Foo {
implicit val spatialComponent: SpatialComponent[Foo] = ???
}
/*
* Copyright 2017 Astraea, Inc.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
This file has been truncated, but you can view the full file.
{
"paragraphs": [
{
"title": "Boilerplate Setup",
"text": "// Java Imports\nimport java.sql.Timestamp\n\n// Spark Imports\nimport org.apache.spark.sql.gt._\nimport org.apache.spark.sql.gt.functions._\nimport org.apache.spark.ml._\nimport org.apache.spark.ml.classification._\nimport org.apache.spark.ml.evaluation._\nimport org.apache.spark.ml.feature._\nimport org.apache.spark.ml.tuning._\nimport org.apache.hadoop.fs.Path\n\n// GeoTrellis Imports\nimport geotrellis.raster._\nimport geotrellis.spark._\nimport geotrellis.util.Filesystem\nimport geotrellis.vector.io._\nimport geotrellis.vector._\nimport geotrellis.raster.histogram.Histogram\n\n// Astraea Imports\nimport astraea.model._\nimport astraea.spark.util._\nimport astraea.zeppelin._\nimport astraea.spark.ingest.AstraeaSparkContextMethods._\n\ngtRegister(sqlContext)",
"user": "anonymous",
"dateUpdated": "May 25, 2017 5:45:31 PM",
"config": {
"colWidth": 12.0,
"enabled": true,