Skip to content

Instantly share code, notes, and snippets.

@ScottEvil
Created September 23, 2015 17:26
Show Gist options
  • Save ScottEvil/37e96f0fdc007b982005 to your computer and use it in GitHub Desktop.
Save ScottEvil/37e96f0fdc007b982005 to your computer and use it in GitHub Desktop.
Different command line examples for GeoWave
# To copy the geowave accumulo jar to the proper location in hdfs
hadoop fs -copyFromLocal geowave-deploy-0.8.9-SNAPSHOT-accumulo-singlejar.jar hdfs://<FQDN>/accumulo/classpath/geowave/0.8.9-hdp2/
# To configure Accumulo to grab the geowave jar from it's location in hdfs (from the Accumulo command line)
config -s general.vfs.context.classpath.geowave=hdfs://<FQDN>:8020/accumulo/classpath/geowave/0.8.9-hdp2/[^.].*.jar
##geowave -localingest -b ./ingest/ne/ -i accumulo -n geowave.50m_admin_0_countries -f geotools-vector -u geowave -p hadoop -z <FQDN>:2181
##java -cp geowave-tools.jar -localingest -b /usr/local/geowave/ingest/ne/ -i accumulo -n geowave.50m_admin_0_countries -f geotools-vector -u geowave -p hadoop -z <FQDN>:2181
#To ingest the natural earth data into GeoWave
java -cp /usr/local/geowave/geowave-deploy-0.8.9-SNAPSHOT-tools.jar:/usr/local/geowave/plugins/* mil.nga.giat.geowave.core.cli.GeoWaveMain -localingest -b /usr/local/geowave/ingest/ne/ -i accumulo -n geowave.50m_admin_0_countries -f geotools-vector -u geowave -p hadoop -z <FQDN>:2181
#To ingest the geolife data into GeoWave
java -cp /usr/local/geowave/geowave-deploy-0.8.9-SNAPSHOT-tools.jar:/usr/local/geowave/plugins/* mil.nga.giat.geowave.core.cli.GeoWaveMain -localingest -b /usr/local/geowave/ingest/geolife/ -dim spatial-temporal -i accumulo -n geowave.geolife -f geolife -u geowave -p hadoop -z <FQDN>:2181
#To run basic stats on the geolife data (note that this dumps the current statistics and generates new statistics
java -cp /usr/local/geowave/geowave-deploy-0.8.9-SNAPSHOT-tools.jar:/usr/local/geowave/plugins/* mil.nga.giat.geowave.core.cli.GeoWaveMain -stats -i accumulo -n geowave.geolife -type geolifepoint -u geowave -p hadoop -z <FQDN>:2181
#To run the DBScan statistics against the geolife tables in GeoWave
yarn jar geowave-analytic-mapreduce-0.8.9-SNAPSHOT-analytics-singlejar.jar -dbscan -n geowave.geolife -u geowave -p hadoop -z <FQDN>:2181 -i accumulo -emn 2 -emx 6 -pd 1000 -pc mil.nga.giat.geowave.analytic.partitioner.OrthodromicDistancePartitioner -cms 10 -orc 4 -hdfsbase /user/geolife -b bdb4 -eit geolife
#Clear tables from GeoWave & Accumulo (haven't been able to get this to work yet)
java -cp /usr/local/geowave/geowave-deploy-0.8.9-SNAPSHOT-tools.jar:/usr/local/geowave/plugins/* mil.nga.giat.geowave.core.cli.GeoWaveMain -clear -c -i accumulo -n geowave.geolife -f geotools-vector -u geowave -p hadoop -z <FQDN>:2181
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment