Skip to content

Instantly share code, notes, and snippets.

name of display: :0
display: :0 screen: 0
direct rendering: Yes
server glx vendor string: SGI
server glx version string: 1.4
server glx extensions:
GLX_ARB_create_context, GLX_ARB_create_context_profile,
GLX_ARB_create_context_robustness, GLX_ARB_fbconfig_float,
GLX_ARB_framebuffer_sRGB, GLX_ARB_multisample,
GLX_EXT_create_context_es2_profile, GLX_EXT_create_context_es_profile,
@Foolius
Foolius / gist:10973879
Created April 17, 2014 10:59
done with elasticsearch-hadoop-1.3.0.BUILD-20140416.180837-387.jar
org.elasticsearch.hadoop.rest.EsHadoopInvalidRequest: Found unrecoverable error [MapperParsingException[failed to parse]; nested: ElasticSearchParseException[Failed to derive xcontent from (offset=0, length=14): [83, 116, 114, 101, 97, 109, 32, 99, 108, 111, 115, 101, 100, 46]]; ]; Bailing out..
at org.elasticsearch.hadoop.rest.RestClient.retryFailedEntries(RestClient.java:189)
at org.elasticsearch.hadoop.rest.RestClient.bulk(RestClient.java:155)
at org.elasticsearch.hadoop.rest.RestRepository.sendBatch(RestRepository.java:170)
at org.elasticsearch.hadoop.rest.RestRepository.doWriteToIndex(RestRepository.java:152)
at org.elasticsearch.hadoop.rest.RestRepository.writeToIndex(RestRepository.java:130)
at org.elasticsearch.hadoop.mr.EsOutputFormat$EsRecordWriter.write(EsOutputFormat.java:160)
at org.elasticsearch.hadoop.pig.EsStorage.putNext(EsStorage.java:191)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.write(PigOutputFormat.java:139)
at org.apache.pig
org.elasticsearch.hadoop.rest.EsHadoopProtocolException: Found unrecoverable error [MapperParsingException[failed to parse]; nested: ElasticSearchParseException[Failed to derive xcontent from (offset=0, length=14): [83, 116, 114, 101, 97, 109, 32, 99, 108, 111, 115, 101, 100, 46]]; ]; Bailing out..
at org.elasticsearch.hadoop.rest.RestClient.retryFailedEntries(RestClient.java:189)
at org.elasticsearch.hadoop.rest.RestClient.bulk(RestClient.java:155)
at org.elasticsearch.hadoop.rest.RestRepository.sendBatch(RestRepository.java:163)
at org.elasticsearch.hadoop.rest.RestRepository.doWriteToIndex(RestRepository.java:146)
at org.elasticsearch.hadoop.rest.RestRepository.writeToIndex(RestRepository.java:124)
at org.elasticsearch.hadoop.mr.EsOutputFormat$EsRecordWriter.write(EsOutputFormat.java:160)
at org.elasticsearch.hadoop.pig.EsStorage.putNext(EsStorage.java:191)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.write(PigOutputFormat.java:139)
at org.apache.
@Foolius
Foolius / 1map
Created April 16, 2014 09:26
Error of one Map
Node [10.10.1.1:9201] failed; selected next node [10.10.1.6:9201]
Node [10.10.1.6:9201] failed; selected next node [10.10.1.2:9201]
Node [10.10.1.2:9201] failed; selected next node [10.10.1.5:9201]
Node [10.10.1.5:9201] failed; selected next node [hadoop.kbs.uni-hannover.de:9200]
Node [hadoop.kbs.uni-hannover.de:9200] failed; selected next node [10.10.1.3:9201]
@Foolius
Foolius / gist:10552798
Created April 12, 2014 19:36
pig + ES problem
index.pig:
register /home/foo/elasticsearch-hadoop-1.3.0.M3/dist/elasticsearch-hadoop-pig-1.3.0.M3.jar
a = load '$files' using PigStorage() as (json:chararray);
store a into '$index' using org.elasticsearch.hadoop.pig.EsStorage('es.input.json=true','es.nodes=hadoop.bla:9200');
$ pig -param files=`cat filelist | tr "\n" "," | sed -e 's/,$//'` -param index=hadooptest/t -f index.pig