For Kibana to create a Tile map it needs geo-encoded data, which logstash does easily with the geoid plugin, BUT the field mapping in Elasticsearch must be geopoint
. Logstash does this dynamically so long as the index being written to matches the elasticsearch-template.json
template, which is any prefixed with logstash
. Otherwise, you need to update the index field mapping yourself for this to work.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Error: 320101]Attempt to target more than one WLDFSystemResource to server | |
Cause: BI/BI Apps come with a pre-defined Diagnostic Module (Module-FMWDFW), which means that you cannot create a new one targeting the same servers. If you want to add your own stuff you need to either add it to this module or disable it first and then create your own. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
(Paste this into a Static Text view in OBIEE to see the current values) | |
[u][b]Predefined Presentation Variables | |
[/b][/u][br/] | |
<p align="left"> | |
[b]dashboard.currentPage:[/b] @{dashboard.currentPage}[br/] | |
[b]dashboard.author:[/b] @{dashboard.author}[br/] | |
[b]dashboard.caption:[/b] @{dashboard.caption}[br/] | |
[b]dashboard.description:[/b] @{dashboard.description}[br/] | |
[b]dashboard.location:[/b] @{dashboard.location}[br/] |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
[http://ritt.md/go-elk-1](http://ritt.md/go-elk-1) | |
http://ritt.md/go-elk-2 | |
http://ritt.md/go-elk-3 |
Error from ROracle when running fetch(query)
:
Error in .valueClassTest(ans, "data.frame", "fetch") :
invalid value from generic function 'fetch', class "try-error", expected "data.frame"
Cause: a CLOB in the data being returned (SELECT *
- lazy!).
Fix: Didn't need the CLOB anyway so dropped it from the query.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
!#/usr/bin/bash | |
export FMW_HOME=/u01/BI_FMW/ | |
export OUTBASE=/tmp/nqstmp_usage | |
while [ 1 -eq 1 ]; do echo -n $(date) >> $OUTBASE.du;echo -n ',' >> $OUTBASE.du;du $FMW_HOME/instances/instance1/tmp/OracleBIServerComponent/coreapplication_obis1/obis_temp >> $OUTBASE.du;sleep 5; done |
Courtesy of here, if you're using a producer on a node remote to the kafka server, you need to make sure you've set advertised.host.name
in config/server.properties
, otherwise the client may not pick up the correct remote host to connect to.
- http://sqlblog.com/blogs/adam_machanic/archive/2013/02/22/capturing-attention-writing-great-session-descriptions.aspx
- http://www.pythian.com/blog/concrete-advice-for-abstract-writers/
- http://dbakevlar.com/2013/10/abstracts-reviews-and-conferences-oh-my/
- http://alistapart.com/article/conference-proposals-that-dont-suck
- https://mwidlake.wordpress.com/2015/04/17/tips-on-submitting-an-abstract-to-conference/
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
plaintext message here {:exception=>#<NoMethodError: undefined method `[]' for 68.68:Float>, :backtrace=>["/opt/logstash-1.5.4/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.4-java/lib/logstash/event.rb:73:in `initialize'", "/opt/logstash-1.5.4/vendor/bundle/jruby/1.9/gems/logstash-codec-json-1.0.1/lib/logstash/codecs/json.rb:46:in `decode'", "/opt/logstash-1.5.4/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-1.0.0/lib/logstash/inputs/kafka.rb:169:in `queue_event'", "/opt/logstash-1.5.4/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-1.0.0/lib/logstash/inputs/kafka.rb:139:in `run'", "/opt/logstash-1.5.4/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.4-java/lib/logstash/pipeline.rb:177:in `inputworker'", "/opt/logstash-1.5.4/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.4-java/lib/logstash/pipeline.rb:171:in `start_input'"], :level=>:error} |
Flume puts just raw text on Kafka, whereas Logstash by default puts encoded message.
68.68.99.199 - - [06/Apr/2014:03:35:25 +0000] "GET /2013/04/smartview-as-the-replacement-for-bi-office-with-obiee-11-1-1-7/ HTTP/1.1" 200 12391 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1)"
vs
{"message":"68.68.99.199 - - [06/Apr/2014:03:35:25 +0000] \"GET /2013/04/smartview-as-the-replacement-for-bi-office-with-obiee-11-1-1-7/ HTTP/1.1\" 200 12391 \"-\" \"Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1)\"","@version":"1","@timestamp":"2015-10-21T21:38:10.165Z","host":"bigdatalite.localdomain","path":"/home/oracle/website_logs/access_log.small"}
This means that Flume -> Kafka -> Logstash with default configs fails at the Logstash stage: