Navigation Menu

Skip to content

Instantly share code, notes, and snippets.

@heinrichvk
heinrichvk / logback-tcp-writer.xml
Last active October 30, 2015 08:28
TCP writer configuration example
<writer class="com.axibase.tsd.collector.writer.TcpAtsdWriter">
<host>atsd_server</host>
<port>8081</port>
</writer>
@heinrichvk
heinrichvk / inotify_sender.sh
Created October 27, 2015 10:22
The sender script checks the specified directory for new csv files and upload them into ATSD. You can check the script's logs in /tmp/itm/logs directory.
#!/bin/bash
scriptDir="`dirname $(readlink -f $0)`"
url="$1"
logPostfix="`date +%s`"
newcsvLog="$scriptDir/../logs/newcsv_${logPostfix}.log"
sendcsvLog="$scriptDir/../logs/send_${logPostfix}.log"
cleanLog="$scriptDir/../logs/cleaner_${logPostfix}.log"
csvDir="$scriptDir/../csv"
tmpDir="$scriptDir/../tmp"
if [ "$url" = "" ]; then
@heinrichvk
heinrichvk / average_brightness.r
Created October 20, 2015 09:04
Here is a code snippet from the R script that we used to calculate the weighted average of brightness for given disks:
sum <- 0
total_weight <- 0
for (i in 1:length(pixels$x_shift)) {
x_shift <- pixels$x_shift[i]
y_shift <- pixels$y_shift[i]
coords <- get_lat_lon(x_0 + x_shift, y_0 + y_shift)
dist <- distGeo(c(lon_0, lat_0), c(coords[2], coords[1]))
weight <- cloud_base / (cloud_base^2 + dist^2)^(3/2)
sum <- sum + weight * matr[x_0 + x_shift, y_0 + y_shift]
@heinrichvk
heinrichvk / store_results.r
Created October 20, 2015 09:02
To store the results in ATSD, the following code is used:
save_series(series, metric_col = 2, metric_name = "cloudiness_Himawari_b13",
entity = entities[i], verbose = FALSE)
@heinrichvk
heinrichvk / cloudiness_himawari.r
Created October 20, 2015 08:57
Here is the key line from the R script used to calculate the cloudiness_himawari_b13 metric from the satellite images:
new_row[1, i + 1] <- get_density(lat = latitudes[i], lon = longitudes[i], matr = img_cont[ , , 1])
@heinrichvk
heinrichvk / core-site.xml
Created October 8, 2015 11:02
Edit /opt/atsd/hadoop/conf/core-site.xml setting hadoop.tmp.dir property to /data/hdfs-cache:
<property>
<name>hadoop.tmp.dir</name>
<value>/data/hdfs-cache</value>
<description>A base for other temporary directories.</description>
</property>
@heinrichvk
heinrichvk / hdfs-site.xml
Created October 8, 2015 11:01
Edit /opt/atsd/hadoop/conf/hdfs-site.xml setting dfs.name.dir and dfs.data.dir properties to /data/hdfs-data-name and /data/hdfs-data:
<property>
<name>dfs.name.dir</name>
<value>/data/hdfs-data-name</value>
</property>
<property>
<name>dfs.data.dir</name>
<value>/data/hdfs-data</value>
</property>
@heinrichvk
heinrichvk / atsd_with_tags.sh
Created September 28, 2015 07:27
Script for loading data into ATSD, with tags.
#!/bin/bash
#2015-06-24T00:00:00Z = 1435093200000 ms
#2015-06-25T00:00:00Z = 1435179600000 ms
for ((q = 1; q < 101; q++)); do
echo 'sensor'$q
for (( i = 1435093200000; i < 1435179600000; i+=10000 )); do
@heinrichvk
heinrichvk / atsd_no_tags.sh
Created September 28, 2015 07:23
Scripts for loading data into Splunk, without tags.
#!/bin/bash
#2015-06-24T00:00:00Z = 1435093200000 ms
#2015-06-25T00:00:00Z = 1435179600000 ms
for ((q = 1; q < 101; q++)); do
echo 'sensor'$q
for (( i = 1435093200000; i < 1435179600000; i+=10000 )); do
@heinrichvk
heinrichvk / splunk_with_tags.sh
Last active October 2, 2015 14:25
Scripts for loading data into Splunk, with tags.
#!/bin/bash
#2015-06-24T00:00:00Z = 1435093200000 ms
#2015-06-25T00:00:00Z = 1435179600000 ms
echo "temperature, humidity, precipitation, timestamp, location, type, host" >> splunk_tags
for (( i = 1435093200000; i < 1435179600000; i+=10000 )); do
for ((q = 1; q < 101; q++)); do