View spark-python-version.py
from pyspark import SparkContext
from pyspark import SparkConf
import os
import sys
if __name__ == "__main__":
conf = SparkConf()
conf.setAppName("version-check")
View app_index.html
<!DOCTYPE html>
<html lang="en">
<head>
<!-- Basic Page Needs
–––––––––––––––––––––––––––––––––––––––––––––––––– -->
<meta charset="utf-8">
<title>Active Apps</title>
<!-- Mobile Specific Metas
View gitea-logs.txt
[signal SIGSEGV: segmentation violation code=0x1 addr=0xe5 pc=0x7f4e081c0b43]
runtime stack:
runtime.throw(0x1355333, 0x2a)
/usr/local/go/src/runtime/panic.go:596 +0x95
runtime.sigpanic()
/usr/local/go/src/runtime/signal_unix.go:274 +0x2db
goroutine 213 [syscall, locked to thread]:
runtime.cgocall(0x105ad10, 0xc42012d3e8, 0x410001)
View gist:3a53f6bec36f84797cdfb1dadaf4d95c
Non-terminated Pods: Name CPU Requests CPU Limits Memory Requests Memory Limits
--------- ---- ------------ ---------- --------------- -------------
default anaconda-app-bdd182dcbfe847bfa7e9720278186fb6-2746126448-pnxkg 100m (0%) 4 (11%) 429496729600m (0%) 4Gi (6%)
View dask-yarn-errors
ontainer: container_1483005539219_0001_01_000002 on e669ac404e81_37693
=========================================================================
LogType:stderr
Log Upload Time:Thu Dec 29 05:24:49 -0500 2016
LogLength:3198
Log Contents:
distributed.nanny - INFO - Start Nanny at: 127.0.0.1:36755
distributed.worker - INFO - Start worker at: 127.0.0.1:35193
distributed.worker - INFO - http at: 127.0.0.1:42609
distributed.worker - INFO - nanny at: 127.0.0.1:36755
View keybase.md

Keybase proof

I hereby claim:

  • I am quasiben on github.
  • I am quasiben (https://keybase.io/quasiben) on keybase.
  • I have a public key whose fingerprint is 1A11 C2A9 2CDF A1AF F775 38CC 2D91 3569 1385 0607

To claim this, I am signing this object:

View dataframe_barplot.py
import pandas as pd
import numpy as np
from bokeh.plotting import figure, output_file, show
from bokeh.charts import Bar
from bokeh.charts.attributes import ColorAttr, CatAttr
from bokeh.charts.builders.bar_builder import BarBuilder
output_file('try_select.html')
df = pd.DataFrame(np.random.randint(0,100,size=(100, 2)), columns=['other_values', 'values'])
df.sort_values('values', inplace=True)
View fast_food_map.py
import pandas as pd
from bokeh.plotting import ColumnDataSource, figure, show, output_file
from bokeh.models import HoverTool
from utils import within_bbox
df = pd.read_csv("POIWorld.csv")
View parcel.json
{
"name": "Anaconda",
"packages": [
{
"name": "abstract-rendering",
"version": "0.5.1-np110py27_0"
},
{
"name": "alabaster",
"version": "0.7.6-py27_0"
View logstash-errors.txt
Got error to send bulk of actions: blocked by: [SERVICE_UNAVAILABLE/1/state not recovered / initialized];[SERVICE_UNAVAILABLE/2/no master]; {:level=>:error}
Failed to flush outgoing items {:outgoing_count=>1, :exception=>org.elasticsearch.cluster.block.ClusterBlockException: blocked by:
[SERVICE_UNAVAILABLE/1/state not recovered / initialized];[SERVICE_UNAVAILABLE/2/no master];, :backtrace=>["org.elasticsearch.cluster
.block.ClusterBlocks.globalBlockedException(org/elasticsearch/cluster/block/ClusterBlocks.java:151)", "org.elasticsearch.cluster.block
.ClusterBlocks.globalBlockedRaiseException(org/elasticsearch/cluster/block/ClusterBlocks.java:141)", "org.elasticsearch.action.bulk
.TransportBulkAction.executeBulk(org/elasticsearch/action/bulk/TransportBulkAction.java:210)", "org.elasticsearch.action.bulk.TransportBu
lkAction.access$000(org/elasticsearch/action/bulk/TransportBulkAction.java:73)", "org.elasticsearch.action.bulk.TransportBulkAction$1
.onFailure(org/elasticsearch/action/bulk/TransportBulkAction.ja