Skip to content

Instantly share code, notes, and snippets.

View iandow's full-sized avatar

Ian Downard iandow

View GitHub Profile
<div class="col-md-12">
<h2 id="the-mapr-data-platform">THE MAPR SANDBOX FOR HADOOP</h2>
<p>The MapR Sandbox for Hadoop is a fully-functional single-node cluster that provides data scientists, developers, and other DataOps stakeholders a safe environment in which to explore MapR’s core data storage for files, tables, and streams, plus ecosystem components for Hadoop, HBase, Hive, Hue, Kafka, Pig, Spark, and more.</p>
<p><h2 id="the-mapr-data-platform">THE MAPR SANDBOX FOR APACHE DRILL</h2></p>
<p>The MapR Sandbox with Drill is a fully functional single-node cluster that can be used to get an overview of <a href="/products/apache-drill/">Apache Drill</a> in the MapR data platform. Data scientists, developers, and other DataOps stakeholders can use this sandbox environment to get a feel for the power and capabilities of Drill by performing various types of queries outlined in the <a href="http://drill.apache.org/docs/getting-to-know-the-drill-sandbox/">Drill tutorial.</a></p>
<h2 id
3 cat /opt/mapr/conf/mapr-clusters.conf
4 ping localhost
5 ping localhost.localdomain
6 jps
7 hadoop fs -ls /
8 id mapr
9 sudo su mapr
10 maprlogin print
11 maprlogin password
12 passwd
@iandow
iandow / here.md
Last active November 30, 2018 05:46

Create a docker network to bridge containers

docker network create mynetwork

Start StreamSets like this:

docker run -it -p 18630:18630 -d --name sdc --network mynetwork streamsets/datacollector
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?><!--
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
@iandow
iandow / dsr-unsecured.yaml
Created October 9, 2018 21:17
yaml file for deploying mapr data science refinery in kubernetes
apiVersion: v1
kind: ConfigMap
metadata:
name: dsr-configmap
namespace: idownard-cluster
data:
MAPR_CLUSTER: idownard-cluster
MAPR_CLDB_HOSTS: 10.24.1.7
MAPR_HS_HOST: 10.24.1.7
MAPR_CONTAINER_USER: mapr
apiVersion: v1
kind: Pod
metadata:
name: dsr-kube
labels:
app: dsr-svc
spec:
containers:
- name: dsr
imagePullPolicy: Always
@iandow
iandow / dsr_describe.txt
Created October 8, 2018 23:50
dsr deploy error
enmac:private-kubernetes idownard$ kubectl get pods -n idownard-cluster
NAME READY STATUS RESTARTS AGE
admincli-975b9897d-rvsnv 1/1 Running 0 1h
cldb-0 1/1 Running 0 1h
dataaccessgateway-5bcdcb4d7c-sgxsd 1/1 Running 5 1h
kafkarest-69984c5dcf-n22pw 1/1 Running 5 1h
ldap-0 1/1 Running 0 1h
mapr-init-6mxvq 0/1 Completed 0 1h
maprgateway-0 1/1 Running 5 1h
mastgateway-6f94c7fd-lr7lm 1/1 Running 5 1h
@iandow
iandow / gist:22c82206dfac90584a192a28aae9d2d4
Created September 6, 2018 18:20
kube cluster create commands
enmac:kubeflow-codelab idownard$ gcloud config set project mapr-demos
Updated property [core/project].
Updates are available for some Cloud SDK components. To install them,
please run:
$ gcloud components update
def main(_):
channel = grpc.insecure_channel(FLAGS.server)
stub = prediction_service_pb2_grpc.PredictionServiceStub(channel)
# Send request
with open(FLAGS.image, 'rb') as f:
# See prediction_service.proto for gRPC request/response details.
data = f.read()
request = predict_pb2.PredictRequest()
request.model_spec.name = 'model'
# request.model_spec.signature_name = 'predict_images'
def export_model(sess, keys, architecture, saved_model_dir):
if architecture == 'inception_v3':
input_tensor = 'DecodeJpeg/contents:0'
elif architecture.startswith('mobilenet_'):
input_tensor = 'input:0'
else:
raise ValueError('Unkonwn architecture', architecture)
in_image = sess.graph.get_tensor_by_name(input_tensor)
inputs = {'image': tf.saved_model.utils.build_tensor_info(in_image)}