Skip to content

Instantly share code, notes, and snippets.

View csbond007's full-sized avatar

Kaustav Saha csbond007

View GitHub Profile
I am running on a AWS free tier Ubuntu box. OS version is Ubuntu 14.04.4 LTS
Default python version was 2.7.6.
Failed building wheel for pycapnp
---------------------------------------------------------------
sudo apt-get update -y
sudo apt-get install git g++ cmake python-dev -y
git clone https://github.com/numenta/nupic.core.git
git clone https://github.com/numenta/nupic.git
export NUPIC=$HOME/nupic
(gdb) bt
#0 __memset_sse2 () at ../sysdeps/x86_64/multiarch/../memset.S:78
#1 0x00007fffe938980c in std::_Hashtable<unsigned long, std::pair<unsigned long const, capnp::SchemaLoader::Impl::RequiredSize>, std::allocator<std::pair<unsigned long const, capnp::SchemaLoader::Impl::RequiredSize> >, std::__detail::_Select1st, std::equal_to<unsigned long>, std::hash<unsigned long>, std::__detail::_Mod_range_hashing, std::__detail::_Default_ranged_hash, std::__detail::_Prime_rehash_policy, std::__detail::_Hashtable_traits<false, false, true> >::clear() () from /home/kaustavsaha/.local/lib/python2.7/site-packages/nupic/bindings/_math.so
#2 0x00007fffe9385dbe in std::_Hashtable<unsigned long, std::pair<unsigned long const, capnp::SchemaLoader::Impl::RequiredSize>, std::allocator<std::pair<unsigned long const, capnp::SchemaLoader::Impl::RequiredSize> >, std::__detail::_Select1st, std::equal_to<unsigned long>, std::hash<unsigned long>, std::__detail::_Mod_range_hashing, std::__detail::_Default_ranged_hash, std::__detai
@csbond007
csbond007 / Working_Basic_Numenta
Last active June 25, 2016 01:50
Working Basic Numenta
https://github.com/numenta/nupic/wiki/Compiling-NuPIC-on-Ubuntu-14
sudo apt-get update -y
sudo apt-get install git g++ cmake python-dev -y
git clone https://github.com/numenta/nupic.core.git
git clone https://github.com/numenta/nupic.git
export NUPIC=$HOME/nupic
export NUPIC_CORE=$HOME/nupic.core
curl https://bootstrap.pypa.io/get-pip.py | sudo python
cd $NUPIC_CORE
@csbond007
csbond007 / Push_Code_GIT_from_Linux.txt
Created July 25, 2016 15:48
Pushing code to GIT Repository from Linux
git init
git add .
git commit -m "First commit"
git remote add origin "remote repository URL"
git remote -v
git pull origin master
git push
git clone https://github.com/csbond007/LiquidWeb.git
git pull
git status
git branch
// changes
// if files add then git add "filename"
git status
git commit -a -m "Commit"
git push
Change Network Settings in VM
NAT->Bridged Adapter
Promiouscuous Node : Deny -> Allow All
Download Cassandra Tar-Ball : http://www.apache.org/dyn/closer.lua/cassandra/3.7/apache-cassandra-3.7-bin.tar.gz
cd Downloads/
ls
sudo cp apache-cassandra-3.7-bin.tar.gz /usr/local/.
cd /usr/local
bin/spark-shell --conf spark.cassandra.connection.host=127.0.0.1 --packages datastax:spark-cassandra-connector:2.0.0-M2-s_2.11
import com.datastax.spark.connector._
///////////////////////////////////////////////////
//// Cassandra Table creation
CREATE KEYSPACE test WITH REPLICATION = {'class': 'SimpleStrategy', 'replication_factor': 1 };
CREATE TABLE test.words (word text PRIMARY KEY, count int);
INSERT INTO test.words (word, count) VALUES ('foo', 20);
Reference : http://www.nodalpoint.com/development-and-deployment-of-spark-applications-with-scala-eclipse-and-sbt-part-1-installation-configuration/
Make a Directory spark_sbt_eclipse_cassandra
SBT Installation : http://www.scala-sbt.org/0.13/docs/Installing-sbt-on-Linux.html
curl https://bintray.com/sbt/rpm/rpm | sudo tee /etc/yum.repos.d/bintray-sbt-rpm.repo
sudo yum install sbt
///////////////////////////////////////////////////////////////////////////////////////
[ksaha@mesos101 SampleApp]$ spark-submit --class "SampleApp" --master mesos://zk://10.10.40.138:2181/mesos --jars lib/spark-cassandra-connector-1.6.1-s_2.10.jar,lib/cassandra-driver-core-3.1.1.jar, target/scala-2.10/sampleapp_2.10-1.0.jar
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/10/24 19:34:25 INFO SparkContext: Running Spark version 1.6.2
16/10/24 19:34:25 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/10/24 19:34:25 INFO SecurityManager: Changing view acls to: ksaha
16/10/24 19:34:25 INFO SecurityManager: Changing modify acls to: ksaha
16/10/24 19:34:25 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(ksaha); users with modify permissions: Set(ksaha)
16/10/24 19:34:25 INFO Utils: Successfully started service 'sparkDriver' on port 34208.
16/10/24 19:34:26 INFO Slf4jLogger: Slf4jLogger started
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import com.datastax.spark.connector._
object SampleApp {
def main(args: Array[String]) {