-
Download Hadoop and install.
-
Setup $HADOOP_HOME
-
Install
openssh-server
and runsudo /etc/init.d/ssh restart
Check if ssh is running on port 22 by:
netstat -tupln | grep 22
-
Edit core-site.xml, hdfs-site.xml to add the following
$HADOOP_HOME/etc/hadoop/core-site-xml <configuration> <property> <name>fs.defaultFS</name> <value>hdfs://localhost:9000</value> </property> </configuration> $HADOOP_HOME/etc/hadoop/hdfs-site.xml <configuration> <property> <name>dfs.replication</name> <value>1</value> </property> </configuration>
Last active
September 22, 2016 02:00
-
-
Save purukaushik/78e3e4b9178dc87d7c13cae3b5f500d1 to your computer and use it in GitHub Desktop.
Hadoop-Spark setup on standalone ubuntu
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment