Skip to content

Instantly share code, notes, and snippets.

@viecode09
Created March 18, 2017 17:22
Show Gist options
  • Star 23 You must be signed in to star a gist
  • Fork 10 You must be signed in to fork a gist
  • Save viecode09/ad56b09bea4da59b4240d45b666321cf to your computer and use it in GitHub Desktop.
Save viecode09/ad56b09bea4da59b4240d45b666321cf to your computer and use it in GitHub Desktop.
This is how to install hadoop on Mac OS

STEP 1: First Install HomeBrew, download it from http://brew.sh

$ ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"

STEP 2: Install Hadoop

$ brew search hadoop
$ brew install hadoop

Hadoop will be installed at path /usr/local/Cellar/hadoop

STEP 3: Configure Hadoop:

Edit hadoop-env.sh, the file can be located at /usr/local/Cellar/hadoop/2.6.0/libexec/etc/hadoop/hadoop-env.sh where 2.6.0 is the hadoop version. Change the line

export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true" to

export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true -Djava.security.krb5.realm= -Djava.security.krb5.kdc=" Edit Core-site.xml, The file can be located at /usr/local/Cellar/hadoop/2.6.0/libexec/etc/hadoop/core-site.xml add below config

<property>
<name>hadoop.tmp.dir</name>
<value>/usr/local/Cellar/hadoop/hdfs/tmp</value>
<description>A base for other temporary directories.</description>
</property>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>

Edit mapred-site.xml, The file can be located at /usr/local/Cellar/hadoop/2.6.0/libexec/etc/hadoop/mapred-site.xml and by default will be blank add below config

<configuration>
 <property>
  <name>mapred.job.tracker</name>
  <value>localhost:9010</value>
 </property>
</configuration>

Edit hdfs-site.xml, The file can be located at /usr/local/Cellar/hadoop/2.6.0/libexec/etc/hadoop/hdfs-site.xml add

<configuration>
 <property>
  <name>dfs.replication</name>
  <value></value>
 </property>
</configuration>

To simplify life edit a ~/.profile and add the following commands. By default ~/.profile might not exist.

alias hstart=<"/usr/local/Cellar/hadoop/2.6.0/sbin/start-dfs.sh;/usr/local/Cellar/hadoop/2.6.0/sbin/start-yarn.sh">
alias hstop=<"/usr/local/Cellar/hadoop/2.6.0/sbin/stop-yarn.sh;/usr/local/Cellar/hadoop/2.6.0/sbin/stop-dfs.sh">

and source it

$ source ~/.profile

Before running Hadoop format HDFS

$ hdfs namenode -format

STEP 4: To verify if SSH Localhost is working check for files ~/.ssh/id_rsa and the ~/.ssh/id_rsa.pub files. If they don’t exist generate the keys using below command

$ ssh-keygen -t rsa

Enable Remote Login: “System Preferences” -> “Sharing”. Check “Remote Login” Authorize SSH Keys: To allow your system to accept login, we have to make it aware of the keys that will be used

$ cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys

Test login.

$ ssh localhost
Last login: Fri Mar 6 20:30:53 2015
$ exit

STEP 5: Run Hadoop

$ hstart

and stop using

$ hstop

@pospospos2007
Copy link

NOT WORK,PLZ SEE this log I received:
==> Downloading https://www.apache.org/dyn/closer.cgi?path=hadoop/common/hadoop-
Already downloaded: /Users/mac/Library/Caches/Homebrew/hadoop-3.0.0.tar.gz

then ,I have waiting for a long time ,it still give me this msg.I'm confused.

@fakeyanss
Copy link

@pospospos2007 try to install with other mirror source. it usually takes a few minutes to finish installation.

@Avid2018
Copy link

Avid2018 commented Jun 3, 2019

I installed everything but when I type hstart I get
-bash: hstart: command not found

@shmsr
Copy link

shmsr commented Jul 31, 2019

@Avid2018

Add this to the end of your ~/.bashrc

alias hstart="/usr/local/Cellar/hadoop/{version}/sbin/start-all.sh"
alias hstop="/usr/local/Cellar/hadoop/{version}/sbin/stop-all.sh"

Note: Change the {version} to appropriate val

@MichaelTan9999
Copy link

I did carefully as @viecode09 mentioned, but what I got from terminal after entering 'jps' didn't include ResourceManager. I'm confused.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment