Skip to content

Instantly share code, notes, and snippets.

@arundasan91
Created April 2, 2016 22:16
Show Gist options
  • Star 77 You must be signed in to star a gist
  • Fork 14 You must be signed in to fork a gist
  • Save arundasan91/b432cb011d1c45b65222d0fac5f9232c to your computer and use it in GitHub Desktop.
Save arundasan91/b432cb011d1c45b65222d0fac5f9232c to your computer and use it in GitHub Desktop.
Caffe Installation Tutorial for beginners

Caffe

Freshly brewed !

With the availability of huge amount of data for research and powerfull machines to run your code on, Machine Learning and Neural Networks is gaining their foot again and impacting us more than ever in our everyday lives. With huge players like Google opensourcing part of their Machine Learning systems like the TensorFlow software library for numerical computation, there are many options for someone interested in starting off with Machine Learning/Neural Nets to choose from. Caffe, a deep learning framework developed by the Berkeley Vision and Learning Center (BVLC) and its contributors, comes to the play with a fresh cup of coffee.

Installation Instructions (Ubuntu 14 Trusty)

The following section is divided in to two parts. Caffe's documentation suggests you to install Anaconda Python distribution to make sure that you've installed necessary packages, with ease. If you're someone who do not want to install Anaconda in your system for some reason, I've covered that too. So in the first part you'll find information on how to install Caffe with Anaconda and in the second part you'll find the information for installing Caffe without Anaconda .

Please note that the following instructions were tested on my local machine and in two Chameleon Cloud Instances. However I cannot garuntee success for anyone. Please be ready to see some errors on the way, but I hope you won't stumble into any if you follow the directions as is.

My local machine and the instances I used are NOT equipped with GPU's. So the installation instrucions are strictly for non-GPU based or more clearly CPU-only systems running Ubuntu 14 trusty. However, to install it in a GPU based system, you just have to install CUDA and necessary drivers for your GPU. You can find the instructions in Stack Overflow or in the always go to friend Google.

For systems without GPU's (CPU_only)

1. Caffe + Anaconda

Anaconda python distribution includes scientific and analytic Python packages which are extremely useful. The complete list of packages can be found here.

To install Anaconda, you have to first download the Installer to your machine. Go to this website to download the Installer. Scroll to the 'Anaconda for Linux' section and choose the installer to download depending on your system architecture.

Once you have the Installer in your machine, run the following code to install Anaconda.

bash Anaconda2-2.5.0-Linux-x86_64.sh

If you fail to read the few lines printed after installation, you'll waste a good amount of your produtive time on trying to figure out what went wrong. An important line reads:

For this change to become active, you have to open a new terminal.

So, once the Anaconda installation is over, please open a new terminal. Period.

After opening a new terminal, to verify the installation type:

conda -V

This should give you the current version of conda, thus verifying the installation. Now that's done !

Now we will install OpenBLAS.

sudo apt-get install libopenblas-dev

Next go ahead and install Boost. More info on boost here

I faced a problem while installing boost in all my machines. I fixed it by including multiverse repository into the sources.list. Since playing with sources.list is not reccomended, follow the steps for a better alternative.

echo 'deb http://archive.ubuntu.com/ubuntu trusty main restricted universe multiverse' >>/tmp/multiverse.list

sudo cp /tmp/multiverse.list /etc/apt/sources.list.d/

rm /tmp/multiverse.list

The repo is saved to a temporary list named 'multiverse.list' in the /tmp folder. It is then copied to /etc/apt/sources.list.d/ folder. The file in /tmp folder is then removed. I found this fix in Stack Exchange fourm.

Now to install boost, run:

sudo apt-get install libboost-all-dev

Now, let us install OpenCV. Go ahead and run:

conda install opencv

sudo apt-get install libopencv-dev

Now let us install some dependencies of Caffe. Run the following:

sudo apt-get install libleveldb-dev libsnappy-dev libhdf5-serial-dev

sudo apt-get install libgflags-dev libgoogle-glog-dev liblmdb-dev

sudo apt-get install protobuf-compiler libprotobuf-dev

conda install -c https://conda.anaconda.org/anaconda protobuf

Okay, that's it. Let us now download the Caffe. If you don't have git installed in your system yet, run this code really quick:

sudo apt-get install git

We will clone the official Caffe repository from Github.

git clone https://github.com/BVLC/caffe

Once the git is cloned, cd into caffe folder.

cd caffe

We will edit the configuration file of Caffe now. We need to do it to specify that we are using a CPU-only system. (Tell compiler to disable GPU, CUDA etc). For this, make a copy of the Makefile.config.example.

cp Makefile.config.example Makefile.config

Great ! Now go ahead and open the Makefile.config in your favourite text editor (vi or vim or gedit or ...). Change the following:

1. Uncomment (No space in the beginning): 
CPU_ONLY := 1

2. Change:
BLAS := atlas to BLAS := open

3. Comment out:
PYTHON_INCLUDE := /usr/include/python2.7 \
        /usr/lib/python2.7/dist-packages/numpy/core/include
   
4. Uncomment:
ANACONDA_HOME := $(HOME)/anaconda2

PYTHON_INCLUDE := $(ANACONDA_HOME)/include \
    $(ANACONDA_HOME)/include/python2.7 \
    $(ANACONDA_HOME)/lib/python2.7/site-packages/numpy/core/include 

5. Comment:
PYTHON_LIB := /usr/lib

6. Uncomment:
PYTHON_LIB := $(ANACONDA_HOME)/lib

7. Uncomment:
USE_PKG_CONFIG := 1

Your Makefile.config should look something like this now: Makefile.config

Now that's done, let me share with you an error I came across. Our Makefile.config is okay. But while 'make'-ing / building the installation files, the hf5 dependeny gave me an error. This might not apply to you. I can't say for sure. The build required two files libhdf5_h1.so.10 and libhd5.so.10 but the files in the system were libhdf5_h1.so.7 and libhd5.so.7. I fixed this by doing the following:

cd /usr/lib/x86_64-linux-gnu/

sudo cp libhdf5_hl.so.7 libhdf5_hl.so.10

sudo cp libhdf5.so.7 libhdf5.so.10

We will now install the libraries listed in the requirements.txt file.

cd ~/caffe/python

sudo apt-get install python-pip && sudo pip install -r requirements.txt

Now, we can safely build the files in the caffe directory. We will run the make process as 4 jobs by specifying it like -j4. More on it here

cd ~/caffe

sudo make all -j4

I hope the make process went well. If not, please see which package failed by checking the logs or from terminal itself. Feel free to comment, I will help to the best of my knowledge. You can seek help from your go to friend Google or Stack Exchange as mentioned above.

Provided that the make process was successfull, continue with the rest of the installation process.

We will now make the Pycaffe files. Pycaffe is the Python interface of Caffe which allows you to use Caffe inside Python. More on it here. We will also make distribute. This is explained in Caffe website.

sudo make pycaffe

sudo make distribute

Awesome! We are almost there. We just need to test whether everything went fine. For that make the files for testing and run the test.

sudo make test

sudo make runtest

If you succeed in all the tests then you've successfully installed Caffe in your system ! One good reason to smile !

Finally, we need to add the correct path to our installed modules. Using your favourite text editor, add the following to the .bashrc file in your /home/user/ folder for Caffe to work properly. Please make sure you replace the < username > with your system's username.

#Anaconda if not present already
export PATH=/home/<username>/anaconda2/bin:$PATH
#Caffe Root
export CAFFE_ROOT=/home/<username>/caffe/
export PYTHONPATH=/home/<username>/caffe/distribute/python:$PYTHONPATH
export PYTHONPATH=/home/<username>/caffe/python:$PYTHONPATH

CHEERS ! You're done ! Now let's test if it really works.

Restart/reboot your system to ensure everything loads perfect.

sudo reboot

Open Python and type:

import caffe

You should be able to successfully load caffe. Now let's start coding :)

2. Caffe without installing Anaconda

By preference, if you don't want to install Anaconda in your system, you can install Caffe by following the steps below. As mentioned earlier, installing all the dependencies can be difficult. If this tutorial does not work for you, please look into the errors, use our trusted friends.

To start with, we will update and upgrade the packages in our system. Then we will have to install the dependencies one by one on the machine. Type the following to get started.

sudo apt-get update && sudo apt-get upgrade && sudo apt-get dist-upgrade && sudo apt-get autoremove

Now, let us install openblas.

sudo apt-get -y install libopenblas-dev

Next go ahead and install Boost. More info on boost here

I faced a problem while installing boost in all my machines. I fixed it by including multiverse repository into the sources.list. Since playing with sources.list is not reccomended, follow the steps for a better alternative.

echo 'deb http://archive.ubuntu.com/ubuntu trusty main restricted universe multiverse' >>/tmp/multiverse.list

sudo cp /tmp/multiverse.list /etc/apt/sources.list.d/

rm /tmp/multiverse.list

The repo is saved to a temporary list named 'multiverse.list' in the /tmp folder. It is then copied to /etc/apt/sources.list.d/ folder. The file in /tmp folder is then removed. I found this fix in Stack Exchange fourm.

Now to install boost, run:

sudo apt-get update && sudo apt-get install libboost-all-dev

If later in the installation process you find that any of the boost related files are missing, run the following command. You can skip this one for now but won't hurt if you do it either.

sudo apt-get -y install --fix-missing libboost-all-dev

Go ahead and install libfaac-dev package.

sudo apt-get install libfaac-dev

Now, we need to install ffmpeg. Let us also make sure that the ffmpeg version is one which OpenCV and Caffe approves. We will remove any previous versions of ffmpeg and install new ones.

The following code will remove ffmpeg and related packages:

sudo apt-get -y remove ffmpeg x264 libx264-dev

The mc3man repository hosts ffmpeg packages. I came to know about it from Stack Exchange forums. To include the repo, type this:

sudo add-apt-repository ppa:mc3man/trusty-media

Update and install ffmpeg.

sudo apt-get update && sudo apt-get install ffmpeg gstreamer0.10-ffmpeg

Now, we can install OpenCV. First let us install the dependencies. Building OpenCV can be challenging at first, but if you have all the dependencies correct it will be done in no time.

Go ahead and run the following lines:

sudo apt-get install build-essential

The 'build-essential' ensures that we have the compilers ready. Now we will install some required packages. Run:

sudo apt-get install cmake git libgtk2.0-dev pkg-config libavcodec-dev libavformat-dev libswscale-dev

We will install some optional packages as well. Run:

sudo apt-get install python-dev python-numpy libtbb2 libtbb-dev libjpeg-dev libpng-dev libtiff-dev libjasper-dev libdc1394-22-dev

sudo apt-get install libtiff4-dev libopenexr-dev libeigen2-dev yasm libopencore-amrnb-dev libtheora-dev libvorbis-dev libxvidcore-dev

sudo apt-get install python-tk libeigen3-dev libx264-dev libqt4-dev libqt4-opengl-dev sphinx-common texlive-latex-extra libv4l-dev default-jdk ant libvtk5-qt4-dev

Now we can go ahead and download the OpenCV build files. Go to your root folder first.

cd ~

Download the files:

wget http://sourceforge.net/projects/opencvlibrary/files/opencv-unix/2.4.9/opencv-2.4.9.zip

Unzip the file by:

unzip opencv-2.4.9.zip

Go to the opencv folder by running:

cd opencv-2.4.9

Make a build directory inside.

mkdir build

Go inside the build directory.

cd build

Build the files using cmake.

cmake -D WITH_TBB=ON -D BUILD_NEW_PYTHON_SUPPORT=ON -D WITH_V4L=ON -D INSTALL_C_EXAMPLES=ON -D INSTALL_PYTHON_EXAMPLES=ON -D BUILD_EXAMPLES=ON -D WITH_QT=ON -D WITH_OPENGL=ON -D WITH_VTK=ON ..

In the summary, make sure that FFMPEG is installed, also check whether the Python, Numpy, Java and OpenCL are properly installed and recognized.

Now we will run the make process as 4 jobs by specifying it like -j4. More on it here

sudo make -j4

Go ahead and continue installation.

sudo make install

Once the installation is complete, do these steps to get OpenCV configured.

sudo sh -c 'echo "/usr/local/lib" > /etc/ld.so.conf.d/opencv.conf'

sudo ldconfig

export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/opencv/lib

Come out of the build folder if you haven't already by running:

cd ~

Install python-pip:

sudo apt-get install python-pip

Now, we will install the Scipy and other scientific packages which are key Caffe dependencies.

sudo apt-get install python-numpy python-scipy python-matplotlib ipython ipython-notebook python-pandas python-sympy python-nose

We will install Cython now. (I wanted it to install scikit-image properly)

wget http://cython.org/release/Cython-0.23.4.zip

unzip Cython-0.23.4.zip

cd Cython-0.23.4

sudo python setup.py install

cd ~

Now that we have Cython, go ahead and run the code below to install Scikit Image and Scikit Learn.

sudo pip install scikit-image scikit-learn

We will now install some more crucial dependencies of Caffe

sudo apt-get install libprotobuf-dev libleveldb-dev libsnappy-dev libopencv-dev libhdf5-serial-dev

sudo apt-get install libgflags-dev libgoogle-glog-dev liblmdb-dev protobuf-compiler

sudo pip install protobuf

Installing Pydot will be beneficial to view our net by saving it off in an image file.

sudo apt-get install python-pydot

Now that all the dependencies are installed, we will go ahead and download the Caffe installation files. Go ahead and run:

git clone https://github.com/BVLC/caffe

Go into the caffe folder and copy and rename the Makefile.config.example file to Makefile.config.

cd caffe
cp Makefile.config.example Makefile.config

Great ! Now go ahead and open the Makefile.config in your favourite text editor (vi or vim or gedit or ...). Change the following:

1. Uncomment (No space in the beginning): 
    CPU_ONLY := 1

2. Uncomment:
    USE_PKG_CONFIG := 1

We will install the packages listed in Caffe's requirements.txt file as well; just in case.

cd ~/caffe/python
sudo pip install -r requirements.txt

Now, we can safely build the files in the caffe directory. We will run the make process as 4 jobs by specifying it like -j4. More on it here

cd ~/caffe
sudo make all -j4

I hope the make process went well. If not, please see which package failed by checking the logs or from terminal itself. Feel free to comment, I will help to the best of my knowledge. You can seek help from your go to friend Google or Stack Exchange as mentioned above.

Provided that the make process was successfull, continue with the rest of the installation process.

We will now make the Pycaffe files. Pycaffe is the Python interface of Caffe which allows you to use Caffe inside Python. More on it here. We will also make distribute. This is explained in Caffe website.

sudo make pycaffe
sudo make distribute

Awesome! We are almost there. We just need to test whether everything went fine. For that make the files for testing and run the test.

sudo make test
sudo make runtest

If you succeed in all the tests then you've successfully installed Caffe in your system ! One good reason to smile !

Finally, we need to add the correct path to our installed modules. Using your favourite text editor, add the following to the .bashrc file in your /home/user/ folder for Caffe to work properly. Please make sure you replace the < username > with your system's username.

# Caffe Root
export CAFFE_ROOT=/home/<username>/caffe/
export PYTHONPATH=/home/<username>/caffe/distribute/python:$PYTHONPATH
export PYTHONPATH=/home/<username>/caffe/python:$PYTHONPATH

CHEERS ! You're done ! Now let's test if it really works.

Restart/reboot your system to ensure everything loads perfect.

sudo reboot

Open Python and type:

import caffe

You should be able to successfully load caffe. Now let's start coding :)

Appendix

Makefile.config

For Caffe + Anaconda

## Refer to http://caffe.berkeleyvision.org/installation.html
# Contributions simplifying and improving our build system are welcome!

# cuDNN acceleration switch (uncomment to build with cuDNN).
# USE_CUDNN := 1

# CPU-only switch (uncomment to build without GPU support).
CPU_ONLY := 1

# uncomment to disable IO dependencies and corresponding data layers
# USE_OPENCV := 0
# USE_LEVELDB := 0
# USE_LMDB := 0

# uncomment to allow MDB_NOLOCK when reading LMDB files (only if necessary)
#	You should not set this flag if you will be reading LMDBs with any
#	possibility of simultaneous read and write
# ALLOW_LMDB_NOLOCK := 1

# Uncomment if you're using OpenCV 3
# OPENCV_VERSION := 3

# To customize your choice of compiler, uncomment and set the following.
# N.B. the default for Linux is g++ and the default for OSX is clang++
# CUSTOM_CXX := g++

# CUDA directory contains bin/ and lib/ directories that we need.
CUDA_DIR := /usr/local/cuda
# On Ubuntu 14.04, if cuda tools are installed via
# "sudo apt-get install nvidia-cuda-toolkit" then use this instead:
# CUDA_DIR := /usr

# CUDA architecture setting: going with all of them.
# For CUDA < 6.0, comment the *_50 lines for compatibility.
CUDA_ARCH := -gencode arch=compute_20,code=sm_20 \
        -gencode arch=compute_20,code=sm_21 \
        -gencode arch=compute_30,code=sm_30 \
        -gencode arch=compute_35,code=sm_35 \
        -gencode arch=compute_50,code=sm_50 \
        -gencode arch=compute_50,code=compute_50

# BLAS choice:
# atlas for ATLAS (default)
# mkl for MKL
# open for OpenBlas
BLAS := open
# Custom (MKL/ATLAS/OpenBLAS) include and lib directories.
# Leave commented to accept the defaults for your choice of BLAS
# (which should work)!
# BLAS_INCLUDE := /path/to/your/blas
# BLAS_LIB := /path/to/your/blas

# Homebrew puts openblas in a directory that is not on the standard search path
# BLAS_INCLUDE := $(shell brew --prefix openblas)/include
# BLAS_LIB := $(shell brew --prefix openblas)/lib

# This is required only if you will compile the matlab interface.
# MATLAB directory should contain the mex binary in /bin.
# MATLAB_DIR := /usr/local
# MATLAB_DIR := /Applications/MATLAB_R2012b.app

# NOTE: this is required only if you will compile the python interface.
# We need to be able to find Python.h and numpy/arrayobject.h.
#PYTHON_INCLUDE := /usr/include/python2.7 \
        /usr/lib/python2.7/dist-packages/numpy/core/include
# Anaconda Python distribution is quite popular. Include path:
# Verify anaconda location, sometimes it's in root.
ANACONDA_HOME := $(HOME)/anaconda2
PYTHON_INCLUDE := $(ANACONDA_HOME)/include \
         $(ANACONDA_HOME)/include/python2.7 \
         $(ANACONDA_HOME)/lib/python2.7/site-packages/numpy/core/include 

# Uncomment to use Python 3 (default is Python 2)
# PYTHON_LIBRARIES := boost_python3 python3.5m
# PYTHON_INCLUDE := /usr/include/python3.5m \
#                 /usr/lib/python3.5/dist-packages/numpy/core/include

# We need to be able to find libpythonX.X.so or .dylib.
#PYTHON_LIB := /usr/lib
PYTHON_LIB := $(ANACONDA_HOME)/lib

# Homebrew installs numpy in a non standard path (keg only)
# PYTHON_INCLUDE += $(dir $(shell python -c 'import numpy.core; print(numpy.core.__file__)'))/include
# PYTHON_LIB += $(shell brew --prefix numpy)/lib

# Uncomment to support layers written in Python (will link against Python libs)
# WITH_PYTHON_LAYER := 1

# Whatever else you find you need goes here.
INCLUDE_DIRS := $(PYTHON_INCLUDE) /usr/local/include
LIBRARY_DIRS := $(PYTHON_LIB) /usr/local/lib /usr/lib

    # If Homebrew is installed at a non standard location (for example your home directory) and you use it for general dependencies
# INCLUDE_DIRS += $(shell brew --prefix)/include
# LIBRARY_DIRS += $(shell brew --prefix)/lib

# Uncomment to use `pkg-config` to specify OpenCV library paths.
# (Usually not necessary -- OpenCV libraries are normally installed in one of the above $LIBRARY_DIRS.)
USE_PKG_CONFIG := 1

BUILD_DIR := build
DISTRIBUTE_DIR := distribute

# Uncomment for debugging. Does not work on OSX due to https://github.com/BVLC/caffe/issues/171
# DEBUG := 1

# The ID of the GPU that 'make runtest' will use to run unit tests.
TEST_GPUID := 0

# enable pretty build (comment to see full commands)
Q ?= @

For Caffe without Anaconda

## Refer to http://caffe.berkeleyvision.org/installation.html
# Contributions simplifying and improving our build system are welcome!

# cuDNN acceleration switch (uncomment to build with cuDNN).
# USE_CUDNN := 1

# CPU-only switch (uncomment to build without GPU support).
CPU_ONLY := 1

# uncomment to disable IO dependencies and corresponding data layers
# USE_OPENCV := 0
# USE_LEVELDB := 0
# USE_LMDB := 0

# uncomment to allow MDB_NOLOCK when reading LMDB files (only if necessary)
#	You should not set this flag if you will be reading LMDBs with any
#	possibility of simultaneous read and write
# ALLOW_LMDB_NOLOCK := 1

# Uncomment if you're using OpenCV 3
# OPENCV_VERSION := 3

# To customize your choice of compiler, uncomment and set the following.
# N.B. the default for Linux is g++ and the default for OSX is clang++
# CUSTOM_CXX := g++

# CUDA directory contains bin/ and lib/ directories that we need.
CUDA_DIR := /usr/local/cuda
# On Ubuntu 14.04, if cuda tools are installed via
# "sudo apt-get install nvidia-cuda-toolkit" then use this instead:
# CUDA_DIR := /usr

# CUDA architecture setting: going with all of them.
# For CUDA < 6.0, comment the *_50 lines for compatibility.
CUDA_ARCH := -gencode arch=compute_20,code=sm_20 \
        -gencode arch=compute_20,code=sm_21 \
        -gencode arch=compute_30,code=sm_30 \
        -gencode arch=compute_35,code=sm_35 \
        -gencode arch=compute_50,code=sm_50 \
        -gencode arch=compute_50,code=compute_50

# BLAS choice:
# atlas for ATLAS (default)
# mkl for MKL
# open for OpenBlas
BLAS := open
# Custom (MKL/ATLAS/OpenBLAS) include and lib directories.
# Leave commented to accept the defaults for your choice of BLAS
# (which should work)!
# BLAS_INCLUDE := /path/to/your/blas
# BLAS_LIB := /path/to/your/blas

# Homebrew puts openblas in a directory that is not on the standard search path
# BLAS_INCLUDE := $(shell brew --prefix openblas)/include
# BLAS_LIB := $(shell brew --prefix openblas)/lib

# This is required only if you will compile the matlab interface.
# MATLAB directory should contain the mex binary in /bin.
# MATLAB_DIR := /usr/local
# MATLAB_DIR := /Applications/MATLAB_R2012b.app

# NOTE: this is required only if you will compile the python interface.
# We need to be able to find Python.h and numpy/arrayobject.h.
PYTHON_INCLUDE := /usr/include/python2.7 \
        /usr/lib/python2.7/dist-packages/numpy/core/include
# Anaconda Python distribution is quite popular. Include path:
# Verify anaconda location, sometimes it's in root.
# ANACONDA_HOME := $(HOME)/anaconda
# PYTHON_INCLUDE := $(ANACONDA_HOME)/include \
        # $(ANACONDA_HOME)/include/python2.7 \
        # $(ANACONDA_HOME)/lib/python2.7/site-packages/numpy/core/include \

# Uncomment to use Python 3 (default is Python 2)
# PYTHON_LIBRARIES := boost_python3 python3.5m
# PYTHON_INCLUDE := /usr/include/python3.5m \
#                 /usr/lib/python3.5/dist-packages/numpy/core/include

# We need to be able to find libpythonX.X.so or .dylib.
PYTHON_LIB := /usr/lib
# PYTHON_LIB := $(ANACONDA_HOME)/lib

# Homebrew installs numpy in a non standard path (keg only)
# PYTHON_INCLUDE += $(dir $(shell python -c 'import numpy.core; print(numpy.core.__file__)'))/include
# PYTHON_LIB += $(shell brew --prefix numpy)/lib

# Uncomment to support layers written in Python (will link against Python libs)
# WITH_PYTHON_LAYER := 1

# Whatever else you find you need goes here.
INCLUDE_DIRS := $(PYTHON_INCLUDE) /usr/local/include
LIBRARY_DIRS := $(PYTHON_LIB) /usr/local/lib /usr/lib

    # If Homebrew is installed at a non standard location (for example your home directory) and you use it for general dependencies
# INCLUDE_DIRS += $(shell brew --prefix)/include
# LIBRARY_DIRS += $(shell brew --prefix)/lib

# Uncomment to use `pkg-config` to specify OpenCV library paths.
# (Usually not necessary -- OpenCV libraries are normally installed in one of the above $LIBRARY_DIRS.)
# USE_PKG_CONFIG := 1

BUILD_DIR := build
DISTRIBUTE_DIR := distribute

# Uncomment for debugging. Does not work on OSX due to https://github.com/BVLC/caffe/issues/171
# DEBUG := 1

# The ID of the GPU that 'make runtest' will use to run unit tests.
TEST_GPUID := 0

# enable pretty build (comment to see full commands)
Q ?= @
@wmmxk
Copy link

wmmxk commented Mar 5, 2017

Thanks for your documentation.

I get this error and google a lot and no luck. Do you have any ideas? Here is the error.

CMakeFiles/compute_image_mean.dir/compute_image_mean.cpp.o: In function std::string* google::MakeCheckOpString<int, int="">(int const&, int const&, char const*)': compute_image_mean.cpp:(.text._ZN6google17MakeCheckOpStringIiiEEPSsRKT_RKT0_PKc[_ZN6google17MakeCheckOpStringIiiEEPSsRKT_RKT0_PKc]+0x50): undefined reference to google::base::CheckOpMessageBuilder::NewString()'
CMakeFiles/compute_image_mean.dir/compute_image_mean.cpp.o: In function std::string* google::MakeCheckOpString<unsigned long,="" int="">(unsigned long const&, int const&, char const*)': compute_image_mean.cpp:(.text._ZN6google17MakeCheckOpStringImiEEPSsRKT_RKT0_PKc[_ZN6google17MakeCheckOpStringImiEEPSsRKT_RKT0_PKc]+0x50): undefined reference to google::base::CheckOpMessageBuilder::NewString()'
CMakeFiles/compute_image_mean.dir/compute_image_mean.cpp.o: In function main': compute_image_mean.cpp:(.text.startup+0x168): undefined reference to google::SetUsageMessage(std::string const&)'
../lib/libcaffe.so.1.0.0-rc5: undefined reference to leveldb::DB::Open(leveldb::Options const&, std::string const&, leveldb::DB**)' ../lib/libcaffe.so.1.0.0-rc5: undefined reference to leveldb::Status::ToString() const'
collect2: error: ld returned 1 exit status
tools/CMakeFiles/compute_image_mean.dir/build.make:135: recipe for target 'tools/compute_image_mean' failed
make[2]: *** [tools/compute_image_mean] Error 1
CMakeFiles/Makefile2:511: recipe for target 'tools/CMakeFiles/compute_image_mean.dir/all' failed
make[1]: *** [tools/CMakeFiles/compute_image_mean.dir/all] Error 2
Makefile:127: recipe for target 'all' failed
make: *** [all] Error 2

@mayyphyokhaing
Copy link

Sir, I'm now reading
For systems without GPU's (CPU_only)

  1. Caffe + Anaconda
    and it said to install 'Anaconda for Linux'. Is that means I need to use Linux OS?
    I'm really the beginner to use Python and could you explain about it?

@qaz99
Copy link

qaz99 commented Jul 11, 2017

in Caffe + Anaconda

after lines
cd ~/caffe
sudo make all -j4

i am getting

make: Nothing to be done for `all'.

what shoud I do?

@jqueguiner
Copy link

@vokhidovhusan
Copy link

really nice tutorial. Thanks

@Saadiabatool
Copy link

before executing make all edit make.config as:
+INCLUDE_DIRS := $(PYTHON_INCLUDE) /usr/local/include /usr/include/hdf5/serial/
+LIBRARY_DIRS := $(PYTHON_LIB) /usr/local/lib /usr/lib /usr/lib/x86_64-linux-gnu/hdf5/serial/

@quintendewilde
Copy link

Makefile:594: recipe for target '.build_release/cuda/src/caffe/layers/cudnn_lcn_layer.o' failed
make: *** [.build_release/cuda/src/caffe/layers/cudnn_lcn_layer.o] Error 1
make: *** Waiting for unfinished jobs....

How to fix this? Running cuda 9.0.
Would be much appriciated! 👍

@YogeshShitole
Copy link

Hi when I am trying to build caffe with command sudo make all -j4
I am getting below error
Makefile:616: recipe for target '.build_release/tools/caffe.bin' failed
make: *** [.build_release/tools/caffe.bin] Error 1

@damianb-inwebit
Copy link

Makefile:581: recipe for target '.build_release/src/caffe/util/db_leveldb.o' failed
make: *** [.build_release/src/caffe/util/db_leveldb.o] Error 1
In file included from src/caffe/util/db.cpp:2:0:
./include/caffe/util/db_leveldb.hpp:7:24: fatal error: leveldb/db.h: No such file or directory
compilation terminated.
Makefile:581: recipe for target '.build_release/src/caffe/util/db.o' failed
make: *** [.build_release/src/caffe/util/db.o] Error 1

@ZubairKhan001
Copy link

@ BLCKPSTV this is because you are building caffe with cudnn=1 and you didn't copied the cudnn libraries into cuda 9.0. its better to use cuda 8.0 with cudnn v6.0.

@Yumin-Sun-00
Copy link

Are you going to update a Ubuntu 1604+CUDA 9.1 + cuDNN 7.1 +OpenCV3 +python3 + anaconda3 version installation guide?

@venkat01
Copy link

venkat01 commented Jul 1, 2018

Fantastic blog mate. To make it run, i had to do the following [ Running on ubuntu 14.4 ]

--> During installation of the requirements.txt, the suggestion is to do 2 items at a time as if the 8th item gives an error and after fixing it, we have to do download all of them again. That is what i did and found to be successful

  1. sudo pip install --upgrade pip --> as ipython setup was breaking

  2. Also had to install the following before ipython setup :-

sudo apt-get install libffi-dev libssl-dev
sudo pip install pyopenssl ndg-httpsclient pyasn1

@sara-eb
Copy link

sara-eb commented Jul 23, 2018

@Laowai I have installed cuDNN v6 with cuda 8 as it has been suggested in Caffe website, but still I am getting the following error with N dimensional pooling Layer once I am switching on the cudnn=1 flag, Does anyone knows how to solve this?


[ 36%] Building CXX object src/caffe/CMakeFiles/caffe.dir/layers/cudnn_pooling_layer.cpp.o
/home/user/caffe-models/caffe-unet/caffe/src/caffe/layers/cudnn_pooling_layer.cpp: In member function ‘virtual void caffe::CuDNNPoolingLayer<Dtype>::LayerSetUp(const std::vector<caffe::Blob<Dtype>*>&, const std::vector<caffe::Blob<Dtype>*>&)’:
/home/user/caffe-models/caffe-unet/caffe/src/caffe/layers/cudnn_pooling_layer.cpp:13:3: error: ‘createTensorDesc’ is not a member of ‘caffe::cudnn’
   cudnn::createTensorDesc<Dtype>(&bottom_desc_);
   ^
/home/user/caffe-models/caffe-unet/caffe/src/caffe/layers/cudnn_pooling_layer.cpp:13:32: error: expected primary-expression before ‘>’ token
   cudnn::createTensorDesc<Dtype>(&bottom_desc_);
                                ^
/home/user/caffe-models/caffe-unet/caffe/src/caffe/layers/cudnn_pooling_layer.cpp:14:3: error: ‘createTensorDesc’ is not a member of ‘caffe::cudnn’
   cudnn::createTensorDesc<Dtype>(&top_desc_);
   ^
/home/user/caffe-models/caffe-unet/caffe/src/caffe/layers/cudnn_pooling_layer.cpp:14:32: error: expected primary-expression before ‘>’ token
   cudnn::createTensorDesc<Dtype>(&top_desc_);

                                ^
/home/user/caffe-models/caffe-unet/caffe/src/caffe/layers/cudnn_pooling_layer.cpp:30:3: error: ‘createNdPoolingDesc’ is not a member of ‘caffe::cudnn’
   cudnn::createNdPoolingDesc<Dtype>(&pooling_desc_,
   ^
/home/user/caffe-models/caffe-unet/caffe/src/caffe/layers/cudnn_pooling_layer.cpp:30:35: error: expected primary-expression before ‘>’ token
   cudnn::createNdPoolingDesc<Dtype>(&pooling_desc_,
                                   ^
/home/user/caffe-models/caffe-unet/caffe/src/caffe/layers/cudnn_pooling_layer.cpp: In member function ‘virtual void caffe::CuDNNPoolingLayer<Dtype>::Reshape(const std::vector<caffe::Blob<Dtype>*>&, const std::vector<caffe::Blob<Dtype>*>&)’:
/home/user/caffe-models/caffe-unet/caffe/src/caffe/layers/cudnn_pooling_layer.cpp:40:3: error: ‘setTensorNdDesc’ is not a member of ‘caffe::cudnn’
   cudnn::setTensorNdDesc<Dtype>(&bottom_desc_, bottom[0]->shape());
   ^
/home/user/caffe-models/caffe-unet/caffe/src/caffe/layers/cudnn_pooling_layer.cpp:40:31: error: expected primary-expression before ‘>’ token
   cudnn::setTensorNdDesc<Dtype>(&bottom_desc_, bottom[0]->shape());

@anirudhmnair
Copy link

Thanks a ton!
Successfully installed CAFFE !
The detailed instructions, were very informative and useful.

@atifkarim
Copy link

atifkarim commented Dec 6, 2018

Thanks a ton!
Successfully installed CAFFE !
The detailed instructions, were very informative and useful.

I am facing problem during installation. But before I want to give some details about my system.

1/ My OS is ubuntu 16.04. I am using Anaconda3 and try to install caffe in virtual environment(in my home folder the anaconda folder name is anaconda3 and virtual env path is /home/atif/anaconda3/envs )
2/ Installed python version here is 3.6.

Python 3.6.6 | packaged by conda-forge | (default, Oct 12 2018, 14:08:43) 
[GCC 4.8.2 20140120 (Red Hat 4.8.2-15)] on linux
Type "help", "copyright", "credits" or "license" for more information.

3/ Opencv version 3.4.2

>>> import cv2
>>> cv2.__version__
'3.4.2'

My question is, is it possible to install caffe in venv? If yes, in which line I have to change in below file named Makefile.config

1. Uncomment (No space in the beginning): 
CPU_ONLY := 1

2. Change:
BLAS := atlas to BLAS := open

3. Comment out:
PYTHON_INCLUDE := /usr/include/python2.7 \
        /usr/lib/python2.7/dist-packages/numpy/core/include
   
4. Uncomment:
ANACONDA_HOME := $(HOME)/anaconda2

PYTHON_INCLUDE := $(ANACONDA_HOME)/include \
    $(ANACONDA_HOME)/include/python2.7 \
    $(ANACONDA_HOME)/lib/python2.7/site-packages/numpy/core/include 

5. Comment:
PYTHON_LIB := /usr/lib

6. Uncomment:
PYTHON_LIB := $(ANACONDA_HOME)/lib

7. Uncomment:
USE_PKG_CONFIG := 1

My guess is:
1/ ANACONDA_HOME := $(HOME)/anaconda3/envs/venv
2/ 2.7 will be 3.6

But not sure.

If you please help me I will be very happy.

@riyaj8888
Copy link

i hav ecompleted the above process. evry thing done e=well. but import caffe give error

@DStickley
Copy link

before executing make all edit make.config as:

+INCLUDE_DIRS := $(PYTHON_INCLUDE) /usr/local/include /usr/include/hdf5/serial/
+LIBRARY_DIRS := $(PYTHON_LIB) /usr/local/lib /usr/lib /usr/lib/x86_64-linux-gnu/hdf5/serial/

THANK YOU! I was getting an issue during make where the error showed that the hdf5 files did not exist, this fixed it.

@GsHeri
Copy link

GsHeri commented Jul 22, 2019

before executing make all edit make.config as:

+INCLUDE_DIRS := $(PYTHON_INCLUDE) /usr/local/include /usr/include/hdf5/serial/
+LIBRARY_DIRS := $(PYTHON_LIB) /usr/local/lib /usr/lib /usr/lib/x86_64-linux-gnu/hdf5/serial/

THANK YOU! I was getting an issue during make where the error showed that the hdf5 files did not exist, this fixed it.

same for me, luckily he said to check the comments, thanks man! :)

@israrbacha
Copy link

israrbacha commented Oct 1, 2019

i create conda environment for caffe and install caffe successfully, but tensorflow-gpu=1.4 didn't install in the same env due to package conflict anyone can help me?

@arundasan91
Copy link
Author

arundasan91 commented Oct 2, 2019

@everyone, This tutorial is pretty old now. I will try to update it in the coming weeks as I get some free time. Ubuntu 16.04, and Ubuntu 18.04 install instructions to follow.

@ziggyjosh16
Copy link

Why are you using sudo make with conda environments?

@Neelam96
Copy link

I am getting stuck "sudo make all -j4" step, it gives me the following kind of error:
Please look into it, I am a complete beginner in Linux.

CXX .build_release/src/caffe/proto/caffe.pb.cc CXX src/caffe/layer_factory.cpp CXX src/caffe/solvers/nesterov_solver.cpp CXX src/caffe/solvers/sgd_solver.cpp In file included from /usr/include/c++/4.8/cstdint:35:0, from /home/neelam/anaconda2/include/google/protobuf/stubs/port.h:35, from /home/neelam/anaconda2/include/google/protobuf/stubs/common.h:46, from .build_release/src/caffe/proto/caffe.pb.h:9, from .build_release/src/caffe/proto/caffe.pb.cc:5: /usr/include/c++/4.8/bits/c++0x_warning.h:32:2: error: #error This file requires compiler and library support for the ISO C++ 2011 standard. This support is currently experimental, and must be enabled with the -std=c++11 or -std=gnu++11 compiler options. #error This file requires compiler and library support for the \ ^ In file included from /home/neelam/anaconda2/include/google/protobuf/stubs/common.h:46:0, from .build_release/src/caffe/proto/caffe.pb.h:9, from .build_release/src/caffe/proto/caffe.pb.cc:5: /home/neelam/anaconda2/include/google/protobuf/stubs/port.h:114:2: error: #error "Protobuf requires at least C++11." #error "Protobuf requires at least C++11." ^ In file included from .build_release/src/caffe/proto/caffe.pb.cc:5:0: .build_release/src/caffe/proto/caffe.pb.h:17:2: error: #error This file was generated by an older version of protoc which is #error This file was generated by an older version of protoc which is ^ .build_release/src/caffe/proto/caffe.pb.h:18:2: error: #error incompatible with your Protocol Buffer headers. Please #error incompatible with your Protocol Buffer headers. Please ^ .build_release/src/caffe/proto/caffe.pb.h:19:2: error: #error regenerate this file with a newer version of protoc. #error regenerate this file with a newer version of protoc. ^ In file included from /home/neelam/anaconda2/include/google/protobuf/arena.h:55:0, from /home/neelam/anaconda2/include/google/protobuf/arenastring.h:41, from /home/neelam/anaconda2/include/google/protobuf/any.h:37, from /home/neelam/anaconda2/include/google/protobuf/generated_message_util.h:49, from .build_release/src/caffe/proto/caffe.pb.h:22, from .build_release/src/caffe/proto/caffe.pb.cc:5: /home/neelam/anaconda2/include/google/protobuf/arena_impl.h:375:3: warning: identifier ‘static_assert’ is a keyword in C++11 [-Wc++0x-compat] static_assert(kBlockHeaderSize % 8 == 0, ^ In file included from /home/neelam/anaconda2/include/google/protobuf/arenastring.h:41:0, from /home/neelam/anaconda2/include/google/protobuf/any.h:37, from /home/neelam/anaconda2/include/google/protobuf/generated_message_util.h:49, from .build_release/src/caffe/proto/caffe.pb.h:22, from .build_release/src/caffe/proto/caffe.pb.cc:5: /home/neelam/anaconda2/include/google/protobuf/arena.h:440:19: warning: identifier ‘decltype’ is a keyword in C++11 [-Wc++0x-compat] std::is_same<Arena*, decltype(std::declval<const U>() ^ In file included from /home/neelam/anaconda2/include/google/protobuf/stubs/common.h:46:0, from .build_release/src/caffe/proto/caffe.pb.h:9, from .build_release/src/caffe/proto/caffe.pb.cc:5: /home/neelam/anaconda2/include/google/protobuf/stubs/port.h:127:9: error: ‘uint8_t’ does not name a type typedef uint8_t uint8; ^ /home/neelam/anaconda2/include/google/protobuf/stubs/port.h:128:9: error: ‘uint16_t’ does not name a type typedef uint16_t uint16; ^ /home/neelam/anaconda2/include/google/protobuf/stubs/port.h:129:9: error: ‘uint32_t’ does not name a type typedef uint32_t uint32; ^ /home/neelam/anaconda2/include/google/protobuf/stubs/port.h:130:9: error: ‘uint64_t’ does not name a type typedef uint64_t uint64; ^ /home/neelam/anaconda2/include/google/protobuf/stubs/port.h:136:14: error: ‘uint32’ does not name a type static const uint32 kuint32max = 0xFFFFFFFFu; ^ /home/neelam/anaconda2/include/google/protobuf/stubs/port.h:137:14: error: ‘uint64’ does not name a type static const uint64 kuint64max = PROTOBUF_ULONGLONG(0xFFFFFFFFFFFFFFFF);

@terminalh2t3
Copy link

terminalh2t3 commented Jan 16, 2020

@Neelam96
I saw you are using anaconda2 with protobuf installed.
Just try conda uninstall protobuf and build again

@gracec10
Copy link

gracec10 commented May 25, 2020

If you're getting this error:
/usr/bin/ld: cannot find -lhdf5_hl
/usr/bin/ld: cannot find -lhdf5
create a symbolic link:
sudo ln -s libhdf5_serial_hl.so.10.0.2 libhdf5_hl.so
sudo ln -s libhdf5_serial.so.10.1.0 libhdf5.so
This is for Ubuntu 16.04. Visit /usr/lib/x86_64-linux-gnu/ and list the contents to find your file

@Jaluco
Copy link

Jaluco commented Dec 13, 2021

I get the following error:
Err:21 http://ppa.launchpad.net/mc3man/trusty-media/ubuntu focal Release
404 Not Found [IP: 91.189.95.85 80]
Hit:18 https://packagecloud.io/slacktechnologies/slack/debian jessie InRelease
Reading package lists... Done
E: The repository 'http://ppa.launchpad.net/mc3man/trusty-media/ubuntu focal Release' does not have a Release file.
N: Updating from such a repository can't be done securely, and is therefore disabled by default.
N: See apt-secure(8) manpage for repository creation and user configuration details.
Don't really know what to do

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment