Skip to content

Instantly share code, notes, and snippets.

Last active September 23, 2022 03:28
  • Star 19 You must be signed in to star a gist
  • Fork 10 You must be signed in to fork a gist
Star You must be signed in to star a gist
What would you like to do?
Building TensorFlow for Raspberry Pi: a Step-By-Step Guide

Building TensorFlow 1.3.0-rc1 for Raspberry Pi/Ubuntu 16.04: a Step-By-Step Guide

Here you'll learn how to build Tensorflow for the raspberry pi 3 with either the Python API or as a standalone shared library which can be interfaced from the C++ API and eventually as a library which can be used in other languages.

For the C++ library this tutorial will show you how extract tensorflow library and headers to use in any environment you want.

(This tutorial couldn't be possible without the help of the people from the References section)

What You Need

  • Raspberry Pi 2 or 3 Model B
  • An SD card running Raspbian with several GB of free space
    • An 8 GB card with a fresh install of Raspbian does not have enough space. A 16 GB SD card minimum is recommended.
    • These instructions may work on Linux distributions other than Raspbian
  • Internet connection to the Raspberry Pi
  • A USB memory drive that can be installed as swap memory (if it is a flash drive, make sure you don't care about the drive). Anything over 1 GB should be fine
  • A fair amount of time


These instructions were crafted for a Raspberry Pi 3 Model B running a vanilla copy of Ubuntu mate (xenial). It appears to work on Raspberry Pi 2, but there are some kinks that are being worked out. If these instructions work for different distributions, let me know!

Here's the basic plan: build a RPi-friendly version of Bazel and use it to build TensorFlow.


  1. Install basic dependencies
  2. Install USB Memory as Swap
  3. Build Bazel
  4. Compiling TensorFlow
  5. Cleaning Up
  6. References

The Build

1. Install basic dependencies

First, update apt-get to make sure it knows where to download everything.

sudo apt-get update

Next, install some base dependencies and tools we'll need later.

For Bazel:

sudo apt-get install pkg-config zip g++ zlib1g-dev unzip default-jdk autoconf automake libtool

For TensorFlow:

# For Python 2.7
sudo apt-get install python-pip python-numpy swig python-dev
sudo pip install wheel

# For Python 3.3+
sudo apt-get install python3-pip python3-numpy swig python3-dev
sudo pip3 install wheel

To be able to take advantage of certain optimization flags:

sudo apt-get install gcc-4.8 g++-4.8
sudo update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-4.8 100
sudo update-alternatives --install /usr/bin/g++ g++ /usr/bin/g++-4.8 100

Finally, for cleanliness, make a directory that will hold the Protobuf, Bazel, and TensorFlow repositories.

mkdir tf
cd tf

2. Install a Memory Drive as Swap for Compiling

In order to succesfully build TensorFlow, your Raspberry Pi needs a little bit more memory to fall back on. Fortunately, this process is pretty straightforward. Grab a USB storage drive that has at least 1GB of memory. I used a flash drive I could live without that carried no important data. That said, we're only going to be using the drive as swap while we compile, so this process shouldn't do too much damage to a relatively new USB drive.

First, put insert your USB drive, and find the /dev/XXX path for the device.

sudo blkid

As an example, my drive's path was /dev/sda1

Once you've found your device, unmount it by using the umount command.

sudo umount /dev/XXX

Format your USB drive with the following command (the swap area will be 2GB: 1024 * 2048 = 2097152):

sudo dd if=/dev/zero of=/dev/sda bs=1024 count=2097152

Find it back with this command:

sudo fdisk -l

Then flag your device to be swap:

sudo mkswap /dev/XXX

If the previous command outputted an alphanumeric UUID, copy that now. Otherwise, find the UUID by running blkid again. Copy the UUID associated with /dev/XXX

sudo blkid

Now edit your /etc/fstab file to register your swap file. (I'm a Vim guy, but Nano is installed by default)

sudo nano /etc/fstab

On a separate line, enter the following information. Replace the X's with the UUID (without quotes)


Save /etc/fstab, exit your text editor, and run the following command:

sudo swapon -a

If you get an error claiming it can't find your UUID, go back and edit /etc/fstab. Replace the UUID=XXX.. bit with the original /dev/XXX information.

sudo nano /etc/fstab
# Replace the UUID with /dev/XXX
/dev/XXX none swap sw,pri=5 0 0

Alright! You've got swap! Don't throw out the /dev/XXX information yet- you'll need it to remove the device safely later on.

3. Build Bazel

To build Bazel, we're going to need to download a zip file containing a distribution archive. Let's do that now and extract it into a new directory called bazel (the 0.5.2+ releases ends with an error):

unzip -d bazel

Once it's done downloading and extracting, we can move into the directory to make a few changes:

cd bazel

Now we need to change the permissions of every files in the bazel project with:

sudo chmod u+w ./* -R

Before building Bazel, we need to set the javac maximum heap size for this job, or else we'll get an OutOfMemoryError. To do this, we need to make a small addition to bazel/scripts/bootstrap/ (Shout-out to @SangManLINUX for pointing this out..

nano scripts/bootstrap/

Move down to line 117, where you'll see the following block of code:

run "${JAVAC}" -classpath "${classpath}" -sourcepath "${sourcepath}" \
      -d "${output}/classes" -source "$JAVA_VERSION" -target "$JAVA_VERSION" \
      -encoding UTF-8 "@${paramfile}"

At the end of this block, add in the -J-Xmx500M flag, which sets the maximum size of the Java heap to 500 MB:

run "${JAVAC}" -classpath "${classpath}" -sourcepath "${sourcepath}" \
      -d "${output}/classes" -source "$JAVA_VERSION" -target "$JAVA_VERSION" \
      -encoding UTF-8 "@${paramfile}" -J-Xmx500M

Now we can build Bazel! Warning: This takes a really, really long time. Several hours.


When the build finishes, you end up with a new binary, output/bazel. Copy that to your /usr/local/bin directory.

sudo cp output/bazel /usr/local/bin/bazel

To make sure it's working properly, run bazel on the command line and verify it prints help text. Note: this may take 15-30 seconds to run, so be patient!


Usage: bazel <command> <options> ...

Available commands:
  analyze-profile     Analyzes build profile data.
  build               Builds the specified targets.
  canonicalize-flags  Canonicalizes a list of bazel options.
  clean               Removes output files and optionally stops the server.
  dump                Dumps the internal state of the bazel server process.
  fetch               Fetches external repositories that are prerequisites to the targets.
  help                Prints help for commands, or the index.
  info                Displays runtime info about the bazel server.
  mobile-install      Installs targets to mobile devices.
  query               Executes a dependency graph query.
  run                 Runs the specified target.
  shutdown            Stops the bazel server.
  test                Builds and runs the specified test targets.
  version             Prints version information for bazel.

Getting more help:
  bazel help <command>
                   Prints help and options for <command>.
  bazel help startup_options
                   Options for the JVM hosting bazel.
  bazel help target-syntax
                   Explains the syntax for specifying targets.
  bazel help info-keys
                   Displays a list of keys used by the info command.

Move out of the bazel directory, and we'll move onto the next step.

cd ..

4. Compiling TensorFlow

First things first, clone the TensorFlow repository and move into the newly created directory.

git clone --recurse-submodules
cd tensorflow

Once in the directory, checkout the latest tensorflow stable branch

git checkout v1.3.0-rc1

Now we have to write a nifty one-liner that is incredibly important. The next line goes through all files and changes references of 64-bit program implementations (which we don't have access to) to 32-bit implementations. Neat!

grep -Rl 'lib64' | xargs sed -i 's/lib64/lib/g'

Next, we need to delete a particular line in tensorflow/core/platform/platform.h. Open up the file in your favorite text editor:

nano tensorflow/core/platform/platform.h

Now, scroll down toward the bottom and delete the following line containing #define IS_MOBILE_PLATFORM (around line 48):

#elif defined(__arm__)

This keeps our Raspberry Pi device (which has an ARM CPU) from being recognized as a mobile device.

Finally we have to use a most recent version of the Eigen dependency as the one by default fails on ARM devices.

nano tensorflow/workspace.bzl

Around line 147 modify this code:

      name = "eigen_archive",
      urls = [
      sha256 = "ca7beac153d4059c02c8fc59816c82d54ea47fe58365e8aded4082ded0b820c4",
      strip_prefix = "eigen-eigen-f3a22f35b044",
      build_file = str(Label("//third_party:eigen.BUILD")),

to (be careful the indentation is important):

      name = "eigen_archive",
      urls = [
      sha256 = "4286e8f1fabbd74f7ec6ef8ef31c9dbc6102b9248a8f8327b04f5b68da5b05e1",
      strip_prefix = "eigen-eigen-5a0156e40feb",
      build_file = str(Label("//third_party:eigen.BUILD")),

and around line 198:

      name = "gemmlowp",
      urls = [
      sha256 = "75d40ea8e68b0d1644f052fffe8f14a410b2a73d40ccb859a95c0578d194ec26",
      strip_prefix = "gemmlowp-a6f29d8ac48d63293f845f2253eccbf86bc28321",


      name = "gemmlowp",
      urls = [
      sha256 = "88b8ce51917da751219e897edcc145e9a69a2295a1accaa23c2f20f49ce80823",
      strip_prefix = "gemmlowp-7f782be8a053c03ffbeeacef6f76e30002819432",

Now let's configure the build:


Please specify the location of python. [Default is /usr/bin/python]: /usr/bin/python
Do you wish to build TensorFlow with MKL support? [y/N] N
No MKL support will be enabled for TensorFlow
Please specify optimization flags to use during compilation when bazel option "--config=opt" is specified [Default is -march=native]:
Do you wish to use jemalloc as the malloc implementation? [Y/n] Y
jemalloc enabled
Do you wish to build TensorFlow with Google Cloud Platform support? [y/N] N
No Google Cloud Platform support will be enabled for TensorFlow
Do you wish to build TensorFlow with Hadoop File System support? [y/N] N
No Hadoop File System support will be enabled for TensorFlow
Do you wish to build TensorFlow with the XLA just-in-time compiler (experimental)? [y/N] N
No XLA support will be enabled for TensorFlow
Do you wish to build TensorFlow with VERBS support? [y/N] N
No VERBS support will be enabled for TensorFlow
Do you wish to build TensorFlow with OpenCL support? [y/N] N
No OpenCL support will be enabled for TensorFlow
Do you wish to build TensorFlow with CUDA support? [y/N] N
No CUDA support will be enabled for TensorFlow
Do you wish to build TensorFlow with MPI support? [y/N] N
MPI support will not be enabled for TensorFlow
Configuration finished

Note: if you want to build for Python 3, specify /usr/bin/python3 for Python's location and /usr/local/lib/python3.5/dist-packages for the Python library path.

Now we can use it to build TensorFlow! Warning: This takes a really, really long time. Several hours.

To build it with the python API do:

bazel build -c opt --copt="-mfpu=neon-vfpv4" --copt="-funsafe-math-optimizations" --copt="-ftree-vectorize" --copt="-fomit-frame-pointer" --local_resources 1024,1.0,1.0 --verbose_failures tensorflow/tools/pip_package:build_pip_package

Or if you only want to build the shared library to use in C++ (for bindings in other languages you need to compile the C interface with // instead):

bazel build -c opt --copt="-mfpu=neon-vfpv4" --copt="-funsafe-math-optimizations" --copt="-ftree-vectorize" --copt="-fomit-frame-pointer" --local_resources 1024,1.0,1.0 --verbose_failures //

When you wake up the next morning and it's finished compiling, you're in the home stretch!

If you chose the python pip compilation

Use the built binary file to create a Python wheel.

bazel-bin/tensorflow/tools/pip_package/build_pip_package /tmp/tensorflow_pkg

And then install it!

sudo pip install /tmp/tensorflow_pkg/tensorflow-1.3.0-cp27-none-linux_armv7l.whl

If you chose the shared lib compilation for C++

Here we'll move the shared libraries and headers to a test folder instead of copying them system wide. To do so lets create our project structure:

mkdir ../tf_test

then open test.cpp:

nano ../tf_test/test.cpp

and copy this code:

#include "tensorflow/cc/client/client_session.h"
#include "tensorflow/cc/ops/standard_ops.h"
#include "tensorflow/core/framework/tensor.h"

int main() {
  using namespace tensorflow;
  using namespace tensorflow::ops;
  Scope root = Scope::NewRootScope();
  // Matrix A = [3 2; -1 0]
  auto A = Const(root, { {3.f, 2.f}, {-1.f, 0.f}});
  // Vector b = [3 5]
  auto b = Const(root, { {3.f, 5.f}});
  // v = Ab^T
  auto v = MatMul(root.WithOpName("v"), A, b, MatMul::TransposeB(true));
  std::vector<Tensor> outputs;
  ClientSession session(root);
  // Run and fetch v
  TF_CHECK_OK(session.Run({v}, &outputs));
  // Expect outputs[0] == [19; -3]
  LOG(INFO) << outputs[0].matrix<float>();
  return 0;

Before moving on we also need to compile the protobuf dependencies with:

mkdir /tmp/proto
cd tensorflow/contrib/makefile/downloads/protobuf/
./configure --prefix=/tmp/proto/
make install

Same thing for eigen:

mkdir /tmp/eigen
cd ../eigen
mkdir build_dir
cd build_dir
cmake -DCMAKE_INSTALL_PREFIX=/tmp/eigen/ ../
make install
cd ../../../../../..

Now copy the libraries to the project folder:

mkdir ../tf_test/lib
cp bazel-bin/tensorflow/ ../tf_test/lib/
cp /tmp/proto/lib/libprotobuf.a ../tf_test/lib/

Then the includes files:

mkdir -p ../tf_test/include/tensorflow
cp -r bazel-genfiles/* ../tf_test/include/
cp -r tensorflow/cc ../tf_test/include/tensorflow
cp -r tensorflow/core ../tf_test/include/tensorflow
cp -r third_party ../tf_test/include
cp -r /tmp/proto/include/* ../tf_test/include
cp -r /tmp/eigen/include/eigen3/* ../tf_test/include

Finally compile the test file with:

cd ../tf_test/
g++ -std=c++11 -Iinclude -Llib test.cpp -ltensorflow_cc -o exec


5. Cleaning Up

There's one last bit of house-cleaning we need to do before we're done: remove the USB drive that we've been using as swap.

First, turn off your drive as swap:

sudo swapoff /dev/XXX

Finally, remove the line you wrote in /etc/fstab referencing the device

sudo nano /etc/fstab

Then reboot your Raspberry Pi.

And you're done! You deserve a break.


  1. Building TensorFlow for Jetson TK1 (an update to this post is available here)
  2. Turning a USB Drive into swap
  3. Safely removing USB swap space
  4. Original guide
  5. Tensorflow for Go on rpi3
  6. Compiling the C++ interface
  7. Compiling for Tensorflow C++ API
  8. Mfpu neon issue thread

Back to top

Copy link

ERROR: Could not build Bazel

please help me

Copy link

Thank you @EKami for this wonderful guide. I took your base and I try it with latest bazel/tensorflow these days. I wrote down my differences in this tutorial where I point back to this one. Once again, thanks this for guide!

Copy link

EKami commented Jul 17, 2019 via email

Copy link

alihydri commented Mar 3, 2020

Thank you for the comprehensive guide on building the Tensorflow.
My goal was to take advantage of the NEON SIMD on ARM. But unfortunately, I got the following error:
gcc: error: unrecognized command line option '-mfpu=neon-vfpv4'
I deleted the --copt="-mfpu=neon-vfpv4" and it worked fine! I was wondering if you are able to compile with the neon engine enabled.

Copy link

thanks!! I ended up having to use env EXTRA_BAZEL_ARGS="--host_javabase=@local_jdk//:jdk" bash ./ for building bazel on a pi 4

Copy link

AaEll commented Feb 6, 2021

Hey ya'll, default-jdk is now JAVA-11 which is incompatible with bazel. Please update the gist from my fork!

Copy link

Dogamo commented May 7, 2022

hi, thank you for your guide, I ran into a problem while try your way. when I run "./" while building bazel, it shows that "ERROR: Cannot determine JDK version, please set $JAVA_HOME.\n $JAVAC_VERSION is "javac 11.0.14""
Could you give me the solution please?

Copy link

EKami commented May 7, 2022

@Dogamo You need to install java jdk and use export JAVA_HOME=<path to your jdk installation>

Copy link

Dogamo01 commented May 8, 2022

@Dogamo You need to install java jdk and use export JAVA_HOME=<path to your jdk installation>

Thank you for your reply! I solved the problem by reinstall 64bit OS on my raspberry pi and follow the guide on youtube, link below

Copy link

felipezora commented Sep 15, 2022

Thank you for the comprehensive guide on building the Tensorflow. My goal was to take advantage of the NEON SIMD on ARM. But unfortunately, I got the following error: gcc: error: unrecognized command line option '-mfpu=neon-vfpv4' I deleted the --copt="-mfpu=neon-vfpv4" and it worked fine! I was wondering if you are able to compile with the neon engine enabled.

Hi Ali! I am trying to build TFLite in a way so it will not use the neon engine, when you delete this line then your TF does not use the neon accelerator anymore?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment