Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save jdrew1303/11e89093d8cc567d2e873a5d018898fd to your computer and use it in GitHub Desktop.
Save jdrew1303/11e89093d8cc567d2e873a5d018898fd to your computer and use it in GitHub Desktop.

Autonomous Mobile Robot (Experimental)

Introduction

The aim of this project is to create mobile robot that can navigate correctly in-door environment. The robot can scan and save surrounding environment as occupancy grid, then basing on that map, it can localize and navigate itself on scanned map. Due to these capabilites, the robot can plan a path from A to B as instructed via Rviz on global map (scanned map), it can also detect unknown obstacles during operation and plan a local path to robustly avoid them and ensure the success of navigation task.

Link to github: https://github.com/anindex/navigation-processing-base

Building robot (Hardware part)

Devices and equipments Here is the list (included links) to the robot parts available in Vietnam (hshop.vn)

  • Round robot chassis link. Price: 132.000VND
  • Arduino Uno or Mega link. Price: 140.000VND
  • Raspberry Pi 3 link. Price: 1.700.000VND
  • DC Motor GA25 V1 (x2) link. Price: 440.000VND
  • Motor GA25 fixed frame (x2) link. Price: 40.000VND
  • Motor Wheel hexa coupling 12mm (x2) link. Price: 260.000VND
  • Arduino motor shield driver link. Price: 220.000VND
  • Hokuyo laser scanner link. Price: ~900USD (can be replace by this lower tech RPLIDAR A1 laser scanner: link)
  • Battery for Raspi link. Price: 175.000VND
  • Battery for motors (x3) link. Price: 465.000VND
  • Auxialiary parts: wiring, screw, switch. Price: ~200.000VND Total Price (excluded laser scanner): ~3.800.000VND

Robot model

This round robot is adapted according to the design of Kobuki robot: http://kobuki.yujinrobot.com/ . The robot uses differential drive mechanism to move around environment. In tuition, turning mechanism is achieved based on different angular velocities of two wheels. Therefore, in order to drive straight, angular velocities of two wheels must be the same in the same direction, in order to in-place rotation, angular velocities of two wheels must be the same in the opposite direction.

More detail about kinematic model of the robot will be in another topics.

Robot assembly

More detail about how to assembly of the robot will be in another topics.

Installation and setting up robot (Software part)

The current state of robot in EyeQ is already installed and set up neccessary requirements. The following guide is just for reference with building robot from scratch.

Raspberry Pi

Developers will need a screen, keyboard and mouse plugged in Raspi to setup these steps. However, we can also use ssh terminal to remote connect to raspi via local network. Raspi needs to be connected to internet.

Follow the instruction: https://ubuntu-mate.org/raspberry-pi/ to install Ubuntu MATE on Raspi 3. This OS is essential because we are using Robotics Operating System (ROS) as middleware, which is a medium to parallel executing modules, exchange sensor and other data packet with ease, and easy integrate with more intelligent modules (scalability).

After finish installing Ubuntu MATE on Raspi, we need to install ROS, please follow this instruction: http://wiki.ros.org/kinetic/Installation/Ubuntu to install ROS. Remenber to create ROS workspace by following this tutorial: http://wiki.ros.org/ROS/Tutorials/InstallingandConfiguringROSEnvironment .

After installing ROS, we need to install the dependencies for our navigation package on ROS. Open a terminal on Raspi and follow these instructions to install our implemented navigation package:

sudo apt-get update && sudo apt-get upgrade
sudo apt-get install ros-kinetic-hector-slam ros-kinetic-navigation ros-kinetic-map-server ros-kinetic-rosserial ros-kinetic-rosserial-arduino git 
cd ~/<YOUR_WORKSPACE>/src 
git clone https://github.com/anindex/navigation-processing-base processing_base
cd .. && catkin_make
source cd ~/<YOUR_WORKSPACE>/devel/setup.bash

The robot would require network environment setup before we can run, which means we tell Raspi the identification of our laptop for it to transfer stats for monitoring. Login to your router to look for your laptop IP and hostname, append the info to /etc/hosts, then issue this command on your terminal echo "source ROS_MASTER_URI=http://localhost:11311" >> ~/.bashrc

Laptop

Developer must install Ubuntu 16.04 on their laptop, and follow the instructions like in Raspi. However, for the laptop side, login to your router to look for your raspi IP and hostname, append the info to /etc/hosts, then issue this command on your terminal echo "source ROS_MASTER_URI=http://<RASPI_HOSTNAME>:11311" >> ~/.bashrc

Arduino

On your laptop, follow this instruction to install Arduino IDE: https://www.arduino.cc/en/Main/Software After install Arduino IDE, we must install our dependencies for our navigation package to upload code to arduino.

Dependencies:

To install ros_lib, please follow this tutorial: http://wiki.ros.org/rosserial_arduino/Tutorials/Arduino%20IDE%20Setup

After install all dependencies, navigate to arduino folder of our repository https://github.com/anindex/navigation-processing-base . Use arduino IDE to open up the project: MobileClientPID. Compile and upload to arduino (you may need to google how to upload code to arduino). NOTES: if you receive permission denied message when uploading code to arduino, issue this command sudo chmod a+rw /dev/ttyACM0.

Robot demo

The robot in EyeQ has already installed and setup. However, you may need to setup network hostname both in raspi and laptop to make them communicate. The following tutorial assumes that you have setup everything up and running. Contact me for troubleshooting via mail: eeit2015_an.lt@student.vgu.edu.vn or phone: +84933726401

Robot hardware

To start, please plug in power connector from raspi to battery for raspi and switch power for motors on.

Raspi side

Connect to raspi via ssh:

ssh anindex@mobile
password 18349275

roscore&
roslaunch process_base start_navigation.launch

Laptop side

rosrun rviz rviz -d "$(find processing_base)/rviz/navigation.rviz"

You will see the sensor data and other modules data on Rviz graphical interface. At initial state, we must point the pose for the robot on the map (both direction and position) by clicking Pose Estimation button on top panel, we must ensure the laser sensors points fit the global map (which means we estimate the pose right). Then for the navigation task, we must point the goal for the robot on the map by clicking 2D Nav Goal button on the top panel. Read this article for more detail: http://wiki.ros.org/navigation/Tutorials/Using%20rviz%20with%20the%20Navigation%20Stack

Congratulation! The robot just moves to the goal point.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment