Mobile Robots – Blog 3 – TurtleBot 3

Blog 3 – Turtlebot 3

2/8/26

Introduction

The TurtleBot series was created to democratize robotics—its core objective was to provide an affordable, standardized, ROS‑native mobile robot that anyone could learn from, modify, and use for research. Its significance lies in becoming the world’s most widely adopted open‑source robotics platform for education and prototyping, shaping how modern ROS developers learn and build.

Each version aligned with major ROS milestones:

  • TurtleBot1 (2010–2011): First ROS‑ready consumer robot.
  • TurtleBot2 (2012): More robust base (Kobuki).
  • TurtleBot3 (2017): Modular, customizable, ROS1→ROS2 bridge.
  • TurtleBot4 (2022+): Fully ROS2‑native, modern sensors, Create 3 base.

With built‑in LiDAR, depth cameras (Turtlebot 4), and ROS2 support, modern TurtleBots allow students and researchers to work on:

  • SLAM
  • Navigation (Nav2)
  • Multi‑robot systems
  • Perception and AI
  • Human‑robot interaction

All without needing expensive industrial hardware.

TurtleBot line remains significant because it:

  • Provides a reference architecture for ROS2‑native mobile robots.
  • Demonstrates best practices in sensor integration, power design, and software stack organization.
  • Offers a baseline platform for benchmarking navigation, SLAM, and perception algorithms.
  • Continues to evolve with ROS2 Humble, Jazzy, and Gazebo Sim support

I purchased a Turtlebot 3 a few years ago for the reasons described above. After using it for a while it has sat on a shelf until recent work on robotics has again made it relevant. This blog describes the (rather long and tedious) process of updating the software and some testing of the available functions.

Setup Process

The following describes the process of setting up the Turtlebot 3. The document was generated by Sonnet 4.5 (Claude) as summary of the actions taken.

OpenCR 1.0 Board Specifications:

ComponentSpecification
MCUSTM32F746ZGT6 (ARM Cortex-M7, 216MHz)
Flash1MB
SRAM320KB
IMUMPU9250 (3-axis gyro, 3-axis accel, 3-axis mag)
Motor Ports4x Dynamixel TTL (3-pin)
Power7-14V input (battery), 5V/3A output
USBMicro-B (programming/communication)
Buttons2x push buttons (SW1, SW2)
LEDs4x user LEDs
GPIOArduino-compatible headers

TurtleBot3 Setup Documentation

Raspberry Pi 4 | Ubuntu 22.04 | ROS2 Humble

1. Overview

This document describes the complete setup process for TurtleBot3 on a Raspberry Pi 4 running Ubuntu 22.04 with ROS2 Humble. The setup includes system configuration, ROS2 installation, TurtleBot3 packages, and hardware configuration.

Final Working Configuration

ComponentStatusNotes
Ubuntu 22.04✓ WorkingOn Raspberry Pi 4
ROS2 Humble Desktop✓ WorkingFull install with RViz2
TurtleBot3 Packages✓ WorkingBuilt from source
OpenCR Firmware✓ UpdatedFlashed via x86 PC
LiDAR (LDS-01)✓ Working/dev/ttyUSB0
Motors✓ WorkingVia OpenCR /dev/ttyACM0
Keyboard Teleop✓ WorkingFull motor control

2. ROS2 Humble Installation

Step 2.1: Configure Locale

sudo apt update && sudo apt install -y locales

sudo locale-gen en_US en_US.UTF-8

sudo update-locale LC_ALL=en_US.UTF-8 LANG=en_US.UTF-8

export LANG=en_US.UTF-8

Step 2.2: Add ROS2 Repository

sudo apt install -y software-properties-common

sudo add-apt-repository -y universe

sudo apt update && sudo apt install -y curl gnupg lsb-release

sudo curl -sSL https://raw.githubusercontent.com/ros/rosdistro/master/ros.key -o /usr/share/keyrings/ros-archive-keyring.gpg

echo “deb [arch=$(dpkg –print-architecture) signed-by=/usr/share/keyrings/ros-archive-keyring.gpg] http://packages.ros.org/ros2/ubuntu $(source /etc/os-release && echo $UBUNTU_CODENAME) main” | sudo tee /etc/apt/sources.list.d/ros2.list > /dev/null

Step 2.3: Install ROS2 Humble

sudo apt update && sudo apt upgrade -y

sudo apt install -y ros-humble-desktop

sudo apt install -y ros-dev-tools python3-colcon-common-extensions python3-rosdep

Step 2.4: Initialize rosdep

sudo rosdep init

rosdep update

Step 2.5: Add ROS2 to bashrc

echo “source /opt/ros/humble/setup.bash” >> ~/.bashrc

3. TurtleBot3 Package Installation

Step 3.1: Install Dependencies from apt

sudo apt install -y ros-humble-cartographer ros-humble-cartographer-ros

sudo apt install -y ros-humble-navigation2 ros-humble-nav2-bringup

sudo apt install -y ros-humble-dynamixel-sdk ros-humble-turtlebot3-msgs ros-humble-turtlebot3

Step 3.2: Create Workspace and Clone Packages

Important: Always source ROS2 before building.

source /opt/ros/humble/setup.bash

mkdir -p ~/turtlebot3_ws/src

cd ~/turtlebot3_ws/src

git clone -b humble https://github.com/ROBOTIS-GIT/DynamixelSDK.git

git clone -b humble https://github.com/ROBOTIS-GIT/turtlebot3_msgs.git

git clone -b humble https://github.com/ROBOTIS-GIT/turtlebot3.git

git clone -b ros2-devel https://github.com/ROBOTIS-GIT/hls_lfcd_lds_driver.git

Step 3.3: Build Workspace

cd ~/turtlebot3_ws

rosdep install –from-paths src –ignore-src -r -y

colcon build –symlink-install –parallel-workers 2

Step 3.4: Configure Environment Variables

Add the following lines to ~/.bashrc:

source ~/turtlebot3_ws/install/setup.bash

export TURTLEBOT3_MODEL=burger

export LDS_MODEL=LDS-01

export ROS_DOMAIN_ID=30

export OPENCR_PORT=/dev/ttyACM0

4. OpenCR Firmware Update

Critical: The OpenCR board must be flashed with ROS2 Humble firmware. ARM64 (Raspberry Pi) flash tools are NOT available. You must use an x86 PC.

On an x86 PC (Windows/Mac/Linux):

wget https://github.com/ROBOTIS-GIT/OpenCR-Binaries/raw/master/turtlebot3/ROS2/latest/opencr_update.tar.bz2

tar -xvf opencr_update.tar.bz2

cd opencr_update

./update.sh /dev/ttyACM0 burger.opencr

Note: Replace ‘burger’ with ‘waffle’ or ‘waffle_pi’ if applicable.

5. TurtleBot3 Operation Commands

5.1 Basic Operation

Launch Robot (Terminal 1)

ros2 launch turtlebot3_bringup robot.launch.py

Keyboard Control (Terminal 2)

ros2 run turtlebot3_teleop teleop_keyboard

RViz Visualization (Terminal 3)

ros2 launch turtlebot3_bringup rviz2.launch.py

5.2 SLAM Mapping

Start SLAM with Cartographer

ros2 launch turtlebot3_cartographer cartographer.launch.py

Save Map

ros2 run nav2_map_server map_saver_cli -f ~/maps/my_map

5.3 Autonomous Navigation

ros2 launch turtlebot3_navigation2 navigation2.launch.py map:=~/maps/my_map.yaml

5.4 Debugging Commands

CommandPurpose
ros2 topic listList all active topics
ros2 topic echo /scanView LiDAR data
ros2 topic echo /cmd_velView velocity commands
ros2 topic echo /odomView odometry data
ros2 node listList active nodes

6. Issues Encountered and Solutions

IssueCauseSolution
turtlebot3_simulations clone failedWrong branch: humble-develUse branch: humble
Build failed – ament_cmake not foundROS2 not sourced before buildRun: source /opt/ros/humble/setup.bash
Gazebo package not foundOSRF repository not addedAdd OSRF Gazebo repository
Environment variables emptybashrc not properly sourcedManually add to ~/.bashrc
OpenCR flash failed on ARM64No ARM64 binaries availableFlash from x86 PC
Motors not movingOpenCR has old firmwareFlash ROS2 Humble firmware

7. System Information Commands

InformationCommand
RAM usagefree -h
SD card storagedf -h
RPi modelcat /proc/device-tree/model
CPU infocat /proc/cpuinfo | grep Model
Check serial portsls /dev/ttyACM* /dev/ttyUSB*

Turtlebot 3 Demos

Within the configuration process (next section) are 10 demos which look at different aspects of the Turtlebot 3 (Burger). These demos and some of the results are described in this section. Note that the results are sometimes continuous and a lot of text. They are shortened in the following as illustration. Command lines are indicated by >.

Demo 1 – Bringup – >ros2 launch turtlebot3_bringup robot.launch.py

[INFO] [launch]: All log files can be found below
/home/dennismiller/.ros/log/2026-02-07-17-25-26-278663-dennismiller-desktop-3651
[INFO] [launch]: Default logging verbosity is set to INFO
urdf_file_name : turtlebot3_burger.urdf ….

Demo 2 – Keyboard teleop

>ros2 launch turtlebot3_bringup robot.launch.py

>ros2 run turtlebot3_teleop teleop_keyboard

Control Your TurtleBot3!
—————————
Moving around:
         w
    a    s    d
         x

w/x : increase/decrease linear velocity (Burger : ~ 0.22, Waffle and
Waffle Pi : ~ 0.26)
a/d : increase/decrease angular velocity (Burger : ~ 2.84, Waffle and
Waffle Pi : ~ 1.82)

space key, s : force stop

CTRL-C to quit

currently:    linear velocity 0.01     angular velocity 0.0
currently:    linear velocity 0.02     angular velocity 0.0
currently:    linear velocity 0.03     angular velocity 0.0
currently:    linear velocity 0.04     angular velocity 0.0

Demo 3 – SLAM mapping

>ros2 launch turtlebot3_bringup robot.launch.py

>ros2 launch turtlebot3_cartographer cartographer.launch.py

[INFO] [launch]: All log files can be found below
/home/dennismiller/.ros/log/2026-02-07-19-56-40-958956-dennismiller-desktop-5068
[INFO] [launch]: Default logging verbosity is set to INFO
[INFO] [cartographer_node-1]: process started with pid [5070]
[INFO]

Demo 4 – Save map

>mkdir -p ~/maps

>ros2 run nav2_map_server map_saver_cli -f ~/maps/my_first_map

Saves two files:

  • ~/maps/my_first_map.yaml — map metadata
  • ~/maps/my_first_map.pgm — map image (grayscale)

View the map image – eog ~/maps/my_first_map.pgm

Demo 5 – Autonomous navigation

>ros2 launch turtlebot3_bringup rviz2.launch.py

>ros2 launch turtlebot3_navigation2 navigation2.launch.pymap:=$HOME/maps/my_first_map.yaml

This command starts navigation and calls the SLAM map from the prior demo. As this opened, the responsiveness of the RPI 4 was very low; it was difficult to determine if selections were used. In the Rviz header, ‘2D Pose Estimate’ was selected and then the current position selected. ‘2D Goal Pose’ is was selected and then the desire position on the map was selected. A path was calculated as shown. The Turtlebots wheels were driven in an attempt to move it to the selected location. While the basic idea worked the combination of the slow response, selection uncertainty and physical setup didn’t result in the robot moving as shown.

RViz Navigation Explained

What you’re seeing:

ColorWhat it is
Green particlesRobot’s estimated positions (localization uncertainty)
Blue linePlanned global path to goal
Red lines/areasCostmap – obstacles and inflation zones (keep-out areas)
Black areasWalls/obstacles from your saved map
Gray areasFree space
White areasUnknown space

Demo 6 – ROS Topic list

>ros2 topic list

battery_state
/cmd_vel
/imu
/joint_states
/magnetic_field
/odom
/parameter_events
/robot_description
/rosout
/scan
/sensor_state
/tf
/tf_static

>dennismiller@dennismiller-desktop:~$ ros2 topic info /scan
Type: sensor_msgs/msg/LaserScan
Publisher count: 1
Subscription count: 0


>dennismiller@dennismiller-desktop:~$ ros2 topic hz /scan
WARNING: topic [/scan] does not appear to be published yet
average rate: 4.985
    min: 0.200s max: 0.201s std dev: 0.00030s

window: 6 (Note: ‘window’ inidicates the number of messages  sampled; it’s cummulative)
average rate: 4.984
    min: 0.200s max: 0.201s std dev: 0.00033s

window: 11
average rate: 4.984
    min: 0.199s max: 0.202s std dev: 0.00054s

…..

Demo 7 – LIDAR data

>ros2 launch turtlebot3_bringup rviz2.launch.py

>ros2 topic echo /scan –once

header:
  stamp:
    sec: 1770571607
    nanosec: 524285835
  frame_id: base_scan
angle_min: 0.0
angle_max: 6.2657318115234375
angle_increment: 0.01745329238474369
time_increment: 0.0005592841189354658
scan_time: 0.20134228467941284
range_min: 0.11999999731779099
range_max: 3.5
ranges:
– 0.4399999976158142
– 0.43299999833106995
– 0.4300000071525574

…..

LIDAR is publishing at 5 Hz (normal for LDS-01)

Key fields:

FieldMeaning
angle_min / angle_maxScan coverage (0 to 6.28 = 360°)
range_min / range_maxValid distance range (0.12m to 3.5m)
rangesArray of distances for each angle
intensitiesSignal strength at each angle

Demo 8 – Odometry data

>ros2 launch turtlebot3_bringup robot.launch.py

>ros2 topic echo /odom –once

stamp: sec: 1770572248 nanosec: 212764219

 frame_id: odom

child_frame_id: base_footprint

pose:

pose:

position: x: 15.919378026437194  y: 0.33155781798783285 z: 0.0

orientation: x: 0.0 y: 0.0 z: 0.013453528466268046 w: 0.9999094971905244

covariance: – 0.0 – 0.0 – 0.0 – 0.0 – 0.0 – 0.0 – 0.0 – 0.0 – 0.0 – 0.0 – 0.0 – 0.0 – 0.0 – 0.0 – 0.0 – 0.0 – 0.0 – 0.0 – 0.0 – 0.0 – 0.0 – 0.0 – 0.0 – 0

Covariance is a 6×6 matrix (36 values) representing uncertainty in the measurement. It tells other nodes “how confident” the odometry is. 0.0 is ‘don’t know’/’not provided’, ~.001 high confidence, ~1+ low confidence.

Running the wheels with teleop resulted in :

Field                            Value                           Meaning

position.x                    15.92 m                       Robot traveled ~16 meters

forwardposition.y       0.33 m                         Slight drift sideways

twist.linear.x               0.086 m/s                    Currently moving forward slowly

twist.angular.z            -0.003 rad/s                 Tiny right turn

Key fields:

FieldMeaning
pose.position.xDistance traveled forward (meters)
pose.position.yDistance traveled sideways (meters)
pose.orientationRobot heading (quaternion)
twist.linear.xCurrent forward velocity (m/s)
twist.angular.zCurrent rotation rate (rad/s)

Demo 9 – Rviz visualization

Stop ROS and restart – >ros2 launch turtlebot3_bringup robot.launch.py

>ros2 launch turtlebot3_bringup rviz2.launch.py

Rviz with LIDAR data points; color is intensity

Demo 10 – System Status

# ROS2 version: >echo $TURTLEBOT3_MODEL – humble

# Active nodes: >ros2 node list –

# Active topics: >ros2 topic list –

# Active services: >ros2 service list –

# Sensor states: >ros2 topic echo /sensor_state –once –

# RPI 4 RAM: >free -h –

# RPI 4 SD Storage: >df -h –

# RPI 4 Temperature: >cat /sys/class/thermal/thermal_zone0/temp (divide by 1000 for C) –

# Serial Ports: >ls /dev/ttyACM0 /dev/ttyUSB0 –

ROS2 nodes:

>ros2 launch turtlebot3_bringup robot.launch.py

>ros2 node list

NodePurpose
/turtlebot3_nodeMotor control, OpenCR communication
/hlds_laser_publisherLiDAR driver (LDS-01)
/robot_state_publisherPublishes robot URDF/transforms

When Running Teleop

NodePurpose
/teleop_keyboardKeyboard input → /cmd_vel

When Running SLAM (Cartographer)

NodePurpose
/cartographer_nodeSLAM processing
/cartographer_occupancy_grid_nodeGenerates map grid

When Running Navigation

NodePurpose
/bt_navigatorBehavior tree executor
/controller_serverPath following
/planner_serverPath planning
/map_serverServes the saved map
/amclLocalization (particle filter)
/lifecycle_managerManages node states

When Running RViz

NodePurpose
/rviz2Visualization

Logitech Controller

Add Xbox/Microsoft Controller

Step 1: Install Xbox driver and joystick packages

bash

# Install Xbox controller driver

>sudo apt install xboxdrv joystick jstest-gtk

# Install ROS2 joystick packages

>sudo apt install ros-humble-joy ros-humble-teleop-twist-joy

Step 2: Connect controller and verify

bash

# Plug in Xbox controller via USB, then check

>ls /dev/input/js*

# Should show /dev/input/js0

# Test the controller

>jstest /dev/input/js0

# Or with GUI

>jstest-gtk

Step 3: Set permissions

bash

# Add user to input group

>sudo usermod -aG input $USER

# Create udev rule for controller

>echo ‘KERNEL==”js*”, MODE=”0666″‘ | sudo tee /etc/udev/rules.d/99-joystick.rules

>sudo udevadm control –reload-rules

>sudo udevadm trigger

Logout and login for group changes.

Step 4: Test with ROS2

Terminal 1 – Launch robot:

bash

>ros2 launch turtlebot3_bringup robot.launch.py

Terminal 2 – Launch joystick node:

bash

>ros2 run joy joy_node

Terminal 3 – Check joystick data:

bash

>ros2 topic echo /joy

Move sticks and press buttons – you should see values change.

Step 5: Launch teleop with joystick

Terminal 2 – Replace joy_node with full teleop:

bash

>ros2 launch teleop_twist_joy teleop-launch.py joy_config:=’xbox’

Controller Mapping (Xbox):

ControlAction
Left stick up/downForward/backward
Right stick left/rightTurn left/right
LB (left bumper)Hold to enable movement (deadman switch)
RB (right bumper)Hold for turbo speed

Step 6: Create a launch script

bash

>cat > ~/turtlebot3_demos/xbox_teleop.sh << ‘EOF’

#!/bin/bash

>source /opt/ros/humble/setup.bash

>source ~/turtlebot3_ws/install/setup.bash

>echo “Xbox Controller Teleop for TurtleBot3”

>echo “”

>echo “Controls:”

>echo ”  Left stick    : Forward/Backward”

>echo ”  Right stick   : Turn Left/Right”

>echo ”  LB (hold)     : Enable movement”

>echo ”  RB (hold)     : Turbo mode”

>echo “”

>echo “Make sure bringup is running first!”

>echo “”

>ros2 launch teleop_twist_joy teleop-launch.py joy_config:=’xbox’

>EOF

>chmod +x ~/turtlebot3_demos/xbox_teleop.sh

Duplicate SD card On Linux PC

# Insert SD card into PC, find the device name

>lsblk

# Usually shows as /dev/sdb or /dev/mmcblk0

# Make sure to identify the CORRECT device!

# Create image (replace sdX with your device)

>sudo dd if=/dev/sdX of=~/turtlebot3_backup.img bs=4M status=progress

# Compress it (optional, saves space)

>gzip ~/turtlebot3_backup.img

To restore to a new SD card:

# Decompress if needed

>gunzip ~/turtlebot3_backup.img.gz

# Write to new SD card (replace sdX)

>sudo dd if=~/turtlebot3_backup.img of=/dev/sdX bs=4M status=progress

Simulation with Gazebo

While technically feasible to run the Turtlebot 3 as a Gazebo simulation, the computational power required is really beyond the RPI 4B with 4 Gb of RAM. I didn’t try to run it.

Configuration Using Python Commands

The entire configuration process was done through a Python script that ran the required commands. For avid Python coders this is likely an obvious process. For me as a sometimes Python coder it was … amazing. The Python script follows.

#!/bin/bash
#===============================================================================
# TurtleBot3 Setup Script for Raspberry Pi 4 with Ubuntu 22.04 and ROS2
Humble
#
# This script performs:
#   1. RPi 4 System Characteristics Detection
#   2. ROS2 Humble Installation
#   3. TurtleBot3 Packages and Demo Scripts Installation
#
# Usage: chmod +x turtlebot3_rpi4_setup.sh && ./turtlebot3_rpi4_setup.sh
#===============================================================================

set -e  # Exit on error

# Colors for output
RED=’\033[0;31m’
GREEN=’\033[0;32m’
YELLOW=’\033[1;33m’
BLUE=’\033[0;34m’
NC=’\033[0m’ # No Color

print_header() {
     echo -e
“\n${BLUE}============================================================${NC}”
     echo -e “${BLUE}$1${NC}”
     echo -e
“${BLUE}============================================================${NC}\n”
}

print_success() {
     echo -e “${GREEN}✓ $1${NC}”
}

print_warning() {
     echo -e “${YELLOW}⚠ $1${NC}”
}

print_error() {
     echo -e “${RED}✗ $1${NC}”
}

#===============================================================================
# PART 1: RASPBERRY PI 4 SYSTEM CHARACTERISTICS
#===============================================================================
detect_rpi4_characteristics() {
     print_header “PART 1: RASPBERRY PI 4 SYSTEM CHARACTERISTICS”

     OUTPUT_FILE=”$HOME/rpi4_characteristics.txt”

     echo “Raspberry Pi 4 System Characteristics Report” > “$OUTPUT_FILE”
     echo “Generated: $(date)” >> “$OUTPUT_FILE”
     echo “=============================================” >> “$OUTPUT_FILE”

     # CPU Information
     echo -e “\n${GREEN}— CPU Information —${NC}”
     echo -e “\nCPU Information:” >> “$OUTPUT_FILE”

     if [ -f /proc/cpuinfo ]; then
         CPU_MODEL=$(grep “Model” /proc/cpuinfo | head -1 | cut -d: -f2
| xargs)
         CPU_HARDWARE=$(grep “Hardware” /proc/cpuinfo | head -1 | cut
-d: -f2 | xargs)
         CPU_REVISION=$(grep “Revision” /proc/cpuinfo | head -1 | cut
-d: -f2 | xargs)
         CPU_CORES=$(nproc)

         echo ”  Model: $CPU_MODEL”
         echo ”  Hardware: $CPU_HARDWARE”
         echo ”  Revision: $CPU_REVISION”
         echo ”  CPU Cores: $CPU_CORES”

         echo ”  Model: $CPU_MODEL” >> “$OUTPUT_FILE”
         echo ”  Hardware: $CPU_HARDWARE” >> “$OUTPUT_FILE”
         echo ”  Revision: $CPU_REVISION” >> “$OUTPUT_FILE”
         echo ”  CPU Cores: $CPU_CORES” >> “$OUTPUT_FILE”
     fi

     # CPU Architecture
     ARCH=$(uname -m)
     echo ”  Architecture: $ARCH”
     echo ”  Architecture: $ARCH” >> “$OUTPUT_FILE”

     # CPU Frequency
     if [ -f /sys/devices/system/cpu/cpu0/cpufreq/cpuinfo_max_freq ]; then
         MAX_FREQ=$(cat
/sys/devices/system/cpu/cpu0/cpufreq/cpuinfo_max_freq)
         MAX_FREQ_MHZ=$((MAX_FREQ / 1000))
         echo ”  Max CPU Frequency: ${MAX_FREQ_MHZ} MHz”
         echo ”  Max CPU Frequency: ${MAX_FREQ_MHZ} MHz” >> “$OUTPUT_FILE”
     fi

     # Memory Information
     echo -e “\n${GREEN}— Memory Information —${NC}”
     echo -e “\nMemory Information:” >> “$OUTPUT_FILE”

     TOTAL_MEM=$(free -h | grep Mem | awk ‘{print $2}’)
     USED_MEM=$(free -h | grep Mem | awk ‘{print $3}’)
     FREE_MEM=$(free -h | grep Mem | awk ‘{print $4}’)

     echo ”  Total RAM: $TOTAL_MEM”
     echo ”  Used RAM: $USED_MEM”
     echo ”  Free RAM: $FREE_MEM”

     echo ”  Total RAM: $TOTAL_MEM” >> “$OUTPUT_FILE”
     echo ”  Used RAM: $USED_MEM” >> “$OUTPUT_FILE”
     echo ”  Free RAM: $FREE_MEM” >> “$OUTPUT_FILE”

     # Storage Information
     echo -e “\n${GREEN}— Storage Information —${NC}”
     echo -e “\nStorage Information:” >> “$OUTPUT_FILE”

     df -h / | tail -1 | while read filesystem size used avail use_pct
mounted; do
         echo ”  Root Filesystem: $filesystem”
         echo ”  Total Size: $size”
         echo ”  Used: $used ($use_pct)”
         echo ”  Available: $avail”

         echo ”  Root Filesystem: $filesystem” >> “$OUTPUT_FILE”
         echo ”  Total Size: $size” >> “$OUTPUT_FILE”
         echo ”  Used: $used ($use_pct)” >> “$OUTPUT_FILE”
         echo ”  Available: $avail” >> “$OUTPUT_FILE”
     done

     # Temperature
     echo -e “\n${GREEN}— Temperature —${NC}”
     echo -e “\nTemperature:” >> “$OUTPUT_FILE”

     if [ -f /sys/class/thermal/thermal_zone0/temp ]; then
         TEMP=$(cat /sys/class/thermal/thermal_zone0/temp)
         TEMP_C=$(echo “scale=1; $TEMP / 1000” | bc)
         echo ”  CPU Temperature: ${TEMP_C}°C”
         echo ”  CPU Temperature: ${TEMP_C}°C” >> “$OUTPUT_FILE”
     fi

     # GPU Memory (if vcgencmd available)
     if command -v vcgencmd &> /dev/null; then
         echo -e “\n${GREEN}— GPU Information —${NC}”
         echo -e “\nGPU Information:” >> “$OUTPUT_FILE”

         GPU_MEM=$(vcgencmd get_mem gpu 2>/dev/null || echo “N/A”)
         echo ”  $GPU_MEM”
         echo ”  $GPU_MEM” >> “$OUTPUT_FILE”
     fi

     # OS Information
     echo -e “\n${GREEN}— Operating System —${NC}”
     echo -e “\nOperating System:” >> “$OUTPUT_FILE”

     if [ -f /etc/os-release ]; then
         source /etc/os-release
         echo ”  OS: $PRETTY_NAME”
         echo ”  Kernel: $(uname -r)”

         echo ”  OS: $PRETTY_NAME” >> “$OUTPUT_FILE”
         echo ”  Kernel: $(uname -r)” >> “$OUTPUT_FILE”
     fi

     # Network Interfaces
     echo -e “\n${GREEN}— Network Interfaces —${NC}”
     echo -e “\nNetwork Interfaces:” >> “$OUTPUT_FILE”

     ip -o link show | while read num iface rest; do
         IFACE_NAME=$(echo $iface | tr -d ‘:’)
         if [[ “$IFACE_NAME” != “lo” ]]; then
             IP_ADDR=$(ip -4 addr show $IFACE_NAME 2>/dev/null | grep
inet | awk ‘{print $2}’ | head -1)
             if [ -n “$IP_ADDR” ]; then
                 echo ”  $IFACE_NAME: $IP_ADDR”
                 echo ”  $IFACE_NAME: $IP_ADDR” >> “$OUTPUT_FILE”
             fi
         fi
     done

     # USB Devices
     echo -e “\n${GREEN}— USB Devices —${NC}”
     echo -e “\nUSB Devices:” >> “$OUTPUT_FILE”

     if command -v lsusb &> /dev/null; then
         lsusb | while read line; do
             echo ”  $line”
             echo ”  $line” >> “$OUTPUT_FILE”
         done
     fi

     # Serial Ports (important for TurtleBot)
     echo -e “\n${GREEN}— Serial Ports (for OpenCR) —${NC}”
     echo -e “\nSerial Ports:” >> “$OUTPUT_FILE”

     ls -la /dev/ttyACM* /dev/ttyUSB* 2>/dev/null | while read line; do
         echo ”  $line”
         echo ”  $line” >> “$OUTPUT_FILE”
     done || echo ”  No serial ports detected”

     print_success “Characteristics saved to: $OUTPUT_FILE”
}

#===============================================================================
# PART 2: ROS2 HUMBLE INSTALLATION
#===============================================================================
install_ros2_humble() {
     print_header “PART 2: ROS2 HUMBLE INSTALLATION”

     # Check if ROS2 is already installed
     if [ -d “/opt/ros/humble” ]; then
         print_warning “ROS2 Humble appears to be already installed at
/opt/ros/humble”
         read -p “Do you want to reinstall? (y/N): ” REINSTALL
         if [[ ! “$REINSTALL” =~ ^[Yy]$ ]]; then
             print_success “Skipping ROS2 installation”
             return
         fi
     fi

     echo “Setting up locale…”
     sudo apt update && sudo apt install -y locales
     sudo locale-gen en_US en_US.UTF-8
     sudo update-locale LC_ALL=en_US.UTF-8 LANG=en_US.UTF-8
     export LANG=en_US.UTF-8
     print_success “Locale configured”

     echo “Adding ROS2 repository…”
     sudo apt install -y software-properties-common
     sudo add-apt-repository -y universe

     sudo apt update && sudo apt install -y curl gnupg lsb-release
     sudo curl -sSL
https://raw.githubusercontent.com/ros/rosdistro/master/ros.key \
         -o /usr/share/keyrings/ros-archive-keyring.gpg

     echo “deb

[arch=$(dpkg –print-architecture)
signed-by=/usr/share/keyrings/ros-archive-keyring.gpg]

\
http://packages.ros.org/ros2/ubuntu $(source /etc/os-release &&
echo $UBUNTU_CODENAME) main” | \
         sudo tee /etc/apt/sources.list.d/ros2.list > /dev/null
     print_success “ROS2 repository added”

     echo “Updating package lists…”
     sudo apt update
     sudo apt upgrade -y

     echo “Installing ROS2 Humble…”
     # For RPi without display, use ros-base. With display, use desktop
     read -p “Install with GUI tools (desktop) or minimal (ros-base)?
[desktop/base]: ” ROS_TYPE

     if [[ “$ROS_TYPE” == “desktop” ]]; then
         sudo apt install -y ros-humble-desktop
         print_success “ROS2 Humble Desktop installed”
     else
         sudo apt install -y ros-humble-ros-base
         print_success “ROS2 Humble Base installed”
     fi

     # Install development tools
     echo “Installing development tools…”
     sudo apt install -y ros-dev-tools python3-colcon-common-extensions
python3-rosdep
     print_success “Development tools installed”

     # Initialize rosdep
     if [ ! -f /etc/ros/rosdep/sources.list.d/20-default.list ]; then
         echo “Initializing rosdep…”
         sudo rosdep init
     fi
     rosdep update
     print_success “rosdep initialized”

     # Add ROS2 to bashrc
     if ! grep -q “source /opt/ros/humble/setup.bash” ~/.bashrc; then
         echo “source /opt/ros/humble/setup.bash” >> ~/.bashrc
         print_success “ROS2 added to .bashrc”
     fi

     # Source for current session
     source /opt/ros/humble/setup.bash

     print_success “ROS2 Humble installation complete!”
}

#===============================================================================
# PART 3: TURTLEBOT3 PACKAGES AND DEMOS
#===============================================================================
install_turtlebot3() {
     print_header “PART 3: TURTLEBOT3 PACKAGES AND DEMOS”

     # Source ROS2
     source /opt/ros/humble/setup.bash

     # Determine TurtleBot3 model
     echo “Which TurtleBot3 model do you have?”
     echo ”  1) burger”
     echo ”  2) waffle”
     echo ”  3) waffle_pi”
     read -p “Enter choice [1-3]: ” TB3_CHOICE

     case $TB3_CHOICE in
         1) TB3_MODEL=”burger” ;;
         2) TB3_MODEL=”waffle” ;;
         3) TB3_MODEL=”waffle_pi” ;;
         *) TB3_MODEL=”burger” ;;
     esac

     # Determine LiDAR model
     echo “Which LiDAR model do you have?”
     echo ”  1) LDS-01 (older TurtleBot3)”
     echo ”  2) LDS-02 (newer TurtleBot3)”
     read -p “Enter choice [1-2]: ” LDS_CHOICE

     case $LDS_CHOICE in
         1) LDS_MODEL=”LDS-01″ ;;
         2) LDS_MODEL=”LDS-02″ ;;
         *) LDS_MODEL=”LDS-01″ ;;
     esac

     # Install TurtleBot3 dependencies
     echo “Installing TurtleBot3 dependencies…”
     sudo apt install -y \
         ros-humble-cartographer \
         ros-humble-cartographer-ros \
         ros-humble-navigation2 \
         ros-humble-nav2-bringup \
         ros-humble-dynamixel-sdk \
         ros-humble-turtlebot3-msgs \
         ros-humble-turtlebot3

     print_success “TurtleBot3 base packages installed”

     # Create TurtleBot3 workspace
     echo “Creating TurtleBot3 workspace…”
     mkdir -p ~/turtlebot3_ws/src
     cd ~/turtlebot3_ws/src

     # Clone TurtleBot3 packages from source for latest updates
     echo “Cloning TurtleBot3 source packages…”

     if [ ! -d “DynamixelSDK” ]; then
         git clone -b humble https://github.com/ROBOTIS-GIT/DynamixelSDK.git
     fi

     if [ ! -d “turtlebot3_msgs” ]; then
         git clone -b humble
https://github.com/ROBOTIS-GIT/turtlebot3_msgs.git
     fi

     if [ ! -d “turtlebot3” ]; then
         git clone -b humble https://github.com/ROBOTIS-GIT/turtlebot3.git
     fi

     if [ ! -d “turtlebot3_simulations” ]; then
         git clone -b humble-devel
https://github.com/ROBOTIS-GIT/turtlebot3_simulations.git
     fi

     # Install LiDAR driver based on model
     echo “Installing LiDAR driver…”
     if [ ! -d “ld08_driver” ] && [ “$LDS_MODEL” == “LDS-02” ]; then
         git clone -b ros2-devel
https://github.com/ROBOTIS-GIT/ld08_driver.git
     elif [ ! -d “hls_lfcd_lds_driver” ] && [ “$LDS_MODEL” == “LDS-01”
]; then
         git clone -b ros2-devel
https://github.com/ROBOTIS-GIT/hls_lfcd_lds_driver.git
     fi

     print_success “Source packages cloned”

     # Install dependencies with rosdep
     echo “Installing dependencies with rosdep…”
     cd ~/turtlebot3_ws
     rosdep install –from-paths src –ignore-src -r -y || true

     # Build workspace
     echo “Building TurtleBot3 workspace (this may take a while on RPi)…”
     cd ~/turtlebot3_ws
     colcon build –symlink-install –parallel-workers 2

     print_success “TurtleBot3 workspace built”

     # Add TurtleBot3 environment to bashrc
     echo “Configuring environment…”

     # Remove old entries if they exist
     sed -i ‘/TURTLEBOT3_MODEL/d’ ~/.bashrc
     sed -i ‘/LDS_MODEL/d’ ~/.bashrc
     sed -i ‘/ROS_DOMAIN_ID.*TURTLEBOT/d’ ~/.bashrc
     sed -i ‘/turtlebot3_ws/d’ ~/.bashrc

     # Add new entries
     cat >> ~/.bashrc << EOF

# TurtleBot3 Configuration
source ~/turtlebot3_ws/install/setup.bash
export TURTLEBOT3_MODEL=$TB3_MODEL
export LDS_MODEL=$LDS_MODEL
export ROS_DOMAIN_ID=30  # Match this with Remote PC
export OPENCR_PORT=/dev/ttyACM0
export OPENCR_MODEL=$TB3_MODEL
EOF

     print_success “Environment configured in .bashrc”

     # Set up udev rules for OpenCR and LiDAR
     echo “Setting up udev rules for hardware…”

     # OpenCR udev rule
     sudo bash -c ‘cat > /etc/udev/rules.d/99-opencr.rules << EOF
ATTRS{idVendor}==”0483″, ATTRS{idProduct}==”5740″,
ENV{ID_MM_DEVICE_IGNORE}=”1″, MODE=”0666″, GROUP=”dialout”,
SYMLINK+=”opencr”
EOF’

     # LiDAR udev rule
     if [ “$LDS_MODEL” == “LDS-01” ]; then
         sudo bash -c ‘cat > /etc/udev/rules.d/99-lds01.rules << EOF
KERNEL==”ttyUSB*”, ATTRS{idVendor}==”0483″, ATTRS{idProduct}==”5740″,
MODE=”0666″, GROUP=”dialout”, SYMLINK+=”ttyLDS”
EOF’
     else
         sudo bash -c ‘cat > /etc/udev/rules.d/99-lds02.rules << EOF
KERNEL==”ttyUSB*”, ATTRS{idVendor}==”10c4″, ATTRS{idProduct}==”ea60″,
MODE=”0666″, GROUP=”dialout”, SYMLINK+=”ttyLDS”
EOF’
     fi

     # Reload udev rules
     sudo udevadm control –reload-rules
     sudo udevadm trigger

     # Add user to dialout group
     sudo usermod -aG dialout $USER

     print_success “udev rules configured”

     # Create demo launch scripts
     create_demo_scripts

     print_success “TurtleBot3 installation complete!”
}

#===============================================================================
# CREATE DEMO SCRIPTS
#===============================================================================
create_demo_scripts() {
     print_header “Creating TurtleBot3 Demo Scripts”

     DEMO_DIR=”$HOME/turtlebot3_demos”
     mkdir -p “$DEMO_DIR”

     # Demo 1: Bringup (run on TurtleBot)
     cat > “$DEMO_DIR/01_bringup.sh” << ‘EOF’
#!/bin/bash
# TurtleBot3 Bringup – Run this on the TurtleBot3 (RPi)
# This starts the robot’s core systems: motors, sensors, etc.

source /opt/ros/humble/setup.bash
source ~/turtlebot3_ws/install/setup.bash

echo “Starting TurtleBot3 Bringup…”
echo “Model: $TURTLEBOT3_MODEL”
echo “LiDAR: $LDS_MODEL”
echo “”
echo “Press Ctrl+C to stop”

ros2 launch turtlebot3_bringup robot.launch.py
EOF
     chmod +x “$DEMO_DIR/01_bringup.sh”

     # Demo 2: Teleop Keyboard
     cat > “$DEMO_DIR/02_teleop_keyboard.sh” << ‘EOF’
#!/bin/bash
# Keyboard Teleoperation – Control TurtleBot3 with keyboard
# Run this on Remote PC or TurtleBot

source /opt/ros/humble/setup.bash
source ~/turtlebot3_ws/install/setup.bash

echo “Keyboard Teleoperation for TurtleBot3”
echo “”
echo “Controls:”
echo ”  w/x : increase/decrease linear velocity”
echo ”  a/d : increase/decrease angular velocity”
echo ”  space/s : stop”
echo “”

ros2 run turtlebot3_teleop teleop_keyboard
EOF
     chmod +x “$DEMO_DIR/02_teleop_keyboard.sh”

     # Demo 3: SLAM (Cartographer)
     cat > “$DEMO_DIR/03_slam_cartographer.sh” << ‘EOF’
#!/bin/bash
# SLAM with Cartographer – Create a map while driving
# Run on Remote PC while bringup is running on TurtleBot

source /opt/ros/humble/setup.bash
source ~/turtlebot3_ws/install/setup.bash

echo “Starting SLAM with Cartographer…”
echo “Make sure bringup is running on TurtleBot3”
echo “”

ros2 launch turtlebot3_cartographer cartographer.launch.py
EOF
     chmod +x “$DEMO_DIR/03_slam_cartographer.sh”

     # Demo 4: Save Map
     cat > “$DEMO_DIR/04_save_map.sh” << ‘EOF’
#!/bin/bash
# Save the map created by SLAM
# Run this after exploring the environment

source /opt/ros/humble/setup.bash

MAP_NAME=${1:-“my_map”}
echo “Saving map as: $MAP_NAME”

ros2 run nav2_map_server map_saver_cli -f ~/maps/$MAP_NAME

echo “Map saved to ~/maps/$MAP_NAME.yaml and ~/maps/$MAP_NAME.pgm”
EOF
     chmod +x “$DEMO_DIR/04_save_map.sh”

     # Demo 5: Navigation
     cat > “$DEMO_DIR/05_navigation.sh” << ‘EOF’
#!/bin/bash
# Autonomous Navigation – Navigate using a saved map
# Run on Remote PC

source /opt/ros/humble/setup.bash
source ~/turtlebot3_ws/install/setup.bash

MAP_FILE=${1:-“$HOME/maps/my_map.yaml”}

echo “Starting Navigation with map: $MAP_FILE”
echo “Make sure bringup is running on TurtleBot3”
echo “”

ros2 launch turtlebot3_navigation2 navigation2.launch.py map:=$MAP_FILE
EOF
     chmod +x “$DEMO_DIR/05_navigation.sh”

     # Demo 6: Check Topics
     cat > “$DEMO_DIR/06_check_topics.sh” << ‘EOF’
#!/bin/bash
# List all active ROS2 topics
# Useful for debugging

source /opt/ros/humble/setup.bash

echo “Active ROS2 Topics:”
echo “===================”
ros2 topic list

echo “”
echo “To see topic data, use: ros2 topic echo /topic_name”
echo “To see topic info, use: ros2 topic info /topic_name”
EOF
     chmod +x “$DEMO_DIR/06_check_topics.sh”

     # Demo 7: View LiDAR Data
     cat > “$DEMO_DIR/07_view_lidar.sh” << ‘EOF’
#!/bin/bash
# View LiDAR scan data
# Run while bringup is active

source /opt/ros/humble/setup.bash

echo “Viewing LiDAR data from /scan topic”
echo “Press Ctrl+C to stop”
echo “”

ros2 topic echo /scan
EOF
     chmod +x “$DEMO_DIR/07_view_lidar.sh”

     # Demo 8: View Odometry
     cat > “$DEMO_DIR/08_view_odom.sh” << ‘EOF’
#!/bin/bash
# View odometry data
# Run while bringup is active

source /opt/ros/humble/setup.bash

echo “Viewing Odometry data from /odom topic”
echo “Press Ctrl+C to stop”
echo “”

ros2 topic echo /odom
EOF
     chmod +x “$DEMO_DIR/08_view_odom.sh”

     # Demo 9: RViz Visualization (for Remote PC with display)
     cat > “$DEMO_DIR/09_rviz.sh” << ‘EOF’
#!/bin/bash
# Launch RViz for visualization
# Run on Remote PC with display

source /opt/ros/humble/setup.bash
source ~/turtlebot3_ws/install/setup.bash

echo “Launching RViz…”

ros2 launch turtlebot3_bringup rviz2.launch.py
EOF
     chmod +x “$DEMO_DIR/09_rviz.sh”

     # Demo 10: System Status
     cat > “$DEMO_DIR/10_system_status.sh” << ‘EOF’
#!/bin/bash
# Check ROS2 system status

source /opt/ros/humble/setup.bash

echo “ROS2 System Status”
echo “==================”

echo “”
echo “ROS2 Version:”
ros2 –version

echo “”
echo “Active Nodes:”
ros2 node list

echo “”
echo “Active Topics:”
ros2 topic list

echo “”
echo “Active Services:”
ros2 service list
EOF
     chmod +x “$DEMO_DIR/10_system_status.sh”

     # Create README for demos
     cat > “$DEMO_DIR/README.md” << EOF
# TurtleBot3 Demo Scripts

## Quick Start

### On TurtleBot3 (Raspberry Pi):
1. First, run bringup:
    \`\`\`
    ./01_bringup.sh
    \`\`\`

### On Remote PC (or another terminal):
2. Control with keyboard:
    \`\`\`
    ./02_teleop_keyboard.sh
    \`\`\`

## Available Demos

| Script | Description | Run On |
|——–|————-|——–|
| 01_bringup.sh | Start robot systems | TurtleBot3 |
| 02_teleop_keyboard.sh | Keyboard control | Remote PC |
| 03_slam_cartographer.sh | Create map while driving | Remote PC |
| 04_save_map.sh | Save SLAM map | Remote PC |
| 05_navigation.sh | Autonomous navigation | Remote PC |
| 06_check_topics.sh | List ROS2 topics | Any |
| 07_view_lidar.sh | View LiDAR data | Any |
| 08_view_odom.sh | View odometry | Any |
| 09_rviz.sh | Visual monitoring | Remote PC (GUI) |
| 10_system_status.sh | Check system status | Any |

## Network Setup

Make sure TurtleBot3 and Remote PC are on the same network and have
matching:
– ROS_DOMAIN_ID (default: 30)
– Same ROS2 version (Humble)

## Troubleshooting

1. **Can’t connect to robot**: Check ROS_DOMAIN_ID matches
2. **No LiDAR data**: Check USB connection and udev rules
3. **Motors not moving**: Check OpenCR connection (/dev/ttyACM0)

## Environment Variables

Current configuration (in ~/.bashrc):
– TURTLEBOT3_MODEL=$TB3_MODEL
– LDS_MODEL=$LDS_MODEL
– ROS_DOMAIN_ID=30
EOF

     # Create maps directory
     mkdir -p ~/maps

     print_success “Demo scripts created in: $DEMO_DIR”
     echo “”
     echo “Available demo scripts:”
     ls -la “$DEMO_DIR”/*.sh
}

#===============================================================================
# MAIN MENU
#===============================================================================
main_menu() {
     clear
     echo -e “${BLUE}”
     echo “╔══════════════════════════════════════════════════════════════╗”
     echo “║     TurtleBot3 Setup for Raspberry Pi 4 + Ubuntu 22.04      ║”
     echo “║                    with ROS2 Humble                          ║”
     echo “╚══════════════════════════════════════════════════════════════╝”
     echo -e “${NC}”
     echo “”
     echo “Select an option:”
     echo “”
     echo ”  1) Run ALL steps (recommended for fresh install)”
     echo ”  2) Part 1 only: Detect RPi4 characteristics”
     echo ”  3) Part 2 only: Install ROS2 Humble”
     echo ”  4) Part 3 only: Install TurtleBot3 packages & demos”
     echo ”  5) Exit”
     echo “”
     read -p “Enter choice [1-5]: ” CHOICE

     case $CHOICE in
         1)
             detect_rpi4_characteristics
             install_ros2_humble
             install_turtlebot3
             ;;
         2)
             detect_rpi4_characteristics
             ;;
         3)
             install_ros2_humble
             ;;
         4)
             install_turtlebot3
             ;;
         5)
             echo “Exiting…”
             exit 0
             ;;
         *)
             echo “Invalid choice”
             exit 1
             ;;
     esac
}

#===============================================================================
# FINAL SUMMARY
#===============================================================================
print_final_summary() {
     print_header “SETUP COMPLETE!”

     echo “Summary of installed components:”
     echo “”
     echo ”  1. RPi4 Characteristics: ~/rpi4_characteristics.txt”
     echo ”  2. ROS2 Humble: /opt/ros/humble/”
     echo ”  3. TurtleBot3 Workspace: ~/turtlebot3_ws/”
     echo ”  4. Demo Scripts: ~/turtlebot3_demos/”
     echo “”
     echo “Next steps:”
     echo “”
     echo ”  1. Log out and log back in (or reboot) for group changes”
     echo ”  2. Connect your TurtleBot3 hardware”
     echo ”  3. Run: cd ~/turtlebot3_demos && ./01_bringup.sh”
     echo “”
     echo “For keyboard control, open another terminal and run:”
     echo ”  cd ~/turtlebot3_demos && ./02_teleop_keyboard.sh”
     echo “”
     print_success “Happy robotics!”
}

#===============================================================================
# RUN SCRIPT
#===============================================================================
main_menu
print_final_summary

Mobile Robots – Blog 2 – JetBot

Blog 2 – Rejuvenation of a Yahboom Jetbot

February 4, 2026

The Jetbot

The Jetbot AI Robot was introduced around 2019 by the Shenzhen Yahboom Technology Co., Ltd. I purchased the Jetbot in 2020. It represented one of the early AI capable robots, an application of the Jetson Nano and the use of serial bus servos in a machine of this type. After the kit was built the Jetbot worked as advertised (JetBot AI Robot Car).

Figure 1 – Jetbot

After initial evaluation little was done with the Jetbot until mid-2024 when I wanted to use it for a robot demonstration. I found that the software would not start and it was put back on the shelf. Recently I decided to bring the Jetbot back to life as it is a good representation of the early advance of hobby/education robots to using computers such as the Jetson Nano.

The Jetbot WEB site includes the software image that I assume was on the factory provided USB disk (which no longer worked). A 128 Gb micro SD card was imaged with balenaEtcher. The image software still would not start. Troubleshooting the factory image/software didn’t seem very productive. Further, I was interested primarily in getting a function demonstration working and at least one example of a machine learning function so did not need all the functions of the original working Jetbot. Even with the decision to do a subset of the factory provided functionality, the development of the scripts needed was a challenge in reverse engineering.

To begin the rejuvenation process I went to the Anthropic Large Language Model Sonnet 4.5 or ‘Claude’ and began a step by step process of developing the Python scripts needed to make the Jetbot functional. The Jetbot Nano shipped with Ubuntu 18.04 and Nvidia JetPack 4.x. It did not use the Robot Operating System (ROS). Rather than trying to replicate the factory software Claude found an updated image that uses Ubuntu 20.04 (Qengineering/Jetson-Nano-Ubuntu-20-image; GitHub – Qengineering/Jetson-Nano-Ubuntu-20-image: Jetson Nano with Ubuntu 20.04 image). This image was downloaded and written to a 128 Gb microSD (Jetson-Nano-Ubuntu-20-image-main.zip). The SD was installed and the Jetson Nano opened Ubuntu 20.04. Through the next several days working with commands and Python3 script provided by Claude, the functionality of Table 1 was developed.

Jetbot Functions

Component                            Status

Drive motors                          ✓ Working

Camera lift + limits                ✓ Working

RGB LEDs                              ✓ Working

OLED display                         ✓ Working

USB camera streaming          ✓ Working

PWM servos (pan/tilt)            ✓ Working

Logitech controller                 ✓ Working

K1 shutdown button               ✓ Working

Ball tracking                           ✓ Working

Battery monitor                      ✓ Working (Bus 0, 0x41)

Auto-start service                   ✓ Working

Table 1 – Jetbot Functions Implemented

The development process was not without some challenges as discussed in the next sections.

Jetbot Hardware

 The Jetbot hardware includes: 1) Nvidia Jetson Nano, 2) Yahboom Jetson Expansion Board, 3) bus connected camera, 4) horizontal and vertical serial bus servos for camera control, 5) DC motor to raise and lower the camera mount, 6) two worm gear track motors (without encoders). Over the course of the rejuvenation work several changes to the hardware were made:

  1. While getting the horizontal and vertical serial bus servos to work the bus driver electronics failed. I suspect a some point while disconnecting/connecting the servo there was a static discharge. The period of work was during an extended cool spell and the inside humidity was really low. The expansion board for the Jetbot is no longer available so another camera servo solution was needed. PWM connections are available on the expansion board but I couldn’t find any servos of the right size to replace the original ones so replaced the entire camera assembly with one I had which used a USB camera and PWM servos.
  2. I added an INA219 current/voltage monitor for the battery and the script for a robot shutdown if the battery voltage gets too low (more later). This was not a part of the factory configuration. I must admit my addition is not in keeping with the construction quality of the original product.
  3. The picture below shows the camera and voltage monitoring changes.

Figure 2 – Jetbot with replacement camera assembly and INA219 assembly (at rear)

In addition to the Jetbot hardware, remote control was done using a Logitech Wireless Gamepad F710 in place of the controller provided with the product.

Software Development

Software development do using ‘Claude’ in an interactive relationship. This and other work have used Claude for commands, coding and suggested directions.

Yahboom provides for download documentation and much of the code used in the Jetbot product. The following is the list of the download section (JetBot AI Robot Car)

Downloaded scripts were provided to Claude as reference to create new Python script specifically for this Jetbot operation plan. In the text and graphics which follow the software configuration of the Jetbot rejuvenation process are described. The content was generated per prompts by Claude. It is important to add that this was not a straight line process as there were a number of places where some trial and error was required to get the desired results. It is also an example of the power of the LLM project process. While I do and have done programmed and system design, what I might do in hours, Claude does in seconds. The summary which follows is an example. In about 10 minutes I have a description of the entire project.

Project Summary (generated by Claude)

Overview

jetbot_tracker.py is a comprehensive control script for the Yahboom JetBot that provides manual remote control via a Logitech F710 gamepad, autonomous colored ball tracking and following, HTTP camera streaming, battery monitoring with auto-shutdown protection, and safe shutdown via a hardware button.

This project arose from the need to completely rebuild the JetBot’s software stack after the manufacturer’s supplied SD card image failed to boot. Starting from a clean Qengineering Ubuntu 20.04 image for the Jetson Nano, every hardware interface was reverse-engineered and rebuilt from scratch using Python, OpenCV, and direct I2C communication with the expansion board’s PCA9685 PWM controller.

Hardware Components

The JetBot’s hardware is controlled through a combination of I2C buses, GPIO pins, and USB interfaces. The central PCA9685 PWM controller at address 0x40 drives all motors and servos.

ComponentInterfaceAddress / Pin
Drive Motors (Left/Right)PCA9685 PWM0x40, Channels 8–11
Camera Lift MotorPCA9685 PWM0x40, Channels 12–13
Lift Limit SwitchesGPIOPin 11 (up), Pin 12 (down)
Pan/Tilt ServosPCA9685 PWM (50 Hz)0x40, Channels 0–1
RGB LED StripI2C0x1B
OLED Display (128×32)I2C Bus 00x3C
Battery Monitor (INA219)I2C Bus 00x41
Shutdown Button (K1)GPIOPin 24 (active LOW)
USB CameraV4L2/dev/video1

Software Architecture — Class Descriptions

BatteryMonitor

Reads voltage from an INA219 current/voltage sensor connected to I2C Bus 0 at address 0x41. The sensor monitors the 3S Li-ion battery pack and enables automatic shutdown when voltage drops below safe levels.

MethodDescription
__init__(address, bus)Initialize I2C connection to INA219 sensor
read_voltage()Returns battery voltage as float, or None if sensor unavailable

Battery Voltage Thresholds

VoltageStatusAction
12.6 VFull charge
11.7 V≈75%Normal operation
11.1 V≈50%Normal operation
10.0 VLowWarning — orange LED pulse
9.0 VCriticalAuto-shutdown initiated

OLED

Controls a 128×32 monochrome OLED display via I2C Bus 0 at address 0x3C. Displays system status, IP address, battery voltage, and tracking information.

MethodDescription
show_status(ip, msg, voltage)Display IP address, status message, and battery voltage
show_tracking(color, status, radius)Display active tracking mode, status, and ball radius
show_shutdown(countdown)Display shutdown countdown timer

Motors

Controls the differential drive system via PCA9685 PWM channels. Each motor uses two channels (forward and reverse) for bidirectional control.

MethodDescription
set_motors(left, right)Set motor speeds from −1.0 (full reverse) to +1.0 (full forward)
stop()Stop both motors immediately

Motor Channel Mapping

LEFT_FWD  = Channel 11    RIGHT_FWD = Channel 8 LEFT_REV  = Channel 10    RIGHT_REV = Channel 9

Lift

Controls the camera lift mechanism, which raises and lowers the camera platform. Uses two PCA9685 channels (12 and 13) and monitors GPIO limit switches to prevent over-travel in either direction.

MethodDescription
move(speed)Move lift from −1.0 (down) to +1.0 (up); respects limit switches
stop()Stop lift motor
check_limits()Returns tuple (at_top, at_bottom) indicating limit switch state

PWMServos

Controls pan and tilt camera servos via PCA9685 at 50 Hz. Servo positions are specified in microseconds of pulse width.

MethodDescription
move_vertical(delta)Adjust tilt by delta microseconds
move_horizontal(delta)Adjust pan by delta microseconds
center()Move both servos to center position
tracking_start_position()Camera angled down and centered for ball tracking
cleanup()Release servo resources

Servo Configuration

Vertical  (Ch 0):  1000μs = up,     1800μs = down,   center = 1400μs Horizontal (Ch 1): 1000μs = right,  2000μs = left,   center = 1500μs

RGB

Controls an RGB LED strip via I2C at address 0x1B. Provides visual feedback for operating mode, tracking status, and battery warnings.

MethodDescription
set_all(r, g, b)Set all LEDs to specified color (0–255 per channel)
off()Turn off all LEDs
flash_red(duration)Flash red LEDs for specified duration (limit switch warning)
pulse_orange()Pulsing orange pattern (low battery warning)

CameraWithTracking

Wraps OpenCV VideoCapture with MJPEG streaming and color detection. Captures frames from a USB camera at 640×480, applies HSV color filtering to detect colored balls, and serves frames as an HTTP MJPEG stream on port 8080.

MethodDescription
get_frame()Returns current frame as JPEG bytes
detect_ball(color)Returns (x, y, radius) of detected ball or None
stop()Release camera resources

Color Detection Ranges (HSV)

RED:    H: 0–10 or 170–180,  S: 100–255,  V: 100–255
GREEN:  H: 40–80,              S: 50–255,   V: 50–255
BLUE:   H: 100–130,            S: 50–255,   V: 50–255

Joystick

Reads input from a Logitech F710 gamepad connected via USB wireless dongle. Reads raw joystick events from /dev/input/js0 in a background thread.

MethodDescription
read()Returns current state of all axes and buttons
cleanup()Close device file

BallTracker

Implements the autonomous ball tracking state machine. Coordinates motors, servos, camera, and LEDs to search for, track, and follow colored balls.

MethodDescription
start(color)Begin tracking a specific color (red, green, or blue)
stop()Exit tracking mode and return to manual control
update()Called each loop iteration; runs search/track/follow logic

Operating Modes and Control Flow

Main Loop

The main loop runs at approximately 50 Hz (20 ms sleep). Each iteration checks the hardware shutdown button, reads battery voltage, processes joystick input, and either runs the ball tracker or processes manual control commands.

┌─────────────── STARTUP ───────────────┐ │  Initialize: GPIO, OLED, Motors,      │ │  Lift, Servos, RGB, Battery,           │ │  Joystick, Camera, HTTP Server          │ └───────────────────┬───────────────────┘                     │          ┌───────▼────────┐          │  MAIN LOOP       │          │  1. Check K1      │          │  2. Read battery  │          │  3. Read joystick │          │  4. Update mode   │          │  5. Sleep 20ms    │          └───────┬────────┘                  │     ┌──────────┼──────────┐     ▼            ▼            ▼ SHUTDOWN      MANUAL       TRACKING (K1 held      (Joystick    (Autonomous  or low        control)     ball follow)  battery)

Controller Mapping

The Logitech F710 gamepad controls all manual robot functions. The controller operates in DirectInput mode.

InputAction
Right Stick Y-AxisDrive forward / reverse
Right Stick X-AxisSteer left / right
Left Stick X-AxisPan camera horizontally
Left Stick Y-AxisTilt camera vertically
RT (Right Trigger)Raise camera lift
LT (Left Trigger)Lower camera lift
A ButtonGreen LEDs
B ButtonRed LEDs
X ButtonBlue LEDs
Y ButtonLEDs off
D-pad LeftTrack BLUE ball
D-pad UpTrack GREEN ball
D-pad RightTrack RED ball
RB (Right Bumper)Exit tracking mode

Ball Tracking Algorithm

Search Phase

When ball tracking is activated by pressing a D-pad arrow, the camera moves to its start position (angled down and centered). The RGB LEDs are set to the target color as a visual indicator. The robot then rotates in place at a slow search speed while the camera scans each frame for a matching colored circle using HSV color filtering and contour detection.

Track Phase

Once a ball is detected, the tracker enters the active tracking phase with three prioritized control axes:

1. HORIZONTAL CENTERING (highest priority)   
– Error = ball_x – frame_center_x   
– If |error| > CENTER_TOLERANCE: rotate robot   
– Speed proportional to error  
2. VERTICAL CENTERING   
– Error = ball_y – frame_center_y   
– Tilt camera servo to compensate  
3. DISTANCE CONTROL (only when horizontally centered)   
– Error = ball_radius – TARGET_BALL_RADIUS   
– Positive error (too close): back up   
– Negative error (too far): move forward   
– If within BALL_RADIUS_TOLERANCE: stop (“LOCKED”)

Figure 3 – Ball tracking image from camera (radius is 29)

Lost Ball Recovery

If the target ball disappears from the camera frame, a lost counter increments each frame. If the ball reappears within approximately one second (30 frames), the robot holds position and re-acquires. After one second of lost contact, the tracker returns to the search phase and begins rotating to re-locate the ball.

Tuning Parameters

ParameterDefaultPurpose
TARGET_BALL_RADIUS50 pxExpected ball radius at desired follow distance (~2 ft)
BALL_RADIUS_TOLERANCE10 pxDeadband around target radius before driving
CENTER_TOLERANCE30 pxHorizontal centering deadband
TRACK_SPEED_FACTOR0.003Motor speed per pixel of horizontal error
SEARCH_ROTATE_SPEED0.3Motor speed during search rotation

HTTP Camera Streaming

The camera feed is served as a Motion JPEG (MJPEG) stream over HTTP. Any web browser or video player that supports MJPEG can display the feed.

PropertyValue
Server Addresshttp://<IP&gt;:8080
Stream URLhttp://<IP&gt;:8080/stream
FormatMotion JPEG (MJPEG)
Resolution640 × 480
JPEG Quality70%
Frame Rate~15–20 fps

When ball tracking is active, the stream includes a visual overlay: target color label in the top-left corner, a yellow circle drawn around the detected ball, the ball’s radius in pixels, and a white crosshair at the frame center.

File Structure and Service Management

File Layout

/home/jetson/
├── jetbot_tracker.py      # Main control script
├── jetbot_unified.py      # Manual-only version (no tracking)
└── NO_AUTOSTART           # Create this file to bypass auto-start
 /etc/systemd/system
└── jetbot-tracker.service # systemd auto-start unit

Service Commands

# Start / stop manually
sudo systemctl start jetbot-tracker
sudo systemctl stop jetbot-tracker  
# Enable / disable auto-start on boot
sudo systemctl enable jetbot-tracker
sudo systemctl disable jetbot-tracker  
# View live logs
journalctl -u jetbot-tracker -f  
# Temporary bypass (create file, remove to re-enable)
touch ~/NO_AUTOSTART

Component Status Summary

Current status of all JetBot hardware and software components as of the most recent testing session:

ComponentStatus
Drive motors (left/right)✓ Working
Camera lift + limit switches✓ Working
RGB LEDs✓ Working
OLED display✓ Working
USB camera streaming✓ Working
PWM servos (pan/tilt)✓ Working
Logitech F710 controller✓ Working
K1 shutdown button✓ Working
Ball tracking (R/G/B)✓ Working
Battery monitor (INA219)✓ Working (Bus 0, 0x41)
Auto-start service✓ Working

Troubleshooting

IssueSolution
Port 8080 already in useRun: sudo fuser -k 8080/tcp
Camera not foundCheck USB connection; run: ls /dev/video*
Service keeps restartingStop service: sudo systemctl stop jetbot-tracker
Ball not detectedAdjust HSV color ranges in script; check lighting conditions
Robot oscillates during trackingIncrease CENTER_TOLERANCE or BALL_RADIUS_TOLERANCE
Servos do not moveCheck PCA9685 frequency; may conflict if set to non-50 Hz
Battery voltage not displayedVerify INA219 is on Bus 0 at address 0x41

Project Background

The Yahboom JetBot is a tracked robot kit built around the NVIDIA Jetson Nano 4GB single-board computer with a custom expansion board. The manufacturer supplies a pre-built SD card image that includes drivers, demo code, and a Jupyter notebook interface.

Unfortunately, the supplied image—both on the included USB drive and downloaded from Yahboom’s website—failed to boot reliably. The image dates from approximately 2020 and appears to suffer from a combination of flash storage degradation, SD card compatibility issues with the Jetson Nano, and outdated JetPack dependencies.

Rather than continue troubleshooting the manufacturer’s image, the decision was made to start fresh with Qengineering’s Ubuntu 20.04 image for the Jetson Nano and rebuild all robot functionality from scratch. Every hardware interface—motor channels, servo mappings, I2C addresses, GPIO pin assignments, and limit switch logic—was reverse-engineered by examining Yahboom’s original Python notebooks and testing directly against the expansion board.

The result is jetbot_tracker.py: a single self-contained Python script that provides full manual control and autonomous ball tracking without requiring ROS, Jupyter, or any of the manufacturer’s software stack.

Terminal Commands

This section lists most of the terminal commands used.

Introduction

This document catalogues the primary terminal commands used during the development of the JetBot Tracker project. Commands are organized by function: from initial OS installation and system setup, through hardware discovery and dependency installation, to script development, testing, and deployment as a system service.

All commands were executed on an NVIDIA Jetson Nano 4GB running Qengineering’s Ubuntu 20.04 image, accessed either via a directly connected monitor or over SSH from a development PC on the same network.

1. OS Installation and Initial Setup

The project began with flashing a fresh Ubuntu 20.04 image to a microSD card after the manufacturer’s image failed to boot. These commands cover the initial system configuration performed immediately after first boot.

CommandDescriptionContext
sudo dd if=/dev/zero   of=/dev/sdX bs=1M count=100Wipe the first 100 MB of the SD card to clear old partition tables before flashing a new imageRun on host PC
sudo apt updateRefresh the package index from all configured repositoriesFirst command after boot
sudo apt upgradeInstall all available package updates to bring the system currentImmediately after apt update
hostname -IDisplay the Jetson Nano’s IP address for SSH accessOn the Nano terminal
ip addr show wlan0Show detailed network information for the WiFi interface, including the assigned IP addressAlternative to hostname -I
ssh jetson@192.168.1.xxxConnect to the Jetson Nano from a remote PC over SSHFrom development PC
ifconfigDisplay all network interfaces and their IP addressesNetwork troubleshooting
sudo nmcli device wifi connect   “SSID” password “PASSWORD”Connect the Nano to a WiFi network by name and passwordInitial WiFi setup

2. I2C and Hardware Discovery

The expansion board’s hardware interfaces were not documented in a usable form. These commands were used to discover I2C device addresses, identify the PCA9685 PWM controller, locate motor channels, find limit switches, and map servo configurations.

CommandDescriptionContext
sudo apt install -y i2c-toolsInstall I2C diagnostic utilities including i2cdetect, i2cget, and i2csetOne-time setup
sudo apt install -y   python3-smbusInstall the Python SMBus library for I2C communication from scriptsOne-time setup
sudo i2cdetect -y -r 1Scan I2C Bus 1 and display a grid of all responding device addresses (found 0x40, 0x1B, 0x70)Hardware discovery
sudo i2cdetect -y 0Scan I2C Bus 0; used later to find the OLED (0x3C) and INA219 battery monitor (0x41)Bus 0 devices
ls /dev/input/js*Check if the Logitech F710 gamepad is recognized as a joystick deviceController setup
ls /dev/video*List available video devices to identify the USB camera device node (found /dev/video1)Camera setup
python3 pca9685_diag.pyRun a custom diagnostic script that tests all 16 PCA9685 channels at 50% duty cycle, one at a time, to identify motor channel assignmentsChannel discovery
python3 lift_motor_scan.pyTest non-motor PCA9685 channels (0–7, 12–15) to locate the camera lift motor (found Ch 12–13)Lift motor discovery
sudo python3   limit_switch_test.pyMonitor GPIO pins 11 and 12 in real time to verify limit switch behavior as the camera lift is manually movedGPIO pin mapping
python3 test_dpad.pyRead raw joystick events from /dev/input/js0 to determine D-pad axis numbers and value rangesController mapping

3. Python Dependencies and Package Installation

The JetBot script depends on several Python libraries for I2C communication, PWM control, GPIO access, image processing, and display output. The Jetson Nano’s ARM architecture required specific package versions in many cases.

CommandDescriptionContext
sudo apt install python3-pipInstall the Python 3 package managerPrerequisites
sudo apt install python3-devInstall Python 3 development headers needed for compiling C extensionsPrerequisites
sudo apt install build-essentialInstall the GCC compiler toolchain required for building native Python packagesPrerequisites
sudo pip3 install adafruit-   circuitpython-pca9685Install the Adafruit PCA9685 driver library for PWM motor and servo control via I2CMotor/servo control
sudo pip3 install   adafruit-circuitpython-   motorInstall the Adafruit motor helper library for simplified DC motor controlMotor control
sudo pip3 install Jetson.GPIOInstall NVIDIA’s GPIO library for Jetson Nano pin access (limit switches, shutdown button)GPIO access
sudo apt install python3-opencvInstall OpenCV for Python; used for camera capture, HSV color filtering, and ball detectionComputer vision
sudo pip3 install numpyInstall NumPy for array operations used by OpenCV image processingVision support
sudo pip3 install PillowInstall the PIL imaging library for generating text and graphics on the OLED displayOLED display
sudo pip3 install   adafruit-circuitpython-   ssd1306Install the SSD1306 OLED driver for the 128×32 monochrome display on I2C Bus 0OLED display
sudo pip3 install pyserialInstall serial port library; resolved a missing dependency for the original jetbot moduleDependency fix
sudo pip3 install CythonInstall Cython compiler; required as a build dependency for several packages including h5pyBuild dependency
sudo apt install libhdf5-dev   libhdf5-serial-devInstall HDF5 system libraries required for building h5py from source on ARMBuild dependency
pip3 list | grep package_nameVerify a specific package is installed and check its version numberTroubleshooting
sudo pip3 install –upgrade pipUpdate pip itself to the latest version to resolve dependency resolution issuesMaintenance

4. Script Development and Editing

All script development was done directly on the Jetson Nano using the nano text editor over SSH. Scripts were tested interactively before being deployed as system services.

CommandDescriptionContext
nano ~/jetbot_tracker.pyOpen the main control script in the nano text editor for editing; Ctrl+O to save, Ctrl+X to exitPrimary editing tool
nano ~/jetbot_unified.pyEdit the manual-only version of the control script (no ball tracking)Alternate script
python3 ~/jetbot_tracker.pyRun the tracker script interactively for testing; Ctrl+C to stopDevelopment testing
cat > ~/test_script.py << ‘EOF’   … script content … EOFCreate a small test script using heredoc syntax; used frequently for quick diagnostic scriptsQuick test scripts
chmod +x ~/script_name.pyMake a Python script executable so it can be run with ./script_name.pyPermissions
Ctrl+W (in nano)Search within the nano editor; used to jump to specific sections like controller mapping or tuning parametersIn-editor navigation

5. Process and Port Management

During iterative development, the HTTP camera stream on port 8080 frequently needed to be released before restarting the script. These commands became the standard pre-launch sequence.

CommandDescriptionContext
sudo fuser -k 8080/tcpKill any process holding port 8080 (the camera stream); essential before restarting the scriptBefore each test run
sleep 1Pause briefly after killing a port to let the OS fully release the TCP socketUsed with fuser
pkill -9 python3Force-kill all running Python 3 processes; used when the script becomes unresponsiveEmergency stop
ps aux | grep pythonList all running Python processes to check if old instances are still lingeringDiagnostics

6. systemd Service Management

Once the script was stable, it was deployed as a systemd service so it would start automatically on boot. A bypass mechanism was added so a monitor could be connected for maintenance without the script grabbing the camera and ports.

CommandDescriptionContext
sudo nano /etc/systemd/system/   jetbot-tracker.serviceCreate or edit the systemd unit file that defines the auto-start serviceService setup
sudo systemctl daemon-reloadTell systemd to re-read all unit files after creating or editing a service definitionAfter editing .service
sudo systemctl enable   jetbot-trackerEnable the service to start automatically at bootDeployment
sudo systemctl disable   jetbot-trackerPrevent the service from starting at bootMaintenance
sudo systemctl start   jetbot-trackerStart the service immediately without rebootingManual start
sudo systemctl stop   jetbot-trackerStop the running service; essential before editing the script or running interactivelyBefore editing
sudo systemctl status   jetbot-trackerCheck whether the service is running, and show recent log outputDiagnostics
journalctl -u jetbot-tracker -fFollow the service’s log output in real time; shows print() output and errorsLive debugging
journalctl -bShow all log messages from the current boot sessionBoot diagnostics
systemd-analyze blame   | grep jetbotShow how long the jetbot service took to start during bootPerformance check
touch ~/NO_AUTOSTARTCreate the bypass file that prevents the service from launching the script on next bootMaintenance mode
rm ~/NO_AUTOSTARTRemove the bypass file to re-enable auto-startResume normal mode

7. System Shutdown and Power Management

Proper shutdown is important on the Jetson Nano to prevent SD card corruption. These commands were used for clean shutdowns before switching to battery power.

CommandDescriptionContext
sudo systemctl stop   jetbot-trackerStop the control script to release the camera, ports, and GPIO cleanlyBefore shutdown
pkill -9 python3Ensure no Python processes remain holding hardware resourcesBefore shutdown
syncFlush all pending filesystem writes to the SD cardBefore shutdown
sudo shutdown -h nowPerform a clean system shutdown; safe to remove power after the green LED stops blinkingPower off
sudo rebootRestart the system; used after editing the systemd service or changing configuration filesConfiguration changes

8. Network and Remote Access

The Jetson Nano was primarily accessed over SSH from a development PC. The camera stream was viewed in a web browser on the same network.

CommandDescriptionContext
ssh jetson@192.168.1.164Connect to the Nano over SSH using its static IP addressPrimary access method
hostname -IQuickly display the Nano’s current IP addressVerify IP
nmcli device statusShow the status of all network interfaces (WiFi connected, Ethernet active, etc.)Network diagnostics
iwconfigDisplay WiFi interface details including signal strength and connected network nameWiFi diagnostics
http://192.168.1.164:8080URL to view the live camera stream in any web browser on the same networkOpened in browser

9. Common Troubleshooting Commands

These commands were used repeatedly during development to diagnose hardware issues, resolve port conflicts, and verify that components were functioning correctly.

CommandDescriptionContext
sudo i2cdetect -y -r 1Re-scan I2C bus to verify devices are still responding after a power cycle or wiring changeHardware check
ls /dev/video*Verify the USB camera is detected; if /dev/video1 disappears, the camera has disconnectedCamera check
ls /dev/input/js*Verify the Logitech gamepad wireless dongle is connected and recognizedController check
sudo fuser -k 8080/tcpRelease port 8080 when getting “Address already in use” errorsPort conflict
dmesg | tail -20Show the most recent kernel messages; useful for diagnosing USB device connection issuesHardware errors
dmesg | grep -i usbFilter kernel messages for USB-related events to debug camera or controller detectionUSB diagnostics
lsusbList all connected USB devices to verify the camera and gamepad dongle are presentUSB device check
pip3 list | grep adafruitVerify Adafruit libraries are installed and check version numbersPackage check
python3 -c   “import cv2; print(cv2.__version__)”Quick check that OpenCV is installed and importablePackage check

10. Repository and File Management

The manufacturer’s original code repository was cloned early in the project to study the Yahboom JetBot’s hardware mappings and original control logic.

CommandDescriptionContext
git clone https://github.com/   YahboomTechnology/   JetBot-AI-Robot-Car.gitClone the Yahboom JetBot repository to study their original motor mappings, servo code, and notebook examplesReference code
ls ~/JetBot-AI-Robot-Car/List the cloned repository contents including manuals, code samples, and hardware documentationBrowsing reference
cat ~/jetbot_tracker.pyDisplay the entire script in the terminal for review or to copy into another toolCode review
cp ~/jetbot_unified.py   ~/jetbot_unified_backup.pyCreate a backup of the script before making major changesVersion safety

Mobile Robots Blog 1 

February 1, 2026

Some Background

Until few years ago, mobile robots have been the realm of research, special applications (moon rovers), robot competitions (FIRST, VEX), industrial labs (Asmio – Honda) and interested hobbyists. There is of course a much longer history of mobile robots. A movie example is Maria from ‘Metropolis’ in 1927 (History of robots – Wikipedia). A functioning example is ‘Eric’ which was constructed to open the Exhibition of the Society of Model Engineers at London’s Royal Horticultural Hall in 1928 (Eric (robot) – Wikipedia). The idea that mobile robots would carry out work done by humans is also not new as illustrated in the 1929 Le Petit Inventeur article (1928 – Eric Robot – Capt. Richards & A.H. Reffell (English) – cyberneticzoo.com).

My interest in mobile robots has been from building or buying hobbyist robots for classes for middle and high school students and participating for a number of years with FIRST Robotics teams. These included robots using LEGOs to the 120 pound constructed robots built for FIRST Robotics competition. The FIRST Robotics robots are a combination of autonomous and teleoperated functions and annually are uniquely designed to meet the requirements of the competition rules. This interest evolved to preparing for and teaching an engineering course on mobile robots for university continuing education.

There are a number of important/essential contributors to the evolution toward today’s mobile robots.

  • The development and application of fixed robots for manufacturing. This development has resulted in continuing advances in motors, servos, hardware configurations, controllers and computer functions which lay the groundwork for mobile robot advances.
  • The rapid advances in machine learning (artificial intelligence) methods including Large Language Models, neural networks and model training functions.
  • The development of small, relatively inexpensive computers that are capable of running control, navigation and AI algorithms including large language models.
  • Very large investments by companies and governments, particularly China, in mobile robot development.
  • Increasing actual and expected application of mobile robots in a wide variety of business processes (farming, warehousing, assembly, security, medical), personal care and the military.

As interest in mobile robots and the reality of their rapid increase in capability, vendors of research, educational and hobby robots have also improved the functionality of their products. This includes AI functions for object and face recognition, navigation functions, integration of large language models and motion control. These robots utilize the latest computing technology (Nvidia Jetson Orin NX) and navigation sensors (LIDAR and stereo cameras), Many are in the $500 to $1500 price range. While education and hobby products have rapidly increased their capabilities, some industrial/commercial mobile robot providers have developed products which also are accessible for the education, hobby and research lab market. Two specific examples are the Unitree GO quadped and the Unitree G1 humanoid robot. Still expensive but possible.

Why This Blog On Mobile Robots

I have spent a lot of time working on mobile robots. This includes building mobile robots the size and performance of the FIRST Robotics robots as well as many smaller ones. Much more time has been spent programming the various computers used and more recently using primarily Anthropic’s Sonnet 4.5 (Claude) for programming and education. Besides the mobile robots built I’ve purchased a number of the more advanced education/hobby robots which include essentially all the capabilities of industrial/commercial robots. My objectives in putting some of what I’ve built, bought and learned into a blog are:

  • To document some of the basic information I’ve accumulated for my own use as future reference
  • To (I hope) save some time for readers who are interested in mobile robots by providing information on things I sometimes spent hours trying to understand and apply
  • To learn from others their experiences

At this point I don’t know what sequence I will be using for documentation. I’ll start by describing some of the robots I have. As caveats for the material provided:

  • I claim no unique knowledge; I’m sure a lot of what I spent much time on others already know
  • Any code/script/commands provided are ‘as is’; I can say that what is provided worked at least once on my computer system and for the robots I have
  • I have spent time working to understand the operating instructions for many robots from a number of vendors; there have been problems with documentation and code, some of which of my own making and some are vendor issues; as these are discussed it is not to be critical of the vendors; many of the robots are complicated systems and getting everything right is no small task

Topics, Mobile Robots, Components and Software to be Discussed

A list of the mobile robots I have is given in Table 1. All of these have run at sometime. They use computers ranging from the Arduino Uno to the Jetson Orin NX. There are many controllers used. Most programming is in Python. Other languages are Arduino and graphical (like Scratch). The Python work is done with one of the Ubuntu versions. Many of the more recent robots use ROS2 for control and Gazebo for simulation. An objective in preparing the most recent course on mobile robots was to have an illustration of many/all the different types of locomotion (Ackerman steering, Mecanum Steering, tank steering … tracks, wheels … different suspension systems). The application of machine learning / artificial intelligence is also a part of the operation of a number of the robots. I have done a good deal of work with quadrotors/drones (mobile robots) and will discuss them. I’ve done nothing with water-based robots. The robots listed represent work over about 15 years. Some early ones are still applicable although in some cases the computers and controls used are no longer supported (most of these have been replaced by newer hardware). A few are more in the RC car category and are included as they represent a specific type of suspension or locomotion.

No.Robot NameDescriptionComputerLocomotionDate Built / Purchased
1Unitree GO2Quadped with LIDAR, phone appProprietary4 legs, Parasagittal configuration7/2025 – Purchased
2HiWonder JetAutoArm, stereo camera, LIDAR, ROS2, phone appJetson Orin Nano4 wheels, Mecanum6/2024 – purchased
3NQD Robotic DogRC, unique drive systemNone8 Mecanum wheels in 4 sets of 2, tank steering8/2025 – purchased
4Redcat Racing Ridgerock 1:10RC, 3 all wheel steering options, suspensionNone4 wheel steering9/2025 – purchased
5SunFounder GalaxyRVRRocker suspension, 6 wheel driveArduino Uno6 wheels, tank drive6/2025 – purchased
6Hiwonder MentorPiAckerman steering, ROS2, stereo camera, LIDAR, phone appRaspberryPi54 wheels, rear wheel drive, front wheel steering4/2025 – purchased
7Yahboom MicroROSLIDAR, microROS2, virtual machine control, phone appRaspberryPi5, ESP324 wheels, tank steering6/2025 – purchased
8Yahboom MicroROS Self balancing2 wheel balancing, LIDAR, microROS, phone appESP32, STM32 controller2 wheels, tank steering4/2025 – purchased
9Tesla Model 3Multiple cameras, full self drivingProprietary4 wheels, Ackerman steering12/2023 – purchased
10Tracked robot with gripperGripper, controller displayVEX V52 tracks, tank steering2/2020 – built
11Wheeled Robot with serve steeringSwerve steering, 4 wheel driveVEX V54 wheels independently steer, driven5/2019 – built, disassembled
12Hiwonder AINEX HumanoidHumanoid, 24 DOF, ROS2, arm grippers, phone appRaspberryPi52 legs, balancing5/2024 – purchased
13Hiwonder ROSPug QuadpedQuadped, LIDAR, 3 servos per leg, phone app  Jetson Nano4 legs, Parasagittal configuration4/2024 – purchased
No.Robot NameDescriptionComputerLocomotionDate Built / Purchased
14Hiwonder JetHexa QuadpedQuadped, ROS2, LIDAR, stereo camera, phone appJetson Nano4 legs, sprawl configuration5/2024 – purchased
15ReachEDU Mekamon QuadpedQuadped, AR games (legacy), phone appPIC324 legs, sprawl configuration12/2018 – purchased
16New Bright iRobot 710 Kobra4 tracks, front tracks adjustable, arm, gripper, RCNone4 tracks, tank steering3/2020 – purchased
17Yahboom JetBot2 tracks, camera lift, phone app (legacy)Jetson Nano2 tracks, tank steering3/2021 – purchased
18 TurtleBot 32 wheels, idler, LIDAR, ROS2RaspberryPi 4B2 wheels with idler ball, tank steering7/2022 – purchased
19Independent suppension6 wheels each powered with individual suspensionVEX V56 wheels, tank steering3/2023 – built
20Omnidrive4 omnidrive wheels, each individually powered, all directionsVEX V54 omniwheels,2/2017 – built
21Large frame 6 wheels6 wheels, 2 omni and 4 solid, 2’x2’VEX V56 wheels, tank steering10/2020 – built
22Hiwonder uHandPi4 finger/thumb robot hand, 6 servos, not mobile but part of robot studyRaspberryPi5 6/2025 – purchased
23Yahboom STM32 Smart Robot Kit4 wheels, use STM32 for controlSTM324 wheels, tank steering11/2025 – purchased
24DJI Robomaster EP Core4 wheel, arm, gripper, CAN bus, phone appProprietary4 wheels, Mecanum steering4/2020 – purchased
25DJI Mavic 3Quadrotor, cameras … photography, obstacle detectionProprietaryQuadrotor10/2023 – purchased
26DJI Avata 2Quadrotor, one hand controller, directed flightProprietaryQuadrotor4/2024 – purchased
27DJI FPVQuadrotor, fast .. 80 mphProprietaryQuadrotor5/2022 – purchased

Table 1 – List of Mobile Robots

I also have a number of small mobile robots that are functional and use either an Arduino or RaspberryPi 4 computer. As noted, several years ago I designed and built two FIRST Robotics scale robots, these used the National Instruments RoboRIO computer. One was 4 wheels and one six wheels; both used tank steering. They were too be and too heavy for home use so were donated to a FIRST Robotics team.

As computers are a fundamental component of mobile robots, I have evaluated several as shown in Table 2.

No.ComputerOS/LanguageDescription
1Arduino UnoArduinoGeneral application
2Sparkfun ESP32ArduinoGeneral application
3ESP32FreeRTOSRobot control, microROS
4RaspberryPi4BUbuntu,Python3TurtleBot3, ROS2
5RasberryPi5Ubuntu,Python3Mobile robots, ROS2
6Jetson NanoUbuntu,Python3Mobile robots, direct control, ROS2
7Jetson Orin NanoUbuntu,Python3Mobile robots, ROS2
8Jetson Orin NXUbuntu,Python3Mobile robots, ROS2
9Geekom NUCWindow, Ubuntu, Python3Evaluation, plan for a mobile robot, ROS2
10VEX V5Propietary/Blocks, Python, (C++)Mobile robots
11Robotis CM 530Propietary/PropietaryMobile robots

Table 2 – Mobile Robot Computers

Over time a number of mobile robot applicable computers and support systems come and gone. Modern Robotics had a complete computer and sensor system but ended the product line in 2019 after being bought by BoxLight. LEGOs Mindstorms was a computer and support system; production ended in 2022. The National Instruments RoboRIO is available only for FIRST Robotics competitions at least through 2026.

Finally, to complete the ‘lists’ of mobile robot hardware are the controllers. These devices are almost always used to support the interface of the computer to the robot’s functions … motors, servos, communications (I2C, RS484), binary I/O, A/D process, power management/distribution. The table lists the primary ones used in the purchased robots or ones I have obtained to see how they might be used.

No.NameDescription
1Robotis openCRGeneral use but specific for the Dynamixel servo series, USB connected
2Robotis Dynamixel ShieldArduino shield, primarily supports Dynamixel servo series
3OpenCM 485 Expansion BoardRequires the OpenCM 9.04 controller board; primarily supports Dynamixel servo series
4Yahboom ROS robot control board STM32Motor and encoder, PWM servo, IMU, SBus, CAN bus, power distribution, USB connected
5Yahboom Micro ROS controller board for RaspberryPiMotor and encoder, I2C, SBus, USB connected
6JetBot Expansion boardMotor control, PWM, SBus, I2C, power, pin connected (discontinued)
7Hiwonder 6 Channel Digital Servo TesterUse with robotic hand
8Hiwonder Serial Bus Servo ControllerUSB connection, test Hiwonder serial bus servos (note: the protocol for serial bus servos is not standardize)
9Hiwonder Raspberrypi5 Expansion Board A, B, CBoard series: A – I2C, PWM, B – bus servo, C – PWM .. used with specific products

Table 3 – Controller and Expansion Boards

In addition to the three hardware areas in the tables are an array of sensors. The primary ones for the more recent robots are LIDARs (light detection and ranging), stereo (depth) cameras and USB and ribbon connected cameras. These will be discussed in other blogs.

Future blogs will cover specifics of some of the mobile robots (i.e. TurtleBot3, JetBot), LIDAR mapping and SLAM, controller applications, ROS2, machine learning/AI applications, communication and others.

Mobile Robot – Arduino Sensors (1) – BME 280

Comment on Getting Started Again

When I started this blog site a couple of years ago, I had a plan, but time and other interests got in the way. I’m ready to try this again with perhaps a little less rigorous view of what I want to write about. As before, my primary mobile robot blog objectives are to relate descriptions and comments on 1) the range of mobile robots I am working on; 2) supporting devices such as sensors, actuators, cameras … and 3) types of applications. My prior plan was to specifically work through a set of example robot types and implementations. At least for the present, my plan for topics is to describe what I’m working on and document descriptions, resources, things that work and don’t work and uses. Currently my interest and work are on the many, many sensors that are available and supported by the Arduino computers. Most will relate to mobile robots in some way; some will be applicable to topics outside of robotics. Sometimes topics will be in an orderly sequence and sometimes simply as I get to them. For a reader this is not ideal … but that’s the way I’m doing it.

This blog is essentially about things learned and implemented as a retired electrical engineer with at least a moderate amount of time and monetary resources. My general view is that if I haven’t at least implemented an example (not necessarily a pretty example) and learned from it, I won’t include it. That being said, only so much time and money is available and at some point, the effort on a topic (robots) becomes good enough (i.e., that’s all I care to know) or I have run up to the edge of my expertise or that which can be reasonably learned.

Nothing I include is intended to represent a basis for commercial or industrial design or production.

Where I might say a device or software doesn’t work, it is totally within my scope of development and test. It may be, and quite likely is that it does work in some context and that I can’t make it work in mine or don’t want to spend any more time trying. I will try to relate to a reasonable level, the basis of my conclusion.

Beginning on Sensors

This blog is titled ‘ … Sensors (1) – BME 280’. One of my activities is volunteering as a teaching naturalist at the Wehr Nature Center which is a unit of the Milwaukee County Park system. As a possible direction of instruction, the use of small computers and, primarily, environmental sensors to both study aspects of nature and also the basics of computers, sensing and data acquisition. Investigation this idea led to the accumulation of many different sensors which are applicable to the Nature Center’s objective. Also, many of these same sensors are applicable to the control, guidance and the mission of mobile robots. The primary small computer used is the Arduino in a number of forms. Sensor acquisition is an on-going process. There are lots of sensors. There are numbers of base suppliers or labels. These products are then sold through suppliers such as Amazon, Robot Shop, Digi-key and others. For most sensors, the base suppliers implement their version of the ‘chip’ or ‘package’ level sensor which come from electronic device manufacturer such as Texas Instruments, Bosch and others.

BME 280 Sensor Example

The BME 280 is manufactured by BOSCH (https://pdf1.alldatasheet.com/datasheet-pdf/view/1132060/BOSCH/BME280.html). It is a mechatronics device and measures temperature (0 degrees C to 65 C), relative humidity (20 to 80 % RH) and pressure (300 to 1100 hPa; 4 to 16 psi). There are of course lots of qualifiers and secondary specifications in the 55-page specification. The supply voltage is 1.7 to 3.6. Output communication is via I2C (Inter-Integrated Circuit) or SPI (Serial Peripheral Interface). The I2C address is 0x76.

Adafruit (https://www.adafruit.com/product/2652) and others put the BOSCH BME280 on a printed circuit board with supporting power and communication electronics and sells it as an Arduino or other computer mountable and connectable sensor. I don’t have price data on the raw BME280 sensor; however, the Adafruit board is approximately $15 with the supportive additions. Adafruit also provides (usually) Arduino (and other computers) loadable software for BME280 functions in a program. The software provides user effective program commands through interface to the BME280 commands plus the driver (BOSCH).

There will be more on the software handling later.

Other like suppliers are SeeedStudio/Grove (https://www.seeedstudio.com/Grove-BME280-Environmental-Sensor-Temperature-Humidity-Barometer.html), WaveShare (https://www.waveshare.com/wiki/BME280_Environmental_Sensor), SparkFun (https://www.sparkfun.com/products/13676), HiLetgo (http://hiletgo.com/ProductDetail/1953483.html), DFRobot Gravity and Fermion (https://www.dfrobot.com/search-bme280.html) and there are probably others. Each vendor provides application software which is all similar and may be interchangeable but may include unique functions for that vendor. In addition, other software writers may provide their own versions. Generally, these are available on ‘GitHub’.

Arduino Test Platform

As a base for testing environmental related sensors as well as other aspects of Arduino devices, a board was layed out and built which included the following: Arduino Mega 2650 Rev 3; Gravity; IO Sensor Shield; 4×4 digital input keyboard; I2C 4 x 20 LCD display; I2C real time clock; I2C 3 color LED; cadmium sulphide analog light sensor (upper left corner); mechanical switch (low left corner); analog temperature sensor (middle right); ADXL 345 Adafruit I2C accelerometer board (under BME 280); and … an Adafruit BME280 board.

This board was constructed to provide a prototype student exercise board for learning about basic computer operation, data collections and sensor monitoring. The IO sensor shield board includes a micro-SD socket for data acquisition. There will be more discussion in the future of the components and configuration of this and related computer boards. The BME280 board is connected to the Arduino through the I2C socket hub in the prototyping section of the IO sensor shield.

Arduino Software for the BME280 and Testing

At the present (12/5/22) I have loaded Arduino IDE 2.0.3 (https://docs.arduino.cc/software/ide-v2/tutorials/getting-started/ide-v2-downloading-and-installing). A search of the IDE library results in 18 includes supporting the BME280. There are includes by Adafruit and SparkFun. The rest are names of Arduino software support organizations or individuals. Adafruit and SparkFun are installed.

To make the BME280 test more complete (and interesting), additional devices of the test board are used. The keypad for command; the display for information; the IO sensor shield SD card for data acquisition; and the RTC for timing. The process will be to build up to the final program by adding each of the components to complete the test board.

4×20 LCD I2C Display

At ‘Tools/Manage Libraries…’ opens the Library Manager. Searching on ‘LiquidCrystal’ gives a list of liquid crystal display includes. The main one is ‘LiquidCrystal by Arduino, Adafruit’ version 1.0.7 (latest, 12/4/22). This is installed and shows up in the ‘Sketch/Include Library’ as ‘LiquidCrystal.h’. However, this include is for a digital selection connected display. For an I2C connected display ‘include’, search for that. The result is about 33 includes. Which is best?? At https://www.arduino.cc/reference/en/libraries/liquidcrystal-i2c/ the Arduino library include is by Frank de Brabander. Searching for ‘Frank’ finds this include. This WEB site lists the include functions. When installed it is listed as ‘LiquidCrystal_I2C.h’. Finally, a bit of code is needed to test the I2C controlled liquid crystal display. There are numbers of options. One is to go to ‘more info’ in the Manage Libraries list which then goes to the GitHub site. Download the ‘Code’; unzip and within that there are example sketches. One more thing … in this particular case the examples show up as .pde program types. This is an earlier Arduino file type. The present file type is .ino. The .pde files can be opened by the Arduino IDE and will be saved as .ino files. For a simple test the ‘Hello World’ sketch is used. It is downloaded to the Arduino (Mega 2650 in this case) and run successfully. Note that the ‘LiquidCrystal_I2C.h’ default I2C address is 0x27. In this simple program the only functions used are to place the cursor and display a specified line of text.

Through the preceding, the first test board device is tested and some of the process for setup is explained.

4×4 Keypad

The 4×4 keypad has 10 pins. The outer pins (0 (left) and 10 (right) on each end are not used. The right set of 4 pins are for rows. Pin 1 is the low number row pin and, for the test board is connected to DIO 33. The right most pin of the column set of 4 pins is connected to DIO 37. Note that any set of DIO pins can be used. Only issue is getting them correctly connected and listed in the program. Again, there are lots of keypad include options. In the libraries list is a ‘Keypad.h’ include. A simple program is at https://www.instructables.com/Arduino-Keypad-4×4-Tutorial/. In this sketch the Serial Monitor is used to display the key press.

Data Acquistion on SD Card

The next function to work on is the saving of the BME 280 data on a micro SD card. The IO Sensor Shield has an micro SD slot. More specifically this is the DFRobot Mega IO Expansion Shield V2.3 at ‘https://wiki.dfrobot.com/Mega_IO_Expansion_Shield_V2.3__SKU_DFR0165_&#8217;. A guide to SD card data storage is at ‘https://docs.arduino.cc/learn/programming/sd-guide&#8217;. The example at this site is used. The sketch uses the SPI.h and SD.h include files. Both are in the Sketch/Include Library list. Communication with the SD card is via SPI protocol. The test sketch initializes the file. Waits while the file is opened. Writes some descriptive text. Writes 20 numbers to the file and then closes the file. Once the sketch is run, the SD card can be removed, inserted in a card reader and the content verified. If a data file, it can be transferred to an Excel file for analysis or other processing.

Get Time (RTC)

When data is read is of obvious importance. Thus, the next step is to find and test a sketch that reads real time clock (RTC) data. The RTC used is the Gravity: I2C SD2405 RTC Module at ‘https://www.dfrobot.com/product-1600.html&#8217;. This RTC counts Seconds, Minutes, Hours, Day, Date, Month, and Year with Leap Year Compensation Valid Up to 2100. Its internal battery is to last 5 – 8 years. On the test board the RTC is under the LCD. The RTC include and sketch information is at: ‘https://wiki.dfrobot.com/Gravity__I2C_SD2405_RTC_Module_SKU__DFR0469&#8217;. In this case the include file must be loaded from the GitHub reference. This is done by downloading the code as a zip file and extracting as ‘Arduino-Keypad-main’ folder. Sketch/Include File/Add .ZIP Library … is used to load the RTC’s include file. At the GitHub site download the ‘Code’ and then ‘Extract All’. Open the ‘Gravity-I2C-SD2405-RTC-Module-master’ file and use the Sketch/Include File/Add .ZIP Library … to add the ‘GravityRtc.zip’. In the GravityRtc/Examples is a RtcTime_Test.ino sketch. The include is ‘GravityRtc.h’. The RTC can be set manually in the sketch if needed.

Read BME 280 Data

The next step in this exercise is to read data from the BME 280. The BME 280 sensor board used is the Adafruit BME280 as previously shown. The BME 280 include obtained by installing the Adafruit BME280 Library Version 2.2.2 from the Library Manager. Test sketches are available from several sources; one is ‘https://randomnerdtutorials.com/bme280-sensor-arduino-pressure-temperature-humidity/&#8217;. This sketch is set up for use of both SPI and I2C communication with the sensor; I2C is the default. As noted before the BME 280 include is contained in the Adafruit library. Running the test sketch results in the reading of the temperature, pressure, relative humidity and altitude. Note that the altitude is based on the default ambient pressure. A requirement for use is that in ‘Setup’ the BME 280 must be initialized (Begin). After that data can just be read as needed.

ADXL 345 Accelerometer

The ADXL 345 Accelerometer IC is mounted on a Adafruit circuit board. This is a basic 3 axis acceleration sensor. It has 4 sensitivity ranges from +/- 2G to +/- 16G and it supports output data rates ranging from 10Hz to 3200Hz. On the prototype board it is mounted under the BME 280. Communication with the Mega 2650 is via I2C. The Adafruit reference is at ‘https://learn.adafruit.com/adxl345-digital-accelerometer/overview&#8217;. The include is in the Library Manager under Adafruit. A sketch reference is at ‘https://github.com/adafruit/Adafruit_ADXL345&#8217;. The example sketch allows for selection of access rate and scale.

DS18B20 Waterproof Digital Temperature Sensor

The DS18B20 digital temperature sensor is based on the Dallas One-wire protocol. A reference is ‘https://www.adafruit.com/product/381&#8217;. While the sensor is good up to 125°C the cable is jacketed in PVC so we suggest keeping it under 100°C. Because they are digital, you don’t get any signal degradation even over long distances! These 1-wire digital temperature sensors are fairly precise (±0.5°C over much of the range) and can give up to 12 bits of precision from the onboard digital-to-analog converter. They work great with any microcontroller using a single digital pin, and you can even connect multiple ones to the same pin, each one has a unique 64-bit ID burned in at the factory to differentiate them. Usable with 3.0-5.0V systems. The only downside is they use the Dallas 1-Wire protocol, which is somewhat complex, and requires a bunch of code to parse out the communication.

A library reference is ‘https://github.com/milesburton/Arduino-Temperature-Control-Library&#8217;. The Library Manager has an include for the DS18B20 by Miles Burton. Installing the Dallas Temperature include also installs a OneWire.h include. GIKFUM Dat is connected to DIO 46. From the GitHub download the ‘Simple.ino’ example is used. Note that the sensor code connections are not the same across DS18B20 products. One used reversed the power/ground connection pins.

Blinkm RGB LED

Blinkm is an I2C controlled three LED (RGB) device. Each color can be controlled over a 0 – 255 range. Other options are blinking and color ramping. Blickm is a product of Thing M which has a number of other LED control products. The Library Manager has no Blinkm include. At the ‘https://thingm.com/products/blinkm&#8217; site I found nothing that would allow Arduino control. SparkFun has a Blinkm site at ‘https://todbot.com/blog/2011/03/22/blinkm-smart-led-as-the-smallest-arduino/&#8217; but no simple .h or .ino to load and run. Going to ‘https://github.com/todbot/BlinkM-Examples&#8217; resulted in numbers of application programs for the Blinkm. These require loading BlinkM_funcs.h into the IDE library. At ‘https://docs.arduino.cc/software/ide-v1/tutorials/installing-libraries&#8217; is a discussion of Arduino IDE libraries and how to load third party includes plus reference to the ‘Arduino Library Reference‘. The ‘todbot’ GitHub didn’t have a .zip to load. Another source at ‘https://github.com/darach/BlinkM&#8217; was found which resulted in a successful ‘BlinkM_Arduino_main’ load. When applied to a sketch ‘BlinkM_Arduino_main’ shows up as ‘BlinkM.h’.

Cadmium Sulphide (CS) Light Sensor

A photoresistor (Cadmium Sulphide (CS)) light sensor is located at the left front of the Mega 2650. This is an analog sensor with a binary level detect output set by the potentiometer. The sensor is connected to A15. The level output is not connected. A photoresistor include is in the Library Manager and is installed. In ‘https://github.com/alextaujenis/RBD_LightSensor&#8217; is a sketch for monitoring. The include is listed as ‘<RBD_LightSensor.h>’. The sketch is very simple as it just reads the analog input and multiplies to get a percentage.

Mechanical Switch

In the interest of simplicity and completeness a mechanical limit switch was added. This is connected to DIO 44.

Simple LED

An LED is controlled by DIO 43.

Composite Sketch for BME 280

The final step for this blog is to describe the software used to display the data from the BME 280, RTC, light sensor, temperature probe, mechanical switch, and accelerometer and the process of saving the data to the SD card. In addition, code was written to change the color of the BlinkM LED array and turn the on-board LED on and off. Arduino IDE 2.0.3 was used. The process was to test the code for each sensor and function and then add the device code to the overall code for the test board. It many cases some modification of the sensor and function (i.e., keypad) code was required. In the main loop code, the keypad is monitored. Pressing a keypad number selects a sensor or function for display. These are: 1/RTC hours, minutes and seconds, 2/3 axis acceleration, 3/ temperature probe, 4/ BME temperature, RH, barometric pressure, 5/ altitude from BME, 6/ light level, 7/ mechanical switch status, 8/set BlinkM LED array color and intensity, 9/ turn on board LED on/off, 0/ save sensor data and time to SD card as selected period. For RTC, accelerometer, and light level the data is shown continuously until the function is de-selected. The data is saved to the SD card in selectable periods of 10 seconds, 1 minute or 1 hour. Five hundred samples are taken. The SD save function can be ended via the keypad. Twenty entries are written to the SD card file at each time period. One of the entries is a data count. Besides the actual data, four data entries are available for other information as needed.

Data Saved and Processing in Excel

The data saved to the SD card is a sequential list. To complete the data processing, the SD card is removed and inserted in a computer (Windows in this case). The data is read in Notepad and then transferred as a column into Excel. A macro in Excel is used to parse the data into a row per period (or count). The data can then be plotted as desired. An example follows. This is for 10 second sampling. A lot can and, in application, would be done to add more plots and provide additional data description. The objective at this point is to illustrate the completion of the process.

Summary of Arduino Test Board

This blog describes an Arduino2650 based instrumentation and display board which might be applicable to studying the outdoor environment. The primary sensor array of interest was the BME 280 which provides temperature, relative humidity and barometric pressure. A primary objective of the blog was to describe where information and code for the sensors and other functions of the board can be found. The data acquisition capability is of the board is described through the Excel processing and final through a plot of the light level at 10 second intervals. The test board is in a prototype form with wiring pin connected to the Arduino. An addition step for another time is to provide portable power so that the board could be taken to the field for environmental monitoring.

Next Blog

The next blog will (most likely) extend the sensors used with the Mega 2650 to include GPS, IMU and a number of light sensors.

Robot 28 – Spy Gear Trakr Track Tank Drive

Robot 28, the Wildplanet Spy Gear Video Trakr was purchased and added to the teaching content for robotics classes for middle school students. With multiple machines, it effectively illustrated numbers of robotic functions which, for the time period of 2011, were fairly advanced. The product proved to be reliable both mechanically and electrically. Some apps were downloaded while the Wildplanet site was active but no work was done on creating original apps using the C language compiler.

The Trakr was introduced in 2010 by Wildplanet as an integrated robotics educational system with a wide range of functions on the robot and controller plus a supporting WEB site for apps creation and distribution. This level of development and support was innovative but unfortunately lasted only a couple of years as Wildplanet was acquired and support was ended.

The four Trakrs acquired for robotics classes are still functional and at some future time it would be interesting to try to get a compiler working to create apps as this robotic is still very applicable to studying robotic functions.

Robot 12 – Tank Track with Airsoft Gun

The first robot described is the Tank Track With Airsoft Gun; Robot 12. There is not a specific sequence to the robots described of the prior list. I’m starting with Robot 12 because its the one I’m currently working on. This robot is functioning as described in the documentation which follows. As with all the built and programmed robots, there is always more that can be done in construction, programming and description. However, I have to stop somewhere as there are many robots and functions of interest.

Summary of Mobile Robot Menagerie – 3

This is the third set of descriptions of the mobile robots either built or purchased … or a bit of both. At the end of this set are summary pages of with some status and software information. The total I have listed is 30 robots of various varieties. A legitimate question is whether I will ever complete and program the constructed one or investigate the purchased ones. I’ll accept that the numbers are not too favorable since there are lot on my list and new materials, computers, sensors and complete robots become available all the time and some are worth looking at. My intent is not to go down the list but to work on and add comment to those that are of greatest interest or in the case of purchased / vendor machines are more investigation and observation. Also in some cases I will consolidate information on sensors, computers, batteries and other topics.

Robot 21 – Parallax Boe-Bot: The Parallax Boe-Bot is a small two wheel with an idler wheel, tank drive robot built on an aluminum pad. The computer is a Basic Stamp and Parallax sensors are used. Simple as it is, this robot has served for numbers of years as a starting robot. The primary interest at this point is its application for autonomous sumo bot competition (I have 2) and for autonomous line following.

Parallax Boe-Bot

Robot 22 – Nerf Gun Mechanism: The Nerf Gun Mechanism is not a mobile robot but is a computer driven target acquisition, target tracking and pneumatic firing of nerf projectiles. The Nerf gun part is the gun assemble of a Air Hogs Battle Tracker. The firing control is with a VEX V5 computer, sensor and communication / controller system. The nerf gun mechanism will be mounted on a mobile base. The functions of interest are the use of the V5 camera for target tracking and the overall system for fire control.

Nerf gun and base; computer and other devices not mounted yet.

Robot 23 – Hexpod Robot: The Hexpod robot started as a structural kit with a PIC controller. The 18 servos have been upgraded and a Modern Robotics Fusion computer is used to control two servo control modules. Communication is via Bluetooth using a Logitech controller. A lithium ion battery is used. IR motion sensors and distance sensors are used. As apparent this hexpod is circular. A linear hexpod has also been constructed. The primary function of interest is leg control of a hexpod.

Hexpod Robot; computer and servo controller changes pending.

Robot 24: High Clearance Wheel Tank Drive: This robot is planned but has not yet been constructed. The objective is to use Intel Realsense devices (i.e. LIDAR L515) for navigation and object detection. Computer requirements remain to be determined as well as additional sensors and communication implementation. The robot is planned to have 8 inch pneumatic wheels. The functional objectives include high clearance drive system and Realsense device navigation and objective analysis. There is not yet a picture.

Robot 25: Yahboom Nvidia Jetson Nano Robot: This robot is purchased as a kit. It is tracked tank drive and is a multi-function tank AI education robot based on NVIDIA Jetson Nano A02 or B01 computer. Communication is via Wi-Fi to smart devices. The provided Python based software supports face recognition, object tracking, obstacle avoidance and other functions. The objective of investigating this robot is to learn about NVIDIA GPU based processor application.

Yahboom Nvidia Jetson Nano Robot

Robot 26 – DJI EP Core: The DJI EP Core is a mecanum drive robot with a gripper and arm. Through the SDK it can be programmed for object and text recognition with machine learning applications. Sensors ports allow third party sensors to be used. This robot is of interest as it provides the potential to investigate AI based object recognition functions combined first person viewing, integrated arm/gripper coordination and inter robot communication with other DJI products.

DJI EP Core

Robot 27 – Robotis Swerve Drive Robot: The combination of Robotis servos and software provide a unique approach to motion control. The digital servos include internal sensing including position and current. The software provides the capability to set positions and then use these in the operational program for the robot. The swerve drive versatility is an effective implementation to investigate this mechanical and software coordination of control. This robot is yet to be constructed.

Robot 28 – Spygear Track Tank Drive: The Spygear robot was released as a product in 2010. It was and still is significant for the range of functions as well as the open source programming possible. The robot provided audio in and out, video in for both visible and infrared for first person viewing and programmability for autonomous operation. While used in numbers of classes, this robot still has functional capability to investigate.

Robot 29 – iRobot Roomba Pet Series: The Roomba series of cleaning autonomous robots as well as similar autonomous products by other vendors and for other functions is a class of consumer robotic products in their implementation provide a range of both cost based simplicity and increasingly complex operation. The objective for the Roomba is to investigate the implementation of drives, sensors and function, particularly the recent SLAM function.

Robot 30 – Mercedes 2020 GLB 250: I have the GLB 250. It is a robot by the previous definition. It is a high instrumented machine that uses machine learning functions to either directly control or at least influence a number of the automobiles operation. This is somewhat of a love/hate relationship. The objective of including the GLB 250 in this robot list is that understanding and evaluating the effectiveness of its robotic functions is certainly of interest for present and future driving.

Mercedes 2020 GLB 250 … mine is red.

The following lists the robots I plan to discuss as part of this blog series. As noted previously, there are a lot and there will be new ones so to anticipate being ‘done’ is not realistic.

In addition to the specific robots, there are important topics as part of their implementation. In a future blog I’ll list at least some that will be discussed. I would note again that some of this list represent work that goes back 10+ years. In that time many capabilities have changed although as in the case of the Spygear robot and the Robotis humanoid, some things have not changed all that much.

Summary of Mobile Robot Menagerie – 2

This blog continues the summary of the descriptions of mobile robots which have been build or acquired and their general characteristics. Most were built or acquired because of one or more unique functions I wanted to learn about. Note again that the photographs of these robots in some cases are not completely up to date with their current forms.

Robot 11 – VEX Based Battle Bot: This robot started out as a VEX gripper kit machine. For classes on battle bots the arm was modified to be a flipper. It was then used with a duplicate or other small machines for competitions. While basically an RC machine using the VEX Cortex computer and wireless communication, it included an accelerometer to detect when it was no longer upright and LED’s indicating status. An addition function is that two controllers can be connected for two operators. The primary function was/is a middle school student example of a battle bot and two operators.

VEX battle bot

Robot 12 – Tank Track With Airsoft Gun: Like it or not mobile robots are armed. This medium sized tracked robot using a Lynxmotion track kit includes an airsoft gun and laser target pointer. The computer is a Modern Robotics Spartan, essentially a packaged Arduino. An objective of this robot was to have visual feedback through first person viewing and to use an RC controller to get an extended communication range. It is also an exercise in multiple batteries. The primary functions are: airsoft gun, RC communication and visual status feedback.

Tracked robot with airsoft gun.

Robot 13 – Two Wheel Balancing: Using a Lego Mindstorms computer and sensors plus Lego structural components and motors a two wheel balancing robot is demonstrated. In addition to ‘simply’ balancing the robot does line following. The primary function is two wheel balance and operation.

Lego two wheel balancing robot.

Robot 14 – Large Six Wheel Tank Drive: This robot is a FIRST Robotics competition scale machine. It uses a National Instruments RoboRio computer and a structure from AndyMark. It has a two speed pneumatically switched gearbox. Communication is via Wi-Fi through the National Instruments Windows support software. At this point it is a base structure; plans are for a mechanism to be added. The primary functions are two speed gear box, six wheel tank drive with center wheel offset.

FIRST Robotics class six wheel tank drive robot.

Robot 15 – Six Wheel Tank Drive With Suspension: The structure is a purchased Mantis 6WD Off-Road Rover Kit. The computer is a VEX Cosmos with the associated wireless communication and controller. The sensors include a ultrasonic range sensor and an RC camera. A lithium ion battery is used. The primary objective of this robot is investigate how a wheeled tank drive with all wheel suspension functions.

Robot 16 – Ball With Weighted Drive: This robot is the Sphero Bolt. It is an app-enabled robot ball with programmable sensors + LED matrix, infrared & compass. The ball is 2.9 inches in diameter and is rotated by an internally driven weight. Communication with a smart device is via Bluetooth. The primary interest with this robot is it’s unique approach to motion using a sphere and driven weight.

Sphero Bolt

Robot 17 – Biped Humanoid Robot: Using the Robotis robotics kits humanoid robots were constructed. These robots use Robotis digital servos and have 18 degrees of freedom for arms and legs plus sensors. Communication is using the Robotis wireless RF system. The humanoids run a number of types of provided programs for walking and other actions. These robots are of interest to investigate requirements for biped functions.

Robotis Biped

Robot 18 – ROS With TurtleBot3: The TurtleBot3 but Robotis is a platform which runs the Robot Operating System (ROS). This is a purchased kit which is based on the Raspberry Pi. It includes a number of sensors one of which is a 360 degree LIDAR. The reason for this robot is to study ROS. This is an ongoing process but after many hours it is still a wall of obscurity.

TurtleBot3

Robot 19 – DJI FPV: The DJI FPV is a quadrotor design for first person viewing. It is capable of speeds up to 80 miles per hour with a one degree of freedom stabilized camera, obstruction sensors, GPS and status feedback to the head mounted display. This along with two other DJI quadrotors were purchased to use their capabilities in a number of applications. As a robot the objective was to learn about the control and capabilities of drones and the use of FPV for operation.

DJI FPV Quadrotor

Robot 20 – Five Degree of Freedom Arm and Gripper: A five degree of freedom arm and gripper was built to be mounted on a mobile base. The arms structure uses Actobotics channels, Progressive linear actuators (150 pound and 50 pound), and VEX V5 computer, communication, motors and sensors. The gripper is also VEX. The gripper has 3 degrees of freedom plus a target identification camera. The primary functions of interest were operation of a large arm, the operation of 3 degree of freedom gripper and control of object acquisition and gripping.

Large 5 degree of freedom arm and gripper

This is the next set of ten robots or robot components. The next blog will finish these description summaries. I’ll then start descriptions of individual robots mixed with some specifics on other aspects of the robot such as computers used, batteries used and issues and others.

Summary of Mobile Robot Menagerie – 1

As a start to the Build Mobile Robots blog I will summarize the ‘menagerie’ of robots I have. There are numbers of words used for groups of robots … swarm, fleet … ; menagerie works for mine as they fit into the definition ‘a strange or diverse collection of things’. As they have been built or acquired to explore many aspects of mobile robots there is no common structural or functional theme. My ‘menagerie’ includes 25 to 30 robots. These are at various stages of construction, programming or use. For example the ‘tracked airsoft gun’ robot is complete and is programmed to carry out three demonstration functions. The DJI FPV drone robot is in use with flights every couple of weeks. The ‘large mecanum drive robot’ is constructed but has not been programmed. With a few exceptions, most of the robots were constructed to learn about one or more specific aspects of mobile robot implementation and/or use. This includes different drive systems, computers, sensors, software, construction materials, use/mission … and others. The robots also represent many years of acquisition by construction or purchase.

In this blog and one or two more the mobile robot menagerie will be described through a picture or two and a brief description. The latter will include what about the robot specifically makes it of interest. Note that the robot pictures may not be up to date with respect to all components mentioned. A summary table will be provided when all current robots have been described. As the blog progresses specific aspects of these robots will be looked at in some detail.

Robot 1 – Ackerman Steering and Suspension: This is a Traxxas 1/10 scale truck chassis with a Modern Robotics Fusion computer and sensors added. Teleoperation is with a Bluetooth connection to a Logitech controller. Of primary interest for this robot is the Ackerman steering and four wheel drive with suspension. It has front and rear first person viewing cameras (FPV) and uses a Lithium-Ion battery. It has no interaction mechanism.

Robot 1 – Ackerman steering with suspension

Robot 2 – Omni Drive: This robot was constructed with Tetrix beam components. It uses a Lego Mindstorms computer and sensors. Teleoperation is with a Mindsensors transmitter/receiver and Logitech controller. Of primary interest is the control of the omni drive. Tetrix motors with Sabertooth controllers are used. It has a nickel metal hydride battery. A VEX gripper is used for grasping and moving objects.

Omni Drive

Robot 3 – Medium Mecanum Drive: A Tetrix beam frame and motors are used for a medium sized Mecanum drive robot. The computer is a National Instruments RoboRio. Teleoperation and camera communication is via a Wi-Fi module. A CAN network is used with Talon motor controllers. It uses a nickel metal hydride battery. A VEX gripper is used for grasping and moving objects. The primary characteristics of interest for this robot are the Mecanum drive and CAN network.

Medium Mecanum Drive

Robot 4 – VEX Track Drive: VEX structure materials are used for a basic tank drive robot. The robot system uses the VEX V5 computer, motors, communication and battery. A V5 camera is used in one direction for object detection and tracking. An RC camera is used in the other direction for first person view control. The object detection camera and FPV camera with a Fatshark headset are the primary interest aspects of this robot.

VEX Track Drive

Robot 5 – Tetrix Tank Track Drive: This is a tracked robot using tank drive built on a Tetrix frame structure. It uses a Modern Robotics Fusion controller and Modern Robotics sensors, core modules and motors with rotation sensing. Teleoperation is through a Logitech controller and Bluetooth. Ground sensors are used for step detection. An infrared sensor is used for navigation. Power is via a lithium ion battery.

Tetrix Tank Track Drive

Robot 6 – Tetrix Swerve Drive: The swerve drive robot is constructed using a: 1) Tetrix channel frame, 2) Tetrix motors, 3) 3D printed wheel mounting supports, 4) Sabertooth motor controllers and Hitec servos. The computer, communication and sensors are of the VEX V5 system. A Lithium ion battery is used with the V5 and nickel metal hydride for motor power. The swerve drive capability is the primary feature of interest.

Swerve Drive

Robot 7 – Six Wheel Medium Tank Drive: This robot uses a Tetrix channel frame, 4 standard wheels and two omni wheels. A motor and gear box on each side use sprockets and chains to drive the wheels. The VEX V5 computer, communication, controller and sensor system are used. A 14 amp hour 12 volt battery is used with Talon motor controllers. The plan is to add a mechanism to the base. This mechanism as well as the wheels plus omni tank drive are the features of interest.

Six Wheel Medium Tank Drive

Robot 8 – Mekamon Quadped – The Mekamon quadped is a purchased robot. It was developed in 2016-18 as a part of a augmented reality game system. The product was discontinued in 2019. The robot itself is a very good example of a quadped walking system. The computer, servos and sensors of the Mekamon are internal. Control is via Bluetooth to a smart phone or tablet run app. The capability of interest is the operational characteristics of an effectively programmed quadped.

Mekamon Quadped

Robot 9 – Kobra Arm and Gripper With Dual Tracks – This is a purchased ‘qualified’ robots. It’s essentially all RC but does provide a camera for remote sensing. It has a number of features of interest: 1) a six degree of freedom arm and gripper system, 2) dual tracks allowing some degree of climbing, 3) and Wi-Fi camera working with the gripper. In addition, with two of these machines the complexity of arm coordination can be tested.

Kobra Arm and Gripper with Dual Tracks

Robot 10 – Large Mecanum Drive: This robot is constructed with aluminum angle and plate and is 32″ x 32″ x 10″. Each wheel is driven by dual CIM motors through a two speed pneumatic switched gearbox. Each motor is controlled with a Talon motor controller. The computer is a National Instruments RoboRio. Communication is Wi-Fi through a computer connected via USB to the controller. The battery is a 7 amp hour lead acid. The features of interest are: 1) controlling a robot of this size, 2) use of 2 speed transmission 3) Wi-Fi communication and 4) using pneumatic control.

Large Mecanum Drive

This is the first set of robots. The next blog will include another set.