The Dingo

Team Members: Russell Kosovsky, Jay Nash

This project focuses on rebuilding the dingo quadruped as it is currently in an unfinished state. We plan to fully rebuild the dingo into a platform for use in gait evolution and possibly other forms of objective seeking/evolved behavior. The dingo has a simulated environment that will be used to train the gait in a simulated environment before applying that learned behavior to the real world. We are looking into using Punctuated Anytime Learning for this process in order to better translate simulated behavior to real world behavior. Additionally, we would like to implement a variety of types of genetic algorithms to optimize the gait/behavior in order to determine what type of genetic algorithm works best for a problem. For example, we may use a cyclic genetic algorithm with punctuated anytime learning to evolve a gait, and then use a more standard GA to evolve the behavior of the robot. This last objective might not be entirely feasible but it is something that we discussed and thought might be interesting to look into.

Team

Russell and Jay took over the dingo from a previous student that had worked on the project previously. Jay focused on refining the control systems and power delivery while Russell focused on the underlying ROS setup, modifications of the dingo ROS package for it to better fit our needs, and calibration of the digital servos.

Source Code & Resources

Original Dingo Project Repository

Source code for the Dingo Quadruped, which is a inexpensive quadruped robot based on the Stanford Pupper and notspot codebases

DingoQuadruped

Project Source Code

Source code for running the quadruped created for AALL

DingoAALL

rpLiDAR ROS Repository

Source code for using rpLiDAR with ROS and rViz

rplidar_ros

How it Works

Control Circuit

Controller 1

A controller node is used to receive LiDAR updates and determine the proper actions to take. The logic section is collapsed for brevity but it contains a small expert system to decide on which way the dingo should attempt to move in order to avoid walls.

Input Controller

keyboard

A previously existing input controller was modified to accept action commands from the control circuit and inject them into the actual control commands sent to the dingo. This code will receive that command and update the control settings accordingly. This is not the most ideal way to accomplish this, as we effectively hijack the keyboard controller entirely. However this approach has seemed the most stable and easy to modify.

ROS Launch Modifications

Launch

As the control circuit needed to run in parallel with the dingo, it was loaded as a standalone ROS node and used ROS publication and subscription commands to interface with the LiDAR and input controller. The above picture includes the nodes for the keyboard controller (that has been hijacked for our control system), the dingo itself, the LiDAR, and the controller node.

Development Updates

Update #1: Review of Parts and Initial Installation

Determined needed parts and locations for installation of electronics into the body of the dingo. Installed Arduino Nano, ServoBoard, Switches, Rasberry Pi, and custom PCB (Photo 9, 10)

Update #2: Initial Tests of Servos

Ordered and installed 2200uF capacitor into ServoBoard and tested motor actuation directly via Arduino Nano

Update #3: Continued Servo Tests

Rebuilt the dingo to only include the minimum amount of parts to test servos and wiring (Ardiuno Nano, ServoBoard, and a battery pack)

Update #4: Power Delivery Reformat

Decided to abandon the custom PCB and power delivery system and directly power components from a LiPO battery and buck converters (Photo 5)

Update #5: LLS Troubleshooting

ROS was unable to link to the Arduino Nano via a Low Level Shifter, the documentation on the dingo itself states that this connection is often unreliable at best. As such, we are looking into if we can safely remove the Nano entirely

Update #6: Removal of Arduino Nano

Because the Nano only monitors the voltage of the battery in order to avoid completely discharging it, we decided to use a manual battery alarm instead and remove the Nano from the ROS configuration.

Update #7: Rasberry Pi Setup

Installed Ubuntu 23.10 on the Rasberry Pi, as well as installing and building ROS packages for both the Dingo control and to interface with the LiDAR (Photo 6)

Update #8: Input Controller

Modified Dingo input controllers to get between the keyboard controller and the dingo itself to inject movement commands (Video 1)

Update #9: LiDAR Installation

Installed LiDAR onto the Dingo and tested how it would output information in order to parse it for the dingo controller to use (Photos 3,4)

Update #10: LiDAR ROS Setup

Created a ROS controller node that listens to the LiDAR publisher and sends instructions to the modified keyboard controller

Update #11: Power Issues

After testing the currently installed buck converters, it was apparent that the buck converters we are using are unable to push the current we need and instead drops the voltage below what the servos can operate on

Update #12: Power Issues Continued

Created a 20A buck converter by wiring 2A buck converters in parallel and observed a much stronger gait, however this jerry-rigged 20A converted quickly failed and was deemed too unstable to continue using (Photo 7)

Update #13: Controller Logic

Finished control program for the dingo, the controller now actively reads the LiDAR output and makes movement decisions to avoid walls (Video 2)

Update #14: Website

Created showcase website with progress pictures, development logs, and links to sources used

Update #15: Servo Power Switch

Added switch to control power to servos, this now allows the full program to run without servo motion for debugging purposes (such as testing lidar responses ect) (Photo 10)

Future Prospects

A replacement buck converter that can carry 20A is needed in order to drive the servos effectively. However once that is installed the dingo will be able to be used for various research projects for learning behavior of an autonomus robot and training machine learning algorithms to interpret LiDAR outputs.

We hope to eventually repurpose the gait methods of the dingo to be able to use EC methods to optimize the gait. The gait is generally ideal for forwards and backwards motion but yaw and sideways motion leave something to be desired. This could be a fairly straightforward way of testing EC on the dingo before introducing more complex tasks that may involve many files or control of ROS functions.

Videos

Video 1: Example of standard walking gait

Video 2: Example of gait response to LiDAR detection

Video 3: Presentation