Team Members: Russell Kosovsky, Jay Nash
This project focuses on rebuilding the dingo quadruped as it is currently in an unfinished state. We plan to fully rebuild the dingo into a platform for use in gait evolution and possibly other forms of objective seeking/evolved behavior. The dingo has a simulated environment that will be used to train the gait in a simulated environment before applying that learned behavior to the real world. We are looking into using Punctuated Anytime Learning for this process in order to better translate simulated behavior to real world behavior. Additionally, we would like to implement a variety of types of genetic algorithms to optimize the gait/behavior in order to determine what type of genetic algorithm works best for a problem. For example, we may use a cyclic genetic algorithm with punctuated anytime learning to evolve a gait, and then use a more standard GA to evolve the behavior of the robot. This last objective might not be entirely feasible but it is something that we discussed and thought might be interesting to look into.
Russell and Jay took over the dingo from a previous student that had worked on the project previously. Jay focused on refining the control systems and power delivery while Russell focused on the underlying ROS setup, modifications of the dingo ROS package for it to better fit our needs, and calibration of the digital servos.
Source code for the Dingo Quadruped, which is a inexpensive quadruped robot based on the Stanford Pupper and notspot codebases
A controller node is used to receive LiDAR updates and determine the proper actions to take. The logic section is collapsed for brevity but it contains a small expert system to decide on which way the dingo should attempt to move in order to avoid walls.
A previously existing input controller was modified to accept action commands from the control circuit and inject them into the actual control commands sent to the dingo. This code will receive that command and update the control settings accordingly. This is not the most ideal way to accomplish this, as we effectively hijack the keyboard controller entirely. However this approach has seemed the most stable and easy to modify.
As the control circuit needed to run in parallel with the dingo, it was loaded as a standalone ROS node and used ROS publication and subscription commands to interface with the LiDAR and input controller. The above picture includes the nodes for the keyboard controller (that has been hijacked for our control system), the dingo itself, the LiDAR, and the controller node.
Determined needed parts and locations for installation of electronics into the body of the dingo. Installed Arduino Nano, ServoBoard, Switches, Rasberry Pi, and custom PCB (Photo 9, 10)
Ordered and installed 2200uF capacitor into ServoBoard and tested motor actuation directly via Arduino Nano
Rebuilt the dingo to only include the minimum amount of parts to test servos and wiring (Ardiuno Nano, ServoBoard, and a battery pack)
Decided to abandon the custom PCB and power delivery system and directly power components from a LiPO battery and buck converters (Photo 5)
ROS was unable to link to the Arduino Nano via a Low Level Shifter, the documentation on the dingo itself states that this connection is often unreliable at best. As such, we are looking into if we can safely remove the Nano entirely
Because the Nano only monitors the voltage of the battery in order to avoid completely discharging it, we decided to use a manual battery alarm instead and remove the Nano from the ROS configuration.
Installed Ubuntu 23.10 on the Rasberry Pi, as well as installing and building ROS packages for both the Dingo control and to interface with the LiDAR (Photo 6)
Modified Dingo input controllers to get between the keyboard controller and the dingo itself to inject movement commands (Video 1)
Installed LiDAR onto the Dingo and tested how it would output information in order to parse it for the dingo controller to use (Photos 3,4)
Created a ROS controller node that listens to the LiDAR publisher and sends instructions to the modified keyboard controller
After testing the currently installed buck converters, it was apparent that the buck converters we are using are unable to push the current we need and instead drops the voltage below what the servos can operate on
Created a 20A buck converter by wiring 2A buck converters in parallel and observed a much stronger gait, however this jerry-rigged 20A converted quickly failed and was deemed too unstable to continue using (Photo 7)
Finished control program for the dingo, the controller now actively reads the LiDAR output and makes movement decisions to avoid walls (Video 2)
Created showcase website with progress pictures, development logs, and links to sources used
Added switch to control power to servos, this now allows the full program to run without servo motion for debugging purposes (such as testing lidar responses ect) (Photo 10)
A replacement buck converter that can carry 20A is needed in order to drive the servos effectively. However once that is installed the dingo will be able to be used for various research projects for learning behavior of an autonomus robot and training machine learning algorithms to interpret LiDAR outputs.
We hope to eventually repurpose the gait methods of the dingo to be able to use EC methods to optimize the gait. The gait is generally ideal for forwards and backwards motion but yaw and sideways motion leave something to be desired. This could be a fairly straightforward way of testing EC on the dingo before introducing more complex tasks that may involve many files or control of ROS functions.
Photo 1: Birdseye view 1
Photo 3: LiDAR image 1
Photo 2: Birdseye view 2
Photo 4: LiDAR image 2
Photo 5:
Photo 7:
Photo 6:
Photo 8:
Photo 9:
Photo 11:
Photo 10:
Video 1: Example of standard walking gait
Video 2: Example of gait response to LiDAR detection
Video 3: Presentation