Categories
control envs form Gripper Hardware hardware_ Locomotion power robots

Exhibition Robots

For the MFRU exhibition, we presented a variety of robots. The following is some documentation, on the specifications, and setup instructions. We are leaving the robots with konS.

All Robots

Li-Po batteries need to be stored at 3.8V per cell. For exhibition, they can be charged to 4.15A per cell, and run with a battery level monitor until they display 3.7V, at which point they should be swapped out. Future iterations of robotic projects will make use of splitter cables to allow hot swapping batteries, for zero downtime.

We leave our ISDT D2 Mark 2 charger, for maintaining and charging Li-Po batteries.

At setup time, in a new location, Raspberry Pi SD cards need to be updated to connect to the new Wi-fi network. Simplest method is to physically place the SD card in a laptop, and transfer a wpa_supplicant.conf file with the below changed to the new credentials and locale, and a blank file called ssh, to allow remote login.

ctrl_interface=DIR=/var/run/wpa_supplicant GROUP=netdev
update_config=1
country=si

network={
    ssid="Galerija"
    psk="galerija"
    key_mgmt=WPA-PSK
}

Then following startup with the updated SD card, robot IP addresses need to be determined, typically using `nmap -sP 192.168.xxx.xxx`, (or a windows client like ZenMap).

Usernames and passwords used are:

LiDARbot – pi/raspberry

Vacuumbot – pi/raspberry and chicken/chicken

Pinkbot – pi/raspberry

Gripperbot – pi/raspberry

Birdbot – daniel/daniel

Nipplebot – just arduino

Lightswitchbot – just arduino and analog timer

For now, it is advised to shut down robots by connecting to their IP address, and typing sudo shutdown -H now and waiting for the lights to turn off, before unplugging. It’s not 100% necessary, but it reduces the chances that the apt cache becomes corrupted, and you need to reflash the SD card and start from scratch.

Starting from scratch involves reflashing the SD card using Raspberry Pi Imager, cloning the git repository, running pi_boot.sh and pip3 install -y requirements.txt, configuring config.py, and running create_service.sh to automate the startup.

LiDARbot

Raspberry Pi Zero W x 1
PCA9685 PWM controller x 1
RPLidar A1M8 x 1
FT5835M servo x 4

Powered by: Standard 5V Power bank [10Ah and 20Ah]

Startup Instructions:
– Plug in USB cables.
– Wait for service startup and go to URL.
– If Lidar chart is displaying, click ‘Turn on Brain’

LiDARbot has a lidar, laser head spinning around, collecting distance updates from the light that bounces back, allowing it to update a 2d (top-down) map of its surroundings.
It is able to not bump into things.

Vacuumbot

Raspberry Pi 3b x 1
LM2596 stepdown converter x 1
RDS60 servo x 4

Powered by: 7.4V 4Ah Li-Po battery

NVIDIA Jetson NX x 1
Realsense D455 depth camera x 1

Powered by: 11.1V 4Ah Li-Po battery

Instructions:
– Plug Jetson assembly connector into 11.4V, and RPi assembly connector into 7.4V
– Connect to Jetson:

cd ~/jetson-server
python3 inference_server.py

– Go to the Jetson URL to view depth and object detection.
– Wait for Rpi service to start up.
– Connect to RPi URL, and click ‘Turn on Brain’

It can scratch around, like the chickens.
Vacuumbot has a depth camera, so it can update a 3d map of its surroundings, and it runs an object detection neural network, so it can interact with its environment. It uses 2 servos per leg, 1 for swivelling its hips in and out, and 1 for the leg rotation.

Pinkbot

Raspberry Pi Zero W x 1
PCA9685 PWM controller x 1
LM2596 stepdown converter x 1
RDS60 servo x 8
Ultrasonic sensors x 3

Powered by: 7.4V 6.8Ah Li-Po battery

Instructions:
– Plug in to Li-Po battery
– Wait for Rpi service to start up.
– Connect to RPi URL, and click ‘Turn On Brain’

Pinkbot has 3 ultrasonic distance sensors, so it has a basic “left” “forward” “right” sense of its surroundings. It uses 8 x 60kg-cm servos, (2 per leg), and 2 x 35kg-cm servos for the head. The servos are powerful, so it can walk, and even jump around.

Gripperbot

Gripperbot has 4 x 60kg-cm servos and 1 x 35kg-cm continuous rotation servo with a worm gear, to open and close the gripper. It uses two spring switches which let it know when the hand is closed. A metal version would be cool.

Raspberry Pi Zero W x 1
150W stepdown converter (to 7.4V) x 1
LM2596 stepdown converter (to 5V) x 1
RDS60 servo x 4
MGGR996 servo x 1

Powered by: 12V 60W power supply

Instructions:
– Plug in to wall
– Wait for Rpi service to start up.
– Connect to RPi URL, and click ‘Fidget to the Waves’

Birdbot

Raspberry Pi Zero W x 1
FT SM-85CL-C001 servo x 4
FE-URT-1 serial controller x 1
12V input step-down converter (to 5V) x 1
Ultrasonic sensor x 1
RPi camera v2.1 x 1

Powered by: 12V 60W power supply

Instructions:
– Plug in to wall
– Wait for Rpi service to start up.
– Connect to RPi URL, and click ‘Fidget to the Waves’

Birdbot is based on the Max Planck Institute BirdBot, and uses some nice 12V servos. It has a camera and distance sensor, and can take pictures when chickens pass by. We didn’t implement the force sensor central pattern generator of the original paper, however. Each leg uses 5 strings held in tension, making it possible, with one servo moving the leg, and the other servo moving the string, to lift and place the leg with a more sophisticated, natural, birdlike motion.

Lightswitchbot

Turns on the light, in the morning
Categories
bio control form Locomotion robots

Pantograph Legs

“They observed that many quadrupedal, mammalian animals feature a distinguished functional three-segment front leg and hind leg design, and proposed a “pantograph” leg abstraction for robotic research.”

1 DOF (degree of freedom). 1 motor. Miranda wants jointed legs, and I don’t want to work out inverse kinematics, so this looks ideal. Maybe a bit complicated still.

Biorobotics Laboratory, EPFL

The simpler force diagram:

Cheetah-cub leg mechanism, and leg compliance. A single leg is shown abstracted, detailed leg segment ratios are omitted for clarity, robot heading direction is to the left. (1) shows the three leg angles αprox, αmid, and αdist. Hip and knee RC servo motors are mounted proximally, the leg length actuation is transmitted by a cable mechanism. The pantograph structure was inspired by the work of Witte et al. (2003) and Fischer and Blickhan (2006). (2) The foot segment describes a simplified foot-locus, showing the leg in mid-swing. For ground clearance, the knee motor shortens the leg by pulling on the cable mechanism (green, Fcable). Fdiag is the major, diagonal leg spring. Its force extends the pantograph leg, against gravitational and dynamic forces. (3) The leg during mid-stance. (4) In case of an external translational perturbation, the leg will be compressed passively. (5) If an external perturbation torque applies e.g., through body pitching, the leg linkage will transmit it into a deflection of the parallel spring, not of the diagonal spring.
Cheetah-cub leg mechanism, and leg compliance. A single leg is shown abstracted, detailed leg segment ratios are omitted for clarity, robot heading direction is to the left. (1) shows the three leg angles αprox, αmid, and αdist. Hip and knee RC servo motors are mounted proximally, the leg length actuation is transmitted by a cable mechanism. The pantograph structure was inspired by the work of Witte et al. (2003) and Fischer and Blickhan (2006). (2) The foot segment describes a simplified foot-locus, showing the leg in mid-swing. For ground clearance, the knee motor shortens the leg by pulling on the cable mechanism (green, Fcable). Fdiag is the major, diagonal leg spring. Its force extends the pantograph leg, against gravitational and dynamic forces. (3) The leg during mid-stance. (4) In case of an external translational perturbation, the leg will be compressed passively. (5) If an external perturbation torque applies e.g., through body pitching, the leg linkage will transmit it into a deflection of the parallel spring, not of the diagonal spring.Kinematic primitives for walking and trotting gaits of a quadruped robot with compliant legs (Alexander Badri-Spröwitz et al, 2014)

Compliance is a feature, made possible by springs typically.

Biologically Inspired Robots - nitishpuri.github.io
https://nitishpuri.github.io/posts/robotics/biologically-inspired-robots/

A homemade attempt here with the Mojo robot of the Totally Not Evil Robot Army. Their robot only uses 9g servos, and can’t quite pick itself up.

I did an initial design with what I had around, and it turns out compliance is a delicate balance. Too much spring, and it just mangles itself up. Too little spring and it can’t lift off the ground.

Further iterations removed the springs, which were too tight by far, and used cable ties to straighten the legs, but the weight of the robot is a little bit too much for the knee joints.

I will likely leave it until I have a 3d printer, some better springs, and will give it another try with more tools and materials available. Maybe even hydraulics, some day,

Some more research required, too.

https://www.mdpi.com/1424-8220/20/17/4911/htm

Categories
3D Research AI/ML arxiv control form Locomotion robots

Kinematic Motion Primitives

This post follows the ‘Finding where we left off’ post, focused on locomotion sim2real. In that post I tried to generalise and smooth the leg angle servo movements in their -PI/2 to PI/2 range.

I will likely try extracting kMPs, before this is all over, which from a skim read, and look at the pictures, are like, just taking a single slice of the wave data, and repeating that. Or, taking consecutive periodic waves, and extracting the average / normalized movement from them.

https://becominghuman.ai/introduction-to-timeseries-analysis-using-python-numpy-only-3a7c980231af

Cheetah-cub leg mechanism, and leg compliance. A single leg is shown abstracted, detailed leg segment ratios are omitted for clarity, robot heading direction is to the left. (1) shows the three leg angles αprox, αmid, and αdist. Hip and knee RC servo motors are mounted proximally, the leg length actuation is transmitted by a cable mechanism. The pantograph structure was inspired by the work of Witte et al. (2003) and Fischer and Blickhan (2006). (2) The foot segment describes a simplified foot-locus, showing the leg in mid-swing. For ground clearance, the knee motor shortens the leg by pulling on the cable mechanism (green, Fcable). Fdiag is the major, diagonal leg spring. Its force extends the pantograph leg, against gravitational and dynamic forces. (3) The leg during mid-stance. (4) In case of an external translational perturbation, the leg will be compressed passively. (5) If an external perturbation torque applies e.g., through body pitching, the leg linkage will transmit it into a deflection of the parallel spring, not of the diagonal spring.
Kinematic primitives for walking and trotting gaits of a quadruped robot with compliant legs (Alexander Badri-Spröwitz et al, 2014)

It’s now December 6th 2021, as I continue here…

This paper is very relevant, “Realizing Learned Quadruped Locomotion Behaviors through Kinematic Motion Primitives”

Some Indian PhDs have summed up the process. Unfortunately I’m not quite on the exact same page. I understand the pictures, haha.

Here’s where this picture comes from, which is useful for explaining what I need to do: (Short paper)

In 2014, also, same thing, Kinematic primitives for walking and trotting gaits of a quadruped robot with compliant legs

They just used PCA. (Principal Component Analysis). That’s like a common ML toolkit thing.

Kinematic primitives for walking and trotting gaits of a quadruped robot with compliant legs (2014)

See now this is where they lose me: “The covariance matrix of the normalized dataset”. Come on guys. Throw us a bone.

I found this picture, which is worth 1000 words, in the discussion on stackexchange about PCA and SVD:

Rotating PCA animation

So, I’m not quite ready for PCA. That is two dimensions, anyway. Oh right, so I need to add a ‘time’ dimension. numpy’s expand_dims?

I played around with Codex, to assist with finding the peaks, and to find the period length.

And I separated them out to different plots… and got the peaks matching once I passed in ( , distance=80).

I had to install these, and restart the Jupyter kernel (and I think close and restart the Chrome tab.) in order to get some matplotlib widgets.

Error message:
Jupyter Lab: Error displaying widget: model not found



!pip3 install --upgrade jupyterlab ipympl
%matplotlib widget
The matplotlib slider example (image thereof)

I started on a slider widget to draw a vertical line on top of the leg data, but I need to fix the refresh issue. Anyhow, it’s not quite what i want. What do I want?

So, I want the kMPs. The kMPs are like, a gif of a basic action, e.g. robot taking a full step forward, on all legs, which we can run once, twice, etc.

We can ‘average’ or ‘normalise’ or ‘phase’ the waves, and assume that gives us a decent average step forward.

I think there’s enough variation in this silly simulation walk that we should start with just the simplest, best single wave.

But since they ran PCA, let’s run it to see what it does for the data. We have a single integer value, which is 1D. To make it 2D, so we can run PCA on it… we add a time dimension?

But also, so I measured the period a few programs up, to be

67 steps (front right),

40 steps (front left),

59 steps (back right),

42 steps (back left).

So, as a starting point, it would be nice to be as close to servos at 90 degrees as possible. If I iterate the values, and track the lowest sum diff, yeah… is that it? I’m looking at this link at SO.

Ideally I could visualise the options..

Repeating a slice. Averaging the slices.

Ok, so I need a start index, end index, to index a range.

After some investigation, the index where the legs are closest to 90 degrees, is at 1739

Computer Enhance

So that’s kinda close to our ideal kMPs, from about 1739 to about 1870 maybe, but clearly the data is messy. Could be tweaked. Wavetable editor, basically.

Alright, let’s make an app. We can try run a Flask server on the Pi, with Javascript front end using chart.js.

pip3 install flask

Save the test web app, kmpapp.py

from flask import Flask

app = Flask(__name__)

@app.route('/')
def index():
    return 'Hello world'

if __name__ == '__main__':
    app.run(debug=True, host='0.0.0.0') 

python3 kmpapp.py

Ok good start. We need to get the x and y data into JSON so Javascript can plot it, in chart.js

That’s looking good. Maybe too many points. Ok, so I want to edit, save, and run the KMPs on the robot.

Well it took a day but it’s working, and is pretty cool. Used smooth.js to allow smoother transitions. Took another day to add save and load features.

I’ll upload this to the project repo.

Many improvements added. Will update repo again closer to MFRU.

Categories
form Hardware

SCARA

From a website called robots.com, which is a great URL, but which otherwise has broken links,


The SCARA or Selective Compliant Assembly (or Articulated) Robot Arm robot provides a circular work envelope.

Here’s from allaboutcircuits.com

Industrial Robotics Roundup:

Articulated vs. SCARA vs. Cartesian Robots

Hmm. interesting. I think we’re doing an Articulated build for the gripper. It will need Z axis movement. But SCARA is cool if you don’t need Z axis (up down).

Categories
3D 3D Research form hardware_

what about making materials from chicken industry waste?

The chicken bone robot prototype I made turned out pretty well, mostly because bones have kind of gone through millions of years of evolution to be as strong and light as possible. Sounds like an ideal limb material. It’s also really nice to work with. Would be nice to do some composites too, for molding and 3D printing etc. from egg shells and feathers (2.3 million tonnes of EU feather waste from slaughterhouses a year) and whatever else.

egg shells + calcium carbonate a la Little Pink Maker

Kreative påske til børn - Children's easter activities in Copenhagen city  center. — Little Pink Maker

Giuseppe Abate‘s material experiments with chicken waste

Ylem Lab‘s Chicken Feather material “Pluminaire”

Categories
3D 3D Research form hardware_ robots

probably the prettiest robots I’ve ever seen

“Ecce” Robot pics from taken at the “making robots human ” exhibition in Stockholm Dan and I went to. The exhibition was kinda out of date British nationalist techno-utopian propaganda, but whatever, still got some cool hardware inspiration.