Categories
AI/ML chicken research chicken_research dev ears

Chicken Song Detection

We might get into some goal modification or behaviour triggering, when the robot detects specific chicken sounds.

It seems to be a thing people have been doing for a while: https://www.scientificamerican.com/article/fowl-language-ai-decodes-the-nuances-of-chicken-ldquo-speech-rdquo/

“Over the past five years, engineers and poultry scientists at The University of Georgia and Georgia Institute of Technology have been collaborating to help farmers like Mitchell make better use of the information latent in chicken chatter. In a series of studies published between 2014 and 2016, Georgia Tech research engineer Wayne Daley and his colleagues exposed groups of six to 12 broiler chickens to moderately stressful situations—such as high temperatures, increased ammonia levels in the air and mild viral infections—and recorded their vocalizations with standard USB microphones. They then fed the audio into a machine-learning program, training it to recognize the difference between the sounds of contented and distressed birds.”

For now, just pasting links to libraries and stuff.

librosa: https://librosa.github.io/librosa/
(python package for music and audio analysis)

Pandas: https://pandas.pydata.org/
(python data analysis in general)

If I’m going to do data science, I’ll probably need to install anaconda, https://www.anaconda.com/distribution/#download-section

Let’s see how far we can get without installing Anaconda. It’s like 500MB, and we might not use any of it. Miniconda seems to be the light weight version.

Chickens make these calls for air and ground predators:

https://static.scientificamerican.com/sciam/assets/File/Aerial%20alarm%20calls.wav

https://static.scientificamerican.com/sciam/assets/File/Ground%20alarm.wav

Will get back to this another time…

Categories
control

Central Pattern Generator

https://en.wikipedia.org/wiki/Central_pattern_generator

Categories
bio Locomotion

Magnetic BioRobots

https://newatlas.com/robotics/tiny-magnetized-robots-warehouse-biotech/

Categories
AI/ML Locomotion robots sim2real simulation

Pre-training your dragon

The links here died, so updated to more generic links

https://neurorobotics.net

https://hal.inria.fr/

Also

“Sim-to-Real Transfer with Neural-Augmented Robot Simulation” https://proceedings.mlr.press/v87/golemo18a.html

Categories
AI/ML envs neuro robots simulation Uncategorized

NeuroRobotics

https://neurorobotics.net/

Looks like a sweet project. Should investigate

There’s a whole OpenSim project for human skeletons and muscles

https://simtk-confluence.stanford.edu:8443/display/OpenSim/Musculoskeletal+Models

https://simtk.org/projects/nmblmodels

Categories
doin a heckin inspire Locomotion

Dinosaur Robot ‘Trudy’

Categories
Math

Dot and cross products

Just a reminder of what these mean.

Dot Product

Dot product is a scalar magnitude. You take the X components of both and multiply them together, and take the Y components of both and multiply them together, and then add those.

Image result for dot product visualised

Algebraically, the dot product is the sum of the products of the corresponding entries of the two sequences of numbers. Geometrically, it is the product of the Euclidean magnitudes of the two vectors and the cosine of the angle between them. These definitions are equivalent when using Cartesian coordinates.

The Cross Product

The cross product a × b is defined as a vector c that is perpendicular (orthogonal) to both a and b, with a direction given by the right-hand rule and a magnitude equal to the area of the parallelogram that the vectors span.

Categories
control dev simulation

Understanding MotorDemo.cpp (Part 2)

Since this is the meaty bits where the robot is configured, we need to understand links and joints first. Here is the bit from the docs:

Ok so that’s like the basics, which is pretty confusingly worded here. But basically, because the body isn’t counted as a link, you have the same number of links as joints, since you add a joint and link, each time you add a new body part.

Maybe good to recall the 6 degrees of freedom:

Image result for pitch yaw roll

So the one function is localCreateRigidBody,

you can follow along in the code (We’re looking at TestRig) here:

https://github.com/bulletphysics/bullet3/blob/master/examples/DynamicControlDemo/MotorDemo.cpp

Takes mass, startTransform, and shape.

If mass is not 0, it is ‘dynamic’. If it’s dynamic, calculate the local inertia on the shape, using the mass and local inertia.

The default motion state uses the startTransform

The rigid body construction info takes mass, motion state, shape and local inertia. body is then pointing to the rigid body created using the construction info.

The rigid body is added to the world, and returned.

So the Test Rig takes the world, a vector position offset and a boolean of whether it is fixed or not.

The comment says // Setup geometry

Body size is 0.25, legs length are 0.45, foreleg lengths are 0.75 long.

shapes[0] is a capsule shape of ‘body size’ radius, and 0.1 scalar height. Then it iterates through the number of legs, adding in shapes[i] as leg capsules with .1 radius and leg length, and shapes[i+1] as foreleg capsules with .08 radius and foreleg length.

Next is // Setup rigid bodies

It just sets up the offset origin.

Next is // root

Root’s vector is set as 0.5 height, and a transform set up with origin at root. If the ‘fixed’ variable is set, the root body is set up with (mass, startTransform, and shape) of 0, offset*transform, and the capsule shape set up just now. If ‘fixed’ variable is not set, the mass is 1, same place, same shape.

Next is // legs

This bit is complicated. Angle is 2*PI*i/NUM_LEGS, to evenly distribute the legs around the root body. fSin is the sin (opposite over hypotenuse), and fCos is the cos (adjacent over hypotenuse).

So, set transform to identity. That just means like 1,1,1. Multiply a vector by identity and you get the vector, I think.

vBoneOrigin is a vector:
X: cos (bodysize + 0.5*leg length),
Y: 0.5,
Z: sin (bodysize + 0.5*leg length)

then set the transform to this vector. So, this is to find where the joint is, between the body, and the thigh. Same starting height for all legs, but X and Z plane goes around in circle.

// thigh

vToBone = (vBoneOrigin – vRoot).normalize()

Normalizing makes a unit vector (changes length to 1). So Bone Origin minus Root, normalized, means a unit vector in the direction of root, from the new thigh joint.

vAxis = vToBone.cross(vUp)
where vUp is a unit vector on the y axis.

transform.setRotation(btQuaternion(vAxis, M_PI_2));

Ok dunno. So… cross product is like a normal vector, perpendicular to both vToBone, and vUp, using a right hand rule. Both are unit vectors. So it’s like a weird sideways vector. Ok but that is called vAxis. AH. ok, so the axis is the freaking axis to rotate around. Duh. And it rotates on that axis, by PI/2. By 90 degrees.

The quaternion is just the mathematical system rotating it by 90 degrees. It must have rotated the transform to point downwards.

m_bodies[1 + 2 * i] = localCreateRigidBody(btScalar(1.), offset * transform, m_shapes[1 + 2 * i]);

Ok so that makes the thigh leg bits I guess.

// shin

transform.setIdentity();
transform.setOrigin(btVector3(btScalar(fCos * (fBodySize + fLegLength)), btScalar(fHeight – 0.5 * fForeLegLength), btScalar(fSin * (fBodySize + fLegLength))));
m_bodies[2 + 2 * i] = localCreateRigidBody(btScalar(1.), offset * transform, m_shapes[2 + 2 * i]);

So transform is moved to

X: cos (body + leg)
Y: height – 0.5*foreleg,
Z: sin (body + leg),

Ok so we already rotated 90 degrees, and now we have to move the transform to the axis for the foreleg. So the Y axis is the same for all of them, and then when you have cos and sin, that makes a circle around the y axis, at the (body+leg) distance. That might just make sense.

// Setup some damping on the m_bodies

setDamping, setDeactivationTime, setSleepingThresholds must be so the robot chills for a bit and acts a bit more earth-like.

Ok and finally

// Setup the constraints

Iterates through NUM_LEGS, same angle iteration.

//Hip joints first. (i.e. Between Body and Thigh)

localA.setIdentity();localB.setIdentity();
localA.getBasis().setEulerZYX(0, -fAngle, 0);
localA.setOrigin(btVector3(btScalar(fCos * fBodySize), btScalar(0.), btScalar(fSin * fBodySize)));
localB = m_bodies[1 + 2 * i]->getWorldTransform().inverse() * m_bodies[0]->getWorldTransform() * localA;

hingeC = new btHingeConstraint(*m_bodies[0], *m_bodies[1 + 2 * i], localA, localB);

hingeC->setLimit(btScalar(-0.75 * M_PI_4), btScalar(M_PI_8));
m_joints[2 * i] = hingeC;
m_ownerWorld->addConstraint(m_joints[2 * i], true);

Ok so A is rotated around Y axis by the current angle, and A is moved to bodySize outwards.

OK so for B… the inverse of the leg’s ‘world transform’, times the body’s ‘world transform’, times A. The body is presumably like sitting up straight, and A rotates it about Y, and then the inverse of the leg’s transform makes it point back towards origin.

And then the hinge constraint is set up with body and thigh, A and B. Limits set to -.75*PI/4 to PI/8. Apparently -33.75 degrees to 22.5 degrees.

//knee joints

localA.setIdentity();localB.setIdentity();localC.setIdentity();
localA.getBasis().setEulerZYX(0, -fAngle, 0);
localA.setOrigin(btVector3(btScalar(fCos * (fBodySize + fLegLength)), btScalar(0.), btScalar(fSin * (fBodySize + fLegLength))));
localB = m_bodies[1 + 2 * i]->getWorldTransform().inverse() * m_bodies[0]->getWorldTransform() * localA;
localC = m_bodies[2 + 2 * i]->getWorldTransform().inverse() * m_bodies[0]->getWorldTransform() * localA;
hingeC = new btHingeConstraint(*m_bodies[1 + 2 * i], *m_bodies[2 + 2 * i], localB, localC);
hingeC->setLimit(btScalar(-M_PI_8), btScalar(0.2));
m_joints[1 + 2 * i] = hingeC;
m_ownerWorld->addConstraint(m_joints[1 + 2 * i], true);

And now, same story but between leg and foreleg, from -22.5 degrees to 11.4 degrees.

Anyway, I get the idea.

Categories
3D AI/ML dev simulation

Understanding MotorDemo.cpp (Part 1)

Going through the Bullet3 physics engine example code for ‘MotorDemo’. It’s a hexapod looking robot flexing its joints. The folder is bullet3/examples/DynamicControlDemo/

https://github.com/bulletphysics/bullet3/tree/master/examples/DynamicControlDemo

Little gif of MotorDemo in action

OK so there’s the .h file containing

#ifndef MOTORDEMO_H

#define MOTORDEMO_H

 

class CommonExampleInterface* MotorControlCreateFunc(struct CommonExampleOptions& options);

 

#endif

C++ declares the function in .h files so that when compiling, all the .cpp files know about each other, without necessarily caring about the implementation details.

So this function is called ‘MotorControlCreateFunc’, which returns a pointer to a CommonExampleInterface, and takes a reference to CommonExampleOptions. Nice OOPy programming.

Let’s get into the CPP file.

Ok variables are:

time, cycle period, muscle strength, and an array of TestRig objects called rig

Constructor takes a pointer to a GUIHelperInterface. Destructor does no cleaning up.

They define an ‘initPhysics‘ and ‘exitPhysics‘. exitPhysics seems to have all the destructor code freeing up memory.

And three functions, spawnTestRig, setMotorTargets, and resetCamera.

define NUM_LEGS 6
define BODYPART_COUNT 2 * NUM_LEGS + 1
define JOINT_COUNT BODYPART_COUNT - 1

Looks like we can change some settings maybe. Will play with that later. So a leg had 2 body parts, plus there’s the central ‘body’ of the critter.

Now they define the TestRig class

btDynamicsWorld* m_ownerWorld; 
btCollisionShape* m_shapes[BODYPART_COUNT]; btRigidBody* m_bodies[BODYPART_COUNT]; btTypedConstraint* m_joints[JOINT_COUNT];

TestRig consists of a world, collision shapes, bodies and joints.

Let’s look at the rest of the code, because most of the code takes place in TestRig.

So as mentioned before, initPhysics and exitPhysics set up and shut down the physics engine gui.

So this code will be heavily copy-pasted. In this case, it sets up the 3d world, makes a ground, and spawns two test rigs. (Spider things), with a small ‘start offset’ vector from each other. They are pushed onto the rigs stack variable.

setMotorTargets takes a time as a variable. The comment is

// set per-frame sinusoidal position targets using angular motor (hacky?)

So it iterates over the number of rigs, and then twice per NUM_LEGS, sets up target hinge variables, for the animation.

Per hinge, it gets the current angle, sets the target percentage as time divided by 1000, mod the cycle period, divided by the cycle period. and then sets the target angle as 0.5 * (1 + sin (2*PI*target percentage)). Ok…

Target Limit Angle is the hinge’s lower limit, plus the ( target angle multiplied by the ( upper limit – lower limit)). Ok…

Angle Error is the target limit angle minus the current angle.

DesiredAngularVelocity is angle error / time, which is passed to each hinge’s enableAngularMotor function.

I’d be lying if I said I understood entirely. But this is pretty much the basis of the program.

void motorPreTickCallback(btDynamicsWorld* world, btScalar timeStep)
 {
     MotorDemo* motorDemo = (MotorDemo*)world->getWorldUserInfo();
 motorDemo->setMotorTargets(timeStep);
 }

It basically sets all the motor targets, based on the time that has passed, using some sort of sin-wave and the flexing-limits of the legs

Ok. More later.

Categories
dev Linux

Setting up Java/C++ dev environment

The instructions in the wiki of pyBullet got me thinking, that I could use a proper IDE on the linux machine. I’m too old to learn emacs, and I’m not doing this in vi.

pyCharm is pretty good for Python.

But the Bullet Engine examples are all in C++, so I went with Eclipse CDT https://www.eclipse.org/cdt/ for a nicer experience.

So Eclipse needs Java. So I installed java with

sudo apt install openjdk-8-jdk

Unpacked gzip tar file and deleted it

tar xvf eclipse-cpp-2020-03-R-incubation-linux-gtk-x86_64.tar.gz
rm eclipse-cpp-2020-03-R-incubation-linux-gtk-x86_64.tar.gz

./opt/eclipse/eclipse

Shit yeah! colours