Looks like a sweet project. Should investigate
There’s a whole OpenSim project for human skeletons and muscles
https://simtk-confluence.stanford.edu:8443/display/OpenSim/Musculoskeletal+Models
Looks like a sweet project. Should investigate
There’s a whole OpenSim project for human skeletons and muscles
https://simtk-confluence.stanford.edu:8443/display/OpenSim/Musculoskeletal+Models
Let’s get back to the main thing we wanted to look at.
pip3 install pybullet –upgrade –user
cd /opt
git clone https://github.com/bulletphysics/bullet3.git
cd bullet3/
cd build3/
cmake -G”Eclipse CDT4 – Unix Makefiles” -DCMAKE_BUILD_TYPE=Release -DBUILD_BULLET2_DEMOS=ON -DBUILD_CPU_DEMOS=ON -DBUILD_BULLET3=ON -DBUILD_OPENGL3_DEMOS=ON -DBUILD_EXTRAS=ON -DBUILD_SHARED_LIBS=ON -DINSTALL_EXTRA_LIBS=ON -DUSE_DOUBLE_PRECISION=ON ..
make
make install
ldconfig
ok that installs it and builds the examples but seems to be for Eclipse CDT. Not sure I need a C++ IDE.
ok so i was supposed to do this for cmake:
./build_cmake_pybullet_double.sh
But anyway,
cd /examples/RobotSimulator
./App_RobotSimulator
COOL.
Ok lets run build_cmake_pybullet_double.sh
ok there we go
cd ./build_cmake/examples/ExampleBrowser
./App_ExampleBrowser
OK this is great.
ThiruVenthanFollowDec 7, 2018 · 5 min read
Robotics operating system (ROS) is an open sourced robotic middle ware licensed under the open source, BSD license. ROS provides services like communication between progress, low level device control, hardware abstraction, package management and visualization tools for debugging. ROS based progress can be represented as graph where process happens in nodes and node communicate with others to execute the overall progress.
what is Gazebo?
Gazebo is a robotics simulator which allows to simulate and test our algorithm in indoor and outdoor environment. Some of the great features of Gazebo simulator are Advance 3D visualization , support to various physics engines (ODE, Bullet, Simbody, and DART) and the ability to simulate the sensor with noise etc., which ultimately results in a more realistic simulation results
Launching the Gazebo with the robot model
Go to the cloned directory and open the terminal (ctrl+alt+t) and run the following commands.
cd rover_ws
catkin_make
source devel/setup.bash
roslaunch rover_gazebo rover_world.world
You can see a robot in a simulation world, in gazebo as shown in figure below.
Open another terminal and run the following command to see the available topics.
rostopic list
You will get the following as the output.
/clock
/cmd_vel
/gazebo/link_states
/gazebo/model_states
/gazebo/parameter_descriptions
/gazebo/parameter_updates
/gazebo/set_link_state
/gazebo/set_model_state
/gazebo_gui/parameter_descriptions
/gazebo_gui/parameter_updates
/joint_states
/odom
/rosout
/rosout_agg
/tf
/tf_static
From the results you obtained you can observe that there are no any topic related to sensors, but cmd_vel topic is available, so we can navigate the robot by sending commands (given below) to this topic. As robot is now using differential drive mechanism, by changing the linear x and angular z values you can move the robot around.
rostopic pub /cmd_vel geometry_msgs/Twist "linear:
x: 1.0
y: 0.0
z: 0.0
angular:
x: 0.0
y: 0.0
z: 0.0"
Adding sonar and IR sensor models to the robot model
Open the rover.xarco file in the rover_ws/src/rover_description directory, using your favorite text editor. Add the following code above “ </robot>” tag.
<joint name="ir_front_joint" type="fixed">
<axis xyz="0 1 0" />
<origin rpy="0 0 0" xyz="0.5 0 0" />
<parent link="base_footprint"/>
<child link="base_ir_front"/>
</joint><link name="base_ir_front">
<collision>
<origin xyz="0 0 0" rpy="0 0 0"/>
<geometry>
<box size="0.01 0.01 0.01"/>
</geometry>
</collision> <visual>
<origin xyz="0 0 0" rpy="0 0 0"/>
<geometry>
<box size="0.01 0.01 0.01"/>
</geometry>
</visual> <inertial>
<mass value="1e-5" />
<origin xyz="0 0 0" rpy="0 0 0"/>
<inertia ixx="1e-6" ixy="0" ixz="0" iyy="1e-6" iyz="0" izz="1e-6" />
</inertial>
</link><joint name="sonar_front_joint" type="fixed">
<axis xyz="0 1 0" />
<origin rpy="0 0 0" xyz="0.5 0 0.25" />
<parent link="base_footprint"/>
<child link="base_sonar_front"/>
</joint><link name="base_sonar_front">
<collision>
<origin xyz="0 0 0" rpy="0 0 0"/>
<geometry>
<box size="0.01 0.01 0.01"/>
</geometry>
</collision> <visual>
<origin xyz="0 0 0" rpy="0 0 0"/>
<geometry>
<box size="0.01 0.01 0.01"/>
</geometry>
</visual> <inertial>
<mass value="1e-5" />
<origin xyz="0 0 0" rpy="0 0 0"/>
<inertia ixx="1e-6" ixy="0" ixz="0" iyy="1e-6" iyz="0" izz="1e-6" />
</inertial>
</link>
The above lines of code integrate senor models (a simple square) to the robot model. For some basic understanding of URDF file of a robot refer this. Now launch the world again by the following code.
roslaunch rover_gazebo rover_world.world
By changing the origin rpy and xyz values within the joint tag the sensor position can be changed.
<origin rpy=”0 0 0" xyz=”0.5 0 0.25" />
You can notice that the sensor model is now visible on top of the robot model.
Adding the sensor plugin for Sonar and IR
Gazebo_ros_range plugin can be used to model both the sonar and the IR sensor. This plugin publish messages according to sensor_msgs/Range Message format so that integration to ros can be easily done. To add the plugin to the open the rover_ws/src/rover_description/urdf/rover.gazebo file in your favorite text editor and add the following lines above “ </robot>” tag.
<gazebo reference="base_ir_front">
<sensor type="ray" name="TeraRanger">
<pose>0 0 0 0 0 0</pose>
<visualize>true</visualize>
<update_rate>50</update_rate>
<ray>
<scan>
<horizontal>
<samples>10</samples>
<resolution>1</resolution>
<min_angle>-0.14835</min_angle>
<max_angle>0.14835</max_angle>
</horizontal>
<vertical>
<samples>10</samples>
<resolution>1</resolution>
<min_angle>-0.14835</min_angle>
<max_angle>0.14835</max_angle>
</vertical>
</scan>
<range>
<min>0.01</min>
<max>2</max>
<resolution>0.02</resolution>
</range>
</ray>
<plugin filename="libgazebo_ros_range.so" name="gazebo_ros_range">
<gaussianNoise>0.005</gaussianNoise>
<alwaysOn>true</alwaysOn>
<updateRate>50</updateRate>
<topicName>sensor/ir_front</topicName>
<frameName>base_ir_front</frameName>
<radiation>INFRARED</radiation>
<fov>0.2967</fov>
</plugin>
</sensor>
</gazebo>
The avove one is for the IR, you can simply copy paste and this again and set the gazebo reference to base_sonar_front and change the topicName and frameName to appropriate one. Now run launch the gazebo.
roslaunch rover_gazebo rover_world.world
Sonar and IR senor rays can be seen in the simulation world. To see the sensor reading superscribe to the appropriate topic. see the commands bellow
rostopic list
Now the output will be like this:
/clock
/cmd_vel
/gazebo/link_states
/gazebo/model_states
/gazebo/parameter_descriptions
/gazebo/parameter_updates
/gazebo/set_link_state
/gazebo/set_model_state
/gazebo_gui/parameter_descriptions
/gazebo_gui/parameter_updates
/joint_states
/odom
/rosout
/rosout_agg
/sensor/ir_front
/sensor/sonar_front
/tf
/tf_static
You could notice that sonar and IR sensor are publishing to the new topics namely, /senor/ir_front and /sensor/sonar_front.
rostopic echo /sensor/ir_front
Type the above code in a the terminal to observe the out put from the IR sensor
header:
seq: 1
stamp:
secs: 1888
nsecs: 840000000
frame_id: "base_ir_front"
radiation_type: 1
field_of_view: 0.296700000763
min_range: 0.00999999977648
max_range: 2.0
range: 0.0671204701066
---
header:
seq: 2
stamp:
secs: 1888
nsecs: 860000000
frame_id: "base_ir_front"
radiation_type: 1
field_of_view: 0.296700000763
min_range: 0.00999999977648
max_range: 2.0
range: 0.0722889602184
---
header:
seq: 3
stamp:
secs: 1888
nsecs: 880000000
frame_id: "base_ir_front"
radiation_type: 1
field_of_view: 0.296700000763
min_range: 0.00999999977648
max_range: 2.0
range: 0.0635933056474
Like this, all the sensor (Lydar , camera, IMU) can be integrated to the robot model. This help a lot in validating the algorithm and finding the optimal sensor position without building the actual hardware fully.Arimac
As an interlude from checking out Blender/phobos, I saw that the big free robot technology effort of the 2010s was probably ROS, the robot operating system, and as it has been around for a long time, it has had its own established ways, in the pre-phobos days. Before these kids could just click their doodads and evolve silicon lifeforms when they felt like it.
No, phobos documentation is actually just a bit terse, so let’s watch some Gazebo videos. This dude just jumps in, expecting you to have things set up. https://stackoverflow.com/questions/41234957/catkin-command-not-found but this guy and me “Probably forgot to set up the environment after installing ROS.
So yeah, what the hell is catkin? So, it’s like how ruby on rails wants you to ask it to make boilerplate for you. So we had to run:
0 mkdir chicken_project
1 cd chicken_project/
2 ls (nothing here, boss)
3 mkdir src
4 cd src
5 catkin_init_workspace
6 cd ..
7 catkin_make
8 source /opt/ros/chicken_project/devel/setup.bash
So, starting 53 seconds in:
Ok so his time guesstimate is a shitload of time. We’re going to go for the shoot first approach. Ok he copy-pastes some code, and apparently you have to copy it from the youtube video. No thanks.
So this was his directory structure, anyway.
cd src
catkin_create_pkg my_simulations
cd my_simulations/
mkdir launch
cd launch/
touch my_world.launch
cd ..
mkdir world
cd world
touch empty_world.world
That is funny. Touch my world. But he’s leading us on a wild goose chase. We can’t copy paste from youtube, dumbass.
Robot Operating System installation http://wiki.ros.org/melodic/Installation/Ubuntu
The geometric model describes the geometric relationships that specify the spatial extent of a given component.
From this website: https:
Kinematics is the study of motion of bodies without regard to the forces that cause the motion. Dynamics on the other hand is the study of the motion of bodies due to applied forces (think F=ma). For example consider orbital mechanics: Kepler’s Laws are kinematic, in that they describe characteristics of a satellite’s orbit such as its eliptical shape without considering the forces that cause that motion, whereas Newton’s Law of Gravity is dynamic as it incorporates the force of gravity to describe why the orbit is eliptical.