Categories
control envs Gripper Research Locomotion simulation

Meta-world

It is a training task set for 6 DOF (degree-of-freedom) robot arms.

https://github.com/rlworkgroup/metaworld

Here’s a 38MB gif explaining it: https://meta-world.github.io/figures/ml45-1080p.gif

Meta-World is an open-source simulated benchmark for meta-reinforcement learning and multi-task learning consisting of 50 distinct robotic manipulation tasks. We aim to provide task distributions that are sufficiently broad to evaluate meta-RL algorithms’ generalization ability to new behaviors.”

Categories
Behaviour envs meta simulation

Animal-AI 2.0

Like metaworld, but 900 tasks, and with Unity mappings http://animalaiolympics.com/AAI/

github: https://github.com/beyretb/AnimalAI-Olympics

“The Animal-AI Olympics was built using Unity’s ML-Agents Toolkit.

The Python library located in animalai extends ml-agents v0.15.0. Mainly, we add the possibility to change the configuration of arenas between episodes.”

To get an idea of the experiments: http://animalaiolympics.com/AAI/testbed

They had a competition of ‘animal AIs’ in 2019, using EvalAI:

EvalAI

The competition was kindly hosted on EvalAI, an open source web application for AI competitions. Special thanks to Rishabh Jain for his help in setting this up. We will aim to reopen submissions with new hidden files in order to keep some form of competition going.

Deshraj Yadav, Rishabh Jain, Harsh Agrawal, Prithvijit Chattopadhyay, Taranjeet Singh, Akash Jain, Shiv Baran Singh, Stefan Lee and Dhruv Batra (2019) EvalAI: Towards Better Evaluation Systems for AI Agents

arxiv: https://arxiv.org/pdf/1902.03570.pdf

Categories
CNNs dev

CNN Training

Which Image resolution should I use for training for deep neural network?

CIFAR dataset is 32px*32px,

MIT 128px*128px,

Stanford 96px*96px.

Following the advice here https://towardsdatascience.com/boost-your-cnn-image-classifier-performance-with-progressive-resizing-in-keras-a7d96da06e20

“small-image models are much faster to train.”

“Here is a smoothed kernel-density plot of image sizes in our “Open Fruits” dataset:”

Image for post

We see here that the images peak at around 128x128 in size. So for our initial input size we will choose 1/3 of that: 48x48.

Now it’s time to experiment! What kind of model you end up building in this phase of the project is entirely up to you.” (https://towardsdatascience.com/boost-your-cnn-image-classifier-performance-with-progressive-resizing-in-keras-a7d96da06e20)

I’ll have a look at the chicken images, and see how to scale them down. Maybe ffmpeg or convert or imagemagick pre-processing is better. But we’ll get there soon enough.

Categories
deep hardware_

TPUs and Graphics cards for AI

So first of all, there are TPUs, Tensor Processing Units, like this one that Google bought https://coral.ai/ / https://coral.ai/products/ that are more specialised. They’re ASICs.

tensor processing unit (TPU) is an AI accelerator application-specific integrated circuit (ASIC) developed by Google specifically for neural network machine learning, particularly using Google’s own TensorFlow software.[1] Google began using TPUs internally in 2015, and in 2018 made them available for third party use, both as part of its cloud infrastructure and by offering a smaller version of the chip for sale.

Second, like, you don’t even need physical cards. You can rent a server at Hetzner, or just buy “Compute” on AWS or Google, etc.

So, how to train your dragon?

Gaming cards are not as fast as TPUs, but they’re pretty good for gaming. That’s something to consider too.

“Which graphics card for deep learning?”

2019 is bit outdated. Before the latest AMD RX 57*/58*

(eg “RX 580X”) series.

Latest advice, 2020 August:

“AMD Ryzen Threadripper 2950x with 2 x Nvidia RTX 2080 Ti.”

NVIDIA has better software support, usually. It’s almost like vi vs. emacs – an eternal battle of the hardware Gods, to increase FLOPS. AMD vs. NVIDIA, newt vs snake, red vs. blue.

AMD has “Vega” 7nm manufacturing process. It’s ahead, for now.

Well, ok here we go, for AMD: holy moly $1899 https://www.amd.com/en/graphics/servers-radeon-instinct-mi

Recent tech radar says:

Best graphics cards at a glance

  1. AMD Radeon RX 5700
  2. Nvidia GeForce RTX 2080 Ti
  3. AMD Radeon RX 5600 XT
  4. Nvidia GeForce RTX 2070 Super
  5. Nvidia GeForce GTX 1660 Super
  6. AMD Radeon VII
  7. Nvidia GeForce RTX 2080 Super
  8. Zotac GeForce GTX 1080 Ti Mini
  9. Gigabyte GeForce GTX 1660 OC 6G
  10. PNY GeForce GTX 1660 Ti XLR8 Gaming OC

NVIDIA has this Edge TPU, https://www.nvidia.com/en-us/autonomous-machines/embedded-systems/jetson-xavier-nx/?nvid=nv-int-csfg-78188#cid=gtcev_nv-int-csfg_en-us

JETSON XAVIER NX – 21 teraflops. $399

 Tera Operations per Second (TOPS).

21 TOPS (at 15 W) or up to 14 TOPS (at 10 W). 

Tera is a lot of OPS.

Anyway, what to think of all this? Graphics cards are pretty expensive. And there’s a whole new world of IoT edge computing devices, which are more what we’re interested in, anyway.

For graphics cards, about a year ago, GTX1060 (6GB) was the best deal. AMD was out of the race. But then they got 7nm processing and whipped up some cool sounding CPUs, in like 16 and 32 core versions. So however shitty their software is, they make very efficient, parallelised products, using CPU and GPU, and have historically been the one that follows open standards. NVIDIA is proprietary. But CUDA used to be practically the only game in town.

Anyway, we can just see how long it takes to train the detectron2 chicken and egg image segmentation code.

I can probably just leave my 4 CPU cores training over night for the things we want to do, or set up the raspberry pi to work on something.

Categories
AI/ML chicken research chicken_research The Chicken Experience

Broiler stunned

“When applied to a reinforcing dataset containing 27,828 images of chickens in a stunned state, the identification accuracy of the model was 98.06%. This was significantly higher than both the established back propagation neural network model (90.11%) and another Faster-RCNN model (96.86%). The proposed algorithm can complete the inspection of the stunned state of more than 40,000 broilers per hour. The approach can be used for online inspection applications to increase efficiency, reduce labor and cost, and yield significant benefits for poultry processing plants.” https://www.sciencedirect.com/science/article/pii/S0032579119579093

Their abstract frames benefit in terms of slaughtering efficiency. Interesting ‘local optima’ ethics-wise. But yes, since we kill 178 million broiler chickens a day, we should at least have an AI checking that the stunning worked. Perhaps implement some “Ethics policy” to re-stun the chicken, if not properly stunned.

(Stunning means the conveyor belt dipping chickens’ heads into electrified water, to stun them, so their heads dangle and can be ripped off mechanically)

Categories
AI/ML neuro

OpenCog

Eventually we’ll need AGI. So Ben Goertzel started OpenCog, which is about cognitive synergy

OpenCog is a more complicated, richer hybrid approach, which incorporates deep learning along with a lot of other stuff

Looks like a well developed framework https://wiki.opencog.org/w/Hands_On_With_OpenCog

Categories
AI/ML Behaviour chicken research CNNs Vision

Egg ID

This is a notably relevant paper from 2019, that appears to be keeping track of eggs

“Our custom SSD object detection and classification model classified when chickens and eggs were detected by the video camera. Our models can label video frames with classifications for 8 breeds of chickens and 4 colors of eggs, with 98% accuracy on chickens or eggs alone and 82.5% accuracy while detecting both types of objects.”


“Tuned accuracy is needed for proper thresholding of object detection”

https://scholar.smu.edu/cgi/viewcontent.cgi?article=1073&context=datasciencereview (https://scholar.smu.edu/datasciencereview/vol2/iss1/20/)

Also interesting,

Factors Affecting Egg Production in Backyard Chicken
Flocks

https://edis.ifas.ufl.edu/pdffiles/ps/ps02900.PDF

Categories
Behaviour bio chicken_research

Chicken behaviour

Lot’s of interesting stuff.

Fighting: “This fighting often continues until they reach maturity and the pecking order is well established.”

Foraging: “In the wild, jungle fowl spend 61% of their time foraging. Foraging behaviors include pecking and scratching at potential food sources, as well as looking for and sampling possible food sources. Providing chickens with a complete feed eliminates the need for foraging in order to obtain nutrients, but the hens will continue performing this behavior”

Nesting: “Birds are mimics”

Categories
AI/ML envs Vision

COCO, ShapeNet, Pix3d

These are some examples of data sets, for different reasons

https://cocodataset.org/https://cocodataset.org/

https://www.shapenet.org/about

http://pix3d.csail.mit.edu/

Torchvision:

torchvision.datasets

Categories
chicken_research

Lol Cute Correspondence

Gmailmiranda moss <miranda.and.a.moss@gmail.com>
Sustainable Chicken Farming Research
5 messages
miranda moss <miranda.and.a.moss@gmail.com>14 August 2020 at 11:40
To: info@gardenbergsnas.se
Dear  Ulrika and Tomas  

Firstly, an apology that I can’t speak Swedish! I hope that’s ok.

My name is Miranda and I am a Master’s student in Sustainable Design at Linnaeus University in Växjö. I am working on a practical research project which aims to decrease cruelty in the mass egg farming industry by using technology to provide humane, cost-effective alternatives for large-scale commercial battery farms.

I found out about your farm from the Reko Ring, and I was wondering if you may be open to me coming to visit to do some research? Your small scale, free-range practices are exactly what we would like to convince big agri-business should be possible in the future.

The research would entail me taking some video footage of the chickens and where they live, seeing how they respond to a small robot prototype, and hopefully, if you have time, asking you some questions about your sustainable chicken farming practices.

I hope that you will be interested in contributing to research in pursuit of a more sustainable future! Please let me know if you have any questions. 

All the very best,
Miranda Moss.  
Dagar Groblad <dagar.groblad@gmail.com>17 August 2020 at 06:09
To: miranda moss <miranda.and.a.moss@gmail.com>
Hello Miranda,

I think you have sent the e-mail to the wrong person. We are not Ulrika and Tomas, and we do not have chickens. Good luck with your research!


Best Regards

Dagar Groblad[Quoted text hidden]
miranda moss <miranda.and.a.moss@gmail.com>17 August 2020 at 14:51
To: Dagar Groblad <dagar.groblad@gmail.com>
Hi Dagar
Sorry for the mistake! It must have been a strange email to receive! Thanks for letting me know though.
All the best,Miranda. [Quoted text hidden]
Dagar Groblad <dagar.groblad@gmail.com>17 August 2020 at 14:54
To: miranda moss <miranda.and.a.moss@gmail.com>
No problem, and not strange. Happy that people like you exist. Together we build something new to try and make a better world.
All the best.Dagar
[Quoted text hidden]
miranda moss <miranda.and.a.moss@gmail.com>18 August 2020 at 09:05
To: Dagar Groblad <dagar.groblad@gmail.com>
Thanks for your kind words and encouragement 🙂
Best of luck for you too, on your journey to making a better world![Quoted text hidden]