Categories
doin a heckin inspire Hardware hardware_ Locomotion robots

Hardware Inspiration

MIT’s Cheetah mini breakown. O-Drive, nice motors, etc…

James Bruton’s Mini Dog, with servos and arduino

James Bruton’s OpenDog with badass DC motors

Categories
deep hardware_

TPUs and Graphics cards for AI

So first of all, there are TPUs, Tensor Processing Units, like this one that Google bought https://coral.ai/ / https://coral.ai/products/ that are more specialised. They’re ASICs.

tensor processing unit (TPU) is an AI accelerator application-specific integrated circuit (ASIC) developed by Google specifically for neural network machine learning, particularly using Google’s own TensorFlow software.[1] Google began using TPUs internally in 2015, and in 2018 made them available for third party use, both as part of its cloud infrastructure and by offering a smaller version of the chip for sale.

Second, like, you don’t even need physical cards. You can rent a server at Hetzner, or just buy “Compute” on AWS or Google, etc.

So, how to train your dragon?

Gaming cards are not as fast as TPUs, but they’re pretty good for gaming. That’s something to consider too.

“Which graphics card for deep learning?”

2019 is bit outdated. Before the latest AMD RX 57*/58*

(eg “RX 580X”) series.

Latest advice, 2020 August:

“AMD Ryzen Threadripper 2950x with 2 x Nvidia RTX 2080 Ti.”

NVIDIA has better software support, usually. It’s almost like vi vs. emacs – an eternal battle of the hardware Gods, to increase FLOPS. AMD vs. NVIDIA, newt vs snake, red vs. blue.

AMD has “Vega” 7nm manufacturing process. It’s ahead, for now.

Well, ok here we go, for AMD: holy moly $1899 https://www.amd.com/en/graphics/servers-radeon-instinct-mi

Recent tech radar says:

Best graphics cards at a glance

  1. AMD Radeon RX 5700
  2. Nvidia GeForce RTX 2080 Ti
  3. AMD Radeon RX 5600 XT
  4. Nvidia GeForce RTX 2070 Super
  5. Nvidia GeForce GTX 1660 Super
  6. AMD Radeon VII
  7. Nvidia GeForce RTX 2080 Super
  8. Zotac GeForce GTX 1080 Ti Mini
  9. Gigabyte GeForce GTX 1660 OC 6G
  10. PNY GeForce GTX 1660 Ti XLR8 Gaming OC

NVIDIA has this Edge TPU, https://www.nvidia.com/en-us/autonomous-machines/embedded-systems/jetson-xavier-nx/?nvid=nv-int-csfg-78188#cid=gtcev_nv-int-csfg_en-us

JETSON XAVIER NX – 21 teraflops. $399

 Tera Operations per Second (TOPS).

21 TOPS (at 15 W) or up to 14 TOPS (at 10 W). 

Tera is a lot of OPS.

Anyway, what to think of all this? Graphics cards are pretty expensive. And there’s a whole new world of IoT edge computing devices, which are more what we’re interested in, anyway.

For graphics cards, about a year ago, GTX1060 (6GB) was the best deal. AMD was out of the race. But then they got 7nm processing and whipped up some cool sounding CPUs, in like 16 and 32 core versions. So however shitty their software is, they make very efficient, parallelised products, using CPU and GPU, and have historically been the one that follows open standards. NVIDIA is proprietary. But CUDA used to be practically the only game in town.

Anyway, we can just see how long it takes to train the detectron2 chicken and egg image segmentation code.

I can probably just leave my 4 CPU cores training over night for the things we want to do, or set up the raspberry pi to work on something.

Categories
deep hardware_ institutes

SLIDE

CPU beating GPU. Lol NVIDIA. SELL! SELL!

https://news.rice.edu/2020/03/02/deep-learning-rethink-overcomes-major-obstacle-in-ai-industry/

arxiv: https://arxiv.org/pdf/1903.03129.pdf

Conclusion:


We provide the first evidence that a smart algorithm with
modest CPU OpenMP parallelism can outperform the best
available hardware NVIDIA-V100, for training large deep
learning architectures
. Our system SLIDE is a combination
of carefully tailored randomized hashing algorithms with
the right data structures that allow asynchronous parallelism.
We show up to 3.5x gain against TF-GPU and 10x gain
against TF-CPU in training time with similar precision on
popular extreme classification datasets. Our next steps are to
extend SLIDE to include convolutional layers. SLIDE has
unique benefits when it comes to random memory accesses
and parallelism. We anticipate that a distributed implementation of SLIDE would be very appealing because the
communication costs are minimal due to sparse gradients.

Categories
dev Hardware hardware_ Linux

RPi without keyboard and mouse

https://sendgrid.com/blog/complete-guide-set-raspberry-pi-without-keyboard-mouse/

https://github.com/motdotla/ansible-pi

First thing is you need a file called ‘ssh’ on the raspbian to enable it:.

https://www.raspberrypi.org/forums/viewtopic.php?t=144839

ok so I found the IP address of the PI

root@chrx:~# nmap -sP 192.168.101.0/24

Starting Nmap 7.60 ( https://nmap.org ) at 2020-04-05 17:06 UTC
Nmap scan report for _gateway (192.168.101.1)
Host is up (0.0026s latency).
MAC Address: B8:69:F4:1B:D5:0F (Unknown)
Nmap scan report for 192.168.101.43
Host is up (0.042s latency).
MAC Address: 28:0D:FC:76:BB:3E (Sony Interactive Entertainment)
Nmap scan report for 192.168.101.100
Host is up (0.049s latency).
MAC Address: 18:F0:E4:E9:AF:E3 (Unknown)
Nmap scan report for 192.168.101.101
Host is up (0.015s latency).
MAC Address: DC:85:DE:22:AC:5D (AzureWave Technology)
Nmap scan report for 192.168.101.103
Host is up (-0.057s latency).
MAC Address: 74:C1:4F:31:47:61 (Unknown)
Nmap scan report for 192.168.101.105
Host is up (-0.097s latency).
MAC Address: B8:27:EB:03:24:B0 (Raspberry Pi Foundation)

Nmap scan report for 192.168.101.111
Host is up (-0.087s latency).
MAC Address: 00:24:D7:87:78:EC (Intel Corporate)
Nmap scan report for 192.168.101.121
Host is up (-0.068s latency).
MAC Address: AC:E0:10:C0:84:26 (Liteon Technology)
Nmap scan report for 192.168.101.130
Host is up (-0.097s latency).
MAC Address: 80:5E:C0:52:7A:27 (Yealink(xiamen) Network Technology)
Nmap scan report for 192.168.101.247
Host is up (0.15s latency).
MAC Address: DC:4F:22:FB:0B:27 (Unknown)
Nmap scan report for chrx (192.168.101.127)
Host is up.
Nmap done: 256 IP addresses (11 hosts up) scanned in 2.45 seconds

if nmap is not installed,

apt-get install nmap

Connect to whatever IP it is

ssh -vvvv pi@192.168.101.105

Are you sure you want to continue connecting (yes/no)? yes

Cool, and to set up wifi, let’s check out this ansible script https://github.com/motdotla/ansible-pi

$ sudo apt update
$ sudo apt install software-properties-common
$ sudo apt-add-repository --yes --update ppa:ansible/ansible
$ sudo apt install ansible

ok 58MB install…

# ansible-playbook playbook.yml -i hosts –ask-pass –become -c paramiko

PLAY [Ansible Playbook for configuring brand new Raspberry Pi]

TASK [Gathering Facts]

TASK [pi : set_fact]
ok: [192.168.101.105]

TASK [pi : Configure WIFI] **
changed: [192.168.101.105]

TASK [pi : Update APT package cache]
[WARNING]: Updating cache and auto-installing missing dependency: python-apt
ok: [192.168.101.105]

TASK [pi : Upgrade APT to the lastest packages] *
changed: [192.168.101.105]

TASK [pi : Reboot] **
changed: [192.168.101.105]

TASK [pi : Wait for Raspberry PI to come back] **
ok: [192.168.101.105 -> localhost]

PLAY RECAP ****
192.168.101.105 : ok=7 changed=3 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0

And I’ll unplug the ethernet and try connect by ssh again

Ah, but it’s moved up to 192.168.1.106 now

nmap -sP 192.168.101.0/24 (I checked again) and now it was ‘Unknown’, but ssh pi@192.168.101.106 worked

(If you can connect to your router, eg. 192.168.0.1 for most D-Link routers, you can go to something like Status -> Wireless, to see connected devices too, and skip the nmap stuff.)

I log in, then to configure some stuff:

sudo raspi-config

Under the interfaces peripheral section, Enable the camera and I2C

sudo apt-get install python-smbus
sudo apt-get install i2c-tools

ok tested with

raspistill -o out.jpg

Then copied across from my computer with

scp pi@192.168.101.106:/home/pi/out.jpg out.jpg

and then make it smaller (because trying to upload the 4MB version no)

convert out.jpg -resize 800×600 new.jpg

Cool and it looks like we also need to expand the partition

sudo raspi-config again, (Advanced Options, and first option)


Upon configuring the latest pi, I needed to first use the ethernet cable,

and then once logged in, use

sudo rfkill unblock 0

to turn on the wifi. The SSID and wifi password could be configured in raspi-config.


At Bitwäsherei, the ethernet cable to the router trick didn’t work.

Instead, as per the resident Gandalf’s advice, the instructions here

https://raspberrypi.stackexchange.com/questions/10251/prepare-sd-card-for-wifi-on-headless-pi

worked for setting up wireless access on the sd card.

“Since May 2016, Raspbian has been able to copy wifi details from /boot/wpa_supplicant.conf into /etc/wpa_supplicant/wpa_supplicant.conf to automatically configure wireless network access”

The file contains

ctrl_interface=DIR=/var/run/wpa_supplicant GROUP=netdev
update_config=1
country=«your_ISO-3166-1_two-letter_country_code»

network={
    ssid="«your_SSID»"
    psk="«your_PSK»"
    key_mgmt=WPA-PSK
}

Save, and put sd card in RPi. Wireless working and can ssh in again!

2022 News flash:

Incredibly, some more issues.

New issue, user guide not updated yet

https://stackoverflow.com/questions/71804429/raspberry-pi-ssh-access-denied

In essence, the default pi user no longer exists, so you have to create it and set its password using either the official Imager tool or by creating a userconf file in the boot partition of your microSD card, which should contain a single line of text: username:hashed-password

Default pi and raspberry

pi:$6$/4.VdYgDm7RJ0qM1$FwXCeQgDKkqrOU3RIRuDSKpauAbBvP11msq9X58c8Que2l1Dwq3vdJMgiZlQSbEXGaY5esVHGBNbCxKLVNqZW1

Categories
Hardware hardware_ robots

Robot prep: PWM control

I started up a raspberry pi with raspbian installed on it. (I used balena’s etcher to flash an sd card): the light version, basically just Debian OS for rpi: https://www.raspberrypi.org/downloads/raspbian/

https://www.balena.io/etcher/

The RPi needs a proper keyboard, at least until you set up ssh and can access it remotely. (

We’re interested in making a robot, and we’re using a Raspberry pi. So we need to control servos. RPi only has a single PWM pin, so we need to use an I2C module https://learn.adafruit.com/16-channel-pwm-servo-driver to control however many servos our robot needs, and the software libs to run it https://github.com/adafruit/Adafruit_Python_PCA9685

adafruit_products_ID815servo_LRG.jpg

and we need an external 5V PSU, to power the servos.

Configuring Your Pi for I2C:

sudo apt-get install python-smbus
sudo apt-get install i2c-tools

Need to connect the RPi to the servo driver. This was a picture taken when testing it on the RPi Zero W in 2019. The instructions for pinout connections: https://learn.adafruit.com/16-channel-pwm-servo-driver/pinouts

Or from Adafruit, for RPi:

adafruit_products_raspi_pca9685_i2c_bb.jpg

(or Arduino)

adafruit_products_AllServos_bb-1024.jpg

Here’s an rpi zero layout.

Ok but how do we power servos? We can’t run it off the RPi’s 5V 2A. Oh duh, there’s that big DC socket connected to the PCA9685

There is something to be said for running the robot on an Arduino. You get something robotic. You upload someone’s hexapod spider code, and it does a little dance. You can control it remotely. It can interact with sensors.

Arduinos are dirt cheap. So I bet we could have tiny neural networks running in arduinos… shit, do we already have them? Let’s see… ok wow there is like a whole thing. https://blog.arduino.cc/2019/10/15/get-started-with-machine-learning-on-arduino/ ok but 2K of RAM is not much. You would need to be a real demo scene junky to use a 2KB NN, but yeah, you could do something crudely intelligent with it.

Robots on ESP32s are definitely a thing. But ok no, Raspberry Pi for real robot. We need Linux for this.

Ok so I need to wire this up. But I also need a chassis for the robot.