Skip to content

ONA controlling Transbot

Patrick Hammer edited this page Dec 25, 2021 · 14 revisions

This is work-in-progress.

Required branch: v0.9.0

Hardware:

https://hitechchain.se/iot/yahboom-ros-transbot-robot-tank-with-7-inch-screen-and-lidar-depth-camera-for-nvidia-jetson-nano-4gb-b01

Software installation:

First, put the Transbot image on SD card, one can use http://91.203.212.130/TransbotImagePH.zip

or one can follow the instructions in https://docs.google.com/document/d/1gSrAIsHIJB1bMKT3ciKrT2Tn2-HQbN5ieOt0QMg1Ctc/edit?usp=sharing

Also the last step, to fix the ext4 partition in this document is necessary, so that the ext4 partition size will be expanded. We will need the space for YOLOv4.

Then:

git clone https://github.com/opennars/OpenNARS-for-Applications

git checkout v0.9.0 (this won't be necessary once v0.9.0 will be released)

cd OpenNARS-for-Applications

remove "-mfpmath=sse -msse2" from BaseFlags in build.sh

./build.sh

cd ./misc/Transbot

sh setup.sh

Software execution

First, run

sh basenodes.sh

to start up necessary ROS nodes, alternatively put start_transbot.sh into start utility of Ubuntu (removing the other) and reboot.

Then:

Run transbot.py in /home/jetson/OpenNARS-for-Applications/misc/Transbot/

Once "//Welcome to ONA-Transbot shell!" appears the ONA Transbot shell is active. It accepts Narsese, and has some pre-defined commands as well which can be seen in transbot.py. Example:

*testmission

*loop

makes it collect a bottle. Alternatively,

<chair --> [left]>! :|:

<chair --> [left]>! :|:

will make it remind itself of the location of the chair, should it have seen and remembered one it will go back there.

The following asks the system what it sees on the left:

<?what --> [left]>? :|:

Further examples will be added in the future.

Software simulation:

Run OpenNARS-for-Applications/misc/Transbot/transbot_simulation.py It will allow to use the ONA-Transbot shell with the hardware (sensing and action) abstracted away. This interface also allows for easy integration with simulators and visualizers.

TODO:

  • Add color detection

  • Use angle feedback (not just visual) also for the arm motor (not just the gripper) so that it will never put the arm into the ground.