-
Notifications
You must be signed in to change notification settings - Fork 40
Robot Operating System modules
Reasoner module (ona_ros)
A ROS module which accepts String messages containing Narsese in the same way as the NAR shell accepts. Its output are String messages corresponding to the execution of operators, with the format: "opname arg1 … argn" (a space-separated string).
Vision module (darknet_ros via ona_darknet)
This module utilizes YOLO via darknet to detect objects, their location and size in images. It includes an encoder (ona_darknet.py) which encodes the output information into Narsese so that NARS can reason about the position, class and size of detected objects and combinations thereof.
QR-Code module (ona_visp_auto_tracker)
For some robotic experiments, putting QR-Codes on relevant objects is feasible, ONA contains an encoder (ona_visp_auto_tracker.py) which encodes the output information into Narsese, and can take the angle into account to for instance detect whether a door is open or closed.
Depth estimation module
Will include a module for radar, LIDAR, or ultrasonic. (will be included in later versions, with an option to be fused with the vision module so that size and depth of objects can be distinguished)
Sound detection module
Will be based on DFT + SOM, see https://www.youtube.com/watch?v=3l1HABvDKRk This approach allows easy online-training on just a few examples in a matter of seconds, it’s fast and reliable as long as only a few decision-relevant sound signals need to be distinguished, with labels being completely optional.
Localization module
Will utilize some SLAM approach for robotic applications. Its output will be coordinates the system will then automatically correlate with other events, such as object detections.
Module with generic numeric and boolean event encoders
This module will allow to map other sensor values and potential decision-relevant value changes (such as from a touch, light sensor or temperature) into Narsese events.