-
Notifications
You must be signed in to change notification settings - Fork 3
PyAER Communication Module
PyAER has been a simple interface to DAVIS Camera. It's better than pure libcaer
, but less complicated than frameworks such as DV
and ROS
. What is missing from day-one is a communication module that can manage the activities between devices.
This feature has been considered for a long time and now it's here!
Support from 0.2.0
It's a ROS-like communication module that implements similar features such as roscore
, publisher
, subscriber
, roslaunch
, rostopic
.
It's implemented by using zeromq
's XPUB
and XSUB
. Specifically, I implemented Figure 13 in this chapter.
In one sentence, this module allows tossing messages between processes. Imagine you want to simultaneously record and display outputs from a DAVIS camera. You can first run a publisher to broadcast all the polarity events, frame events, and IMU events. Then one subscriber can record the data while another subscriber display outputs.
Thanks to the powerful zeromq
socket, you can construct a much more complex network than I just described.
Python sucks at threading and multi-processing. This is a known fact. The learning curve is too steep, the features are not as desired, there are locks, and I can go on. Most of all, I have never successfully written a multi-processing project in Python without making my code unreadable.
Another fact is there are many attempts and projects trying to improve multi-processing in Python because of the need for readability and scalability. I didn't go for those solutions because most of them aim for a super-computing scenario.
Finally, I decided to implement a ROS-like communication module that purposely creates a network among agents. But you may wonder "why wouldn't you use ROS instead?". And the answer is "it's too heavy for prototyping scenario". If you don't know ROS, you may spend a week or so to understand its rationale. And, even then you probably still quite confused about.. well, everything.
This is why I coded pyaer.comm
, a small module which you can basically deploy it almost everywhere (Raspberry Pi, Edge TPU, Desktop, Laptop, whatever with a decent *nix
based OS.
More importantly, this module does not limit to the use of pyaer
, you can basically use it like a mini-ROS and your life becomes brighter.
AER Hub implements a proxy that connects every publisher and subscriber. The Hub broadcast all topics from all publishers.
An AER Publisher publishes messages on specific topics. Each message is tagged with the topic name and timestamp in nanosecond resolution. Normally the Publisher does no pre-processing or very light-weight preprocessing that does not affect publishing rate.
An AER Subscriber could subscribe to one or more topics. It unpacks the incoming messages and does the further processing.
This is a special subscriber that keeps track of all the topics in the network.
As long as the topic is publishing, the lstopic
should be able to catch it.
I will define a markup language that can describe a running sequence in order.
This is an analog to roslaunch
where and XML markup language is used.
The markup language is JSON with OrderedDict
decoding.
These words are identifiers of a program type, they are case-sensitive:
Hub
Publisher
Subscriber
Each launch file will start a Hub, either specified at the top of the description or running a default Hub configuration.
- URL:
tcp://127.0.0.1
(DO NOT WRITElocalhost
) - Publisher port: 5100
- Subscriber port: 5099
"Hub": {
"use_default": true,
"url": "tcp://127.0.0.1",
"publisher_port: 5100,
"subscriber_port": 5099,
"aer_hub_name": "PyAER Message Hub"
}
If use_default
is set as true
, it will use the default setting. And you don't need to specify other fields.
Otherwise, the program will use user-specified settings.
"Subscriber-xxx": {
"use_default": true,
"program": "aer_subscriber",
"url": "tcp://127.0.0.1",
"port": 5099,
"topic": "",
"use_default_sub": true,
"custom_sub": "path/to/custom/pub/file.py",
"custom_class": "CustomPublisherName",
"custom_args": {
"args1": "value1",
"args2": 1,
"args3": [1, 2, 3]
}
"extra_configs": {
}
}
- "use_default": if True, use default subscriber program "aer_subscriber", otherwise use custom subscriber program.
- "use_default_sub": if True, use default subscriber class, otherwise, supply custom implementation of the subscriber.
- "extra_configs": allow the user to further define its own configurations
"Publisher-xxx": {
"use_default": true,
"program": "aer_publisher",
"url": "tcp://127.0.0.1",
"port": 5100,
"master_topic": "device",
"use_default_pub": true,
"custom_pub": "path/to/custom/pub/file.py",
"custom_class": "CustomPublisherName",
"custom_args": {
"args1": "value1",
"args2": 1,
"args3": [1, 2, 3]
}
"device": "DAVIS",
"noise_filter": true,
"bias_file": "path/to/bias/file"
"extra_configs": {
}
}
- "use_default": if
true
, use default publisher programaer_publisher
, otherwise, specify the path of the custom publisher program. - "use_default_pub: Use default publisher class available in the system. Otherwise, supply custom publisher class implementation
- "custom_args": arguments to parse
- "device": "DAVIS", "DVS", or "None".
- "noise_filter": for DAVIS or DVS devices.
- "bias_file": make sure to supply a valid bias file.
- "extra_configs": allow the user to further define its own configurations
As shown above, a user can write his/her own Publisher and Subscriber class. For Publisher, there are four mandatory arguments:
device
url
port
master_topic
For Subscriber, there are three mandatory arguments:
url
port
topic
Additional configurations are passed using a dictionary. This dictionary is created by loading a JSON file.
The launch file will create a temporary file that packs the configs into a JSON file and then read by the publisher or subscriber.
The same rule goes with extra_configs
. This should give the user full control.
We denote these two arguments as cfg
and extra_cfg
.
I plan to support two types of hierarchical data formats: HDF5
and Zarr
.
I will record every message into a group in an ordered fashion.
Let's take HDF5 as an example.
Suppose each message is constructed as [topic_name, timestamp, data]
and the topic_name
is represented as master_topic/sub_topic
(e.g., davis-1/polarity_events
).
We can then write the message at hdf5_file["master_topic/timestamp/sub_topic"]
.
Because we enforced the group order, we can replay in time order when replaying.
HDF5 has a "single-writer-multiple-readers" strategy, therefore we will need to implement a single recording subscriber for writing data.
This sort of aligns with rosbag
. Zarr
supports "multiple-writers-multiple-readers". Although I haven't investigated the use of this feature, we could then use multiple recorders writing into a single file.
The AERSaver is a special subscriber. The data saver may be described by
"Saver-xxx": {
"use_default": true,
"url": "tcp://127.0.0.1",
"port": 5099,
"topic": "",
"filename": "record.ext",
"mode": "w-",
"hdf5": true,
"zarr": false,
"libver": "latest"
}
-
I haven't properly benchmarked the performance or dealt with loss packets. (Hopefully, there is not).
-
I haven't address the late-arrived subscriber issue. Therefore, please start subscribers before publishers.
-
Make some proper applications, for example, stereo setup, proper reactive robotics with DNN.