Brain-Computer Interface with Companion Robot #66
SentryCoderDev
started this conversation in
Ideas
Replies: 2 comments 1 reply
-
Hi @SentryCoderDev this looks great but the link at the end of this doesn't seem to work anymore. |
Beta Was this translation helpful? Give feedback.
1 reply
-
Connection Overview:Two Raspberry Pi devices were connected wirelessly using Bluetooth communication.This was achieved by pairing the devices through the Bluetooth interface availableon each Raspberry Pi. The Bluetooth module on both Raspberry Pi units was initialized,and a serial connection (RFCOMM protocol) was established between the two.One Raspberry Pi acted as the central (master) device, and the other acted as theperipheral (slave) device. Data transmission and reception were facilitated viaBluetooth's Serial Port Profile (SPP), enabling real-time control and communication. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello everyone,
My name is Emir, and I am excited to talk to you about the developments in my project, SentryBOT BCI-CBI. I've been working on this project where I control the two-legged modular robot framework, SentryBOT, using Brain-Computer Interface (BCI) technology.
The foundation of this project revolves around integrating the OpenBCI Cyton biosensor board, allowing me to control SentryBOT using the power of thoughts. This setup enables the robot to respond to thought signals, expanding the boundaries of human-robot interaction by incorporating movements, voice features, and light interactions. The application of BCI in this context is groundbreaking, pushing the limits of what is possible in human-machine interaction.
Another significant feature of my project is the Computer-Brain Interface (CBI) for the robot. This innovative concept involves a vibration motor integrated into a helmet, which will convey outputs using a specialized alphabet (similar to Morse code). The potential to create a real CBI in the future, possibly using bone conduction, offers exciting possibilities for expanding the capabilities of human-machine communication.
To showcase the flexibility of BCI technology and to demonstrate my vision for the future, I am developing a smart home system. This system will utilize SentryBOT as a central control unit to manage functions such as heating, lighting, security, and entertainment by harnessing brain waves. The OpenBCI Cyton biosensor board will play a critical role in connecting my brain to SentryBOT, highlighting the adaptability of OpenBCI technology in the fields of robotics and artificial intelligence.
For this comprehensive smart home system, I plan to use Raspberry Pi/Jetson Nano, Arduino, Python/C++, and the OpenBCI Cyton biosensor board. The goal is to make human-robot and human-environment interactions seamless and intuitive, bringing the future into the present.
This project not only showcases the potential of OpenBCI technology in robotics but also explores new frontiers in human-robot interaction. Controlling the Modular Companion robot with brain waves not only advances technology but also contributes to the fields of neuroscience and robotics.
Continue following me on this exciting adventure in the SentryBOT BCI-CBI project.
Link to the project
Best regards,
Emir
Beta Was this translation helpful? Give feedback.
All reactions