This project aims to recognize hand signs for the game of Rock-Paper-Scissors using deep learning techniques. The system utilizes a convolutional neural network (CNN) to classify hand signs captured through a webcam, allowing users to play the game without the need for physical gestures. Additionally, the application includes features that make it accessible to visually impaired individuals, making it a fun and inclusive way to enjoy the game.
- Webcam-Based Game: Users can play Rock-Paper-Scissors using hand signs captured by a webcam.
- High Accuracy: The CNN model achieves a classification accuracy of over 98% on the test dataset.
- Accessibility: The application is designed to be accessible to visually impaired individuals, providing an inclusive gaming experience.
- Easy to Use: Simple controls allow users to interact with the game using keyboard shortcuts.
To run the project locally, follow these steps:
- Clone the repository to your local machine:
git clone https://github.com/GhufranBarcha/Gesture-Recognition-Rock-Paper-Scissors
-
Install the required dependencies using pip:
-
Download the pretrained model weights from the GitHub repository and place them in the project directory.
-
Run the
app.py
script to start the webcam-based game:
Once the application is running, follow these steps to play the game:
- Position your hand in front of the webcam, making a Rock, Paper, or Scissors gesture.
- Press the spacebar to capture an image of your hand sign.
- The system will classify the hand sign and display the result on the screen.
- Repeat the process to play additional rounds of the game.
- Press the ESC key to exit the game when finished.