Skip to content

r-salas/realtime-voice-assistant

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

30 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

realtime-voice-assistant

Real Time Voice Assistant using LLama 3, Deepgram, and AWS Polly.

Requirements

  • Python 3
  • Redis

Installation

  1. Install llama-cpp-python according to the instructions in the llama-cpp-python documentation.
  2. Install requirements
$ pip install -r requirements.txt
  1. The following environment variables are required:
AWS_ACCESS_KEY=<Your AWS Access Key>
AWS_SECRET_KEY=<Your AWS Secret Key>
DEEPGRAM_API_KEY=<Your Deepgram API Key>
  1. (Optional) Customize your settings in settings.py. You may want to customize the language, by default it is set to es-ES.

Usage

  1. Run the worker
$ celery -A tasks worker --loglevel=info
  1. Run the app
$ python app.py

Customization

Right now, the assistant is configured to speak in Spanish, you can change the language in settings.py.

Mac OS Notes

It's recommended to use the solo pool for the worker on Mac OS. This is because the prefork pool is not supported on Mac OS. To run the worker with the solo pool, use the following command:

$ celery -A tasks worker --loglevel=info --pool=solo