Skip to content

Export messages from remote queue and push them on a local RabbitMQ

Notifications You must be signed in to change notification settings

jecnua/rabbitmq-export-to-local

Repository files navigation

RabbitMQ import/export scripts

The export tool is just a docker container around this tool rabbitmq-dump-queue.

Is not mine :)

Prerequisites

  • docker
  • python 2.7 & pip
  • pika

Python dep

$ sudo apt-get install python-pip
$ pip install --user pika

How to

Clone this repo Build your image locally (don't want to push it anywhere for now)

$ docker built -t jecnua/rabbitmq-export .

Run it

$ docker run -it \
-v $PWD/data:/data \
jecnua/rabbitmq-export

Now pull down the data you need

$ time rabbitmq-dump-queue \
-uri="amqp://<user>:<pass>@<url>:<port>/" \
-queue=<queue_name> \
-max-messages=2000 \
-output-dir=/data

Check the results

$ du -sh data/
344M data/

Now that you have the data, run RabbitMQ locally:

$ ./rabbit_00_run_rabbit_locally.sh
$ docker ps

As a last step push all the messages on the queue:

$ time ./01_built_queue.py

On ubuntu 14.04

You need it to run as sudo or try to solve... a lot of problems:

https://docs.docker.com/engine/installation/linux/ubuntulinux/

For example:

  • Old packages
  • Error: Are you trying to connect to a TLS-enabled daemon without TLS?

link 1

link 2

link 3

So I just run it as sudo -,-

Notes

All the downloaded messages are saved in the data directory. The files are under gitignore so no risk to expose them.

TODO

  • Call the rabbitmq-dump script automatically and use ENV to set parameters
  • Docker composer to pull up all automatically
  • Avoid the hard system dependency on python by using a container

About

Export messages from remote queue and push them on a local RabbitMQ

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published