This is a simple nodejs application that allows PostgreSQL to send messages to a RabbitMQ server using NOTIFY.
For example, in SQL you can run the following query:
SELECT pgnotify_rabbitmq.send('rabbit', 'hola!');This application will then immediately send a message to a RabbitMQ topic with that message to a configured routing key. You can then have an application listen for messages with that key and process those events immediately.
Some structure and elements are expected in the database which you need to create by running the following script:
Every time the application starts it will execute the function "pgnotify_rabbitmq"."handle_ack"() with the intention of processing messages without ack.
For more info see:
docker compose -f docker-compose-dev.yml up --buildYou must wait for at least 5 seconds for Rabbitmq to start.
You need to provide a config.yaml file containing details about your database - a template is provided in the repository.
It consists of three sections:
This section contains connection details to connect to your databases:
databases:
testDB:
enabled: true
host: localhost
port: 5432
database: postgres
user: postgres
password: postgres
ssl: falseYou need to run the scripts scripts/pgnotify-rabbitmq.sql in each database. Here we have just one database configured called testDB which will be referred to later.
This section defines details of the rabbitmq instances you want to connect to. It simply consists of a name for the instance and the connection URI to connect to it.
rabbit:
testRabbit: amqp://guest:password@localhostNote: You can put an IP address here instead of the hostname. If it's an IPv6 address then wrap it within a pair of [ ].
This section defines which databases you want to listen to for notifications. You usually have one entry per database (but you are not limited to this).
notify:
-
enabled: true
database: testdb
name: rabbit
handlers:
rabbit:
instance: testRabbit
key: job.statusHere we are telling the application to listen for notifications sent to the 'rabbit' queue on the testdb database. All messages received would be sent as-is to the testRabbit RabbitMQ instance with the routing key 'job.status'.
Then from PostgreSQL you can use:
echo -e "SELECT pgnotify_rabbitmq.send('rabbit','hola');" | docker exec -i pgnotify-rabbitmq-postgresql-1 psql -U postgresto send the message.
This is a simple case, you can allow PostgreSQL to define the routing key:
notify:
-
enabled: true
database: testdb
name: rabbit
json: true
handlers:
rabbit:
instance: testRabbit
routingKey: key
payload: bodyHere we are telling the application to expect a JSON object from PostgreSQL with two properties.
- "key" will contain the routing key to use
- "body" will contain the message to send.
echo "SELECT SELECT pgnotify_rabbitmq.send('rabbit','{\"key\":\"key.test\",\"body\": \"My message\"}');" docker exec -i pgnotify-rabbitmq-postgresql-1 psql -U postgresNote: "payload" is optional here. If absent then the original message will be sent including the routing key etc.
This Node.js project implements an automated workflow to send push notifications via Firebase Cloud Messaging (FCM) from events generated in a PostgreSQL database.
Upon receiving a NOTIFY event, the application extracts the message payload, which contains the information necessary to construct the FCM notification (e.g., title, body, device token, etc.).
SELECT pgnotify_rabbitmq.send('fcm',
json_build_object('topic',
'all-notifications',
'notification',json_build_object(
'title', 'Titulo',
'body', 'mensaje'
)
)::TEXT
);
You can use environment variables to configure the service, for more info about it see:
To avoid conflicts in configuration files, if you want to use environment variables you must specify USE_TEMPLATE to true
Optional you can set HANDLE_ACK Environment variable to run SELECT pgnotify_rabbitmq.handle_ack(); at app startup.
To run first create a config.yaml file with your configuration then run:
docker run -d -v $(pwd)/config.yaml:/opt/config.yaml robertbruno/pgnotify-rabbitmq:latestPrometheus It logs real-time metrics to a time-series database built using an HTTP pull model, with flexible queries and real-time alerts. Grafana that allows the display and formatting of metric data. Allows you to create dashboards and graphs from multiple sources, including time series databases
You can visualize this metrics in Grafana with the following dashboard:
For more info about metrics visit: