Skip to content

Instantly share code, notes, and snippets.

@juandspy
Created August 23, 2022 12:38
Show Gist options
  • Save juandspy/01d38596fc7202c3fd6441c6592cdab6 to your computer and use it in GitHub Desktop.
Save juandspy/01d38596fc7202c3fd6441c6592cdab6 to your computer and use it in GitHub Desktop.
Populate notification DB N times

How to

❯ git clone git@github.com:RedHatInsights/ccx-notification-writer.git
❯ cd ccx-notification-writer
❯ make build
❯ ./ccx-notification-writer db-init
❯ cd ..
❯ export PGPASSWORD=postgres
❯ psql --username=postgres --host=localhost --port=5432 --dbname notification -f init.sql; done
❯ for i in {1..10}; do psql --username=postgres --host=localhost --port=5432 --dbname notification -f duplicate.sql; done

It would be more realistic to copy a real report instead of using 'test' -> duplicate.sql

version: "3"
services:
# Auxiliar services the external data pipeline needs
db:
ports:
- 5432:5432
image: registry.redhat.io/rhscl/postgresql-13-rhel7
environment:
- POSTGRESQL_USER=user
- POSTGRESQL_PASSWORD=password
- POSTGRESQL_ADMIN_PASSWORD=postgres
- POSTGRESQL_DATABASE=notification
kafka:
image: quay.io/ccxdev/kafka-no-zk:latest
ports:
- 9092:9092
environment:
- KAFKA_ADVERTISED_HOST_NAME=kafka
- KAFKA_CREATE_TOPICS="ccx.ocp.results:1:1"
-- These queries let you duplicate the data in the notification database. This way you don't need to use the ccx-notification-writer anymore for generating more and more data :)
-- new_reports
INSERT INTO
new_reports (org_id, account_number, cluster, report, updated_at, kafka_offset)
SELECT org_id, account_number, gen_random_uuid() AS cluster, report, updated_at, kafka_offset
FROM new_reports;
-- reported
INSERT INTO
reported (org_id, account_number, cluster, notification_type, state, report, updated_at, notified_at, error_log)
SELECT org_id, account_number, gen_random_uuid() AS cluster, notification_type, state, report, updated_at, notified_at, error_log
FROM reported;
-- new_reports
INSERT INTO
new_reports (org_id, account_number, cluster, report, updated_at, kafka_offset)
VALUES (1, 1, gen_random_uuid(), 'test', DATE '2015-12-17' , 1);
-- reported
INSERT INTO
reported (org_id, account_number, cluster, notification_type, state, report, updated_at, notified_at, error_log)
VALUES (1, 1, gen_random_uuid(), 1, 1, 'test', DATE '2015-12-17', DATE '2015-12-17', 'test');
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment