Notice: We've updated our Privacy Policy, effective November 14, 2024.

Step 3: Configure the Collector

Configuring the collector with Docker

Create your configuration file (e.g. named pganalyze_collector.env) with environment variables like this:

PGA_API_KEY=your-organization-api-key
DB_HOST=1.2.3.4
DB_NAME=your_database_name
DB_USERNAME=your_monitoring_user
DB_PASSWORD=your_monitoring_user_password
GCP_PROJECT_ID=your_gcp_project_id
GCP_CLOUDSQL_INSTANCE_ID=your_gcp_cloudsql_instance_id

Fill in the values step-by-step:

  1. The PGA_API_KEY can be found in the pganalyze Settings page for your organization, under the API keys tab
  2. The DB_HOST is the IP address of your Google Cloud SQL / AlloyDB instance
  3. The DB_NAME is the database on the Google Cloud SQL / AlloyDB instance you want to monitor
  4. The DB_USERNAME and DB_PASSWORD should be the credentials of the monitoring user we created in Step 1
  5. The GCP_PROJECT_ID should match the name of the GCP project that contains your Cloud SQL / AlloyDB instance
  6. The GCP_CLOUDSQL_INSTANCE_ID should match the name of the Cloud SQL instance - if using AlloyDB see below

Instructions for Google AlloyDB

If you are using Google AlloyDB, do not specify GCP_CLOUDSQL_INSTANCE_ID, but instead specify GCP_ALLOYDB_CLUSTER_ID (set to the name of the cluster) and GCP_ALLOYDB_INSTANCE_ID (set to the instance name within the cluster).

Test snapshot

Then run the following:

docker run --env-file pganalyze_collector.env quay.io/pganalyze/collector:stable test

Once you've confirmed the install is successful and you're receiving query data in pganalyze, we recommend setting up Log Insights as a follow-up step, to automatically track log events in your database.


Couldn't find what you were looking for or want to talk about something specific?
Start a conversation with us →