getIssues - Get check-up issues and alerts
Use case
You can utilize this endpoint to export the currently active issues for a server or database, as determined by the pganalyze Check-Up system, in order to export them to a third-party system not directly supported by pganalyze's alerting integrations.
API
Arguments for getIssues
:
serverId
(string)
Server ID for which to retrieve issues fordatabaseId
(string)
Database ID for which to retrieve issues for (either this orserverId
needs to be specified)severity
(string[] of "info", "warning", or "critical") (optional)
Only return issues with matching severities (by default issues of any severity are returned)
Fields returned:
issues
(Array of IssueType)id
(String)
Unique ID of this issuedatabaseId
(Number)
Database ID this issue is related todescription
(String)
Issue descriptionseverity
(String)
Issue severity (info, warning, critical)references
(Array of IssueReferenceType)kind
(String)
Type of object this issue referencesname
(String)
Name of the object this issue referencesurl
(String)
URL to the object this issue referencessecondaryUrl
(String) (optional)
Secondary URL for an alternate object this issue references (e.g. active queries reference both the connection as well as the query object)queryText
(String) (optional)
For query references, statement text of the query
Example
GraphQL query:
query {
getIssues(serverId: 'bdp4m73ysjgefke7i2dgkenzi4') {
id
description
severity
references {
name
url
}
}
}
Using curl
:
curl -XPOST -H 'Authorization: Token XXXXXXX' \
-F 'query=query { getIssues(serverId: "bdp4m73ysjgefke7i2dgkenzi4") { id, description, severity, references { name, url } } }' \
https://app.pganalyze.com/graphql
{
"data": {
"getIssues": [
{
"id": "42",
"description": "Setting shared_buffers is too small. For For MB RAM try using shared_buffers = 256 MB (current: 199 MB)",
"severity": "warning",
"references": [
{
"name": "shared_buffers",
"url": "/servers/bdp4m73ysjgefke7i2dgkenzi4/config/shared_buffers"
}
]
}
]
}
}
Couldn't find what you were looking for or want to talk about something specific?
Start a conversation with us →