打印

Watcher Alerts

The following appendix describes the procedure for creating Watcher alerts for machine learning jobs, emails, and remote Syslog servers.

Watcher Alert

DMF 8.1 uses Elasticsearch 7.2.0, where the inter-container functional calls are HTTP-based. However, DMF 8.3 uses Elasticsearch version 7.13.0, which now requires HTTPS-based calls. It would require an extensive change in the system calls used by the Analytics Node (AN), and engineering is working on this effort. Arista recommends the following workaround until the earlier fixes are released.
Workaround Summary:
  • Create a Watcher manually using the provided template.
  • Configure the Watcher to select the job ID for the ML job that needs to send alerts.
  • Select ‘webhook’ as the alerting mechanism within the Watcher to send alerts to 3rd party tools like ‘Slack.’
  1. Access the AN's ML job page and click Manage Jobs to list the ML jobs.
  2. If the data feed column shows as stopped, skip to Step 3. If it says started, click the 3 dots for a particular ML job and Stop the data feed for the current ML job.
    Figure 1. Stop Data Feed
  3. After the data feed has stopped, click the 3 dots and start the data feed.
    Figure 2. Start Data Feed
  4. Select the options as shown in the following diagram.
    Figure 3. Job Time Options
  5. Confirm that the data feed has started. Note down the job ID of this ML job.
    Figure 4. ML Job Characteristics
  6. Access the Watchers page.
    Figure 5. Access Watchers
  7. Create an advanced Watcher.
    Figure 6. Create Advanced Watcher
  8. Configure the name of the Watcher (can include whitespace characters), e.g., Latency ML.
  9. Configure the ID of the Watcher (can be alphanumeric, but without whitespace characters), e.g., ml_latency.
  10. Delete the code from the Watch JSON section.
  11. Copy and paste the following code into the Watcher. Replace the highlighted text according to your environment and your ML job parameters.
    {
    "trigger": {
    "schedule": {
    "interval": "107s"
    }
    },
    "input": {
    "search": {
    "request": {
    "search_type": "query_then_fetch",
    "indices": [
    ".ml-anomalies-*"
    ],
    "rest_total_hits_as_int": true,
    "body": {
    "size": 0,
    "query": {
    "bool": {
    "filter": [
    {
    "term": {
    "job_id": "<use the id of the ML job retrieved in step 6.>"
    }
    },
    {
    "range": {
    "timestamp": {
    "gte": "now-30m"
    }
    }
    },
    {
    "terms": {
    "result_type": [
    "bucket",
    "record",
    "influencer"
    ]
    }
    }
    ]
    }
    },
    "aggs": {
    "bucket_results": {
    "filter": {
    "range": {
    "anomaly_score": {
    "gte": 75
    }
    }
    },
    "aggs": {
    "top_bucket_hits": {
    "top_hits": {
    "sort": [
    {
    "anomaly_score": {
    "order": "desc"
    }
    }
    ],
    "_source": {
    "includes": [
    "job_id",
    "result_type",
    "timestamp",
    "anomaly_score",
    "is_interim"
    ]
    },
    "size": 1,
    "script_fields": {
    "start": {
    "script": {
    "lang": "painless",
    "source": "LocalDateTime.ofEpochSecond((doc[\"timestamp\"].value.getMillis()-((doc[\"bucket_span\"].value * 1000)\n * params.padding)) / 1000, 0,ZoneOffset.UTC).toString()+\":00.000Z\"",
    "params": {
    "padding": 10
    }
    }
    },
    "end": {
    "script": {
    "lang": "painless",
    "source": "LocalDateTime.ofEpochSecond((doc[\"timestamp\"].value.getMillis()+((doc[\"bucket_span\"].value * 1000)\n * params.padding)) / 1000, 0,ZoneOffset.UTC).toString()+\":00.000Z\"",
    "params": {
    "padding": 10
    }
    }
    },
    "timestamp_epoch": {
    "script": {
    "lang": "painless",
    "source": """doc["timestamp"].value.getMillis()/1000"""
    }
    },
    "timestamp_iso8601": {
    "script": {
    "lang": "painless",
    "source": """doc["timestamp"].value"""
    }
    },
    "score": {
    "script": {
    "lang": "painless",
    "source": """Math.round(doc["anomaly_score"].value)"""
    }
    }
    }
    }
    }
    }
    },
    "influencer_results": {
    "filter": {
    "range": {
    "influencer_score": {
    "gte": 3
    }
    }
    },
    "aggs": {
    "top_influencer_hits": {
    "top_hits": {
    "sort": [
    {
    "influencer_score": {
    "order": "desc"
    }
    }
    ],
    "_source": {
    "includes": [
    "result_type",
    "timestamp",
    "influencer_field_name",
    "influencer_field_value",
    "influencer_score",
    "isInterim"
    ]
    },
    "size": 3,
    "script_fields": {
    "score": {
    "script": {
    "lang": "painless",
    "source": """Math.round(doc["influencer_score"].value)"""
    }
    }
    }
    }
    }
    }
    },
    "record_results": {
    "filter": {
    "range": {
    "record_score": {
    "gte": 75
    }
    }
    },
    "aggs": {
    "top_record_hits": {
    "top_hits": {
    "sort": [
    {
    "record_score": {
    "order": "desc"
    }
    }
    ],
    "_source": {
    "includes": [
    "result_type",
    "timestamp",
    "record_score",
    "is_interim",
    "function",
    "field_name",
    "by_field_value",
    "over_field_value",
    "partition_field_value"
    ]
    },
    "size": 3,
    "script_fields": {
    "score": {
    "script": {
    "lang": "painless",
    "source": """Math.round(doc["record_score"].value)"""
    }
    }
    }
    }
    }
    }
    }
    }
    }
    }
    }
    },
    "condition": {
    "compare": {
    "ctx.payload.aggregations.bucket_results.doc_count": {
    "gt": 0
    }
    }
    },
    "actions": {
    "log": {
    "logging": {
    "level": "info",
    "text": "Alert for job [{{ctx.payload.aggregations.bucket_results.top_bucket_hits.hits.hits.0._source.job_id}}] at [{{ctx.payload.aggregations.bucket_results.top_bucket_hits.hits.hits.0.fields.timestamp_iso8601.0}}] score [{ctx.payload.aggregations.bucket_results.top_bucket_hits.hits.hits.0.fields.score.0}}]"
    }
    },
    "my_webhook": {
    "webhook": {
    "scheme": "https",
    "host": "hooks.slack.com",
    "port": 443,
    "method": "post",
    "path": "<path for slack>",
    "params": {},
    "headers": {
    "Content-Type": "application/json"
    },
    "body": """{"channel": "#<slack channel name>", "username": "webhookbot", "text":"Alert for job [{{ctx.payload.aggregations.bucket_results.top_bucket_hits.hits.hits.0._source.job_id}}] at [{{ctx.payload.aggregations.bucket_results.top_bucket_hits.hits.hits.0.fields.timestamp_iso8601.0}}] score [{{ctx.payload.aggregations.bucket_results.top_bucket_hits.hits.hits.0.fields.score.0}}]", "icon_emoji": ":exclamation:"}"""
    }
    }
    }
    }
  12. Click Create Watch to create the Watcher.

Kibana Watcher for Webhook Connector

This document specifically describes how to configure Watcher for Webhook-type connectors.

Kibana connectors in the Stack Management > Alerts and Insights provide seamless integration between the Elasticsearch alerting engine and external systems.

They enable automated notifications and actions to be triggered based on defined conditions, enhancing monitoring and incident response capabilities. Webhook connectors allow alerts to be forwarded to platforms like Slack and Google Chat, delivering customizable payloads to notify relevant teams when critical events occur.

Configuring a Kibana Email Connector

The DMF 8.7.0 release supports the following connector integrations:
  • Gmail via the Email Connector
  • Google Chat and Slack via the Webhook Connector.

Select an existing Kibana email connector to send email alerts or create a connector by navigating to Stack Management > Alerts and Insights > Connectors > Create Connectors. Complete the following steps:

Figure 7. Rules and Connectors
Note: Only the Email and Webhook connector types are available for DMF 8.7 release.

Google Chat Webhook Connector

HTTP header Content-Type is mandatory with the value application/json; charset=UTF-8.
Figure 8. Webhook Connector
Under Test > Create an action > Body enter the following test:
{"text": "Message from Kibana Connector"}
Note: text field is mandatory in the Body section.
Ensure the result is successful and confirm that the respective chat window receives the message.
Figure 9. Editing Google Connector to create action
The screen shows the following message.
Figure 10. Google Chat Message

For any additional details, refer to https://developers.google.com/workspace/chat/quickstart/webhooks#create-webhook.

Slack Webhook Connector

Under Test > Create an action > Body enter the following test:
{"text": "Message from Kibana Connector"}
Note: The text field is mandatory in the Body section.
Figure 11. Editing Slack Connector to create action
The screen shows the following message.
Figure 12. Slack Chat Message

For any additional details, refer to https://api.slack.com/messaging/webhooks#getting_started.

Configuring a Watch

Configure a Watch using the Watcher or Create > Create advanced watch option.

A sample watcher script is as follows with an action type of Google Chat and Slack using the Webhook connectors.
"webhook_googlechat": {
 "webhook": {
 "scheme": "http",
 "host": "169.254.16.1",
 "port": 8000,
 "method": "post",
 "params": {},
 "headers": {},
 "body": """{"request_body": "{\"text\": \"The Elasticsearch cluster status is {{ctx.payload.status}}.\"}","kibana_webhook_connector": "<google-chat-webhook-connector-name>"}"""
 }
  • Add a webhook action with the following fields to select Google Chat and Slack Webhook Connectors.
    • Method: POST
    • Scheme: HTTP
    • Host: 169.254.16.1
    • Port: 8000
    • Specify the Body field as follows:
      • kibana_webhook_connector: The name of the Kibana connector (of type webhook) in string format. It is case-sensitive.
      • request_body: Enter the required fields: The connector gets the HTTP request body (in string format as text).
        Note: request_body value is specific to the connector’s specification.

Click the Save button.

Google Chat Watcher configuration

Google Chat webhook accepts JSON formatted body with a mandatory text field. Hence, request body must have a JSON formatted string with text field. For example,
"{\"text\": \"The Elasticsearch cluster status is {{ctx.payload.status}}.\"}"
A sample watcher script with action type Google Chat Webhook is:
{
 "trigger": {
 "schedule": {
 "interval": "1h"
 }
 },
 "input": {
 "http": {
 "request": {
 "scheme": "https",
 "host": "10.10.10.10",
 "port": 443,
 "method": "get",
 "path": "/es/api/_cluster/health",
 "params": {},
 "headers": {
 "Content-Type": "application/json"
 },
 "auth": {
 "basic": {
 "username": "admin",
 "password": "::es_redacted::"
 }
 }
 }
 }
 },
 "condition": {
 "script": {
 "source": "ctx.payload.status != 'green'",
 "lang": "painless"
 }
 },
 "actions": {
 "webhook_googlechat": {
 "webhook": {
 "scheme": "http",
 "host": "169.254.16.1",
 "port": 8000,
 "method": "post",
 "params": {},
 "headers": {},
 "body": """{"request_body": "{\"text\": \"The Elasticsearch cluster status is {{ctx.payload.status}}.\"}","kibana_webhook_connector": "BSN-Analytics-AppTest-GH-connector"}"""
 }
 }
 }
}

Slack Watcher configuration

Slack webhook accepts JSON formatted body with fields:
  • text: (mandatory)
  • channel : (optional) The channel name must match the same Slack channel for which the webhook is enabled.
  • username: (optional) You can select any username.
For Example:
"{\"channel\": \"test-channel\", \"username\": \"webhookbot\", \"text\": \"The Elasticsearch cluster status is {{ctx.payload.status}}.\"}"
{
 "trigger": {
 "schedule": {
 "interval": "1m"
 }
 },
 "input": {
 "http": {
 "request": {
 "scheme": "https",
 "host": "10.10.10.1",
 "port": 443,
 "method": "get",
 "path": "/es/api/_cluster/health",
 "params": {},
 "headers": {
 "Content-Type": "application/json"
 },
 "auth": {
 "basic": {
 "username": "admin",
 "password": "::es_redacted::"
 }
 }
 }
 }
 },
 "condition": {
 "script": {
 "source": "ctx.payload.status != 'green'",
 "lang": "painless"
 }
 },
 "actions": {
 "webhook_slack": {
 "webhook": {
 "scheme": "http",
 "host": "169.254.16.1",
 "port": 8000,
 "method": "post",
 "params": {},
 "headers": {},
 "body": """{"request_body": "{\"channel\": \"test-webhook\", \"username\": \"webhookbot\", \"text\": \"The Elasticsearch cluster status is {{ctx.payload.status}}.\"}","kibana_webhook_connector": "test-slack-webhook-connector"}"""
 }
 }
 }
}

Troubleshooting

If the Watcher is not sending an alert to the configured Webhook connector when the condition is already there:
  • Check Kibana Connector, which is a type of Webhook.
  • Check that the Kibana Connector is properly configured by running tests from UI. Check connector-specific configurations.
  • Check for properly configured Watcher’s trigger conditions.
  • Make sure all required parameters are present in the connector's watcher configuration.
  • To debug execution in CLI:
    • SSH to AN node
    • Log in to CLI mode command: debug bash
    • Review logs in /var/log/analytics/webhook_listener.log for any clues. Command: tail -f /var/log/analytics/webhook_listener.log
    • To execute service in debug mode:
      • Login as root command: sudo su
      • Stop service command: service webhook-listener stop.
      • Edit web-service in your preferred editor and set the logger to debug mode command: vi /usr/lib/python3.9/site-packages/webhook_listener/run_webhook_listener.py Change Line: LOGGER.setLevel(logging.INFO) to LOGGER.setLevel(logging.DEBUG).
      • Start service command: service webhook-listener start.

You will see debug messages in the log file.

Note: After debugging, revert the debug level to info. If the problem persists, then raise a support ticket.

Limitations

None.

Enabling Secure Email Alerts through SMTP Setting

Refresh the page to view the updated SMTP Settings fields.

The following is an example of the UI SMTP Settings in previous releases:
Figure 13. SMTP Setting
After upgrading the Analytics Node from an earlier version to the DMF 8.6.* version, the following changes apply:
  • Server Name, User, Password, Sender, and Timezone no longer appear in the SMTP Settings.
  • A new field, Kibana Email Connector Name, has been added to SMTP Settings.
  • The system retains Recipients and Dedupe Interval and their respective values in SMTP Settings.
  • If previously configured SMTP settings exist:
    • The system automatically creates a Kibana email connector named SMTPForAlerts using the values previously specified in the fields Server Name, User (optional), Password (optional), and Sender.
    • The Kibana Email Connector Name field automatically becomes SMTPForAlerts.
The following settings appear in the UI after the upgrade to the DMF 8.6.* version:
Figure 14. Upgraded SMTP Setting

Troubleshooting

When Apply & Test, do not send an email to the designated recipients; verify the recipient email addresses are comma-separated and spelled correctly. If it still doesn’t work, verify the designated Kibana email connector matches the name of an existing Kibana email connector. Test that connector by navigating to Stack Management > Rules and Connectors > Connectors, selecting the connector's name, and sending a test email in the Test tab.

..