devops:monitoring:datadog:integrations
Differences
This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| devops:monitoring:datadog:integrations [2025/02/14 10:10] – 85.219.17.206 | devops:monitoring:datadog:integrations [2025/02/14 10:15] (current) – 85.219.17.206 | ||
|---|---|---|---|
| Line 88: | Line 88: | ||
| ddev create Awesome | ddev create Awesome | ||
| - | Write an Agent check | ||
| - | At the core of each Agent-based integration is an Agent Check that periodically collects information and sends it to Datadog. | ||
| - | |||
| - | Checks inherit their logic from the AgentCheck base class and have the following requirements: | ||
| - | |||
| - | Integrations running on the Datadog Agent v7 or later must be compatible with Python 3. Integrations running on the Datadog Agent v5 and v6 still use Python 2.7. | ||
| - | Checks must derive from AgentCheck. | ||
| - | Checks must provide a method with this signature: check(self, instance). | ||
| - | Checks are organized in regular Python packages under the datadog_checks namespace. For example, the code for Awesome lives in the awesome/ | ||
| - | The name of the package must be the same as the check name. | ||
| - | There are no restrictions on the name of the Python modules within that package, nor on the name of the class implementing the check. | ||
| - | |||
| - | Implement check logic | ||
| - | |||
| - | For Awesome, the Agent Check is composed of a service check named awesome.search that searches for a string on a web page. It results in OK if the string is present, WARNING if the page is accessible but the string was not found, and CRITICAL if the page is inaccessible. | ||
| - | |||
| - | To learn how to submit metrics with your Agent Check, see Custom Agent Check. | ||
| - | |||
| - | The code contained within awesome/ | ||
| - | |||
| - | check.py | ||
| - | |||
| - | import requests | ||
| - | |||
| - | from datadog_checks.base import AgentCheck, ConfigurationError | ||
| - | |||
| - | |||
| - | class AwesomeCheck(AgentCheck): | ||
| - | """ | ||
| - | |||
| - | def check(self, instance): | ||
| - | url = instance.get(' | ||
| - | search_string = instance.get(' | ||
| - | |||
| - | # It's a very good idea to do some basic sanity checking. | ||
| - | # Try to be as specific as possible with the exceptions. | ||
| - | if not url or not search_string: | ||
| - | raise ConfigurationError(' | ||
| - | |||
| - | try: | ||
| - | response = requests.get(url) | ||
| - | response.raise_for_status() | ||
| - | # Something went horribly wrong | ||
| - | except Exception as e: | ||
| - | # Ideally we'd use a more specific message... | ||
| - | self.service_check(' | ||
| - | # Page is accessible | ||
| - | else: | ||
| - | # search_string is present | ||
| - | if search_string in response.text: | ||
| - | self.service_check(' | ||
| - | # search_string was not found | ||
| - | else: | ||
| - | self.service_check(' | ||
| - | |||
| - | To learn more about the base Python class, see Anatomy of a Python Check. | ||
| - | Write validation tests | ||
| - | |||
| - | There are two types of tests: | ||
| - | |||
| - | Unit tests for specific functionality | ||
| - | Integration tests that execute the check method and verify proper metrics collection | ||
| - | |||
| - | pytest and hatch are used to run the tests. Tests are required in order to publish your integration. | ||
| - | Write a unit test | ||
| - | |||
| - | The first part of the check method for Awesome retrieves and verifies two elements from the configuration file. This is a good candidate for a unit test. | ||
| - | |||
| - | Open the file at awesome/ | ||
| - | |||
| - | test_awesome.py | ||
| - | |||
| - | import pytest | ||
| - | |||
| - | # Don't forget to import your integration | ||
| - | |||
| - | from datadog_checks.awesome import AwesomeCheck | ||
| - | from datadog_checks.base import ConfigurationError | ||
| - | |||
| - | |||
| - | @pytest.mark.unit | ||
| - | def test_config(): | ||
| - | instance = {} | ||
| - | c = AwesomeCheck(' | ||
| - | |||
| - | # empty instance | ||
| - | with pytest.raises(ConfigurationError): | ||
| - | c.check(instance) | ||
| - | |||
| - | # only the url | ||
| - | with pytest.raises(ConfigurationError): | ||
| - | c.check({' | ||
| - | |||
| - | # only the search string | ||
| - | with pytest.raises(ConfigurationError): | ||
| - | c.check({' | ||
| - | |||
| - | # this should not fail | ||
| - | c.check({' | ||
| - | |||
| - | pytest has the concept of markers that can be used to group tests into categories. Notice that test_config is marked as a unit test. | ||
| - | |||
| - | The scaffolding is set up to run all the tests located in awesome/ | ||
| - | |||
| - | ddev test awesome | ||
| - | |||
| - | Write an integration test | ||
| - | |||
| - | The unit test above doesn’t check the collection logic. To test the logic, you need to create an environment for an integration test and write an integration test. | ||
| - | Create an environment for the integration test | ||
| - | |||
| - | The toolkit uses docker to spin up an NGINX container and lets the check retrieve the welcome page. | ||
| - | |||
| - | To create an environment for the integration test, create a docker-compose file at awesome/ | ||
| - | |||
| - | docker-compose.yml | ||
| - | |||
| - | version: " | ||
| - | |||
| - | services: | ||
| - | nginx: | ||
| - | image: nginx: | ||
| - | ports: | ||
| - | - " | ||
| - | |||
| - | Next, open the file at awesome/ | ||
| - | |||
| - | conftest.py | ||
| - | |||
| - | import os | ||
| - | |||
| - | import pytest | ||
| - | |||
| - | from datadog_checks.dev import docker_run, get_docker_hostname, | ||
| - | |||
| - | URL = ' | ||
| - | SEARCH_STRING = 'Thank you for using nginx.' | ||
| - | INSTANCE = {' | ||
| - | |||
| - | |||
| - | @pytest.fixture(scope=' | ||
| - | def dd_environment(): | ||
| - | compose_file = os.path.join(get_here(), | ||
| - | |||
| - | # This does 3 things: | ||
| - | # | ||
| - | # 1. Spins up the services defined in the compose file | ||
| - | # 2. Waits for the url to be available before running the tests | ||
| - | # 3. Tears down the services when the tests are finished | ||
| - | with docker_run(compose_file, | ||
| - | yield INSTANCE | ||
| - | |||
| - | |||
| - | @pytest.fixture | ||
| - | def instance(): | ||
| - | return INSTANCE.copy() | ||
| - | |||
| - | Add an integration test | ||
| - | |||
| - | After you’ve setup an environment for the integration test, add an integration test to the awesome/ | ||
| - | |||
| - | test_awesome.py | ||
| - | |||
| - | @pytest.mark.integration | ||
| - | @pytest.mark.usefixtures(' | ||
| - | def test_service_check(aggregator, | ||
| - | c = AwesomeCheck(' | ||
| - | |||
| - | # the check should send OK | ||
| - | c.check(instance) | ||
| - | aggregator.assert_service_check(' | ||
| - | |||
| - | # the check should send WARNING | ||
| - | instance[' | ||
| - | c.check(instance) | ||
| - | aggregator.assert_service_check(' | ||
| - | |||
| - | To speed up development, | ||
| - | |||
| - | ddev test -m integration awesome | ||
| - | |||
| - | Your integration is almost complete. Next, add the necessary check assets. | ||
| - | Populate integration assets | ||
| - | |||
| - | The following set of assets created by the ddev scaffolding must be populated with relevant information to your integration: | ||
| - | |||
| - | README.md | ||
| - | This contains the documentation for your Agent Check, how to set it up, which data it collects, and support information. | ||
| - | spec.yaml | ||
| - | This is used to generate the conf.yaml.example using the ddev tooling. For more information, | ||
| - | conf.yaml.example | ||
| - | This contains default (or example) configuration options for your Agent Check. Do not edit this file by hand. It is generated from the contents of spec.yaml. For more information, | ||
| - | manifest.json | ||
| - | This contains the metadata for your Agent Check such as the title and categories. For more information, | ||
| - | metadata.csv | ||
| - | This contains the list of all metrics collected by your Agent Check. For more information, | ||
| - | service_check.json | ||
| - | This contains the list of all Service Checks collected by your Agent Check. For more information, | ||
| - | |||
| - | For more information about the README.md and manifest.json files, see Create a Tile and Integrations Asset Reference. | ||
| - | Build the wheel | ||
| - | |||
| - | The pyproject.toml file provides the metadata that is used to package and build the wheel. The wheel contains the files necessary for the functioning of the integration itself, which includes the Agent Check, configuration example file, and artifacts generated during the wheel build. | ||
| - | |||
| - | All additional elements, including the metadata files, are not meant to be contained within the wheel, and are used elsewhere by the Datadog platform and ecosystem. | ||
| - | |||
| - | To learn more about Python packaging, see Packaging Python Projects. | ||
| - | |||
| - | Once your pyproject.toml is ready, create a wheel using one of the following options: | ||
| - | |||
| - | (Recommended) With the ddev tooling: ddev release build < | ||
| - | Without the ddev tooling: cd < | ||
| - | |||
| - | Install the wheel | ||
| - | |||
| - | The wheel is installed using the Agent integration command, available in Agent v6.10.0 or later. Depending on your environment, | ||
| - | |||
| - | Linux (as dd-agent): | ||
| - | |||
| - | sudo -u dd-agent datadog-agent integration install -w / | ||
| - | |||
| - | OSX (as admin): | ||
| - | |||
| - | sudo datadog-agent integration install -w / | ||
| - | |||
| - | Windows PowerShell (Ensure that your shell session has administrator privileges): | ||
| - | Agent v6.11 or earlier | ||
| - | |||
| - | Agentv6.12 or later | ||
| - | |||
| - | & " | ||
| - | |||
| - | For installing your wheel to test in Kubernetes environments: | ||
| - | |||
| - | Mount the .whl file into an initContainer. | ||
| - | Run the wheel install in the initContainer. | ||
| - | Mount the initContainer in the Agent container while it’s running. | ||
| - | |||
| - | For customer install commands for both host and container environments, | ||
| - | Populate your tile and publish your integration | ||
| - | |||
| - | Once you have created your Agent-based integration, | ||
| - | Update your integration | ||
| - | |||
| - | To update your integration, | ||
| - | |||
| - | If you are editing or adding new integration code, a version bump is required. | ||
| - | |||
| - | If you are editing or adding new README content, manifest information, | ||
| - | |||
| - | After making updates to assets such as dashboards and monitor templates, or non-code files such as README.md and manifest.json, | ||
| - | Bumping an integration version | ||
| - | |||
| - | In addition to any code changes, the following is required when bumping an integration version: | ||
| - | |||
| - | Update __about__.py to reflect the new version number. This file can be found in your integration’s directory under / | ||
| - | Add an entry to the CHANGELOG.md file that adheres to the following format: | ||
| - | |||
| - | ## Version Number / Date | ||
| - | |||
| - | ***Added***: | ||
| - | |||
| - | * New feature | ||
| - | * New feature | ||
| - | |||
| - | ***Fixed***: | ||
| - | |||
| - | * Bug fix | ||
| - | * Bug fix | ||
| - | |||
| - | Update all references to the version number mentioned in README.md and elsewhere. Installation instructions in README.md often include the version number, which needs to be updated. | ||
devops/monitoring/datadog/integrations.1739527812.txt.gz · Last modified: 2025/02/14 10:10 by 85.219.17.206
