Skip to content

Repository for the DataSet OpenTelemetry Collector Exporter plugin and related build infrastructure (CI/CD for automated end to end tests, continous benchmarks, Docker Images).

License

Notifications You must be signed in to change notification settings

scalyr/opentelemetry-exporter-dataset

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DataSet OpenTelemetry Collector Exporter Build Infrastructure

Check code quality TruffleHog Secrets Scan E2E Test - Reliability Benchmarks Release Image codecov Docker Pulls

Main purpose of this repository is to allow testing of changes in the dataset-go library or open-telemetry-contrib exporter before we publish them. This testing is usually done by either running existing "benchmarks" from the scripts folder or using them as inspiration for ad hoc testing.

Dependencies

If you are running targets below on OS X you also need GNU commands aka coreutils package installed and in your $PATH. See https://formulae.brew.sh/formula/coreutils for more info.

Prebuilt Docker Images

This repository contains 2 sets of pre-built Docker images for amd64 and arm64 architectures:

  1. Pre-built image with the latest version of the dataset exporter plugin and some other basic otel components - GitHub.
  2. Pre-built image with the latest development version of the dataset exporter plugin (datasetexporter-latest branch from our fork) with all other upstream otel contrib components - Docker Hub.

Second image can act as a direct drop-in replacement for the upstream otel/opentelemetry-collector-contrib image (e.g. in an upstream otel collector helm chart or similar).

Official Docker Image

You can use the latest official docker image as well:

  1. Prepare config for collector:
  2. Run image:
    • Run: DATASET_URL=https://app.scalyr.com/ DATASET_API_KEY=FOO make exporter-docker-official-run
  3. Verify it:
    • Run: make push-logs MSG="nejneobhospodarovavatelnejsi"
    • Search for the word "nejneobhospodarovavatelnejsi" in DataSet

Build Collector

If you do not want to use prebuilt image, you can build your custom image.

Official documentation provides instructions how to build your own custom image:

Without Docker

If you want to use build your own collector, you can use following instructions:

  1. Install builder:
    • Run go install go.opentelemetry.io/collector/cmd/builder@latest
  2. Prepare builder config:
  3. Generate source code:
    • Run make exporter-normal-build
      • which runs builder --config=otelcol-builder.yaml
    • If you receive "command not found" or similar error, this likely indicates $GOPATH/bin is not in your search $PATH. You can find that by adding contents below to your ~/.bash_profile or ~/.zshrc config:
      • export GOPATH=/Users/$USER/go
        export PATH=$GOPATH/bin:$PATH
  4. Prepare config for collector:
  5. Execute collector:
    • Run: DATASET_URL=https://app.scalyr.com/ DATASET_API_KEY=FOO otelcol-dataset/otelcol-dataset --config config.yaml
  6. Verify it:
    • Run: make push-logs MSG="nejneobhospodarovavatelnejsi"
    • Search for the word "nejneobhospodarovavatelnejsi" in DataSet

Alternatively instead of steps 5 and 6, you may execute make exporter-normal-run.

With Docker

You can use prepared make targets:

  1. Build image using Dockerfile:
    • Run: make exporter-docker-build
  2. Prepare config for collector:
  3. Run image:
    • Run: DATASET_URL=https://app.scalyr.com/ DATASET_API_KEY=FOO make exporter-docker-run
  4. Verify it:
    • Run: make push-logs MSG="nejneobhospodarovavatelnejsi"
    • Search for the word "nejneobhospodarovavatelnejsi" in DataSet

Sample Data

To push some sample data to the collector via OTLP protocol, you may use following tasks:

  • make push-logs - to push logs with specified message
  • make push-linked-trace-with-logs - to push traces that are linked with logs

Configure

For the configuration option you should check documentation.

Development

  1. Update all the repository subtrees to the latest version:
    • make subtrees-pull
  2. Build all images:
    • make docker-build
  3. Run e2e test:
    • make test-e2e

How To Upgrade

  1. Update https://github.com/scalyr/opentelemetry-collector-contrib
    1. cd ../opentelemetry-collector-contrib
      # checkout main branch
      git checkout main
      # sync with upstream
      gh repo sync scalyr/opentelemetry-collector-contrib -b main
      # pull changes
      git pull
      # update dataset-latest branch
      git checkout dataset-latest
      # merge main
      git pull
      git merge main
      # push changes
      git push
  2. Create new branch for the new version
    git checkout -b DPDV-6415-update-packages
  3. Pull all subtress
    make subtrees-pull
  4. Push changes
    git push
  5. Update builder config
    ./scripts/update-builder-config.sh
  6. Update docker files and configurations to use the new version of the collector
    ./scripts/update-otel-version.sh -f v0.101.0 -t v0.104.0
  7. Update collector configuration to use the new version of dataset-go
    ./scripts/update-dataset-go-version.sh -f v0.18.0 -t v0.20.0
  8. Build all docker images
    make docker-build
  9. Run e2e tests
    • Set environment variables for the tests
      export TEST_RUN_SERVERHOST=`date +"%s"`
      export DATASET_URL=https://app.scalyr.com/
      export DATASET_API_KEY=FOO
    • Run the tests
      make test-e2e

Testing Changes Locally

Once you are familiar with building collectors binary, docker image, and executing e2e tests from the scripts folder you can use this repository to test changes from your development branches in the dataset-go library or open-telemetry-contrib exporter.

Workflow is:

  1. Checkout your development branch in a subtree (e.g. ./dataset-go or ./scalyr-opentelemetry-collector-contrib directory)
  2. Build collector
  3. Run collector with your changes

Continous Integration and Delivery (CI/CD)

Benchmarks

We run benchmarks as part of every pull request and main branch run. Results are available in a pretty formatted Markdown format as part of a job summary - example run.

Documentation for the meaning of the columns is in the documentation.

In addition to that, we also run benchmarks on every main branch push / merge and store those results in a dedicated benchmark-results-dont-push-manually branch. This branch is an orphan an only meant to store benchmark results (only CI/CD should push to this branch).

Results are stored in a line break delimited JSON format (serialized JSON string per line).

Security

For information on how to report security vulnerabilities, please see SECURITY.md.

Copyright, License, and Contributors Agreement

Copyright 2023 SentinelOne, Inc.

Licensed under the Apache License, Version 2.0 (the "License"); you may not use this work except in compliance with the License. You may obtain a copy of the License in the LICENSE file, or at:

http://www.apache.org/licenses/LICENSE-2.0

By contributing you agree that these contributions are your own (or approved by your employer) and you grant a full, complete, irrevocable copyright license to all users and developers of the project, present and future, pursuant to the license of the project.

About

Repository for the DataSet OpenTelemetry Collector Exporter plugin and related build infrastructure (CI/CD for automated end to end tests, continous benchmarks, Docker Images).

Resources

License

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •