Main purpose of this repository is to allow testing of changes in the dataset-go library or open-telemetry-contrib exporter before we publish them. This testing is usually done by either running existing "benchmarks" from the scripts folder or using them as inspiration for ad hoc testing.
If you are running targets below on OS X you also need GNU commands aka coreutils package installed and in your $PATH. See https://formulae.brew.sh/formula/coreutils for more info.
This repository contains 2 sets of pre-built Docker images for amd64
and arm64
architectures:
- Pre-built image with the latest version of the dataset exporter plugin and some other basic otel components - GitHub.
- Pre-built image with the latest development version of the dataset exporter plugin
(
datasetexporter-latest
branch from our fork) with all other upstream otel contrib components - Docker Hub.
Second image can act as a direct drop-in replacement for the upstream
otel/opentelemetry-collector-contrib
image (e.g. in an upstream otel collector helm chart or
similar).
You can use the latest official docker image as well:
- Prepare config for collector:
- Update config.yaml
- See documentation for configuration option
- Run image:
- Run:
DATASET_URL=https://app.scalyr.com/ DATASET_API_KEY=FOO make exporter-docker-official-run
- Run:
- Verify it:
- Run:
make push-logs MSG="nejneobhospodarovavatelnejsi"
- Search for the word "nejneobhospodarovavatelnejsi" in DataSet
- Run:
If you do not want to use prebuilt image, you can build your custom image.
Official documentation provides instructions how to build your own custom image:
- https://opentelemetry.io/docs/collector/custom-collector/ - official documentation
- https://github.com/open-telemetry/opentelemetry-collector/tree/main/cmd/builder - builder
If you want to use build your own collector, you can use following instructions:
- Install builder:
- Run
go install go.opentelemetry.io/collector/cmd/builder@latest
- Run
- Prepare builder config:
- Example otelcol-builder.yaml
- Generate source code:
- Run
make exporter-normal-build
- which runs
builder --config=otelcol-builder.yaml
- which runs
- If you receive "command not found" or similar error, this likely indicates
$GOPATH/bin
is not in your search$PATH
. You can find that by adding contents below to your~/.bash_profile
or~/.zshrc
config:-
export GOPATH=/Users/$USER/go export PATH=$GOPATH/bin:$PATH
-
- Run
- Prepare config for collector:
- Example config.yaml
- See documentation for configuration option
- Execute collector:
- Run:
DATASET_URL=https://app.scalyr.com/ DATASET_API_KEY=FOO otelcol-dataset/otelcol-dataset --config config.yaml
- Run:
- Verify it:
- Run:
make push-logs MSG="nejneobhospodarovavatelnejsi"
- Search for the word "nejneobhospodarovavatelnejsi" in DataSet
- Run:
Alternatively instead of steps 5 and 6, you may execute make exporter-normal-run
.
You can use prepared make targets:
- Build image using Dockerfile:
- Run:
make exporter-docker-build
- Run:
- Prepare config for collector:
- Update config.yaml
- See documentation for configuration option
- Run image:
- Run:
DATASET_URL=https://app.scalyr.com/ DATASET_API_KEY=FOO make exporter-docker-run
- Run:
- Verify it:
- Run:
make push-logs MSG="nejneobhospodarovavatelnejsi"
- Search for the word "nejneobhospodarovavatelnejsi" in DataSet
- Run:
To push some sample data to the collector via OTLP protocol, you may use following tasks:
make push-logs
- to push logs with specified messagemake push-linked-trace-with-logs
- to push traces that are linked with logs
For the configuration option you should check documentation.
- Update all the repository subtrees to the latest version:
-
make subtrees-pull
-
- Build all images:
-
make docker-build
-
- Run e2e test:
-
make test-e2e
-
- Update https://github.com/scalyr/opentelemetry-collector-contrib
-
cd ../opentelemetry-collector-contrib # checkout main branch git checkout main # sync with upstream gh repo sync scalyr/opentelemetry-collector-contrib -b main # pull changes git pull # update dataset-latest branch git checkout dataset-latest # merge main git pull git merge main # push changes git push
- Check in the UI, that the branches are not behind
- If they are behind use
Sync fork
button in the UI to sync them
-
- Create new branch for the new version
git checkout -b DPDV-6415-update-packages
- Pull all subtress
make subtrees-pull
- Push changes
git push
- Update builder config
./scripts/update-builder-config.sh
- Update docker files and configurations to use the new version of the collector
- old version is from here - otelcol-builder.yaml
- new version is from the opentelemetry-collector-contrib
./scripts/update-otel-version.sh -f v0.101.0 -t v0.104.0
- Update collector configuration to use the new version of dataset-go
- old version is from here - otelcol-builder.yaml
- new version is from the dataset-go
./scripts/update-dataset-go-version.sh -f v0.18.0 -t v0.20.0
- Build all docker images
make docker-build
- Run e2e tests
- Set environment variables for the tests
export TEST_RUN_SERVERHOST=`date +"%s"` export DATASET_URL=https://app.scalyr.com/ export DATASET_API_KEY=FOO
- Run the tests
make test-e2e
- Set environment variables for the tests
Once you are familiar with building collectors binary, docker image, and executing e2e tests from the scripts folder you can use this repository to test changes from your development branches in the dataset-go library or open-telemetry-contrib exporter.
Workflow is:
- Checkout your development branch in a subtree (e.g.
./dataset-go
or./scalyr-opentelemetry-collector-contrib
directory) - Build collector
- Run collector with your changes
We run benchmarks as part of every pull request and main branch run. Results are available in a pretty formatted Markdown format as part of a job summary - example run.
Documentation for the meaning of the columns is in the documentation.
In addition to that, we also run benchmarks on every main branch push / merge and store those results in a dedicated benchmark-results-dont-push-manually branch. This branch is an orphan an only meant to store benchmark results (only CI/CD should push to this branch).
Results are stored in a line break delimited JSON format (serialized JSON string per line).
For information on how to report security vulnerabilities, please see SECURITY.md.
Copyright 2023 SentinelOne, Inc.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this work except in compliance with the License. You may obtain a copy of the License in the LICENSE file, or at:
http://www.apache.org/licenses/LICENSE-2.0
By contributing you agree that these contributions are your own (or approved by your employer) and you grant a full, complete, irrevocable copyright license to all users and developers of the project, present and future, pursuant to the license of the project.