If you want to package the single jar for, you can do so by running the following command:
sbt assembly
To execute the unit tests, run the following command:
sbt test
To run a specific unit test in SBT, use the testOnly command with the full path of the test class:
sbt "; project pplSparkIntegration; test:testOnly org.opensearch.flint.spark.ppl.PPLLogicalPlanTrendlineCommandTranslatorTestSuite"
The integration test is defined in the integration directory of the project. The integration tests will automatically trigger unit tests and will only run if all unit tests pass. If you want to run the integration test for the project, you can do so by running the following command:
sbt integtest/integration
If you get integration test failures with error message "Previous attempts to find a Docker environment failed" in macOS, fix the issue by following the checklist:
- Check you've installed Docker in your dev host. If not, install Docker first.
- Check if the file /var/run/docker.sock exists. If not, go to
3. - Run
sudo ln -s $HOME/.docker/desktop/docker.sock /var/run/docker.sockorsudo ln -s $HOME/.docker/run/docker.sock /var/run/docker.sock - If you use Docker Desktop, as an alternative of
3, check mark the "Allow the default Docker socket to be used (requires password)" in advanced settings of Docker Desktop.
Running only a selected set of integration test suites is possible with the following command:
sbt "; project integtest; it:testOnly org.opensearch.flint.spark.ppl.FlintSparkPPLTrendlineITSuite"
This command runs only the specified test suite within the integtest submodule.
The aws-integration folder contains tests for cloud server providers. For instance, test against AWS OpenSearch domain, configure the following settings. The client will use the default credential provider to access the AWS OpenSearch domain.
export AWS_OPENSEARCH_HOST=search-xxx.us-west-2.on.aws
export AWS_OPENSEARCH_SERVERLESS_HOST=xxx.us-west-2.aoss.amazonaws.com
export AWS_REGION=us-west-2
export AWS_EMRS_APPID=xxx
export AWS_EMRS_EXECUTION_ROLE=xxx
export AWS_S3_CODE_BUCKET=xxx
export AWS_S3_CODE_PREFIX=xxx
export AWS_OPENSEARCH_RESULT_INDEX=query_execution_result_glue
export AWS_OPENSEARCH_REQUEST_INDEX=.query_execution_request_glue
And run the following command:
sbt integtest/awsIntegration
[info] AWSOpenSearchAccessTestSuite:
[info] - should Create Pit on AWS OpenSearch
[info] Run completed in 3 seconds, 116 milliseconds.
[info] Total number of tests run: 1
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 1, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
The easiest way to ensure the code passes linting is with pre-commit.
Using any recent Python version:
$ pip install pre-commit
$ pre-commit install
This will automatically fix formatting and check for other Scala issues on every push.
For Scala code, flint use spark scalastyle. To automatically apply the scalastyle to all files:
$ sbt scalafmtAll
Alternatively, to just check changed files in the working tree:
$ sbt scalastyle
For IntelliJ users, read more in scalafmt IntelliJ to integrate scalafmt with IntelliJ