diff --git a/.github/ISSUE_TEMPLATE/adoption_request.md b/.github/ISSUE_TEMPLATE/adoption_request.md deleted file mode 100644 index 4716a10b3f2..00000000000 --- a/.github/ISSUE_TEMPLATE/adoption_request.md +++ /dev/null @@ -1,58 +0,0 @@ ---- -name: Adoption Request -about: Submit your feature to the project -title: 'Adoption Request' -labels: 'adoption' -assignees: '' - ---- - -# Adoption Request - -_Thank you for wanting to contribute to the project! We are very happy to see the functionalities of the EDC being extended. Providing this open-source is a great opportunity for others with similar requirements and to avoid additional work._ - -_For any details about the guidelines for submitting features, please take a look at the [contribution categories](https://github.com/eclipse-edc/Connector/blob/main/contribution_categories.md)._ - - -## General information - -Please provide some information about your project or code contribution. - -_If you choose to be referenced as a "friend", these will be added to the [known friends list](https://github.com/eclipse-edc/Connector/blob/main/known_friends.md)._ -_If you choose to add your feature as a core EDC component, links to your current code and correlated issues, discussions, and pull requests are of great importance._ - -| Title | Description | Contact | Links -| :--- | :--- | :--- | :--- -| _My awesome project_ | _This is an example._ | _e-mail-address_ | _link to repository, homepage, discussion, etc._ - - -## Adoption level - -Next, please tell us what level of adoption you intend to provide. _(pick only one)_ - -- [ ] Reference a feature as "friend" -- [ ] Incorporate a feature as core EDC component - - - -## Adoption in EDC core - -_If you chose to add your feature as a core EDC component, please answer the following questions._ - -### Why should this contribution be adopted? -_Please argue why this feature must be hosted upstream and be maintained by the EDC core team._ - -### Could it be achieved with existing functionality? If not, why? -_If there is any existing code that can achieve the same thing with little modification, that is usually the preferable way for the EDC core team. We aim to keep the code succinct and want to avoid similar/duplicate code. Make sure you understand the EDC code base well!_ - -### Are there multiple use cases or applications who will benefit from the contribution? -_Basically, we want you to motivate who will use that feature and why, thereby arguing the fact that it is well-suited to be adopted in the core code base. One-off features are better suited to be maintained externally._ - -### Can it be achieved without introducing third-party dependencies? If not, which ones? -_EDC is a platform rather than an application, therefore we are extremely careful when it comes to introducing third party libraries. The reasons are diverse: security, license issues and over all JAR weight, just to mention a few important ones._ - -### Would this feature limit platform independence in any way? If so, how and why? -_Features that do not work well in clustered environments are difficult to adopt, since EDC is designed from the ground up to be stateless and clusterable. Similarly, features, that have dependencies onto certain operating systems are difficult to argue._ - -### Is it going to be a self-contained feature, or would it cut across the entire code base? -_Features that have a large impact on the code base are very complex to thoroughly test, they have a high chance to destabilize the code and require careful inspection. Self-contained features on the other hand are easier to isolate and test._ diff --git a/.github/ISSUE_TEMPLATE/bug_report.md b/.github/ISSUE_TEMPLATE/bug_report.md deleted file mode 100644 index 2645366a206..00000000000 --- a/.github/ISSUE_TEMPLATE/bug_report.md +++ /dev/null @@ -1,47 +0,0 @@ ---- -name: Bug Report -about: Create a report to help us improve -title: '' -labels: [ "bug_report", "triage" ] -assignees: '' - ---- - -# Bug Report - -## Describe the Bug - -_A clear and concise description of the bug._ - -### Expected Behavior - -_A clear and concise description of what you expected to happen._ - -### Observed Behavior - -_A clear and concise description of what happened instead._ - -## Steps to Reproduce - -Steps to reproduce the behavior: - -1. Go to '...' -2. Click on '....' -3. Scroll down to '....' -4. See error - -## Context Information - -_Add any other context about the problem here._ - -- Used version [e.g. EDC v1.0.0] -- OS: [e.g. iOS, Windows] -- ... - -## Detailed Description - -_If applicable, add screenshots and logs to help explain your problem._ - -## Possible Implementation - -_You already know the root cause of the erroneous state and how to fix it? Feel free to share your thoughts._ diff --git a/.github/ISSUE_TEMPLATE/config.yml b/.github/ISSUE_TEMPLATE/config.yml deleted file mode 100644 index d92a074e9c4..00000000000 --- a/.github/ISSUE_TEMPLATE/config.yml +++ /dev/null @@ -1,9 +0,0 @@ ---- -blank_issues_enabled: false -contact_links: - - name: Ask a question or get support - url: https://github.com/eclipse-edc/Connector/discussions/categories/q-a - about: Ask a question or request support for using the Eclipse Dataspace Connector - - name: Take a look at the documentation - url: https://github.com/eclipse-edc/Connector/tree/main/docs - about: Browse the documentation for more information diff --git a/.github/ISSUE_TEMPLATE/feature_request.md b/.github/ISSUE_TEMPLATE/feature_request.md deleted file mode 100644 index 24f2fb43bde..00000000000 --- a/.github/ISSUE_TEMPLATE/feature_request.md +++ /dev/null @@ -1,25 +0,0 @@ ---- -name: Feature Request -about: Help us with new ideas -title: '' -labels: [ "feature_request", "triage" ] -assignees: '' - ---- - -# Feature Request - -_If you are missing a feature or have an idea how to improve this project that should first be discussed, please feel -free to open up a [discussion](https://github.com/eclipse-edc/Connector/discussions/categories/ideas)._ - -## Which Areas Would Be Affected? - -_e.g., DPF, CI, build, transfer, etc._ - -## Why Is the Feature Desired? - -_Are there any requirements?_ - -## Solution Proposal - -_If possible, provide a (brief!) solution proposal._ diff --git a/.github/PULL_REQUEST_TEMPLATE.md b/.github/PULL_REQUEST_TEMPLATE.md deleted file mode 100644 index 2b33d8890c1..00000000000 --- a/.github/PULL_REQUEST_TEMPLATE.md +++ /dev/null @@ -1,18 +0,0 @@ -## What this PR changes/adds - -_Briefly describe WHAT your pr changes, which features it adds/modifies._ - -## Why it does that - -_Briefly state why the change was necessary._ - -## Further notes - -_List other areas of code that have changed but are not necessarily linked to the main feature. This could be method -signature changes, package declarations, bugs that were encountered and were fixed inline, etc._ - -## Linked Issue(s) - -Closes # <-- _insert Issue number if one exists_ - -_Please be sure to take a look at the [contributing guidelines](https://github.com/eclipse-edc/Connector/blob/main/CONTRIBUTING.md#submit-a-pull-request) and our [etiquette for pull requests](https://github.com/eclipse-edc/Connector/blob/main/pr_etiquette.md)._ diff --git a/.github/workflows/close-inactive-issues.yml b/.github/workflows/close-inactive-issues.yml deleted file mode 100644 index 51b51af323a..00000000000 --- a/.github/workflows/close-inactive-issues.yml +++ /dev/null @@ -1,88 +0,0 @@ -name: Close Inactive Issues -on: - schedule: - - cron: "30 1 * * *" # once a day (1:30 UTC) - workflow_dispatch: # allow manual trigger - -jobs: - close-issues-in-triage: - runs-on: ubuntu-latest - permissions: - issues: write - steps: - - uses: actions/stale@v8 - with: - operations-per-run: 1000 - days-before-issue-stale: 28 - days-before-issue-close: 14 - stale-issue-label: "stale" - stale-issue-message: "This issue is stale because it has been open for 28 days with no activity." - close-issue-message: "This issue was closed because it has been inactive for 14 days since being marked as stale." - close-issue-reason: 'not_planned' - days-before-pr-stale: -1 # ignore PRs (overwrite default days-before-stale) - days-before-pr-close: -1 # ignore PRs (overwrite default days-before-close) - remove-issue-stale-when-updated: true - exempt-all-issue-milestones: false # issues with assigned milestones will be ignored - only-labels: 'triage' - repo-token: ${{ secrets.GITHUB_TOKEN }} - - close-issues-with-assignee: - runs-on: ubuntu-latest - permissions: - issues: write - steps: - - uses: actions/stale@v8 - with: - operations-per-run: 1000 - days-before-issue-stale: 28 - days-before-issue-close: 7 - stale-issue-label: "stale" - stale-issue-message: "This issue is stale because it has been open for 28 days with no activity." - close-issue-message: "This issue was closed because it has been inactive for 7 days since being marked as stale." - close-issue-reason: 'not_planned' - days-before-pr-stale: -1 # ignore PRs (overwrite default days-before-stale) - days-before-pr-close: -1 # ignore PRs (overwrite default days-before-close) - remove-issue-stale-when-updated: true - exempt-all-issue-milestones: true # issues with assigned milestones will be ignored - exempt-issue-labels: bug # ignore issues labelled as bug - repo-token: ${{ secrets.GITHUB_TOKEN }} - - close-issues-without-assignee: - runs-on: ubuntu-latest - permissions: - issues: write - steps: - - uses: actions/stale@v8 - with: - operations-per-run: 1000 - days-before-issue-stale: 14 - days-before-issue-close: 7 - stale-issue-label: "stale" - stale-issue-message: "This issue is stale because it has been open for 14 days with no activity." - close-issue-message: "This issue was closed because it has been inactive for 7 days since being marked as stale." - close-issue-reason: 'not_planned' - days-before-pr-stale: -1 # ignore PRs (overwrite default days-before-stale) - days-before-pr-close: -1 # ignore PRs (overwrite default days-before-close) - remove-issue-stale-when-updated: true - exempt-all-issue-milestones: true # issues with assigned milestones will be ignored - exempt-all-issue-assignees: true # issues with assignees will be ignored - exempt-issue-labels: bug # ignore issues labelled as bug - repo-token: ${{ secrets.GITHUB_TOKEN }} - - close-inactive-pull-requests: - runs-on: ubuntu-latest - permissions: - pull-requests: write - steps: - - uses: actions/stale@v8 - with: - operations-per-run: 1000 - days-before-issue-stale: -1 # ignore issues (overwrite default days-before-stale) - days-before-issue-close: -1 # ignore issues (overwrite default days-before-close) - stale-pr-label: "stale" - stale-pr-message: "This pull request is stale because it has been open for 7 days with no activity." - close-pr-message: "This pull request was closed because it has been inactive for 7 days since being marked as stale." - days-before-pr-stale: 7 - days-before-pr-close: 7 - remove-pr-stale-when-updated: true - repo-token: ${{ secrets.GITHUB_TOKEN }} diff --git a/.github/workflows/discord-webhook.yml b/.github/workflows/discord-webhook.yml index a9e9dde5772..465f2b3cb9c 100644 --- a/.github/workflows/discord-webhook.yml +++ b/.github/workflows/discord-webhook.yml @@ -8,44 +8,19 @@ on: types: [ created ] jobs: - message: - runs-on: ubuntu-latest - steps: - - name: New Discussion - uses: tsickert/discord-webhook@v5.3.0 - if: ${{ (github.event_name == 'discussion') }} - with: - webhook-url: ${{ secrets.DISCORD_WEBHOOK_GITHUB }} - avatar-url: https://avatars.githubusercontent.com/u/9919?s=200&v=4 - embed-author-name: ${{ github.event.sender.login }} - embed-author-url: ${{ github.event.sender.html_url }} - embed-author-icon-url: ${{ github.event.sender.avatar_url }} - embed-title: ${{ github.event.discussion.title }} - embed-url: ${{ github.event.discussion.html_url }} - embed-description: A **discussion** has been created in ${{ github.repository }}. - - - name: New Issue - uses: tsickert/discord-webhook@v5.3.0 - if: ${{ (github.event_name == 'issues') }} - with: - webhook-url: ${{ secrets.DISCORD_WEBHOOK_GITHUB }} - avatar-url: https://avatars.githubusercontent.com/u/9919?s=200&v=4 - embed-author-name: ${{ github.event.sender.login }} - embed-author-url: ${{ github.event.sender.html_url }} - embed-author-icon-url: ${{ github.event.sender.avatar_url }} - embed-title: ${{ github.event.issue.title }} - embed-url: ${{ github.event.issue.html_url }} - embed-description: An **issue** has been opened in ${{ github.repository }}. - - - name: New Pull Request - uses: tsickert/discord-webhook@v5.3.0 - if: ${{ (github.event_name == 'pull_request_target') }} - with: - webhook-url: ${{ secrets.DISCORD_WEBHOOK_GITHUB }} - avatar-url: https://avatars.githubusercontent.com/u/9919?s=200&v=4 - embed-author-name: ${{ github.event.sender.login }} - embed-author-url: ${{ github.event.sender.html_url }} - embed-author-icon-url: ${{ github.event.sender.avatar_url }} - embed-title: ${{ github.event.pull_request.title }} - embed-url: ${{ github.event.pull_request.html_url }} - embed-description: A **pull request** has been opened in ${{ github.repository }}. + trigger-workflow: + uses: eclipse-edc/.github/.github/workflows/discord-webhook.yml@main + with: + event_discussion_html_url: ${{ github.event.discussion.html_url }} + event_discussion_title: ${{ github.event.discussion.title }} + event_issue_html_url: ${{ github.event.issue.html_url }} + event_issue_title: ${{ github.event.issue.title }} + event_name: ${{ github.event_name }} + event_pull_request_html_url: ${{ github.event.pull_request.html_url }} + event_pull_request_title: ${{ github.event.pull_request.title }} + event_sender_avatar_url: ${{ github.event.sender.avatar_url }} + event_sender_html_url: ${{ github.event.sender.html_url }} + event_sender_login: ${{ github.event.sender.login }} + repository_name: ${{ github.repository }} + secrets: + env_discord: ${{ secrets.DISCORD_WEBHOOK_GITHUB }} \ No newline at end of file diff --git a/.github/workflows/first-interaction.yml b/.github/workflows/first-interaction.yml index ef1a25be42a..b1daff14f56 100644 --- a/.github/workflows/first-interaction.yml +++ b/.github/workflows/first-interaction.yml @@ -7,15 +7,7 @@ on: types: [ opened ] jobs: - add-comment: - runs-on: ubuntu-latest - steps: - - uses: actions/first-interaction@v1 - with: - repo-token: ${{ secrets.GITHUB_TOKEN }} - issue-message: 'Thanks for your contribution :fire: We will take a look asap :rocket:' - pr-message: >- - We are always happy to welcome new contributors :heart: To make things easier for everyone, please - make sure to follow our [contribution guidelines](https://github.com/eclipse-edc/Connector/blob/main/CONTRIBUTING.md), - check if you have already signed the [ECA](http://www.eclipse.org/legal/ecafaq.php), and - relate this pull request to an existing issue or discussion. + trigger-workflow: + uses: eclipse-edc/.github/.github/workflows/first-interaction.yml@main + secrets: + envGH: ${{ secrets.GITHUB_TOKEN }} \ No newline at end of file diff --git a/.github/workflows/scan-pull-request.yaml b/.github/workflows/scan-pull-request.yaml index 2346035243d..5ad1602dfbe 100644 --- a/.github/workflows/scan-pull-request.yaml +++ b/.github/workflows/scan-pull-request.yaml @@ -10,33 +10,7 @@ concurrency: cancel-in-progress: true jobs: - check-pull-request-title: - runs-on: ubuntu-latest - continue-on-error: false - steps: - - uses: deepakputhraya/action-pr-title@master - with: - # Match pull request titles conventional commit syntax (https://www.conventionalcommits.org/en/v1.0.0/) - # (online tool for regex quick check: https://regex101.com/r/V5J8kh/1) - # - # Valid examples would be - # - fix: resolve minor issue - # - docs(Sample5): update docs for configuration - # - feat(management-api)!: change path to access contract agreements - # - # Invalid examples would be - # - Add cool feature - # - Feature/some cool improvement - # - fix: resolve minor issue. - regex: '^(build|chore|ci|docs|feat|fix|perf|refactor|revert|style|test)(\(\w+((,|\/|\\)?\s?\w+)+\))?!?: [\S ]{1,80}[^\.]$' - allowed_prefixes: 'build,chore,ci,docs,feat,fix,perf,refactor,revert,style,test' - prefix_case_sensitive: true - - check-for-assigned-labels: - runs-on: ubuntu-latest - continue-on-error: false - steps: - - uses: agilepathway/label-checker@v1.4.30 - with: - any_of: api,bug,build,dependencies,documentation,enhancement,no-changelog,refactoring - repo_token: ${{ secrets.GITHUB_TOKEN }} + trigger-workflow: + uses: eclipse-edc/.github/.github/workflows/scan-pull-request.yml@main + secrets: + envGH: ${{ secrets.GITHUB_TOKEN }} \ No newline at end of file diff --git a/.github/workflows/stale-bot.yml b/.github/workflows/stale-bot.yml new file mode 100644 index 00000000000..28f72936647 --- /dev/null +++ b/.github/workflows/stale-bot.yml @@ -0,0 +1,12 @@ +name: Close Inactive Issues + +on: + schedule: + - cron: "30 1 * * *" # once a day (1:30 UTC) + workflow_dispatch: # allow manual trigger + +jobs: + trigger-workflow: + uses: eclipse-edc/.github/.github/workflows/stale-bot.yml@main + secrets: + envGH: ${{ secrets.GITHUB_TOKEN }} \ No newline at end of file diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md deleted file mode 100644 index 1d7b293047c..00000000000 --- a/CONTRIBUTING.md +++ /dev/null @@ -1,231 +0,0 @@ -# Contributing to the Project - -Thank you for your interest in contributing to -the [Eclipse Dataspace Connector](https://projects.eclipse.org/projects/technology.edc)! - -## Table of Contents - -* [Code Of Conduct](#code-of-conduct) -* [Eclipse Contributor Agreement](#eclipse-contributor-agreement) -* [How to Contribute](#how-to-contribute) - * [Discuss](#discuss) - * [Create an Issue](#create-an-issue) - * [Submit a Pull Request](#submit-a-pull-request) - * [Report on Flaky Tests](#report-on-flaky-tests) -* [Project and Milestone Planning](#project-and-milestone-planning) - * [Milestones](#milestones) - * [Projects](#projects) - * [Releases](#releases) -* [Contact Us](#contact-us) - -## Code Of Conduct - -See the [Eclipse Code Of Conduct](https://www.eclipse.org/org/documents/Community_Code_of_Conduct.php). - -## Eclipse Contributor Agreement - -Before your contribution can be accepted by the project, you need to create and electronically sign -a [Eclipse Contributor Agreement (ECA)](http://www.eclipse.org/legal/ecafaq.php): - -1. Log in to the [Eclipse foundation website](https://accounts.eclipse.org/user/login/). You will - need to create an account within the Eclipse Foundation if you have not already done so. -2. Click on "Eclipse ECA", and complete the form. - -Be sure to use the same email address in your Eclipse Account that you intend to use when you commit -to GitHub. - -## How to Contribute - -### Discuss - -If you want to share an idea to further enhance -the project or discuss potential use cases, please feel free to create a discussion at the -[GitHub Discussions page](https://github.com/eclipse-edc/Connector/discussions). -If you feel there is a bug or an issue, contribute to the discussions in -[existing issues](https://github.com/eclipse-edc/Connector/issues?q=is%3Aissue+is%3Aopen), -otherwise [create a new issue](#create-an-issue). - -### Create an Issue - -If you have identified a bug or want to formulate a working item that you want to concentrate on, -feel free to create a new issue at our project's corresponding -[GitHub Issues page](https://github.com/eclipse-edc/Connector/issues/new). - -Before doing so, please consider searching for potentially suitable -[existing issues](https://github.com/eclipse-edc/Connector/issues?q=is%3Aissue+is%3Aopen). - -We also use [GitHub's default label set](https://docs.github.com/en/issues/using-labels-and-milestones-to-track-work/managing-labels) -extended by custom ones to classify issues and improve findability. - -If an issue appears to cover changes that will have a (huge) impact on the code base and needs to -first be discussed, or if you just have a question regarding the usage of the software, please -create a [discussion](https://github.com/eclipse-edc/Connector/discussions) -before raising an issue. - -Please note that if an issue covers a topic or the response to a question that may be interesting -for other developers or contributors, or for further discussions, it should be converted to a -discussion and not be closed. - -### Adhere to Coding Style Guide - -We aim for a coherent and consistent code base, thus the coding style detailed in the -[styleguide](styleguide.md) should be followed. - -### Submit a Pull Request - -In addition to the contribution guideline made available in the -[Eclipse project handbook](https://www.eclipse.org/projects/handbook/#contributing), -we would appreciate if your pull request applies to the following points: - -* Conform to [Pull-Request Etiquette](pr_etiquette.md) - -* Always apply the following copyright header to specific files in your work replacing the fields - enclosed by curly brackets "{}" with your own identifying information. (Don't include the curly - brackets!) Enclose the text in the appropriate comment syntax for the file format. - - ```text - Copyright (c) {year} {owner}[ and others] - - This program and the accompanying materials are made available under the - terms of the Apache License, Version 2.0 which is available at - https://www.apache.org/licenses/LICENSE-2.0 - - SPDX-License-Identifier: Apache-2.0 - - Contributors: - {name} - {description} - ``` - -* The git commit messages should comply to the following format: - ``` - (): - ``` - - Use the [imperative mood](https://github.com/git/git/blob/master/Documentation/SubmittingPatches) - as in "Fix bug" or "Add feature" rather than "Fixed bug" or "Added feature" and - [mention the GitHub issue](https://docs.github.com/en/issues/tracking-your-work-with-issues/linking-a-pull-request-to-an-issue) - e.g. `chore(transfer process): improve logging`. - - All committers, and all commits, are bound to - the [Developer Certificate of Origin.](https://www.eclipse.org/legal/DCO.php) - As such, all parties involved in a contribution must have valid ECAs. Additionally, commits can - include a ["Signed-off-by" entry](https://wiki.eclipse.org/Development_Resources/Contributing_via_Git). - -* Add meaningful tests to verify your submission acts as expected. - -* Where code is not self-explanatory, add documentation providing extra clarification. - -* Add documentation files to new modules. See [here](#add-documentation) for more details. - -* If a new module has been added or a significant part of the code has been changed and you should - or want to be seen as the contact person for any further changes, please add appropriate - information to the [CODEOWNERS](https://github.com/eclipse-edc/Connector/blob/main/CODEOWNERS) - file. You can find instructions on how to do this at . - Please note that this file does not represent all contributions to the code. What persons and organizations - actually contributed to each file can be seen on GitHub and is documented in the license headers. - -* PR descriptions should use the current [PR template](.github/PULL_REQUEST_TEMPLATE.md) - -* Submit a draft pull request at early-stage and add people previously working on the same code as - reviewer. Make sure automatic checks pass before marking it as "ready for review": - - * _Intellectual Property Validation_ verifying the [Eclipse CLA](#eclipse-contributor-agreement) - has been signed as well as commits have been signed-off and - * _Continuous Integration_ performing various test conventions. - -### Stale issues and PRs - -In order to keep our backlog clean we are using a bot that helps us label and eventually close old issues and PRs. The -following table shows the particular timings. - -| | `stale` after | closed after days `stale` | -|------------------------|---------------|---------------------------| -| Issue without assignee | 14 | 7 | -| Issue with assignee | 28 | 7 | -| PR | 7 | 7 | - -Note that updating an issue, e.g. by commenting, will remove the `stale` label again and reset the counters. However, -we ask the community **not to abuse** this feature (e.g. commenting "what's the status?" every X days would certainly -be qualified as abuse). If an issue receives no attention, there usually -are reasons for it. It is therefore advisable to clarify in advance whether any particular feature fits into EDC's -planning schedule and roadmap. For that, we recommend opening a discussion. Discussions serve us as a system of record, that -means we monitor them more closely, and do not close them automatically. - -### Add Documentation - -Every decision record, launcher, extension, or any type of module has to provide documentation that covers at least -one markdown file with necessary information. Please find appropriate templates that should -be used in [the templates directory](docs/templates). - -### Report on Flaky Tests - -If you discover a randomly failing ("flaky") test, please take the time to check whether an issue for that already -exists and if not, create an issue yourself, providing meaningful description and a link to the failing run. Please also -label it with `Bug` and `FlakyTest`. Then assign it to whoever was the original author of the relevant piece of code or -whoever worked on it last. If assigning the issue is not possible due to missing rights, please just comment and -@mention the author/last editor. - -Please do not just restart the run, as this would overwrite the results. If you need to, a better way of doing this is -to push an empty commit. This will trigger another run. - -```bash -git commit --allow-empty -m "trigger CI" && git push -``` - -If an issue labeled with `Bug` and `FlakyTest` is assigned to you, please prioritize addressing this issue as other people will be affected. -We are taking the quality of our code very serious and reporting on flaky tests is an important step toward improvement -in that area. - -## Project and Milestone Planning - -We use milestones to set a common focus for a period of 6 to 8 weeks. -The group of committers chooses issues based on customer needs and contributions we expect. - -### Milestones - -Milestones are organized at the [GitHub Milestones page](https://github.com/eclipse-edc/Connector/milestones). -They are numbered in ascending order. There, contributors, users, and adopters can track the progress. - -Please note that the due date of a milestone does not imply any guarantee that all linked issued will -be resolved by then. - -When closing the current milestone, issues that were not resolved within a milestone phase will be -reviewed to evaluate their relevance and priority, before being assigned to the next milestone. - -#### Issues - -Every issue that should be addressed during a milestone phase is assigned to it by using the -`Milestone` feature for linking both items. This way, the issues can easily be filtered by -milestones. - -#### Pull Requests - -Pull requests are not assigned to milestones as their linking to issues is sufficient to track -the relations and progresses. - -### Projects - -The [GitHub Projects page](https://github.com/eclipse-edc/Connector/projects) -provides a general overview of the project's working items. Every new issue is automatically assigned -to the ["Dataspace Connector" project](https://github.com/orgs/eclipse-edc/projects/3). -It can be unassigned or moved to any other project that is provided. - -In every project, an issue passes four stages: `Backlog`, `In progress`, `Review in progress`, and `Done`, -independent of their association to a specific milestone. - -### Releases - -Please find more information about our release approach [here](docs/developer/releases.md). - -## Contact Us - -If you have questions or suggestions, do not hesitate to contact the project developers via -the [project's "dev" list](https://dev.eclipse.org/mailman/listinfo/edc-dev). - -You may also want to join our [Discord server](https://discord.gg/n4sD9qtjMQ). - -There, we provide a biweekly meeting on fridays 2-3 p.m. (CET) to give any interested person the -opportunity to get in touch with the committer team. We are meeting in the "general" voice channel. -Find more details about the schedule [on GitHub](https://github.com/eclipse-edc/Connector/discussions/1303). - -_If you have a "contributor" or "committer" status, you will also have access to private channels._ diff --git a/NOTICE.md b/NOTICE.md index f13fdfd7a0e..07551d7a514 100644 --- a/NOTICE.md +++ b/NOTICE.md @@ -1,4 +1,4 @@ -# Notices for Eclipse Dataspace Connector +# Notices for EDC Connector This content is produced and maintained by the Eclipse Dash, Tools for Committers project. diff --git a/README.md b/README.md index c8440392739..849edc153f2 100644 --- a/README.md +++ b/README.md @@ -1,6 +1,4 @@

-
- Logo
EDC Connector
@@ -39,25 +37,6 @@ -

- Contribute • - Docs • - Issues • - License • - Q&A -

- -The Eclipse Dataspace Connector provides a framework for sovereign, inter-organizational data exchange. It will -implement the International Data Spaces Dataspace Protocol (DSP) as well as relevant protocols associated with GAIA-X. -The connector is designed in an extensible way in order to support alternative protocols and integrate in various -ecosystems. - -Please also refer to: - -- The [Eclipse Project Homepage](https://projects.eclipse.org/projects/technology.edc) -- [International Data Spaces](https://www.internationaldataspaces.org) -- [Dataspace Protocol specifications](https://docs.internationaldataspaces.org/dataspace-protocol/overview/readme) -- The [GAIA-X](https://gaia-x.eu) project ### Built with @@ -69,10 +48,10 @@ embedded into any form of application deployment. ### Documentation -Developer documentation can be found under [docs/developer](docs/developer/), -where the main concepts and decisions are captured as [decision records](docs/developer/decision-records/). +Developer documentation can be found under [docs/developer](docs/developer/README.md), +where the main concepts and decisions are captured as [decision records](docs/developer/decision-records/README.md). -Some more documentation can be found at [extensions](extensions/), [launchers](launchers/) and +Some more documentation can be found at [extensions](extensions/README.md), [launchers](launchers/README.md) and [the samples repository](https://github.com/eclipse-edc/Samples). For detailed information about the whole project, please take a look at @@ -138,7 +117,7 @@ Then you can add snapshot dependencies by simply using the `-SNAPSHOT` version s ```kotlin dependencies { - implementation("org.eclipse.edc:spi:core-spi:0.1.4-SNAPSHOT-SNAPSHOT") + implementation("org.eclipse.edc:spi:core-spi:0.1.4-SNAPSHOT") // any other dependencies } ``` @@ -156,7 +135,7 @@ Please be aware of the following pitfalls: _We plan to have actual release versions starting some time mid 2022. Please check back soon._ -> For more information about versioning please refer to the [release documentation](docs/developer/releases.md) +> For more information about versioning please refer to the [release documentation](https://github.com/eclipse-edc/.github/blob/main/docs/developer/releases.md) ### Checkout and build from source @@ -175,7 +154,7 @@ That will build the connector and run unit tests. ### [Optional] Setup your IDE If you wish to configure your IDE/editor to automatically apply the EDC code style, please -follow [this guide](styleguide.md). +follow [this guide](https://github.com/eclipse-edc/.github/blob/main/contributing/styleguide.md). _Note: the style guide will be checked/enforced in GitHub Actions._ @@ -223,11 +202,11 @@ Contains implementations for communication protocols a connector might use, such ## Releases GitHub releases are listed [here](https://github.com/eclipse-edc/Connector/releases). -Please find more information about releases in our [release approach](docs/developer/releases.md). +Please find more information about releases in our [release approach](https://github.com/eclipse-edc/.github/blob/main/docs/developer/releases.md). ### Roadmap -See [here](CONTRIBUTING.md#project-and-milestone-planning) for more information about project and +See [here](https://github.com/eclipse-edc/.github/blob/main/CONTRIBUTING.md#project-and-milestone-planning) for more information about project and milestone planning. Scheduled and ongoing milestones are listed [here](https://github.com/eclipse-edc/Connector/milestones). @@ -237,4 +216,4 @@ Available tags can be found [here](https://github.com/eclipse-edc/Connector/tags ## Contributing -See [how to contribute](CONTRIBUTING.md). +See [how to contribute](https://github.com/eclipse-edc/.github/blob/main/CONTRIBUTING.md). diff --git a/contribution_categories.md b/contribution_categories.md deleted file mode 100644 index c3579a7bcbc..00000000000 --- a/contribution_categories.md +++ /dev/null @@ -1,58 +0,0 @@ -# Guideline for submitting features - -This document is intended as guideline for contributors who either already have implemented a feature, e.g. an extension, or intend to do so, and are looking for ways to upstream that feature into the EDC. - -There are currently two possible levels of adoption for the EDC project: -1. incorporate a feature as core EDC component -2. reference a feature as "friend" - -## Get referenced as "friend" - -This means we will add a link to our [known friends](known_friends.md) list, where we reference projects and features that we are aware of. These are repositories that have no direct affiliation with EDC and are hosted outside of the `eclipse-edc` GitHub organization. We call this a "friend" of EDC (derived from the C++ [`friend class` concept](https://en.cppreference.com/w/cpp/language/friend)). -In order to become a "friend" of EDC, we do a quick scan of the code base to make sure it does not contain anything offensive, or that contradicts our code of conduct, ethics or other core OSS values. - -The EDC core team does not maintain or endorse "friend" projects in any way, nor is it responsible for it, but we do provide a URL list to make it easier for other developers to find related projects and get an overview of the EDC market spread. - -This is the easiest way to "get in" and will be the suitable form of adoption for _most_ features and projects. - -## Get adopted in EDC core - -This means the contribution gets added to the EDC code base, and is henceforth maintained by the EDC core team. The barrier of entry for this is much higher than for "friends", and a more in-depth review of the code will be performed. - -Note that this covers both what we call the [EDC Core repository](https://github.com/eclipse-edc/Connector) as well as any current or future repositories in the `eclipse-edc` GitHub organization. -It is up to the committers to decide where the code will eventually be hosted in case of adoption. - -However, in order to do a preliminary check, please go through the following bullet points: - -#### Why should this contribution be adopted? -Please argue why this feature must be hosted upstream and be maintained by the EDC core team. - -#### Could it be achieved with existing functionality? If not, why? -If there is any existing code that can achieve the same thing with little modification, that is usually the preferable way for the EDC core team. We aim to keep the code succinct and want to avoid similar/duplicate code. Make sure you understand the EDC code base well! - -#### Are there multiple use cases or applications who will benefit from the contribution? -Basically, we want you to motivate who will use that feature and why, thereby arguing the fact that it is well-suited to be adopted in the core code base. One-off features are better suited to be maintained externally. - -#### Can it be achieved without introducing third-party dependencies? If not, which ones? -EDC is a platform rather than an application, therefore we are extremely careful when it comes to introducing third party libraries. The reasons are diverse: security, license issues and over all JAR weight, just to mention a few important ones. - -#### Would this feature limit platform independence in any way? If so, how and why? -Features that do not work well in clustered environments are difficult to adopt, since EDC is designed from the ground up to be stateless and clusterable. Similarly, features, that have dependencies onto certain operating systems are difficult to argue. - -#### Is it going to be a self-contained feature, or would it cut across the entire code base? -Features that have a large impact on the code base are very complex to thoroughly test, they have a high chance to destabilize the code and require careful inspection. Self-contained features on the other hand are easier to isolate and test. - -And on a more general level: -- does your contribution comply with our [licensing](LICENSE)? -- does the code adhere to our [styleguide](styleguide.md) and - our [architectural principles](docs/developer/architecture/coding-principles.md)? -- are you willing to accept our [contributing guidelines](CONTRIBUTING.md)? -- are you prepared to make frequent contributions and help out with maintaining this feature? - -When you submit an application for adopting a feature, _be prepared to answer all of them in an exhaustive and coherent way_! - -Note that even if all of the aforementioned points are answered satisfactorily, **the EDC core team reserves the right to ultimately decide whether a feature will get adopted or not.** - -## Submitting an application - -Please open in issue using the [Adoption request](.github/ISSUE_TEMPLATE/adoption_request.md) template, fill out all the sections to the best of your knowledge and wait to hear back from the EDC core team. We will comment in the issue, or reach out to you directly. Be aware that omitting sections from the application will greatly diminish the chance of approval. diff --git a/docs/developer/README.md b/docs/developer/README.md index 6c3570c2c88..f7b6bd5b169 100644 --- a/docs/developer/README.md +++ b/docs/developer/README.md @@ -8,24 +8,21 @@ - [Performance Tuning](performance-tuning.md) ## Build and testing -- [Releases](releases.md) - [OpenApi Spec Generation](openapi.md) -- [Testing](testing.md) - [Cloud Testing](cloud_testing.md) ## Deep Dives - [Command Queue](command-queue.md) -- [Dependency Resolution](dependency_resolution.md) - [DPF Selector Concept](dpf_selector.md) - [Dynamic SQL Queries](sql_queries.md) - [Events](events.md) -- [Logging](logging.md) - [Metrics](metrics.md) - [State Machine](state-machine.md) > All implementations have to follow existing design principles and architectural patterns that are provided as -> [Decision Records](decision-records/). Therefore, during implementation, please refer to the dedicated -> [style guide](../../styleguide.md) and [contribution guidelines](../../CONTRIBUTING.md), and the patterns we -> documented in [architecture principles](architecture/coding-principles.md). _Make sure to continuously -> check and extend the list._ +> [Decision Records](decision-records/README.md). Therefore, during implementation, please refer to the dedicated +> [style guide](https://github.com/eclipse-edc/.github/blob/main/contributing/styleguide.md) and +> [contribution guidelines](https://github.com/eclipse-edc/.github/blob/main/CONTRIBUTING.md), and the patterns we +> documented in [architecture principles](https://github.com/eclipse-edc/.github/blob/main/contributing/coding-principles.md). +> _Make sure to continuously check and extend the list._ diff --git a/docs/developer/architecture/README.md b/docs/developer/architecture/README.md index 67bc2b1b391..a4846911a58 100644 --- a/docs/developer/architecture/README.md +++ b/docs/developer/architecture/README.md @@ -1,6 +1,5 @@ # Architecture -- [Key Principles](coding-principles.md) - [Terminology](terminology.md) - [Catalog](catalog/) - [Usage Control](usage-control/) diff --git a/docs/developer/architecture/coding-principles.md b/docs/developer/architecture/coding-principles.md deleted file mode 100644 index c8744680a71..00000000000 --- a/docs/developer/architecture/coding-principles.md +++ /dev/null @@ -1,140 +0,0 @@ -# Coding Principles and Style Guide - -## I. Fail-fast and Explicit Configuration - -1. Configuration should be loaded and validated at extension initialization so that issues are reported immediately. Do - not lazy-load configuration unless it is required to do so. -2. Settings can be pulled from the extension context and placed into configuration objects, which are passed to services - via their constructor. -3. Service configuration requirements should always be explicit; as a general rule, do not pass a single configuration - object with many values to multiple services. - For example, see `HttpFunctionConfiguration.java`. -4. Annotate configuration keys with `@Setting` so that they may be tracked. - -## II. Errors - -1. Do not throw checked exceptions; use unchecked exceptions. If an unchecked exception type needs to be defined, - inherit from EdcException. -2. Do not throw exceptions to signal a validation error; report the error (preferably collated) and return an error - response. -3. Throw an unchecked exception if something unexpected happens (e.g. a backing store connection is down after a number - of retries). Note that validation errors are expected. - For example, see `Result.java`. -4. Only throw an exception when there is no remediation possible, i.e. the exception is fatal. Do not throw an exception - if an operation can be retried. - -## III. Simplicity - -1. Avoid layers of indirection when they are not needed (e.g. "pass-through methods"). -2. Avoid needlessly wrapping objects, especially primitive datatypes. - -## IV. General Coding Style - -1. Use `var` instead of explicit types (helps with clarity) -2. Avoid `final` in method args and local variables -3. Use `final` in field declarations -4. Avoid `static` fields except in constants or when absolutely necessary. (you should be able to provide a reason). -5. Use interfaces to define shared constants -6. Use "minimally required types" (or "smallest possible API"), e.g. use `ObjectMapper` instead of `TypeManager` - , or use a `String` instead of a more complex object containing the String, etc. -7. Use either `public` members, which are documented and tested, or `private` members. -8. Avoid package-private members, especially if only needed for testing -9. Avoid `protected` members unless they're intended to be overridden. -10. Use package-private classes if they're not needed outside the package, e.g. implementation classes -11. Avoid using `enum`s for anything other than named integer enumerations. -12. Avoid using static classes as much as possible. Exceptions to this are helper functions and test utils, etc. as well - as static inner classes. -13. Use only camel case and no prefixes for naming. -14. Avoid unnecessary `this.` except when it is necessary e.g. when there is a name overlap -15. Use static imports, as long as code readability and comprehension is not impacted. For example, - - use `assertThat(...)` instead of `Assertions.assertThat(...)` - - use `format("...",arg1)` instead of `String.format(...)`, but - - avoid `of(item1, item2).map(it -> it.someOperation)...` instead of `Stream.of(item1, item2)`. - Also, avoid static imports if two static methods with the same name would be imported from different classes -16. Avoid `Optional` as method return type or method argument, except when designing a fluent API. Use `null` in - signatures. -17. Avoid cryptic variable names, especially in long methods. Instead, try to write them out, at least to a reasonable - extent. - -## V. Testing - -1. All handlers and services should have dedicated unit tests with mocks used for dependencies. -2. Prefer unit tests over all other test types: unit > integration/component > e2e -3. When appropriate, prefer composing services via the constructor so that dependencies can be mocked as opposed to - instantiating dependencies directly. -4. Use classes with static test functions to provide common helper methods, e.g. to instantiate an object. -5. Use `[METHOD]_when[CONDITION]_should[EXPECTATION]` as naming template for test methods, - e.g. `verifyInput_whenNull_shouldThrowNpe()` as opposed to `testInputNull()` - -## VI. Data Objects - -1. Use the `Builder` pattern when: - - there are any number of optional constructor args - - there are more than 3 constructor args - - inheriting from an object that fulfills any of the above. In this case use derived builders as well. - -2. Although serializability is not the reason we use the builder pattern, it is a strong indication that a builder - should be used. -2. Builders should be named just `Builder` and be static nested classes. -3. Create a `public static Builder newInstance(){...}` method to instantiate the builder -4. Builders have non-public constructors -5. Use single-field builders: a `Builder` instantiates the object it builds in its constructor, and sets the properties - in its builder methods. The `build()` method then only performs verification (optional) and returns the instance. -6. Use `private` constructors for the objects that the builder builds. -7. If there is a builder for an object, use it to deserialize an object, i.e. put Jackson annotations such - as `JsonCreator` and `@JsonBuilder` on builders. -8. Note that the motivation behind use of builders is not for immutability (although that may be good in certain - circumstances). Rather, it is to make code less error-prone and - simpler given the lack of named arguments and optional parameters in Java. - -## VII. Secrets - -1. Only store secrets in the `Vault` and do not hold them in objects that may be persisted to other stores. -2. Do not log secrets or sensitive information. - -## VIII. Extensions and Libraries - -1. Extension modules contribute a feature to the runtime such as a service. -2. SPI modules define extensibility points in the runtime. There is a core SPI module that defines extensibility for - essential runtime features. There are other SPI modules that - define extensibility points for optional features such as IDS. -3. Libraries are utility modules that provide classes which may be used by other modules. They do not directly - contribute features to the runtime. -4. An SPI module may only reference other SPI modules and library modules. -5. An Extension module may only reference other SPI modules and library modules. -6. A library module may only reference other library modules. - -## IX. Build - -1. There should only be a root `gradle.properties` that contains build variables. Do not create separate - `gradle.properties` files in a module. -2. For external dependencies, do not reference the version directly. Instead, use - the [version catalog](../version-catalogs.md) feature. - -## X. Handling Null Return Values - -1. In certain situations, `null` may need to be returned from a method, passed as a parameter, or set on a field. Only - use `Optional` if a method is part of a fluent API. - Since the runtime will rarely require this, the project standard is to use the `org.jetbrains.annotations.Nullable` - and `org.jetbrains.annotations.NotNull` annotations. - -## XI. Objects Serialization/Deserialization - -1. `TypeManager` is the component responsible for json ser/des, you can also use the `ObjectMapper` inside it, but there - should be no other `ObjectMapper` instance. - -## XII. Class Naming - -1. A single implementor of an interface should be named `Impl`. -2. An implementor who are meant to be the default implementation for an interface but other are/can be defined used - instead. - -## XIII. Observability - -1. Services are [instrumented for collecting essential metrics](../metrics.md), in particular instances - of `ExecutorService`. - -## XIV. Streams - -1. Always close explicitly `Stream` objects that are returned by a service/store, since they could carry a connection, - and otherwise it will leak. diff --git a/docs/developer/autodoc.md b/docs/developer/autodoc.md deleted file mode 100644 index 14f71b3bd9e..00000000000 --- a/docs/developer/autodoc.md +++ /dev/null @@ -1,33 +0,0 @@ -# The `autodoc` Gradle plugin - -Please find the comprehensive documentation about the `autodoc` plugin in -the [Github Repo](https://github.com/eclipse-edc/GradlePlugins/blob/main/docs/developer/autodoc.md) of -the plugin. - -In EDC, the plugin is intended to be used to generate metamodel manifests for every Gradle module, which then -transformed into Markdown files, subsequently rendered for publication in static web content. - -## Publishing the manifest files - -For every subproject that generates an `edc.json` file a Maven publication is created in the root build file, so that -the manifest file gets published alongside the binary jar files, sources jar and javadoc jar. - -## Downloading the manifest files - -For publishing we use `type=json` and `classifier=manifest`, which means a dependency in a client project would look -like -this (kotlin DSL): - -```kotlin -implementation("org.eclipse.edc:::manifest@json") -``` - -For example, for the `:core:control-plane:control-plane-core` module in version `0.1.4-SNAPSHOT-SNAPSHOT`, this would be: - -```kotlin -implementation("org.eclipse.edc:control-plane-core:0.1.4-SNAPSHOT-SNAPSHOT:manifest@json") -``` - -When the dependency gets resolved, the manifest file will get downloaded to the local gradle cache, typically located -at `.gradle/caches/modules-2/files-2.1`. So in the example the manifest would get downloaded -at `~/.gradle/caches/modules-2/files-2.1/org.eclipse.edc/control-plane-core/0.1.4-SNAPSHOT-SNAPSHOT//control-plane-core-0.1.4-SNAPSHOT-SNAPSHOT-manifest.json` diff --git a/docs/developer/decision-records/2022-02-10-code-coverage/README.md b/docs/developer/decision-records/2022-02-10-code-coverage/README.md deleted file mode 100644 index 8d45f868d12..00000000000 --- a/docs/developer/decision-records/2022-02-10-code-coverage/README.md +++ /dev/null @@ -1,31 +0,0 @@ -# Code coverage - -## Decision - -JaCoCo is used for measuring test code coverage in the build, in order to obtain metrics on the current state of EDC testing as well as its evolution over time. - -The Codecov platform is used for visualizing code coverage statistics on PRs. This will raise developer awareness on an increase/decrease of coverage introduced by PRs. Codecov provides a detailed report including a dashboard with additional metrics like code complexity. - -## Rationale - -Test code coverage is a measure of the source code that executed when a test suite is run. A program with high test coverage has a lower chance of containing bugs. - -At the time of writing, code coverage in the solution is under 50%. Increasing code coverage can best be achieved over time by providing feedback on coverage impact on each PR. This requires a more advanced tool than JaCoCo on its own can provide, and is well achieved by Codecov. - -## Spikes - -We evaluated the following options: - -- [JaCoCo without or with aggregation](jacoco.md) -- [JaCoCo with Codecov](codecov.md) -- [JaCoCo with Codacy](codacy.md) -- [JaCoCo with SonarQube](sonarqube.md) -- [JaCoCo with Github Action](jacoco_github_action.md) - -## Comparison of selected options - -| Tool | Project coverage report | Coverage on PR in Github | Additional comments | -| -------------------------- | -------------------------------------- | ------------------------------------------------------------ |-------------------------------------------------------------------------------------------------------------------| -| JaCoCo with Codecov | ✅ Detailed report on Codecov dashboard | ✅ Github bot messages on every PR (coverage after the PR is merged, total project coverage, code complexity) | ✅ Reports on code complexity
✅ Easy configuration | -| JaCoCo with Github Actions | ✅ Basic report (percentage) | ✅ Github bot messages on every PR (coverage on changed files and total project coverage) | ⚠️ [Minor issue] Manual configuration (JaCoCo Report Github Action requires a property to path to JaCoCo reports) | -| JaCoCo with Codacy | ✅ Report available on Codacy dashboard | ⚠️ Not supported | ⚠️ Delays in reports showing up in the dashboard | diff --git a/docs/developer/decision-records/2022-02-10-code-coverage/codacy.md b/docs/developer/decision-records/2022-02-10-code-coverage/codacy.md deleted file mode 100644 index 71586d34ee1..00000000000 --- a/docs/developer/decision-records/2022-02-10-code-coverage/codacy.md +++ /dev/null @@ -1,31 +0,0 @@ -# JaCoCo with Codacy - -[Codacy](https://www.codacy.com/) is an online service for both static code analysis and test code coverage analysis. It is free for Open Source projects. - -We [enrolled our repository fork](https://docs.codacy.com/getting-started/codacy-quickstart/) into Codacy using its Web UI, and obtained a [Project API token](https://docs.codacy.com/codacy-api/api-tokens/) which we set up as a GitHub secret. - -We used the modified `build.gradle.kts` file as above to create JaCoCo XML reports. We then used the [Codacy GitHub action](https://github.com/codacy/codacy-coverage-reporter-action) to upload our reports. The `find ` command is set up to exclude one XML report with empty content that is 240 bytes long, and causes the following action to fail. - -```yaml - - name: Set Coverage Report Paths - id: coverage-paths - run: | - echo -n "::set-output name=COVERAGE_REPORT_PATHS::" - find . -name jacocoTestReport.xml -size +300c -printf '%p,' - - - name: Publish Code Coverage Results - uses: codacy/codacy-coverage-reporter-action@v1 - with: - project-token: ${{ secrets.CODACY_PROJECT_TOKEN }} - coverage-reports: ${{ steps.coverage-paths.outputs.COVERAGE_REPORT_PATHS }} -``` -At first the reports weren't visible in the [Codacy UI](https://app.codacy.com/gh/Agera-CatenaX/EclipseDataSpaceConnector/settings/coverage), but they -started to appear after ~16 hours. - -In the meantime we also reached out to Codacy support to investigate the issue. - -Below screenshot shows code coverage diagram of the main branch analysis. -![Code Coverage with Codacy](code-coverage-codacy.png) - -We didn't manage to set up Codacy to see the code coverage reports for the PRs neither in the Codacy dashboard nor in Github Actions. -Codacy seems to offer more features for code quality analysis than for code coverage scans. \ No newline at end of file diff --git a/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-codacy.png b/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-codacy.png deleted file mode 100644 index d4f23d55da7..00000000000 Binary files a/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-codacy.png and /dev/null differ diff --git a/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-codecov-dashboard.png b/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-codecov-dashboard.png deleted file mode 100644 index ab01c70cfd8..00000000000 Binary files a/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-codecov-dashboard.png and /dev/null differ diff --git a/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-codecov-pr-detail.png b/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-codecov-pr-detail.png deleted file mode 100644 index f4a50032870..00000000000 Binary files a/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-codecov-pr-detail.png and /dev/null differ diff --git a/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-codecov-pr-github.png b/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-codecov-pr-github.png deleted file mode 100644 index 04a4907524c..00000000000 Binary files a/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-codecov-pr-github.png and /dev/null differ diff --git a/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-codecov-pr.png b/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-codecov-pr.png deleted file mode 100644 index 6152f615fe4..00000000000 Binary files a/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-codecov-pr.png and /dev/null differ diff --git a/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-codecov-summary.png b/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-codecov-summary.png deleted file mode 100644 index 380016d8d3f..00000000000 Binary files a/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-codecov-summary.png and /dev/null differ diff --git a/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-jacoco-code.png b/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-jacoco-code.png deleted file mode 100644 index d221274d7b3..00000000000 Binary files a/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-jacoco-code.png and /dev/null differ diff --git a/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-jacoco-gitbhub-actions.png b/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-jacoco-gitbhub-actions.png deleted file mode 100644 index 033530b91ab..00000000000 Binary files a/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-jacoco-gitbhub-actions.png and /dev/null differ diff --git a/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-jacoco-summary.png b/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-jacoco-summary.png deleted file mode 100644 index 1f45db6a180..00000000000 Binary files a/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-jacoco-summary.png and /dev/null differ diff --git a/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-sonar.png b/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-sonar.png deleted file mode 100644 index a3b663d388a..00000000000 Binary files a/docs/developer/decision-records/2022-02-10-code-coverage/code-coverage-sonar.png and /dev/null differ diff --git a/docs/developer/decision-records/2022-02-10-code-coverage/codecov.md b/docs/developer/decision-records/2022-02-10-code-coverage/codecov.md deleted file mode 100644 index 34eabb6e706..00000000000 --- a/docs/developer/decision-records/2022-02-10-code-coverage/codecov.md +++ /dev/null @@ -1,92 +0,0 @@ -# JaCoCo with Codecov - -## Evaluation - -Codecov is an online service for code coverage analysis that promises to "always be free for open source projects". We have been widely using it in various (open-source and proprietary) projects for years with good results. - -We modified the root `build.gradle.kts` file to apply the JaCoCo plugin to all projects, and produce an XML format report that can be used by Codecov: - -```kotlin -// build.gradle.kts - -allprojects { - //... - apply(plugin = "jacoco") - - //... - tasks.jacocoTestReport { - reports { - xml.required.set(true) - } - } -} - -``` - -We modified the `.github/workflows/verify.yaml` workflow as follows: - -```yaml - - name: Gradle Test Core - run: ./gradlew clean check jacocoTestReport - - - name: CodeCov - uses: codecov/codecov-action@v2 - with: - token: ${{ secrets.CODECOV_TOKEN }} -``` - -The token is supposedly not required for open-source projects, but we got an error running the action without providing a token. - -By logging in at https://about.codecov.io with our GitHub Account, we were able to browse straight away to our EDC (fork) repository and obtain a token for the repository. We added the token as a GitHub secret. - -We merged a PR with the action configuration above into the `main` (default) branch of our fork repository, for Codecov to report code coverage differences in PRs. - -Finally, we installed the Codecov GitHub app into the repository, to enable the Codecov bot to post comments directly into PRs. - -The Codecov online site provides detailed coverage reports. These reports also measure cyclomatic complexity. - -![Code Coverage with Codecov](code-coverage-codecov-summary.png) - -In PRs, the Codecov bot automatically posts a report indicating coverage changes. - -![Code Coverage with Codecov](code-coverage-codecov-pr-github.png) - -These reports can also be accessed from the Codecov online service. - -![Code Coverage with Codecov](code-coverage-codecov-pr.png) - -The report can be drilled to highlight the code subjected to coverage changes. - -![Code Coverage with Codecov](code-coverage-codecov-pr-detail.png) - -The configuration of Codecov can be adjusted in a [`codecov.yaml` configuration file](https://docs.codecov.com/docs/codecov-yaml). That allows for example configuration to ensure each new PR [does not decrease coverage](https://docs.codecov.com/docs/common-recipe-list#increase-overall-coverage-on-each-pull-request). - -## Using Codecov with forks - -Further tests showed that if Codecov is installed in the base repository then providing the Codecov token is indeed not required for open source projects: - -```yaml - - name: CodeCov - uses: codecov/codecov-action@v2 -``` - -The Codecov PR reports are available with no additional changes also for PRs between forks and the base repository. - -If the owners of a fork repository want to use Codecov also for internal PRs (before merging to upstream) then Codecov App needs to be installed also in the -fork, but in this case we also got the reports without providing the token. - -## Codecov reports - -We can download a report containing the data from (max) last 6 months about Line and Complexity Coverage from the Codecov dashboard. -The coverage on the chart can be aggregated per day, hour, month and commit. - -![Code Coverage with Codecov](code-coverage-codecov-dashboard.png) - -Codecov only shows commits that have uploaded coverage reports and are six months or less old ([Codecov doc](https://docs.codecov.com/docs/frequently-asked-questions#where-are-my-older-commits-my-project-dashboard-doesnt-show-any-commit-data-in-the-line-graph)). - -## Useful links - -- [How to interpret Codecov graphs](https://docs.codecov.com/docs/graphs) -- [How to interpret delta in Codecov reports](https://docs.codecov.com/docs/codecov-delta) -- [More information about Codecov Pull Request comments](https://docs.codecov.com/docs/pull-request-comments) - diff --git a/docs/developer/decision-records/2022-02-10-code-coverage/jacoco.md b/docs/developer/decision-records/2022-02-10-code-coverage/jacoco.md deleted file mode 100644 index 3ad74f18c60..00000000000 --- a/docs/developer/decision-records/2022-02-10-code-coverage/jacoco.md +++ /dev/null @@ -1,32 +0,0 @@ -# Option 1: JaCoCo - -JaCoCo (Java Code Coverage) is a popular and mature open-source tool. It runs as a Java agent during test execution, to capture which lines are exercised during which test. - -Capturing coverage for a particular project in JaCoCo is straightforward, using the [Gradle JaCoCo Plugin](https://docs.gradle.org/current/userguide/jacoco_plugin.html). - -```kotlin -// build.gradle.kts -plugins { - jacoco -} -``` - -This yields an HTML report. - -![Code Coverage with JaCoCo](code-coverage-jacoco-summary.png) - -The report can be drilled to highlight covered lines (green), not covered lines (red), and lines where some execution branches are not covered (orange). - -![Code Coverage with JaCoCo](code-coverage-jacoco-code.png) - -This configuration has limited value since each project produces its own report. Furthermore, there is no indication of whether a given commit is increasing or decreasing coverage, and in which areas of the code. - -# Option 2: JaCoCo with aggregation - -The Gradle documentation includes a sample for [Reporting code coverage across multiple sub-projects with JaCoCo](https://docs.gradle.org/current/samples/sample_jvm_multi_project_with_code_coverage.html). The sample explains how to generate a single aggregated report. - -We were not able to get the sample working in the EDC repository. - -In any case, extensive complex Kotlin code needs to be added to the build. This is concerning for maintainability. - -As it would anyway not solve the problem that code coverage is best analyzed relatively to a previous commit, we did not attempt further to get the sample working. diff --git a/docs/developer/decision-records/2022-02-10-code-coverage/jacoco_github_action.md b/docs/developer/decision-records/2022-02-10-code-coverage/jacoco_github_action.md deleted file mode 100644 index f6632512a48..00000000000 --- a/docs/developer/decision-records/2022-02-10-code-coverage/jacoco_github_action.md +++ /dev/null @@ -1,22 +0,0 @@ -# JaCoCo with Github Action - -Code coverage coming from JaCoCo reports can be added to a PR using [Github Action JaCoCo Report](https://github.com/marketplace/actions/jacoco-report). - -```yaml -- name: Set Coverage Report Paths - id: coverage-paths - run: | - echo -n "::set-output name=COVERAGE_REPORT_PATHS::$(find ~+ -name jacocoTestReport.xml -size +300c -printf '%p,' | sed 's/.$//')" - -- name: Add coverage to PR - id: jacoco - uses: madrapps/jacoco-report@v1.2 - with: - paths: ${{ steps.coverage-paths.outputs.COVERAGE_REPORT_PATHS }} # Comma separated absolute paths of the generated jacoco xml files - token: ${{ secrets.GITHUB_TOKEN }} # Github personal token to add commits to Pull Request - min-coverage-overall: 40 # The minimum code coverage that is required to pass for overall project - min-coverage-changed-files: 60 #The minimum code coverage that is required to pass for changed files -``` -The above workflow will send a comment to the PR showing the code coverage of the files modified in the PR and the overall project code coverage. - -![Code Coverage with JaCoCo and Github Action](code-coverage-jacoco-gitbhub-actions.png) diff --git a/docs/developer/decision-records/2022-02-10-code-coverage/sonarqube.md b/docs/developer/decision-records/2022-02-10-code-coverage/sonarqube.md deleted file mode 100644 index 01e75b0d4a8..00000000000 --- a/docs/developer/decision-records/2022-02-10-code-coverage/sonarqube.md +++ /dev/null @@ -1,57 +0,0 @@ -# JaCoCo with SonarQube - -[SonarQube](https://docs.sonarqube.org/latest/setup/get-started-2-minutes/) is a platform for both static code analysis and test code coverage analysis. It offers an open source Community -Edition version, which is free but has some limitations. - -SonarQube can be run locally by adding a SonarQube plugin to gradle and e.g. running SonarQube instance from docker. - -Add Gradle plugin: - -```gradle -plugin { - id("org.sonarqube") version "3.3" -} -``` - -To enable code coverage reports test coverage reports should be generated (explained in sections: Option 1 and Option 2). - -Docker-compose file with minimal configuration: - -```yml -version: "3" -services: - sonarqube: - image: sonarqube:lts - ports: - - 9000:9000 - environment: - - SONAR_FORCEAUTHENTICATION=false -``` - -Then when sonar is up current project can be added to the analysis by running a command: - -```bash -./gradlew sonarqube -``` - -Above mentioned configuration works when SonarQube is running on default url: http://localhost:9000 and jacoco reports are placed in default location. -Otherwise these properties should be set: _sonar.host.url_, _sonar.jacoco.reportPaths_. Here can be found more information about [sonarqube Gradle plugin](https://docs.sonarqube.org/latest/analysis/scan/sonarscanner-for-gradle/). - -Code coverage analysis with SonarQube: - -![Code Coverage with Sonar](code-coverage-sonar.png) - -## Integration with Github Actions - -Integration with github Actions wasn't a part of this spike, because it requires having a SonarQube instance deployed for the whole project, instead of using -localhost version. - -More information about [Github Integration](https://docs.sonarqube.org/latest/analysis/github-integration/). - -[Github Action that helps to run the code analysis.](https://github.com/marketplace/actions/official-sonarqube-scan) - -## Limitations of the Community Edition version - -- Analysis of multiple branches is not supported -- Reporting measures to branches and pull requests in Github not supported -- Automatic detection of branches/pull requests in Github Actions not supported \ No newline at end of file diff --git a/docs/developer/decision-records/2022-02-11-codeql/README.md b/docs/developer/decision-records/2022-02-11-codeql/README.md deleted file mode 100644 index e307875dc90..00000000000 --- a/docs/developer/decision-records/2022-02-11-codeql/README.md +++ /dev/null @@ -1,69 +0,0 @@ -# CodeQL -CodeQL is a semantic code analysis engine developed by GitHub to automate security checks. A database is extracted from source code that can be analysed with a powerful query language. Each single query can be thought of as a “check” or “rule” representing a distinct security vulnerability that is being searched for. There is an available set of standard CodeQL queries, written by GitHub researchers and community contributors, and custom ones can be written too. See [Writing queries](https://codeql.github.com/docs/writing-codeql-queries/codeql-queries/) in the CodeQL docs for more information. - -## Extending the scope of CodeQL queries scan -CodeQL is integrated in the EDC CI build in a dedicated [Github workflow](../.github/workflows/codeql-analysis.yml). -Currently the workflow runs on PRs and commits to the main branch and runs the default set of queries as provided by CodeQL. - -To have more detailed scan we decided to extend the CodeQL queries, by using the built in CodeQL query suite: [security-and-quality](https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/configuring-code-scanning#using-queries-in-ql-packs). - -```yaml - # Initializes the CodeQL tools for scanning. - - name: Initialize CodeQL - uses: github/codeql-action/init@v2 - with: - languages: ${{ matrix.language }} - queries: +security-and-quality -``` - -To reduce amount on false positive alerts we excluded the test code from the scan by replacing CodeQL Autobuild with a task that compiles only Java -production sources: - -```yaml - # Compiles production Java source (without tests) - - name: Build - run: ./gradlew compileJava -``` -The results can be visible in the Github Workflow check view under the PR and in Security Tab. - -![CodeQL](codeql_github_alerts.png) - -After clicking on the alert we can see a view with more detailed explanations about it, references and examples. - -## Suppressing the alerts - -The alerts can be suppressed or removed by users with Security permissions which are assigned by default to user roles Write, Maintain and Admin. - -![CodeQL](security_permissions.png) - -Users with Read permissions (repository Members by default) can see the alerts in the PRs, but they don't have access to suppress the alerts or to see -the details. - -Users with the proper permissions can analyse the alerts and dismiss/remove them if they are not applicable from both views - under the PR and in the Security Tab. - -![CodeQL](codeql_dismiss_alerts.png) - -Dismissing the alerts will dismiss them on all branches. Dismissed alerts can be later reopened. Deleting the alerts doesn't prevent them from appearing on -the next scans. -[Here](https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/managing-code-scanning-alerts-for-your-repository#dismissing-or-deleting-alerts) you can find more information about dismissing/deleting CodeQL alerts. - -In Settings tab we can also define the alert severities causing [pull request check failure](https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/configuring-code-scanning#defining-the-severities-causing-pull-request-check-failure) (available also only for users with at least Write role). - -![CodeQL](codeql_severity_settings.png) - -[GitHub code scanning](https://github.com/github/codeql/issues/7294#issuecomment-985496463) does not support alert suppression comments and annotations at -the moment. - -### LGTM -[LGTM](https://lgtm.com/) is an online analysis platform that automatically checks your code for real CVEs and vulnerabilities using CodeQL. -In contrast to running CodeQL as a Github Action, LGTM supports [alert suppression](https://help.semmle.com/lgtm-enterprise/user/help/alert-suppression.html) -through comments and annotations in the code. -It could be considered a useful addition to the project in the future as it seems more comfortable to use and mature alternative. - -## Customization of Queries -After reviewing the current capabilities of CodeQL for the customization of queries with the intention of providing additional insight for the repo the following findings are presented: - -- The documentation for CodeQL is lacking in detail and provides little insight into the capabilities of the query language -- Customization of CodeQL at this time brings little benefit and would require addition review of the source code in order to fully expose a robust features to enable customizations -- CodeQL has valuable functionality in existing `packs` which can and should be used when it benefits the needs for the project -- Development efforts for CodeQL remain strong and progress is expected to bring clarity and new features that will enable one to develop customizations in the future diff --git a/docs/developer/decision-records/2022-02-11-codeql/codeql_dismiss_alerts.png b/docs/developer/decision-records/2022-02-11-codeql/codeql_dismiss_alerts.png deleted file mode 100644 index d6805cddaec..00000000000 Binary files a/docs/developer/decision-records/2022-02-11-codeql/codeql_dismiss_alerts.png and /dev/null differ diff --git a/docs/developer/decision-records/2022-02-11-codeql/codeql_github_alerts.png b/docs/developer/decision-records/2022-02-11-codeql/codeql_github_alerts.png deleted file mode 100644 index 0eb4ae0b116..00000000000 Binary files a/docs/developer/decision-records/2022-02-11-codeql/codeql_github_alerts.png and /dev/null differ diff --git a/docs/developer/decision-records/2022-02-11-codeql/codeql_severity_settings.png b/docs/developer/decision-records/2022-02-11-codeql/codeql_severity_settings.png deleted file mode 100644 index 409c3866bc5..00000000000 Binary files a/docs/developer/decision-records/2022-02-11-codeql/codeql_severity_settings.png and /dev/null differ diff --git a/docs/developer/decision-records/2022-02-11-codeql/security_permissions.png b/docs/developer/decision-records/2022-02-11-codeql/security_permissions.png deleted file mode 100644 index 8a19a405073..00000000000 Binary files a/docs/developer/decision-records/2022-02-11-codeql/security_permissions.png and /dev/null differ diff --git a/docs/developer/decision-records/2022-03-11-story-issues/README.md b/docs/developer/decision-records/2022-03-11-story-issues/README.md deleted file mode 100644 index 1ffcec1f6af..00000000000 --- a/docs/developer/decision-records/2022-03-11-story-issues/README.md +++ /dev/null @@ -1,44 +0,0 @@ -# Creating "story" issues - -## Decision - -For larger features, for which it seems reasonable to create more than one issue, we will create overarching issues which -we'll name "story issues". Every story issues has several sub-issues a.k.a sub-tasks. - -## Rationale - -There is no clear limit or rule that specifies the size at which issues need to be broken into sub-issues, rather, -this should be done by intuition. Developers are expected to have an intrinsic understanding of the natural boundaries -of features. In case of doubt it is better to create several small issues and list them in an overarching -story issue. - -Sub-issues may initially just be placeholders. That shows readers how the story is intended to be implemented and -which aspects have been taken into account, even if certain details might still be missing. - -Finally, having small, focused issues and PRs makes them more digestible, which in turn will benefit the review process. - -## Approach - -Issues should be broken up if -- there is more than one area of code involved (e.g. API, store,...) -- work streams can be parallelized -- there are orthogonal concerns - -The overarching story issues should contain a bulleted list with references to all sub-issues, for example: -```markdown -- [ ] #123 -- [ ] #456 -... -``` - -Please add the `"story"` label to the overarching story issue. - -This will cause GitHub to automatically track the sub-issues in the overarching story issue, and it will display a progress -indicator on overview pages. More information can be found in the [documentation](https://docs.github.com/en/issues/tracking-your-work-with-issues/about-task-lists). - -For example, imagine a story that requires some lib analysis or evaluation before the lib is adopted into the code base could be split into the following sub-tasks: -- evaluate lib X, lib Y and lib Z, create ADR -- implement adapter for lib -- add overall system-test - -Generally speaking there should be one PR per sub-issue. diff --git a/docs/developer/decision-records/2022-07-06-release-automation/README.md b/docs/developer/decision-records/2022-07-06-release-automation/README.md deleted file mode 100644 index ab82fcc96b2..00000000000 --- a/docs/developer/decision-records/2022-07-06-release-automation/README.md +++ /dev/null @@ -1,40 +0,0 @@ -# Automating the EDC release process - -## Decision - -We will use GitHub actions to automate the release of both `SNAPSHOT` and release versions to decrease the error -surface. - -A dedicated `releases` branch will be created to host those. - -## Rationale - -This allows committers to simply trigger the creation of a version through GitHub actions and does not require logging -in to our JIPP (EF Jenkins) instance and perform manual actions there. - -Having a dedicated branch for all releases will it make easier in the future to provide hotfixes for older versions. - -## Approach - -There will be a new branch called `releases`, which is only used to record the history of our releases, i.e. receive -merge commits for every release. In addition, a new GitHub workflow will be created that: - -1. prompts the user to input version string. This must be a SemVer string. -2. creates a tag on `main` with that version string, e.g. `v0.0.1-SNAPSHOT`. This could be done automatically in step 5. -3. creates a merge-commit `main`->`releases`, where the version string is used in the commit message -4. triggers the release job on JIPP supplying the version string as input parameter -5. creates a GitHub Release - -The JIPP then builds and publishes the version to MavenCentral, or OSSRH Snapshots if the version string ends -with `-SNAPSHOT`. For that, a new job will be created on Jenkins, that does _not_ have a cron-based build trigger. - -## Future improvements - -- update `gradle.properties`: the GitHub action could commit the user input (version string) back - into `gradle.properties`. That would result in an additional commit and was therefor left out for now. -- bump version automatically: instead of manually entering a version we could have an "auto-bump" feature, that - automatically increases the version in `gradle.properties`. This makes snapshots with metadata more difficult - (e.g. `0.0.1-foobar-SNAPSHOT`), and was therefore skipped for now. -- use Jenkins' GitHub hook trigger for GITScm polling: GitHub calls a WebHook in Jenkins, who then in turn - one-time-polls the Git repo, and triggers a build when changes were detected. This would get rid of the busy waiting - of the GitHub Jenkins Action. diff --git a/docs/developer/decision-records/2022-08-11-versioning_and_artifacts/README.md b/docs/developer/decision-records/2022-08-11-versioning_and_artifacts/README.md deleted file mode 100644 index f1c8ce66b25..00000000000 --- a/docs/developer/decision-records/2022-08-11-versioning_and_artifacts/README.md +++ /dev/null @@ -1,51 +0,0 @@ -# Versioning concept for MVD - -## Decision - -We want to get rid of checking out other repositories during CI, building them locally and publishing them into the -local Maven cache (`publishToMavenLocal`). - -## Rationale - -When building and/or running projects that use EDC (such as MVD) it is quite cumbersome and error-prone to check out a -particular git ref in different projects and to build and publish them locally. - -We will therefore move toward a system where we use distributed Maven artifacts rather than local ones. This is less -flexible than git refs, but at the same time improves coherence and setup speed. - -## General rules - -> for the sake of brevity, the term "our" refers to all original implementation projects inside the -`eclipse-edc` org in [Github](https://github.com/eclipse-edc/). At the time of this -> writing that includes `DataSpaceConnector` (a.k.a. "EDC"), -`RegistrationService` (a.k.a. "RS", [see Github](https://github.com/eclipse-edc/RegistrationService)), -`IdentityHub` (a.k.a. "IH", [see Github](https://github.com/eclipse-edc/IdentityHub), -`MinimumViableDataspace` (a.k.a. "MVD" -> , [see Github](https://github.com/eclipse-edc/MinimumViableDataspace))) and `FederatedCatalog` (a.k.a. -"FC", [see Github](https://github.com/eclipse-edc/FederatedCatalog)). - -All "our" projects must - -- use Maven artifacts from MavenCentral or OSSRH Snapshots, both for local and CI builds -- produce a new rolling `-SNAPSHOT` version based on their respective `main` branch every 30 minutes if there are - changes. _This is already in place._ -- produce a nightly snapshot build containing the date in the metadata, in the format `X.Y.Z-YYYYMMDD-SNAPSHOT`. _This - is already in place._ - -## Specific rules for "our" dependent projects - -- publishing a new release in a dependency should also trigger a release in dependent projects with the same version - string. E.g. building EDC -> triggers RS and IH. -- all "our" dependent projects **must** maintain version consistency: for example when RS and IH both reference - EDC `0.0.1-some-fix-SNAPSHOT`, then MVD **must** reference that same version -- version bumps must happen across all "our" repos: when RS upgrades to EDC `0.0.1-milestone-69`, then all other - projects **must** follow suit. - -## During development of "our" dependent projects - -- the `main` branches of "our" dependent projects must always reference releases or "named" snapshots of EDC -- in case a dependent project requires a change in EDC, they can temporarily use the rolling snapshot or nightly version - of EDC including that fix, but EDC should release a "named" snapshot, e.g. `0.0.1-something-SNAPSHOT` in a timely - manner. From that time forward, that project will use `0.0.1-something-SNAPSHOT` on its `main` branch. -- before merging the PR in the dependent project, there **must** be a named snapshot or release of EDC, which the - dependent project references henceforth. diff --git a/docs/developer/decision-records/2022-10-10-naming-conventions/README.md b/docs/developer/decision-records/2022-10-10-naming-conventions/README.md deleted file mode 100644 index 5b5c5dc6025..00000000000 --- a/docs/developer/decision-records/2022-10-10-naming-conventions/README.md +++ /dev/null @@ -1,122 +0,0 @@ -# EDC Naming Conventions - -## Decision - -The naming of existing and future Java packages, Gradle modules, Maven artifacts, and configuration -properties should follow defined naming conventions that align the project's structure. - -Related decision records: -- [2022-08-09 Project structure review](../2022-08-09-project-structure-review/) -- [2022-08-11 Versioning and Artifacts](../2022-08-11-versioning_and_artifacts/) - -## Rationale - -A software project's structure should be designed as developer-friendly as possible, by following precisely -defined rules based on established standards. - -Our goals by introducing naming conventions are: -- Easy navigation of either Gradle modules, Maven artifacts or the Java package structure. -- Unique naming of config properties (especially with regard to custom properties introduced by extensions). -- Accordance to the Eclipse Foundation's guidelines and release processes. -- Elimination of split packages. - -## Approach - -### Gradle module name - -Gradle modules must have unique names across the entire build classpath, even if they are located in -different module paths. This is because of a bug in Gradle itself where Gradle will erroneously report -a "cyclic dependency" if this rule is violated. The following hypothetical example would constitute -such a violation: - -```kotlin -// settings.gradle.kts -include(":core:common:transfer-process") -include(":extensions:sql:transfer-process") -``` - -The EDC project has checks in place to make sure module IDs are unique. - -> Rule 1: Modules must have unique names. - -In addition, the _module name_ should give a hint what is in the module, without being too verbose. The -earlier example would be a bad one, because "transfer-process" does not indicate what the contents could -be. This is especially important because we require the Maven's _artifactId_ to be equal to module names. - -Here are some bad examples: -- `:core:common:transfer-process:validation`: bad because "validation" is likely to be not unique and isolated it only indicates that it has to do with validation, but not in _what context_. -- `:core:dataplane:framework`: again, "framework" is liable to cause conflicts, and in addition, it's a very generic, unspecific term - -Refactoring these bad examples, we could arrive at these: -- `:core:common:transfer-process:transfer-process-validation`: could contain validation functions -- `:core:dataplane:dataplane-framework`: would contain default impls and platform code for the dataplane - -> Rule 2: Module names should indicate what the contents are. - -> Rule 3: Module names must be identical to the Maven artifactId (if one is published). - -### Maven artifactId - -The EDC project uses the same `groupId = "org.eclipse.edc"` across all sub-projects, which means all -_artifactIds_ must be unique across multiple software components to avoid conflicts. - -> Rule 4: A Maven artifactId must be unique within the groupId. - -> Rule 5: A Maven artifactId must be identical to the module name (cf. [Rule 3](#gradle-module-name)). - -### Java package name - -Following Oracle's [Java package naming conventions](https://docs.oracle.com/javase/tutorial/java/package/namingpkgs.html), -EDC's base package name is `org.eclipse.edc`, followed by domain and then function. For example, the -Policy SPI should be located at `org.eclipse.edc.policy.spi` rather than `org.eclipse.edc.spi.policy`. -Here, `policy` is the "domain", i.e. the thematic context, and `spi` would be the "function", i.e. what -kind of package it is or what purpose it serves. - -The module name could be a helpful reference for the package name, replacing dashes with dots. - -Other positive examples would be: -- `org.eclipse.edc.transferprocess.validation`: domain is `transferprocess`, and it's a package that deals with validation -- `org.eclipse.edc.dataplane.framework.manager`: here the domain is `dataplane.framework`, so all code should be beneath that directory. - -This helps to avoid split packages. - -> Rule 6: Package names should first contain the domain, and then the function. - -### Configuration properties - -Configuration properties should have a unique prefix `edc` to avoid clashes, when EDC gets embedded -in other Java frameworks such as Spring. Further, the config property should contain the component for -which it is valid, and a section in hierarchical structure (dot notation) indicating what the value is about. - -Bad: -- `edc.retry.backoff.min`: does not contain the component, i.e. _which_ retry backoff min value is configured -- `edc.retry.backoff`: does not contain the component nor does it indicate, _which_ value is configured, i.e. what data type is expected -- `edc.core.retryBackoffMin`: is not hierarchically structured -- `edc.core.system.threadpool-size`: missing part of the component and is therefore misleading, because it does not indicate _what_ threadpool we're configuring -- `edc.dataplane.wait`: does not indicate which value is configured -- `web.http.port`: not prefixed with `edc`, can lead to conflicts - -Better: -- `edc.core.retry.backoff.min` -- `edc.core.system.health.check.threadpool-size` -- `edc.dataplane.queue.poll-timeout` -- `edc.web.http.port` - -> Rule 7: Configuration properties are prefixed with `edc.`. - -> Rule 8: Configuration properties must contain the component in dotted notation to which they belong. - -> Rule 9: Configuration properties must indicate the value and datatype that they configure. - - -## Implementation - -Renaming according to the defined [rules](#approach). -- Check if _module name_ and _artifactId_ are unique and represent a concatenation in the correct order. -- Modify release process to use `org.eclipse.edc` as _groupId_. -- Check every Java package and move classes if necessary. -- Detect and resolve split packages. -- Align existing configuration properties in the EDC project. Add a clear warning to extensions that fail to load a (new) value. - -Changes in the connector repository will affect downstream repositories, in addition, conventions should -also be implemented there. \ No newline at end of file diff --git a/docs/developer/decision-records/2022-10-20-trust-framework-adoption-repository/README.md b/docs/developer/decision-records/2022-10-20-trust-framework-adoption-repository/README.md deleted file mode 100644 index cde6273b0c9..00000000000 --- a/docs/developer/decision-records/2022-10-20-trust-framework-adoption-repository/README.md +++ /dev/null @@ -1,32 +0,0 @@ -# Trust Framework Adoption Repository - -## Decision - -A new repository called Adoption Framework Repository will be created in the [EDC project](https://github.com/eclipse-edc). -This repository will provide some generic extensions and documentation enabling to enforce the compliance with trust framework. -As compliance with Gaia-X was stated as a goal in the proposal on the part of the EDC project, the repository will cover the -configuration of these generic extensions in order to comply with the [Gaia-X Trust Framework](https://gaia-x.eu/wp-content/uploads/2022/05/Gaia-X-Trust-Framework-22.04.pdf). - -## Rationale - -Enabling sovereign and secured data exchanges between companies implies to be compliant with a set of rules. -This set of rules is known as the _trust framework_. Each trust framework defines its own rules, but they generally all fall under -certain categories, such as identity enforcement, access control... Thus, it was decided to create a dedicated repository called _Trust Framework Adoption_ -which will provide generic extensions and documentation for simplifying the trust framework enforcement into the EDC components. -These extensions will be packaged and published in the same way as the other EDC components. - -The Gaia-X Trust Framework is today the industry standard in terms of sovereign and trusted data exchanges, and defines -a set of rules known as the Compliance Process. Every Gaia-X certified dataspace must be compliant with this Compliance Process in order -to ensure trust and interoperability. - -In order to ease adoption of the EDC into Gaia-X dataspaces, it was also decided to provide the Gaia-X implementation/configuration of the above-mentioned generic -extensions in the same repository. This approach enables to make the EDC components compliant with the Gaia-X Trust Framework without writing any code. - -> ⚠️The EDC project does not aim to support all future trust frameworks in a similar form and make them available as bundles as comparable artifacts. -> Compliance with Gaia-X was stated as a goal in the proposal on the part of the EDC project and takes a separate role in the consideration. -> Thus, only the Gaia-X flavour of the above-mentioned extensions will be created and maintained under the EDC project. Any other future framework would have to develop, maintain and publish its own extensions. - -## Approach - -A request will be emitted to the Eclipse Foundation asking for the creation of a new repository under the [EDC project](https://github.com/eclipse-edc). -Then, all extensions and documentations related to compliance with trust framework will be placed into this repository. diff --git a/docs/developer/decision-records/2022-10-21-gradle-versioncatalogs/README.md b/docs/developer/decision-records/2022-10-21-gradle-versioncatalogs/README.md deleted file mode 100644 index 4a96de380af..00000000000 --- a/docs/developer/decision-records/2022-10-21-gradle-versioncatalogs/README.md +++ /dev/null @@ -1,81 +0,0 @@ -# Usage of Gradle Version Catalogs - -## Decision - -The EDC Build Plugin (currently under development) will provide -a [Gradle Version Catalog](https://docs.gradle.org/7.4/userguide/platforms.html) -that will contain versions of all the third-party libraries that are presently used in the EDC codebase. - -## Rationale - -The usage of version catalogs is expected to remove the possibility for version clashes between EDC and client projects -because it centralizes common definitions. -It will also make it transparent, which versions of which libraries EDC is using internally without having to look at -the source code or performing a dependency inspection on the build level. - -There are in fact multiple scenarios where a benefit can be gained from version catalogs. - -1. EDC itself will be able to get rid of declaring all dependencies and accessing them through build - properties (`val something : String by project`), because the version catalog is typed and accessible at - configuration time. It is also hierarchical, so it is easy to navigate and access. - -2. Other EDC projects/components: here we'll mostly use the version catalog to enforce version consistency across - multiple repos/projects. The version catalog is created centrally, and distributed through Gradle plugins. This also - helps in keeping third-party libraries updated across multiple projects. - -3. Third-party client projects: various industry initiative such as Catena-X would benefit from the version catalog, in - that they do not have to look at EDC's source code in order to learn which version of which lib it uses, but can - consult the version catalog distributed by the plugin. That will avoid version clashes with transitive dependencies, - and the amount of introspection and intricate knowledge necessary to use EDC. - -## Approach - -The EDC Build Plugin (under development) will declare and distribute the version catalog as part of its public API. It -will contain all third-party libraries currently in use by EDC at the time of publication. For example a structure -similar to the following could emerge: - -```kotlin -versionCatalogs { - create("edcext") { //extensions is a reserved keyword - library("azure-storage", "com.azure:azure-storage-blob:X.Y.Z") - library("azure-cosmos", "com.azure:azure-cosmos:X.Y.Z") - library("azure-resourcemanager", "com.azure:azure-resource-manager:X.Y.Z") - library("azure-resourcemanager-auth", "com.azure:azure-resource-manager-authorization:X.Y.Z") - // ... - } -} -``` - -Version catalogs are lightweight, much more so than platforms (which actually influence the dependency graph), and they -should be understood as the EDC project team's recommendation. Clients can then use the platform feature to apply them -and restrict the dependency graph or simply choose to override them at their own risk. - -### A word on naming - -Version catalogs automatically convert separated aliases into hierarchical structures, so `azure-resourcemanager-auth` -would be converted into `azure.resourcemanager.auth`. As a general rule of thumb those aliases should include the -project name and the module that is being imported, for example -Jackson: `com.fasterxml.jackson.core:jackson-annotations:X.Y.Z`: - -- **bad**: `jackson.core.annotations`: the `core` is not needed, as it's part of the group id and does not offer - additional - insights -- **bad**: `fasterxml-jackson.annotations`: should avoid long project names, people would likely expect it to - be `jackson` rather than `fasterxml-jackson` -- **better** `jackson.annotations`, `jackson.core`, `jackson.databind`, etc. - -### Guidelines when to create new entries in the catalog - -As a general rule of thumb a library should be included in the dependency version catalog when: - -- it is used in multiple modules in EDC -- it is a technology dependency, such as Azure Blob Storage or Google Cloud Storage -- it is an essential dependency, such as AssertJ, Mockito, etc. -- there are known conflicts, vulnerabilities or inconsistencies, even between minor versions. Crypto-libraries sometimes - are affected by this. - -## Nota Bene - -- Version catalogs will be implemented in the EDC Build Plugin first, and will be adopted in EDC at a later point in - time -- Version catalogs are still an incubating feature \ No newline at end of file diff --git a/docs/developer/decision-records/2022-11-28-release-management/README.md b/docs/developer/decision-records/2022-11-28-release-management/README.md deleted file mode 100644 index 655415e9314..00000000000 --- a/docs/developer/decision-records/2022-11-28-release-management/README.md +++ /dev/null @@ -1,82 +0,0 @@ -# Release Management of Dataspace Components - -## Decision - -EDC's release management will undergo some refactoring/restructuring to be able to accommodate various requirements that -arise from other projects as well as the distributed nature of the components themselves. The term "release management" -solely refers to the [Jenkins build server](https://ci.eclipse.org/edc) we use, i.e. to the "delivery" part in "CI/CD". - -## Rationale - -The complexity of the EDC project grown quite a bit over the past months so that we now have these separate components, -all of which are hosted in separate Git repositories. - -- Build: contains runtime metamodel and Gradle plugins -- Connector (sometimes referred to as the "Core") -- Federated Catalog -- Identity Hub -- Registration Service -- (Minimum Viable Dataspace: does not publish any artifacts) - -These components have dependencies onto one another, yet we will use one and the same version for all of them ( -cf. [decision record 2022-08-11](../2022-08-11-versioning_and_artifacts)). However, it has become apparent that having a -common release strategy for all of them is necessary to avoid version clashes and maintain our development velocity. - -When we publish to MavenCentral, all components must be released with the same version. To avoid feature gaps between -releases, we need to verify compatibility amongst the components on a daily basis. - -### Homogenous releases - -By that we mean that all components should always have the same version number. That implies, that every component -depends on other components with the _same version number_. For example, version `0.0.3` of the Registration -Service would depend on version `0.0.3` of the Runtime Metamodel, the Connector and the IdentityHub. In turn, -that implies, that before we can build and release Registration Service `0.0.3`, we **must** release that exact -version of all the other components. Only then can we update the dependencies and start the release process. - -## Approach - -We will separate our releases into two major categories: _automatic_ and _manual_ releases. While the earlier is -triggered either by an external system, such as Github or by a cron job, the latter is done only on-demand upon human -interaction. - -### Automatic releases - -1. **`SNAPSHOT` builds** are created for every component. Every commit on the `main` branch of every component triggers - a - snapshot build. -2. **Nightly components build**: a build job triggers `SNAPSHOT` releases of all components in sequence. The purpose of - this - is to verify, that all components are still compatible to each other and to identify broken APIs/SPIs. This works - because every component uses a `SNAPSHOT` version of the other components, and dependent components are - built/released first.
- As it stands, that sequence - is `runtime-metamodel -> connector -> [federated-catalog, identity-hub] -> registration-service`. -3. **Nightly tagged build**: after the "nightly components build" has successfully completed, every component releases - a " - tagged" version, i.e. a snapshot version with metadata that doesn't get overwritten, e.g. `0.0.1.4-SNAPSHOT0221128-SNAPSHOT`. - In order to make it truly repeatable, every component must update its dependencies to other dependencies. - The purpose of this is to allow for repeatable builds in client applications, while keeping feature gaps to a - minimum. - -### Manual releases - -1. **Release-by-tag**: the build job lets the user select a Git tag and a `VERSION` string as input and builds the - specified - version based on the given tag. -2. **Release-by-branch**: the build job lets the user select a Git branch and a `VERSION` string as input and builds the - specified version based on the given branch. -3. **Release components**: all components are built and released. For actual releases, i.e. artifacts that get published - to - MavenCentral, we need a build job that accepts the version as input parameter, and then triggers all downstream - projects with that same version. This is similar to the "Nightly components build". - -## Implementation notes - -As many of the aforementioned builds jobs are quite similar, we should try to create reusable pipelines in Jenkins. For -example, a pipeline to build and release all the components is used in two contexts. - -One way of doing this is creating a parameterized job, that accepts the Git repo, a version string and an optional git -ref as input. So we have _one_ job, that is invoked with different parameters for every component or release scenario. - -Then, once we have that modular pipeline in place, we can create _trigger pipelines_, i.e. pipelines with the sole -purpose of triggering other jobs. They can also have post-build hooks such as notifying Discord or sending emails. diff --git a/docs/developer/decision-records/2023-01-26-release-process/README.md b/docs/developer/decision-records/2023-01-26-release-process/README.md deleted file mode 100644 index 8148f15a9af..00000000000 --- a/docs/developer/decision-records/2023-01-26-release-process/README.md +++ /dev/null @@ -1,29 +0,0 @@ -# Release process - -## Decision - -EDC release will happen using an ad-hoc root gradle project that has all the components as subprojects. -This needs to be done because the `gradle-nexus.publish-plugin` doesn't support a publication of artifacts coming from -separate repositories with the same `groupId`. - -## Rationale - -As the `publishToSonatype` gradle task needs to run together with the `closeAndReleaseSonatypeStagingRepository`, that's -the one needed to close the staging repository and publish the artifacts on the release repository (Maven Central), we'd -need to gather all the components in a single gradle project and run the publish task from there. -This approach can be used for "snapshot" and "nightly" releases as well. - -## Approach - -We'd need two new repositories under the `eclipse-edc` organization, that could be named, according to the convention: -- `JenkinsPipelines`: - it will contain all the Jenkins pipeline files that are consumed by [our Jenkins instance](https://ci.eclipse.org/edc), - currently they are stored into a committer personal GitHub account. -- `Release`: - it will contain the "root release project" and the script needed to prepare it for a version release, currently they - are stored into a committer personal GitHub account. - -On Jenkins there will be a single job that will build and release the current main branches given a version, at the end, -if the version doesn't end with the `-SNAPSHOT` postfix, the GitHub release will created on every component's repository. - -That single job would then be used for official releases, snapshots and nightly releases. diff --git a/docs/developer/decision-records/2023-02-10-nightly-builds/README.md b/docs/developer/decision-records/2023-02-10-nightly-builds/README.md deleted file mode 100644 index d9d5d046a7d..00000000000 --- a/docs/developer/decision-records/2023-02-10-nightly-builds/README.md +++ /dev/null @@ -1,29 +0,0 @@ -# Nightly builds become release versions - -## Decision - -Nightly builds will henceforth be published as release version (as opposed to: snapshots) and will be published to OSSRH -Releases. -Only major releases, such as milestones, will get published to MavenCentral. - -## Rationale - -Downstream projects may want to track the EDC features as closely as possible while still maintaining a consistent and -repeatable build pipeline. EDC's release train is roughly six to eight weeks, which may be too sparse for downstream -projects' development schedules. - -To that end, the only option currently is to use nightlies, but they are snapshots, and snapshot builds are ill-suited, -because they are not permanent, and can be overwritten or deleted at any time. - -## Approach - -- Publish all snapshots to [OSSRH Snapshots](https://oss.sonatype.org/content/repositories/snapshots/) -- Make nightly versions releases, e.g. `0.0.1.4-SNAPSHOT0230210` -- Publish all release versions to [OSSRH Releases](https://oss.sonatype.org/content/repositories/releases/) -- Only publish major releases (e.g. milestones) to MavenCentral - -The reason for using OSSRH Release instead of MavenCentral for nightly releases is simply that we do not want to spam -MavenCentral with ~300 artifacts on a daily basis, which would offer little value to the larger community. - -The build plugin needs to be adapted to publish to OSSRH Releases by default. Further, we need to implement a separate, -additional task that allows publishing to MavenCentral, which is invoked from a (new) Jenkins job. diff --git a/docs/developer/decision-records/2023-03-15_repository_split/README.md b/docs/developer/decision-records/2023-03-15_repository_split/README.md deleted file mode 100644 index ab6266d5ce5..00000000000 --- a/docs/developer/decision-records/2023-03-15_repository_split/README.md +++ /dev/null @@ -1,79 +0,0 @@ -# Splitting the Connector repository - -## Decision - -The Connector repository will be split up into a "core" repository and technology repositories. Extensions that are -specific to a particular cloud environment will get moved out into new repositories to which we will refer as -"technology repos". -All technology-specific extensions of all the other repos (FederatedCatalog, IdentityHub, RegistrationService) will be -moved out as well. - -The Connector repository will _not_ get renamed, but it can henceforth be referred to as "core" or "connector-core". - -## Rationale - -Many third-party technology extensions are very specific to a particular cloud provider, and cannot reasonably be used -across multiple clouds, thus they depend on specific libraries/SDKs and testing them may even require an account with -that cloud provider. - -The reasoning for moving these kinds of extensions out to separate repositories is as follows: - -- keeping the core small and agile: building and running it should be fast and efficient, no third-party account should - be required. -- simplifying the core's CI builds: the build should be self-contained using GitHub services which are docker - containers. Checking for the presence of account keys or other service credentials in CI builds is thus obsolete. -- reducing build time: not having to build and test many extensions everytime will significantly improve the developer - experience, because build times will be shorter. -- decoupling responsibilities: contributing in one repository requires only specific knowledge about that - particular tech, as opposed to: understanding of the _entire_ code base. Maintainer teams with specific knowledge can - be established. -- different contribution criteria: while the criteria for adoption in the core should be kept very strict, the same - might - not necessarily be true for technology repos. There, features could get adopted simply for the sake of having them, as - long as they are properly maintained. While in the core there would have to be a specific reason to _adopt_ a feature, - in the technology repos there would have to be a specific reason to _deny_ them. -- decoupling of lifecycles: archiving/abandoning a particular technology repo would not influence the core at all, - should we ever need to do that. - -## Approach - -At the time of this writing three technology repositories can be identified: - -- `Technology-Azure` -- `Technology-Aws` -- `Technology-Gcp` - -Once these repositories are created, all extensions, that contain code for services, that are cloud-specific, will be -moved out accordingly. Initially, this will just be a code dump in the technology repos, but all further development -will then happen there. -To maintain consistency, all specific cloud-provider-extensions will go in the technology repo, e.g. Azure: KeyVault, -BlobStore, CosmosDB etc. - -### Continuous Integration - -Technology repos should maintain all access keys, service credentials etc. that are needed to run integration tests -against a live testing environment as repo secrets. For example, running i-tests against an actual CosmosDB instance -requires a secret key. These secrets are kept per repository, as opposed to: per GitHub org. - -CI workflows must take that into account, and also consider a situation where they are run from a fork. - -### Version Catalogs - -Technology repositories should publish their own version catalogs. All technology-specific entries will be removed by -the core's version catalog. - -## Further Consequences - -While the contribution standards for the Connector core will remain high (if anything, they will get raised even more), -technology repositories may have different contribution guidelines. In technology repos, "feature completeness" could -be a sufficient justification to adopt a feature. Cloud providers may want to create a way to run a connector -exclusively using their technology, and for that, they may want a wide variety of services. - -Every technology repository should define a team of contributors, who primarily take care of maintaining the code in the -repo. - -If a technology repo is not properly maintained, and is not ready for release at a predetermined time, the project -committee may elect to omit it from the release. - -When a repository loses its maintainers, or development becomes otherwise stale, the project committee can elect to -archive the repository after an appropriate notice period. \ No newline at end of file diff --git a/docs/developer/decision-records/2023-03-31-version-catalog-per-component/README.md b/docs/developer/decision-records/2023-03-31-version-catalog-per-component/README.md deleted file mode 100644 index 7f72ea7cf1c..00000000000 --- a/docs/developer/decision-records/2023-03-31-version-catalog-per-component/README.md +++ /dev/null @@ -1,33 +0,0 @@ -# Version Catalog per Component - -## Decision - -Every component will have its own version catalog - -## Rationale - -At the moment we have a single catalog located in the `GradlePlugins` repository and used by every component. This is not -sustainable anymore because a version update in that repository could cause breaking changes in the other ones that will -pop up only in the nightly build. Plus, the `Release` repository has to define these catalogs manually because they aren't -published. - -## Approach - -We should keep a "base" `edc-versions` catalog in the `GradlePlugins` containing only the general purpose dependencies like -`runtime-metamodel`, `jackson`, `junit` ... in fewer words, the one that are automatically injected into every module by -the `DefaultDependencyConvention` in the `edc-build` plugin. - -This `edc-versions` catalog could also be injected automatically by the plugin. - -Then every component will have its own version catalog containing the version of the main `edc-versions` catalog plus only -the versions that it's actually using, defined in the `gradle/lib.versions.toml`. -This catalog will be published to maven with the artifactId in the format of `-versions`, like: -- `connector-versions` -- `identity-hub-versions` -- `registration-service-versions` -- `federated-catalog-versions` - these three will be defined once we migrate the cloud service dependant dependencies to the respective repositories -- `technology-aws-versions` -- `technology-azure-versions` -- `technology-gcp-versions` - diff --git a/docs/developer/decision-records/2023-05-17-release-process/README.md b/docs/developer/decision-records/2023-05-17-release-process/README.md deleted file mode 100644 index 3998421a44e..00000000000 --- a/docs/developer/decision-records/2023-05-17-release-process/README.md +++ /dev/null @@ -1,24 +0,0 @@ -# Release process - -## Decision - -We will start to release more often, changing the versioning convention. - -## Rationale - -6-weeks (if not more) release cycles demonstrated to be not really suitable for downstream projects, because forced them -to either use nighly builds (that are SNAPSHOT) or wait for the next release. -A shorter release cycle would solve this issue. - -## Approach - -We will drop the `-milestone-#` suffix to the release version, passing to a standard SemVer version number like `X.Y.Z`. - -The SemVer specification won't be fully followed up for the moment, the approach followed will be: -- `X` will remain 0 until the end of the "incubation phase" -- `Y` will change on every release -- `Z` will remain 0 (could be used for bugfixes, please read below) -- releases will be created from `main` branch -- bugfix versions may be released for a specific X.Y version at the discretion of the committers. -- a new version release will need an agreement in the committer group before being triggered -- we won't follow backward compatibility: any release could bring in breaking changes diff --git a/docs/developer/decision-records/2023-05-23-java-17-baseline/README.md b/docs/developer/decision-records/2023-05-23-java-17-baseline/README.md deleted file mode 100644 index 7dfaa47fd87..00000000000 --- a/docs/developer/decision-records/2023-05-23-java-17-baseline/README.md +++ /dev/null @@ -1,16 +0,0 @@ -# Java 17 baseline - -## Decision - -We will use Java 17 as baseline version. - -## Rationale - -Java 11 active support [will end in September 2023](https://endoflife.date/java), and, following Java "new" release cycle, we should update the baseline -version to the current LTS from time to time. -Other OSS frameworks as [Spring already did that](https://spring.io/blog/2021/09/02/a-java-17-and-jakarta-ee-9-baseline-for-spring-framework-6) -already did that. - -## Approach - -Just update the `javaLanguageVersion` property of the gradle `BuildExtension`. diff --git a/docs/developer/decision-records/2023-05-25-template-repository/README.md b/docs/developer/decision-records/2023-05-25-template-repository/README.md deleted file mode 100644 index eab9920c2b8..00000000000 --- a/docs/developer/decision-records/2023-05-25-template-repository/README.md +++ /dev/null @@ -1,93 +0,0 @@ -# Template and `.github` Repositories - -## Decision - -Two new repositories will be created within the EDC GitHub organization: -- `.github` repository for common files that are the _same_ for all components, -- dedicated [template repositories](https://docs.github.com/en/repositories/creating-and-managing-repositories/creating-a-template-repository) - for files that must be _adapted/modified_ by each component. - -## Rationale - -Currently, general documents like the `pr_etiquette.md` or `CONTRIBUTING.md` -are located in the Connector repository, although they apply to all repositories in the scope of the -EDC project. In addition, resources like style files or issue/pr templates are duplicated across -repositories and, with this, are not up-to-date, as most of the changes to those documents are made -in the Connector repo and are not replicated across the other component repos. - -The reasoning for moving these documents out to separate repositories is as follows: -- simplify maintenance of files -- facilitate the onboarding for new community members -- improve automation of GitHub processes -- harmonize creation process of new EDC repositories - -## Approach - -At the time of this writing, two repositories can be identified: -- `.github` -- `template-basic` - -We need to identify "common" and "repository-specific" documents. In addition, we need to test how the -`.github` repo and the repositories' `.github` folders relate to each other. - -### Common documents - -Suggested structure for `.github` repository: -- `.github/`: - - [ISSUE_TEMPLATE](https://github.com/eclipse-edc/Connector/tree/main/.github/ISSUE_TEMPLATE) and [PULL_REQUEST_TEMPLATE.md](https://github.com/eclipse-edc/Connector/blob/main/.github/PULL_REQUEST_TEMPLATE.md) - - generic workflows, e.g. - - [scan-pull-request.yaml](https://github.com/eclipse-edc/Connector/blob/main/.github/workflows/scan-pull-request.yaml), - - [first-interaction.yml](https://github.com/eclipse-edc/Connector/blob/main/.github/workflows/first-interaction.yml), - - [close-inactive-issues.yml](https://github.com/eclipse-edc/Connector/blob/main/.github/workflows/close-inactive-issues.yml) - - `release-.yml` - - ... -- `contributing/`: - - [CONTRIBUTING.md](https://github.com/eclipse-edc/Connector/blob/main/CONTRIBUTING.md) - - [contribution_categories.md](https://github.com/eclipse-edc/Connector/blob/main/contribution_categories.md) - - [known_friends.md](https://github.com/eclipse-edc/Connector/blob/main/known_friends.md) - - [pr_etiquette.md](https://github.com/eclipse-edc/Connector/blob/main/pr_etiquette.md) - - [styleguide.md](https://github.com/eclipse-edc/Connector/blob/main/styleguide.md) - - ... -- `docs/`: same as for every repo - - `developer/`: generic documentation files from [docs/developer](https://github.com/eclipse-edc/Connector/tree/main/docs/developer) - - `decision-records/`: those that cover the EDC project from [docs/developer/decision-records](https://github.com/eclipse-edc/Connector/tree/main/docs/developer/decision-records) - - `legal/`: files from [docs/legal](https://github.com/eclipse-edc/Connector/tree/main/docs/legal) - - `templates/`: files from [docs/templates](https://github.com/eclipse-edc/Connector/tree/main/docs/templates) - - ... -- `profile/README.md`: provide a [welcome page](https://github.blog/changelog/2021-09-14-readmes-for-organization-profiles/) -- `resources/`: move files like [edc-checkstyle-config.xml](https://github.com/eclipse-edc/Connector/blob/main/resources/edc-checkstyle-config.xml) -- `README.md` -- ... - -Impacts on component repositories: -- remove generic workflow files -- remove listed contributing files -- remove decision records that affect the general EDC project, e.g., org-wide release process -- remove generic resources -- keep component-specific decision records in component repositories - -### Repository-specific documents - -Suggested basic structure for new repositories: -- `.github/`: e.g., [dependabot.yml](https://github.com/eclipse-edc/Connector/blob/main/.github/dependabot.yml) _(to be identified, as mentioned above)_ -- `docs/developer/decision-records/README.md` _(empty list)_ -- [.gitattributes](https://github.com/eclipse-edc/Connector/blob/main/.gitattributes) -- [.gitignore](https://github.com/eclipse-edc/Connector/blob/main/.gitignore) _(empty)_ -- [CODEOWNERS](https://github.com/eclipse-edc/Connector/blob/main/CODEOWNERS) _(empty list)_ -- [LICENSE](https://github.com/eclipse-edc/Connector/blob/main/LICENSE) -- [NOTICE.md](https://github.com/eclipse-edc/Connector/blob/main/NOTICE.md) _(empty list)_ - - -## Further Considerations - -Repositories that could be identified in the future: -- `template-gradle` - -With additional files (matching Java/Gradle) -- `gradle/` -- [build.gradle.kts](https://github.com/eclipse-edc/Connector/blob/main/build.gradle.kts) -- [gradle.properties](https://github.com/eclipse-edc/Connector/blob/main/gradle.properties) -- [gradlew](https://github.com/eclipse-edc/Connector/blob/main/gradlew) -- [gradlew.bat](https://github.com/eclipse-edc/Connector/blob/main/gradlew.bat) -- [settings.gradle.kts](https://github.com/eclipse-edc/Connector/blob/main/settings.gradle.kts) - diff --git a/docs/developer/decision-records/2023-06-19-new-issue-triage-process/README.md b/docs/developer/decision-records/2023-06-19-new-issue-triage-process/README.md deleted file mode 100644 index 7be6cc09acd..00000000000 --- a/docs/developer/decision-records/2023-06-19-new-issue-triage-process/README.md +++ /dev/null @@ -1,33 +0,0 @@ -# Implementation of a new triage process - -## Decision - -All issues of the EDC must go through a triage process before being accepted into release planning. To that end, all new -issues will be auto-labelled with a new `triage` label. Bug reports and feature requests will also be -labelled `bug_report` and `feature_request` accordingly. - -## Rationale - -Newly created issues and bug reports must go through vetting by the technical committee before they can get accepted -into release planning. - -At times, bugs have been reported that turn out not to be bugs, or have nothing to do with the EDC code base, e.g. local -configuration issues. These new labels make a clear distinction between bug/feature _requests_, and things that actually -planned by the EDC committers. - -## Approach - -- The committers go through all issues labelled `triage` every once in a while. - > _The EDC technical committee explicitly **does not** commit to - a particular time frame for this!_ -- The auto-close bot will be adapted in the following aspects: - - only considers issues labelled with `triage` - - issues are marked `stale` after 28 days - - stale issues are closed after another 14 days -- Issue template must be adapted to auto-assign the `triage` and `bug_report`/`feature_request` label -- After triage, the `bug_report`/`feature_request` label is replaced with the existing labels, and the `triage` label is - removed. Typically, triaged issues should have an assignee and a target milestone, or should be closed with an - appropriate label, e.g. `Won't fix`. In case more information is needed from the reporter, the `question` label is - applied, which resets the stale counter. -- Committers reserve the right to bypass the triage process to expedite urgent features or bug fixes. - diff --git a/docs/developer/decision-records/2023-06-19-onboarding-contributors/README.md b/docs/developer/decision-records/2023-06-19-onboarding-contributors/README.md deleted file mode 100644 index 0960c3be9bb..00000000000 --- a/docs/developer/decision-records/2023-06-19-onboarding-contributors/README.md +++ /dev/null @@ -1,45 +0,0 @@ -# Changes to onboarding contributors - -## Decision - -Becoming a _"contributor"_ in the project will now require a proven track record of continuous contributions. The list -of _"contributors"_ will be checked regularly for active participation, and inactive members will get removed. - -NB: the term _"contributor"_ refers to a specific role in the GitHub org defined by Eclipse, not in the loosely defined -GitHub sense "person who has contributed once". - -## Rationale - -Contributors have certain rights, like assigning issue labels, that can have a significant impact on the lifecycle of an -issue. For example, a contributor could remove the `triage` label ( -see [this decision-record](../2023-06-19-new-issue-triage-process)) and thus schedule an issue for planning. At the very -least that causes additional work for the technical committee. -Combined with the fact that up until now there was no real process to become a committer, that could cause confusion and -additional effort. - -So in order to become a committer, there now has to be a sustained stream of valuable contributions in the form of -pull-requests and beyond that, the contributor-to-be has to show that they'll be a valuable addition to the group, e.g. -by participating in discussions, answering questions etc. It is generally assumed that the project guidelines -regarding [pull requests](../../../../pr_etiquette.md), [code style](../../../../styleguide.md), [coding -principles](../../architecture/coding-principles.md) etc. are adhered to. - -The intention behind this is not to make it unduly hard for people to participate in the project ("gate-keeping"), but -rather to groom an active community, that gains insight and knowledge about EDC, and where members can - and are willing -to - contribute back into the community. - -While pull requests are certainly the most important and impactful way of contributing, there are others, like -participating in discussions, creating meaningful and well-formulated issues, etc. that will be taken into -consideration. - -If active participation of a contributor stagnates or ceases completely, that contributor may get removed again without -prior notice. Both adding and removing happens at the discretion of the technical committee of EDC. - -> Being a contributor first is a precondition to becoming a committer! - -## Approach - -- Adding and removing contributors is done by the technical committee in one of its regular meetings -- There is no hard minimum amount of contributions, approval happens at the discretion of the committee. There also is - no guaranteed time frame for a decision. -- After this decision-record gets accepted, the technical committee will evaluate the current list of contributors, and - perform an initial clean-up diff --git a/docs/developer/decision-records/README.md b/docs/developer/decision-records/README.md index 029975ea467..7f5319250de 100644 --- a/docs/developer/decision-records/README.md +++ b/docs/developer/decision-records/README.md @@ -3,17 +3,9 @@ - [2022-02-03 Integration Testing](2022-02-03-integration-testing/) - [2022-02-07 Micrometer Metrics](2022-02-07-micrometer-metrics/) - [2022-02-07 Tracing](2022-02-07-tracing/) -- [2022-02-10 Code Coverage](2022-02-10-code-coverage/) - - [Jacoco](2022-02-10-code-coverage/jacoco.md) - - [Jacoco with Codacy](2022-02-10-code-coverage/codacy.md) - - [Jacoco with Codecov](2022-02-10-code-coverage/codecov.md) - - [Jacoco with GitHub Actions](2022-02-10-code-coverage/jacoco_github_action.md) - - [Jacoco with Sonarqube](2022-02-10-code-coverage/sonarqube.md) -- [2022-02-11 CodeQL](2022-02-11-codeql/) - [2022-02-14 Helm Chart](2022-02-14-helm-chart/) - [2022-03-01 Serverless Transfers](2022-03-01-serverless-transfers/) - [2022-03-02 Performance Tests](2022-03-02-performance-tests/) -- [2022-03-11 Story Issues](2022-03-11-story-issues/) - [2022-03-14 Cloud Testing](2022-03-14-cloud-testing/) - [2022-03-14 Dependency Analysis](2022-03-14-dependency-analysis/) - [2022-03-15 Policy Scopes](2022-03-15-policy-scopes/) @@ -26,7 +18,6 @@ - [2022-06-19 Json Web Token](2022-06-19-json-web-token/) - [2022-07-04 Type Manager](2022-07-04-type-manager/) - [2022-07-05 IDS requests pagination](2022-07-05-ids-requests-pagination/) -- [2022-07-06 Release automation](2022-07-06-release-automation/) - [2022-07-22 Simplify FCC](2022-07-22-simplify-fcc/) - [2022-07-27 Custom DTO validation](2022-07-27-custom-dto-validation/) - [2022-07-28 TransferProcess "Provisioning Requested"](2022-07-28-transfer-process-new-state/) @@ -34,35 +25,21 @@ - [2022-08-04 Async Code Testing Practices](2022-08-04-async-code-testing-practices/) - [2022-08-04 Automated Documentation](2022-08-04-documentation-automation/) - [2022-08-09 Project structure review](2022-08-09-project-structure-review/) -- [2022-08-11 Versioning and Artifacts](2022-08-11-versioning_and_artifacts/) - [2022-08-17 Remove H2 Database Tests](2022-08-17-remove_h2_database_tests/) - [2022-09-18 IDS catalog request filtering](2022-09-18-ids-catalog-request-filtering/) - [2022-09-23 Extract metamodel and autodoc](2022-09-23-extract-metamodel-and-autodoc/) - [2022-09-29 Sql Query Streaming](2022-09-29-sql-query-streaming/) -- [2022-10-10 EDC Naming Conventions](2022-10-10-naming-conventions/) -- [2022-10-20 Trust Framework Adoption repository](2022-10-20-trust-framework-adoption-repository/) -- [2022-10-21 Gradle Version Catalogs](2022-10-21-gradle-versioncatalogs/) - [2022-10-31 Service Layer](2022-10-31-aggregate-service-layer/) - [2022-11-09 API Restructuring](2022-11-09-api-refactoring/) -- [2022-11-28 Release Management](2022-11-28-release-management/) - [2022-12-05 EDC Http Client](2022-12-05-edc-http-client/) - [2022-12-07 Transaction Synchronization](2022-12-07-transaction-synchronization/) -- [2023-01-26 Release Process](2023-01-26-release-process/) -- [2023-02-10 Nightly Builds](2023-02-10-nightly-builds/) - [2023-02-22 Contract Definition Validation](2023-02-22-contract-definition-validation/) - [2023-02-22 Update Entities](2023-02-22-update-entities/) - [2023-02-27 Dataspace Protocol TransferProcess state transitions](2023-02-27-dataspace-protocol-transferprocess-state-transitions) - [2023-03-02 Entity Store Refactoring](2023-03-02_entity_store_refactoring) - [2023-03-09-Event Framework Refactoring](2023-03-09-event-framework-refactoring) -- [2023-03-15-Repository_Split](2023-03-15_repository_split/) -- [2023-03-31-Version Catalog per Component](2023-03-31-version-catalog-per-component) - [2023-04-18-API-controllers-testing](2023-04-18-api-controllers-testing) -- [2023-05-17-Release-process](2023-05-17-release-process) - [2023-05-17-Helm-charts](2023-05-17-delete-helm-charts) -- [2023-05-23-Java-17-baseline](2023-05-23-java-17-baseline) -- [2023-05-25-template-repository](2023-05-25-template-repository) - [2023-06-02-separating_plugins_and_metamodel](2023-06-02-separating_plugins_and_metamodel) - [2023-06-05-validation-engine](2023-06-05-validation-engine) -- [2023-06-19-change-of-github-labels](2023-06-19-new-issue-triage-process/) -- [2023-06-19-onboarding-contributors](2023-06-19-onboarding-contributors/) - [2023-06-30-allow-use-of-testcontainers](2023-06-30-allow-use-of-testcontainers/) diff --git a/docs/developer/dependency_resolution.md b/docs/developer/dependency_resolution.md deleted file mode 100644 index 76b3e0d6f3d..00000000000 --- a/docs/developer/dependency_resolution.md +++ /dev/null @@ -1,274 +0,0 @@ -# Dependency resolution in the EDC - -The code base of the Eclipse Dataspace Connector is architected in away that allows for easily extending and swapping -the core functionality using certain plug-points called _extensions_. One example would be to swap out an in-memory -implementation for a datastore for one backed by an actual database. In order to achieve that there are several key -components: - -- a service interface, typically located in an SPI module -- a module providing the implementation, typically located in the `extensions` directory -- the service registry, i.e. the `ServiceExtensionContext`. Since it is not quite an IoC container, we'll henceforth - refer to it as the "context". -- a hook point into the loading sequence: an extension that instantiates and registers the implementation class with the - context - -## Registering a service implementation - -As a general rule the module that provides the implementation also should register it with the `ServiceExtensionContext` -. This is done in an accompanying service extension. For example, providing a CosmosDB based implementation for -a `FooStore` (stores `Foo` objects) would require the following classes: - -1. A `FooStore.java` interface, located in SPI: - ```java - public interface FooService { - void store(Foo foo); - } - ``` -2. A `CosmosFooStore.java` class implementing the interface, located in `:extensions:azure:cosmos:foo-store-cosmos`: - ```java - public class CosmosFooStore implements FooStore { - @Override - void store(Foo foo){ - // ... - } - } - ``` -3. A `CosmosFooStoreExtension.java` located also in `:extensions:azure:cosmos:foo-store-cosmos`. Must be accompanied by - a _"provider-configuration file"_ as required by - the [`ServiceLoader` documentation](https://docs.oracle.com/javase/8/docs/api/java/util/ServiceLoader.html). Code - examples will follow below. - -### Option 1: use `@Provider` methods (recommended) - -Every `ServiceExtension` may declare methods that are annotated with `@Provider`, which tells the dependency resolution -mechanism, that this method contributes a dependency into the context. This is very similar to other DI containers, e.g. -Spring's `@Bean` -annotation. It looks like this: - -```java -public class CosmosFooStoreExtension implements ServiceExtension { - - @Override - public void initialize(ServiceExtensionContext context) { - // ... - } - - //Example 1: no args - @Provider - public SomeService provideSomeService() { - return new SomeServiceImpl(); - } - - //Example 2: using context - @Provider - public FooStore provideFooStore(ServiceExtensionContext context) { - var setting = context.getSetting("...", null); - return new CosmosFooStore(setting); - } -} -``` - -As the previous code snipped shows, provider methods may have no args, or a single argument, which is -the `ServiceExtensionContext`. There are a few other restrictions too. Violating these will raise an exception. Provider -methods must: - -- be public -- return a value (`void` is not allowed) -- either have no arguments, or a single `ServiceExtensionContext`. - -Having a provider method is equivalent to invoking `context.registerService(SomeService, new SomeServiceImpl())`. Thus, -the return type of the method defines the service `type`, whatever is returned by the provider method determines the -implementation of the service. - -**Caution**: there is a slight difference between declaring `@Provider` methods and -calling `service.registerService(...)` with respect to sequence: the DI loader mechanism _first_ -invokes `ServiceExtension#initialize()`, and -_then_ invokes all provider methods. In most situations this difference is negligible, but there could be situations, -where this matters. - -#### Provide "defaults" - -Where `@Provider` methods really come into their own is when providing default implementations. This means we can have a -fallback implementation. For example, going back to our `FooStore` example, there could be an extension that provides a -default (=in-mem) -implementation: - -```java -public class DefaultsExtension implements ServiceExtension { - - @Override - public void initialize(ServiceExtensionContext context) { - // ... - } - - @Provider(isDefault = true) - public FooStore provideDefaultFooStore() { - return new InMemoryFooStore(); - } -} -``` - -Provider methods configured with `isDefault=true` are only invoked, if the respective service (here: `FooStore`) is not -provided by any other extension. - -> Default provider methods are a tricky topic, please be sure to thoroughly read the additional documentation about -> them [here](default_provider_methods.md)! - -### Option 2: register manually - -Of course, it is also possible to manually register services by invoking the respective method on -the `ServiceExtensionContext` - -```java - -@Provides(FooStore.class/*, possibly others*/) -public class CosmosFooStoreExtension implements ServiceExtension { - - @Override - public void initialize(ServiceExtensionContext context) { - var setting = context.getSetting("...", null); - var store = new CosmosFooStore(setting); - context.registerService(FooStore.class, store); - } -} -``` - -There are three important things to mention: - -1. the call to `context#registerService` makes the object available in the context. From this point on other extensions - can inject a `FooStore` (and in doing so will receive a `CosmosFooStore`). -2. declaring the exposed interface in the `@Provides()` annotation is required, as it helps the extension loader define - the order in which it needs to initialize extensions -3. service registrations **must** be done in the `initialize()` method. - -## Injecting a service - -Services should only be referenced by the interface they implement. This will keep dependencies clean and maintain -extensibility and modularity. Say we have a `FooMaintenanceService` that receives `Foo` objects over an arbitrary -network channel and stores them. - -### Option 1: use `@Inject` to declare dependencies (recommended) - -```java -public class FooMaintenanceService { - private final FooStore fooStore; - - public FooMaintenanceService(FooStore fooStore) { - this.fooStore = fooStore; - } -} -``` - -Note that the example uses what we call _constructor injection_ (even though nothing is actually _injected_), because -that is needed for object construction, and it increases testability. Also, those types of class fields should be -declared `final` to avoid programming errors. - -In contrast to conventional DI frameworks the `fooStore` dependency won't get auto-injected - rather, there has to be -another `ServiceExtension` that has a reference to the `FooStore` and that constructs the `FooMaintenanceService`: - -```java -public class FooMaintenanceExtension implements ServiceExtension { - @Inject - private FooStore fooStore; - - @Override - public void initialize(ServiceExtensionContext context) { - var service = new FooMaintenanceService(fooStore); //use the injected field - } -} -``` - -The `@Inject` annotation on the `fooStore` field tells the extension loading mechanism that `FooMaintenanceExtension` -depends on a `FooService` and because of that, any provider of a `FooStore` must be initialized _before_ -the `FooMaintenanceExtension`. The fact that `CosmosFooStoreExtension` provides a `FooStore` is declared using -the `@Provides` annotation. - -### Option 2: use `@Requires` to declare dependencies - -In cases where defining a field seems unwieldy or is simply not desirable, we provide another way to dynamically resolve -service from the context: - -```java - -@Requires({ FooService.class, /*maybe others*/ }) -public class FooMaintenanceExtension implements ServiceExtension { - - @Override - public void initialize(ServiceExtensionContext context) { - var fooStore = context.getService(FooStore.class); - var service = new FooMaintenanceService(fooStore); //use the resolved object - } -} -``` - -The important issue to mention is that `@Requires` is absolutely necessary to inform the service loader about the -dependency. Failing to add it this may potentially result in exceptions, and in further consequence, in -an `EdcInjectionException`. - -Option 1 and 2 are almost semantically equivalent, with the small exception of optional dependencies: -while `@Inject(required=false)` allows for nullable dependencies, `@Requires` has no such option and the service -dependency must be resolved explicitly using a boolean parameter `context.getService(FooStore.class, true)`. - -## Extension initialization sequence - -The extension loading mechanism uses a two-pass procedure to resolve dependencies. First, all implementors -of `ServiceExtension` are instantiated using their public default constructor, put in a list and sorted using a -topological sort algorithm based on their dependency graph. Cyclic dependencies would be reported in this stage. - -Second, the extension is initialized by setting all fields annotated with `@Inject` and by calling its `initialize()` -method. This implies that every extension can assume that by the time its `initialize()` method executes, all its -dependencies are already instantiated and registered, because the extension(s) providing them were ordered at previous -positions in the list, and thus have already been initialized. - -## Tests for classes using injection - -To test classes using the `@Inject` annotation, use the appropriate JUnit extension: - -- If only basic dependency injection is needed (unit testing), use the `DependencyInjectionExtension`. -- If the full EDC runtime should be run (integration testing), use the `EdcExtension`. - -## Limitations and differences to fully-fledged IoC containers - -#### Only available in `ServiceExtensions` - -Services can only be injected into `ServiceExtension` objects at this time as they are the main hook points for plugins, -and they have a clearly defined interface. All subsequent object creation must be done manually using conventional -mechanisms like constructors or builders. - -#### No multiple registrations - -Registering two implementations for an interface will result in the first registration being overwritten by the second -registration. If both providers have the same topological ordering it is undefined whichever comes first. A warning is -posted to the `Monitor`. - -_It was a conscientious architectural decision to forego multiple service registrations for the sake of simplicity and -clean design. Patterns like composites or delegators exist for those rare cases where having multiple implementors of -the same interface is indeed needed. Those should be used sparingly and not without a strong reason._ - -#### No collection-based injection - -Because there can be only ever one implementation for a service, it is not possible to inject a collection of -implementors as it would be in other DI frameworks. - -#### Only field injection - -At the moment the `@Inject` annotation can only target fields, meaning, that we cannot perform constructor or setter -injection with it, for example `public SomeExtension(@Inject SomeService someService){ ... }` would not be possible. - -#### No named dependencies - -Dependencies cannot be decorated with an identifier, which would technically allow for multiple service registrations ( -using different names). Technically this is linked to the limitation of single service registrations. - -#### Direct inheritors/implementors only - -This is not due to a limitation of the dependency injection mechanism, but rather due to the way how the context -maintains service registrations: it simply maintains a `Map` containing interface class and implementation type. - -#### Cyclic dependencies - -Cyclic dependencies are detected by the `TopologicalSort` algorithm, but the error reporting is a bit limited. - -#### No generic dependencies - -It's not possible to have dependencies with a type parameter. diff --git a/docs/developer/logging.md b/docs/developer/logging.md deleted file mode 100644 index 8ff127e62fa..00000000000 --- a/docs/developer/logging.md +++ /dev/null @@ -1,43 +0,0 @@ -# Logging - -A comprehensive and consistent way of logging is a crucial pillar for operability. Therefore, the following rules should be followed: - -## Logging component - -Logs must only be produced using the [`Monitor`](../../spi/common/core-spi/src/main/java/org/eclipse/edc/spi/monitor/Monitor.java) service, -which offers 4 different log levels: - -### `severe` -> Error events that might lead the application to abort or still allow it to continue running. - -Used in case of an unexpected interruption of the flow or when something is broken, i.e. an operator has to take action. -e.g. service crashes, database in illegal state, ... even if there is chance of self recovery. - -### `warning` -> Potentially harmful situations messages. - -Used in case of an expected event that does not interrupt the flow but that should be taken into consideration. - -### `info` -> Informational messages that highlight the progress of the application at coarse-grained level. - -Used to describe the normal flow of the application. - -### `debug` -> Fine-grained informational events that are most useful to debug an application. - -Used to describe details of the normal flow that are not interesting for a production environment. - -## What should be logged -- every exception with `severe` or `warning` -- every `Result` object evaluated as `failed`: - - with `severe` if this is something that interrupts the flow and someone should take care of immediately - - with `warning` if this is something that doesn't interrupt the flow but someone should take care of, because it could give worse results in the future -- every important message that's not an error with `info` -- other informative events like incoming calls at the API layer or state changes with `debug` - -## What should be not logged - -- secrets and any other potentially sensitive data, like the payload that is passed through the `data-plane` -- an exception that will be thrown in the same block -- not strictly necessary information, like "entering method X", "leaving block Y", "returning HTTP 200" diff --git a/docs/developer/releases.md b/docs/developer/releases.md deleted file mode 100644 index b766fc77f38..00000000000 --- a/docs/developer/releases.md +++ /dev/null @@ -1,182 +0,0 @@ -Release Approach -================ - -## Table of Contents - -* [Versioning](#versioning) - * [API Compatibility](#api-compatibility) - * [Modules Providing API](#modules-providing-api) - * [Towards a First Release](#towards-a-first-release) -* [Legal Documentation Requirements](#legal-documentation-requirements) - * [License and Notice Files](#license-and-notice-files) - * [Creating the Notice File](#creating-the-notice-file) - * [Background Information](#background-information) -* [Publishing Maven Artifacts](#publishing-maven-artifacts) - * [Naming Convention](#naming-convention) - -### Versioning - -The Eclipse Dataspace Connector will employ [SemVer](https://semver.org) for versioning and distinguish between the -following releases as defined by the [Eclipse Handbook](https://www.eclipse.org/projects/handbook/#release): - -- Major releases, which introduce API changes, -- minor releases, which add new functionality, but are API compatible with previous versions, and -- service releases, which include bug fixes only and add no significant new functionality. - -Between releases, snapshot versions reflecting the current state of modules can be packaged to distribution artifacts on -a regular basis. Snapshots, however, do not actually represent released versions. - -#### API Compatibility - -The concept of API compatibility is defined in terms of binary compatibility according to -the [Java SE 17 Language Specification](https://docs.oracle.com/javase/specs/jls/se17/html/jls-13.html) - -#### Modules Providing API - -The following modules define official extension points of the EDC based on the Java Service Provider Interface (SPI), -contributing public classes, interfaces and public members which are considered public APIs and are, therefore, covered -by the proposed versioning approach: - -- spi -- data-protocols/ids-spi - -Apart from these SPI-based extension points, individual modules can also contribute additional public-facing APIs, such -as communication endpoints (e.g., based on HTTP). To support a fast-paced development of such endpoints without -impacting the connector's core release cycle, modules contributing this type of public-facing API can be managed within -a separate repository. - -The following modules are also distributed as individual artifacts to support a convenient customisation of connectors -and are, however, not considered public APIs: - -- core/* -- extensions/* - -Extensions can in turn specify their own SPI-based extension points. Theses are, however, regarded as **internal SPI** -and not as a public API. Therefore, changing internal SPI doesn't necessarily imply a version increment for the module. - -#### Towards a First Release - -Until its first major release, the Eclipse Dataspace Connector will be developed under the version 0.0.1 without -complying to semantic versioning (i.e., API changes don't imply a major release). Snapshot versions may break binary -compatibility with previous versions and should not be regarded as stable. There are no guarantees regarding functional -and non-functional aspects of the implementation. Tooling for a later migration of current implementations to the -envisioned first release will not be provided. - -### Legal Documentation Requirements - -License and notice files must be included in every unit-level distribution artifact. In the case of Java archive (JAR) -files, the legal files should be placed in the META-INF directory. However, depending on the distribution format, the -exact location of the files might vary. - -#### License and Notice Files - -An appropriate license file is supplied on the root of the source code repository and must be included as is in each -distribution artifact. The supplied top-level notice file represents a snapshot of the dependencies included in all -modules present in the project repository at a given point in time. Before each new release or distribution, the notice -file must be updated regarding the listed third-party dependencies. -While distributing individual modules, a notice file containing only the relevant subset of dependencies must be -created (as described below). - -#### Creating the DEPENDENCIES File - -Notice files consist of some prescribed statements addressing trademarks copyright, and licensing. Additionally, the -section on third-party content lists all dependencies of the current scope (project or module) and must be maintained -before each release. This list is populated by deriving dependencies using the build tool (i.e., gradle), analysing them -using an IP tool (i.e., Eclipse Dash Tool), and decorating the resulting report with additional information using a -custom script. The shell script located below docs/legal supports parsing the results of the Eclipse Dash Licenses tool -and creating a formatted markdown report listing third-party content with extended information. - -Execute the gradle task *allDependencies* for creating an integrated dependency report over all sub-modules of the -project (including isolated modules). To process the dependencies of a specific module (e.g., an individual launcher) -execute the standard *dependencies* task: - -- First, the dependencies of this module are calculated with gradle and passed to the Dash tool: - -``` -gradle dependencies | grep -Poh "(?<=\s)[\w.-]+:[\w.-]+:[^:\s]+" | sort | uniq | java -jar /path/org.eclipse.dash.licenses-.jar - -summary DEPENDENCIES -``` - -> Caution macOS users: by default, macOS has BSD Grep installed, rather than GNU Grep. If you experience any issues, -> please try to [install -GNU Grep](https://apple.stackexchange.com/questions/193288/how-to-install-and-use-gnu-grep-in-macos). Furthermore, -> sorting depends on locale and collation, and -> -may [differ between OSes](https://unix.stackexchange.com/questions/362728/why-does-gnu-sort-sort-differently-on-my-osx-machine-and-linux-machine). -> Our CI job, which verifies the `DEPENDENCIES` file, is running on `ubuntu-latest` - -- For each dependency that is reported as `restricted`, an IPlap issue must be opened. For details, please refer to - the [documentation](https://github.com/eclipse/dash-licenses) of the Dash tool. - -#### Background Information - -The [Eclipse Dash Licenses tool](https://github.com/eclipse/dash-licenses) first looks -into [IPZilla](https://dev.eclipse.org/ipzilla) and second into [ClearlyDefined](https://clearlydefined.io). IPZilla -tracks the results (i.e. approved/restricted) of IP due diligence conducted by the Eclipse Foundation. The Dash tool -reports for each artifact found within IPZilla also its corresponding contribution questionnaire number (CQ#). In some -cases, an approved artifact doesn't reference a license type, which has to be then searched manually. ClearlyDefined is -maintained by a third-party and assigns scores to artifact licenses. If a minimum threshold is reached, the item is -considered as approved. The Dash tool tags artifacts found within this source accordingly. In some cases, the Dash tool -results in an inappropriate license, although a more suitable one is existing. In this case the tool requests a manual -review. In rare cases neither an Eclipse approval nor an ClearlyDefined entry is found. Currently, these licenses can be -found manually (e.g., on Maven Central). - -### Publishing Maven Artifacts - -As far as technically sensible, project modules are packaged and distributed as Maven artifacts via third-party -services (i.e., Maven Central). - -#### Workflow - -Execute the gradle task *publish* on the level of an individual module to publish it as a Maven artifact. - -#### Naming Convention - -Artifact names must adhere to the following naming convention: - -- Group name: org.eclipse.edc -- Artifact id describing the module name (disregarding the directory structure) separating terms by a dash - -Examples: - -``` -org.eclipse.edc:spi -org.eclipse.edc:util -``` - -A comprehensive list can be found [here](https://search.maven.org/search?q=org.eclipse.edc). - -#### Release guide - -_Note: the intended audience for this section are individuals who are eligible to author the release process. At the -time of this writing these are the committers of the project._ - -To trigger a new release please follow these simple steps: - -- update `gradle.properties`: set the `version` entry to the new version. -- trigger the actual release in GitHub: - - on the `Actions` tab pick the `Create EDC Release` workflow - - Select the `main` branch - - clicking on `Run workflow` should bring up a prompt for the version string. Please enter the version string in - SemVer format without any prefixes: `0.0.4-something-SNAPSHOT` would be OK, whereas `v0.0.4-rc1` would not. - - start the workflow - -The GitHub workflow then performs these steps - -1. creates a tag on the current branch, e.g. `v0.0.4-something-SNAPSHOT` (note the `v` prefix). This is done using the - GitHub API. -2. creates a merge commit from source branch to `releases`. The version information is encoded into the commit message. -3. triggers the Eclipse Foundation Jenkins instance ("JIPP"). This is where the actual publishing to MavenCentral - happens. Note that this process may take quite a bit of time, as every module is signed and uploaded. **Important: if - the version string contains the `-SNAPSHOT` suffix, the version is uploaded to OSSRH Snapshots instead of - MavenCentral!** -4. Creates a GitHub release including an automatically generated changelog, if the release is not a `-SNAPSHOT`. This is - only for informational purposes, no additionsl artifacts are uploaded. The GitHub Release has the following - properties: - - only created on non-snapshots - - always created off of `main` branch - - the release notes are auto-generated based on the last available tag and the `.github/releases.yaml` file - - no pre-releases are supported - - no discussions are created - -**Important: The first commit after a release has to change the `defaultVersion` in `gradle.properties` to `-SNAPSHOT` -again. Otherwise, the upload of the automated nightly builds to OSSRH Snapshots will fail.** diff --git a/docs/developer/testing.md b/docs/developer/testing.md deleted file mode 100644 index f516ebdb7d3..00000000000 --- a/docs/developer/testing.md +++ /dev/null @@ -1,238 +0,0 @@ -# Writing Tests - -## Adding EDC test fixtures - -To add EDC test utilities and test fixtures to downstream projects, simply add the following Gradle dependency: - -```kotlin - testImplementation("org.eclipse.edc:common-junit:") -``` - -## Controlling test verbosity - -To run tests verbosely (displaying test events and output and error streams to the console), use the following system -property: - -```shell -./gradlew test -PverboseTest -``` - -## Definition and distinction - -* _unit tests_ test one single class by stubbing or mocking dependencies. -* [_integration test_](#integration-tests) tests one particular aspect of a software, which may involve external - systems. -* [_system tests_](#system-tests) are end-2-end tests that rely on the _entire_ system to be present. - -## Integration Tests - -### TL;DR - -Use integration tests only when necessary, keep them concise, implement them in a defensive manner using timeouts and -randomized names, setup external systems during the workflow. - -### When to use them - -Generally we should aim at writing unit tests rather than integration tests, because they are simpler, more stable and -typically run faster. Sometimes that's not (easily) possible, especially when an implementation relies on an external -system that is not easily mocked or stubbed such as cloud-based databases. - -Therefore, in many cases writing unit tests is more involved that writing an integration test, for example say we wanted -to test our implementation of a CosmosDB-backed queue. We would have to mock the behaviour of the CosmosDB API, which - -while certainly possible - can get complicated pretty quickly. Now we still might do that for simpler scenarios, but -eventually we might want to write an integration test that uses a CosmosDB test instance. - -### Coding Guidelines - -EDC codebase has few annotations and these annotation focuses on two important aspects: - -- Exclude integration tests by default from JUnit test runner as these tests relies on external systems which might not - be available during a local execution. -- Categorize integration tests with help of - [JUnit Tags](https://junit.org/junit5/docs/current/user-guide/#writing-tests-tagging-and-filtering). - -Following are some available annotations: - -- `@IntegrationTest`: Marks an integration test with `IntegrationTest` Junit tag. This is the default tag and can be - used if you do not want to specify any other tags on your test to do further categorization. - -Below annotations are used to categorize integration tests based on the runtime components that must be available for -the test to run. All of these annotations are composite annotations and contains `@IntegrationTest` annotation as well. - -- `@AzureStorageIntegrationTest`: Marks an integration test with `AzureStorageIntegrationTest` Junit tag. This should be - used when the integration test requires the Azure Storage emulator to run. -- `@AzureCosmosDbIntegrationTest`: Marks an integration test with `AzureCosmosDbIntegrationTest` Junit tag. This should - be used when the integration test requires the Azure CosmosDB emulator to run. -- `@AwsS3IntegrationTest`: Marks an integration test with `AwsS3IntegrationTest` Junit tag. This should be used when the - integration test requires the AWS S3 storage emulator to run. -- `@DapsTest`: Marks an integration test with `DapsIntegrationTest` Junit tag. This should be used when the integration - test is requires Daps IAM endpoint to run. -- `@EndToEndTest`: Marks an integration test with `EndToEndTest` Junit Tag. This should be used when entire system is -- involved in a test. -- `@ComponentTest`: Marks an integration test with `ComponentTest` Junit Tag. This should be used when the test does not - use an external system, but uses actual collaborator objects instead of mocks. - -We encourage you to use these available annotation but if your integration test does not fit in one of these available -annotations, and you want to categorize them based on their technologies then feel free to create a new annotations but -make sure to use composite annotations which contains `@IntegrationTest`. If you do not wish to categorize based on -their technologies then you can use already available `@IntegrationTest` annotation. - -- By default, JUnit test runner ignores all integration tests because in root `build.gradle.kts` file we have excluded - all tests marked with `IntegrationTest` Junit tag. -- If your integration test does not rely on an external system then you may not want to use above-mentioned annotations. - -All integration tests should specify annotation to categorize them and the `"...IntegrationTest"` postfix to distinguish -them clearly from unit tests. They should reside in the same package as unit tests because all tests should maintain -package consistency to their test subject. - -Any credentials, secrets, passwords, etc. that are required by the integration tests should be passed in using -environment variables. A good way to access them is `ConfigurationFunctions.propOrEnv()` because then the credentials -can also be supplied via system properties. - -There is no one-size-fits-all guideline whether to perform setup tasks in the `@BeforeAll` or `@BeforeEach`, it will -depend on the concrete system you're using. As a general rule of thumb long-running one-time setup should be done in -the `@BeforeAll` so as not to extend the run-time of the test unnecessarily. In contrast, in most cases it is **not** -advisable to deploy/provision the external system itself in either of those methods. In other words, manually -provisioning a CosmosDB or spinning up a Postgres docker container directly from test code should generally be avoided, -because it will introduce code that has nothing to do with the test and may cause security problems (privilege -escalation through the Docker API), etc. - -Specifically, if possible all external system should be deployed using [Testcontainers](https://testcontainers.com/). -Alternatively, in special situations there might be a dedicated test instance running continuously, e.g. a CosmosDB test -instance in Azure. In the latter case please be careful to avoid conflicts (e.g. database names) when multiple test -runners access that system simultaneously and to properly clean up any residue before and after the test. - -### Running them locally - -As mentioned above the JUnit runner won't pick up integration tests unless a tag is provided. For example to run -`Azure CosmosDB` integration tests pass `includeTags` parameter with tag value to the `gradlew` command: - -```bash -./gradlew test -p path/to/module -DincludeTags="AzureCosmosDbIntegrationTest" -``` - -if needed to run all types of tests(e.g. unit & integration) then it can be achieved by passing the `runAllTests=true` -parameter to the `gradlew` command: - -```bash -./gradlew test -DrunAllTests="true" -``` - -For example to run all integration tests from Azure cosmos db module and its sub-modules: - -```bash -./gradlew -p extensions/azure/cosmos test -DincludeTags="AzureCosmosDbIntegrationTest" -``` - -_Command as `./gradlew :extensions:azure:cosmos test -DincludeTags="AzureCosmosDbIntegrationTest"` does not execute -tests from all sub-modules so we need to use `-p` to specify the module project path._ - -Cosmos DB integration tests are run by default against a locally -running [Cosmos DB Emulator](https://docs.microsoft.com/azure/cosmos-db/local-emulator). You can also use an instance of -Cosmos DB running in Azure, in which case you should set the `COSMOS_KEY` and `COSMOS_URL` environment variables. - -### Running them in the CI pipeline - -All integration tests should go into the [verify workflow](/.github/workflows/verify.yaml), every -"technology" should have its own job, and technology specific tests can be targeted using Junit tags with -`-DincludeTags` property as described above in document. - -A GitHub [composite action](https://docs.github.com/actions/creating-actions/creating-a-composite-action) was created to -encapsulate the tasks of running tests and uploading test reports as artifacts for publication. - -A final job named `Upload-Test-Report` should depend on all test jobs. It assembles all individual test reports. - -For example let's assume we've implemented a Postgres-based Asset Index, then the integration tests for that should go -into a "Postgres" `job`, and every module that adds a test (here: `extensions:postgres:assetindex`) should apply a -composite annotation (here: `@PostgresIntegrationTest` adding a tag `PostgresIntegrationTest`) on its integration tests. -This tagging will be used by the CI pipeline step to target and execute the integration tests related to Postgres. - -Let's also make sure that the code is checked out before and integration tests only run on the upstream repo. - -```yaml -jobs: - Postgres-Integration-Tests: - # run only on upstream repo - if: github.repository_owner == 'eclipse-edc' - runs-on: ubuntu-latest - - # taken from https://docs.github.com/en/actions/using-containerized-services/creating-postgresql-service-containers - services: - # Label used to access the service container - postgres: - # Docker Hub image - image: postgres - # Provide the password for postgres - env: - POSTGRES_PASSWORD: ${{ secrets.POSTGRES_PASSWORD }} - # Set health checks to wait until postgres has started - options: >- - --health-cmd pg_isready - --health-interval 10s - --health-timeout 5s - --health-retries 5 - - env: - POSTGRES_USER: ${{ secrets.POSTGRES_USERNAME }} - POSTGRES_PWD: ${{ secrets.POSTGRES_PASSWORD }} - - steps: - - uses: ./.github/actions/setup-build - - - name: Postgres Tests #just an example! - uses: ./.github/actions/run-tests - with: - command: ./gradlew -p extensions/postgres test -DincludeTags="PostgresIntegrationTest" - - [ ... ] - -Upload-Test-Report: - needs: - [ ... ] - - Postgres-Integration-Tests - [ ... ] -``` - -It is important to note that the secrets (here: `POSTGRES_USERNAME` and `POSTGRES_PASSWORD`) must be defined within the -repository's settings and that can only be done by a committer with temporary admin access, so be sure to contact them -before submitting your PR. - -### Do's and Don'ts - -DO: - -- use integration tests sparingly and only when unit tests are not practical -- deploy the external system as `service` directly in the workflow or -- use a dedicated always-on test instance -- take into account that external systems might experience transient failures or have degraded performance, so test - methods should have a timeout so as not to block the runner indefinitely. -- use randomized strings for things like database/table/bucket/container names, etc., especially when the external - system does not get destroyed after the test. - -DO NOT: - -- try to cover everything with integration tests. It's typically a code smell if there are no corresponding unit tests - for an integration test. -- slip into a habit of testing the external system rather than your usage of it -- store secrets directly in the code. Github will warn about that. -- perform complex external system setup in `@BeforeEach` or `@BeforeAll` - -## System tests - -System tests are needed when an entire feature should be tested, end to end. - -To write a system test two parts are needed: - -- _runner_: a module that contains the test logic -- _runtimes_: one or more modules that define a standalone runtime (e.g. a complete EDC definition) - -The runner can load an EDC runtime by using the `@RegisterExtension` annotation (example -in [`FileTransferIntegrationTest`](../../system-tests/tests/src/test/java/org/eclipse/edc/test/system/local/FileTransferIntegrationTest.java)) -. - -To make sure that the runtime extensions are correctly built and available, they need to be set as dependency of the -runner module as `testCompileOnly`. (example in [`build.gradle.kts`](/system-tests/tests/build.gradle.kts)). - -This would permit the dependency isolation between runtimes (very important the test need to run two different -components like a control plane and a data plane). - diff --git a/docs/developer/version-catalogs.md b/docs/developer/version-catalogs.md deleted file mode 100644 index e9e1e5c62b5..00000000000 --- a/docs/developer/version-catalogs.md +++ /dev/null @@ -1,123 +0,0 @@ -# The EDC version catalog - -EDC provides a [Version Catalog](https://docs.gradle.org/7.4/userguide/platforms.html) which contains all the -third-party dependencies -that are currently in use by EDC. - -This version catalog should be regarded as the recommended and tested dependency matrix, but it is not mandatory nor -does it enforce the use of a particular dependency. We only use "required dependencies", i.e. no minimum or maximum -versions, no ranges, no rejected versions etc. - -## Using the version catalog - -The version catalog gets distributed as regular Maven Artifact using the following coordinates: - -``` -org.eclipse.edc:edc-versions: -``` - -As per the [documentation](https://docs.gradle.org/7.4/userguide/platforms.html#sec:importing-published-catalog) the -version catalog and the repositories, in which to look for it, must be declared in the `settings.gradle.kts`: - -```kotlin -// in settings.gradle.kts -dependencyResolutionManagement { - repositories { - maven { - url = uri("https://oss.sonatype.org/content/repositories/snapshots/") - } - mavenCentral() - mavenLocal() - } - versionCatalogs { - create("libs") { - from("org.eclipse.edc:edc-versions:>") - } - } -} -``` - -Then, the version catalog named `"libs"` is available in the project, its exact contents can be -inspected [in this `*.toml` file](https://github.com/eclipse-edc/GradlePlugins/blob/main/gradle/libs.versions.toml) -. Be aware that the library aliases are _normalized_, that means all the dashes, underscores and dots are interpreted as -separators. - -Utilizing a dependency is easy, simply refer to it in the dependency configuration: - -```kotlin -implementation(libs.jackson.annotation) // resolves to "com.fasterxml.jackson.core:jackson-annotations" -``` - -### Using bundles - -In the context of version catalogs a `bundle` is a set of versions grouped together, similar to a Maven BOM. Using them -in build files is just as easy as with "normal" dependencies: - -```kotlin -implementation(libs.bundles.jersey.core) - -testFixturesImplementation(libs.bundles.jupiter) -``` - -## Modifying the version catalog - -There are two main scenarios where you might want to change the provided version catalog: - -- adding new libraries -- upgrading a specific version - -_It must be made clear that changing the provided version catalog should only be done deliberately and with a very -specific purpose in mind. Expect to be challenged during code reviews if you choose to do it!_ - -### Extending the version catalog - -Adopting a new version into the EDC Version Catalog will take some time: a feature request issue must be opened, that -issue needs to be processed, a PR must be opened, a review must be performed, and a new artifact version of the Version -Catalog artifact must be built, etc. While that is the recommended process, we understand that it is not always possible -to wait for that. - -For example if you have a time-sensitive PR open in EDC for which you need a third-party library that -will be used in multiple packages (NimbusDS would be a good example). Then it might be a good idea to extend the EDC -Version Catalog temporarily, until that lib can be adopted into the EDC Version Catalog. - -Another situation would be a third-party library, that will only be used in a very limited scope, for example an SPI -package and the corresponding implementation package. We would not necessarily need to adopt such a lib into the EDC -Version Catalog, but it is still a good idea to harmonize version management inside the EDC project. - -```kotlin -dependencyResolutionManagement { - // not shown: repositories - versionCatalogs { - create("libs") { - from("org.eclipse.edc:edc-versions:0.1.4-SNAPSHOT-SNAPSHOT") - // this is not part of the published EDC Version Catalog, so we'll just "amend" it. - // the versionRef "okhttp" is already defined in the Version Catalog - library("dnsOverHttps", "com.squareup.okhttp3", "okhttp-dnsoverhttps").versionRef("okhttp") - } - } -} -``` - -### Overriding the version catalog - -Due to the reasons mentioned before, it is sometimes quicker to override a specific version directly in the client -project. This can come in handy when there are breaking changes in the lib's API, or there are known critical -vulnerabilities, or you simply need a new and shiny feature. Then (temporarily) overwriting a library's version could be -an option: - -```kotlin -dependencyResolutionManagement { - // not shown: repositories - versionCatalogs { - create("libs") { - from("org.eclipse.edc:edc-versions:0.1.4-SNAPSHOT-SNAPSHOT") - // override the version for Jackson. Must use existing alias - version("jackson", "2.69.0") - } - } -} -``` - -Note that the version, that actually gets used during runtime, may still be different due to conflict resolution etc. -See also -the [official documentation](https://docs.gradle.org/7.4/userguide/platforms.html#sec:overwriting-catalog-versions) diff --git a/docs/templates/README.md b/docs/templates/README.md deleted file mode 100644 index 01e8fb344ff..00000000000 --- a/docs/templates/README.md +++ /dev/null @@ -1,34 +0,0 @@ -# Templates - -Find all provided documentation templates in this folder. Please note that the _italic text -and sentences_ should be removed. Feel free to add additional sections and subsections, however, make sure -that at least the sections of the templates marked as "mandatory" are filled. - -## Decision Records - -[Link](decision-record.md) to template. - -Each decision record should be put in an appropriate folder that is following a naming pattern: -`YYYY-MM-DD-title-of-decision-record`. This should be located at the [decision record folder](../developer/decision-records/) -and contain all relevant files, at least a filled-out template named `README.md` and any additional images. - -As of now, every merged decision record is in state `accepted`. Please make sure to add a comment to -a decision record that replaces a previous one with adding a hint: `superseded by [...]`. - -## Extensions - -[Link](extension.md) to template. - -Every module located [in the extensions folder](../../extensions/) has to provide documentation regarding its -functionality, implementation, or architectural details. -The filled-out template has to be added as `README.md` to each module. Any additional files can be placed -in a dedicated `docs` directory. As defined by the template, this markdown file can point to submodules -that provide the same documentation scope themselves. - -## Launchers - -[Link](launcher.md) to template. - -Every module located [in the launchers folder](../../launchers/) has to provide documentation regarding its purpose and usage. -The filled template has to be added as `README.md` to each module. Any additional files can be placed -in a dedicated `docs` directory. diff --git a/docs/templates/decision-record.md b/docs/templates/decision-record.md deleted file mode 100644 index 68d30c57a92..00000000000 --- a/docs/templates/decision-record.md +++ /dev/null @@ -1,15 +0,0 @@ -# Title of the Decision Record - -## Decision - -_Briefly and clearly describe the topic this record covers, what decision was made and why._ - -## Rationale - -_Briefly describe the relevance of this topic and why it is important to have a decision. If applicable, -add an evaluation of several options (e.g., different libs)._ - -## Approach - -_Clearly describe the approach. Feel free to add subsections, graphics, or example code and please -make sure every relevant detail is explained._ diff --git a/docs/templates/extension.md b/docs/templates/extension.md deleted file mode 100644 index b61a50833bf..00000000000 --- a/docs/templates/extension.md +++ /dev/null @@ -1,52 +0,0 @@ -# Title of the Extension - -_Briefly describe the functionality this extension introduces._ - -## Background - -_Briefly describe the relevance and need of this extension._ - -### Scope - -_Describe what this extension should be used for, e.g., for what kind of data transfers or data -sharing scenarios._ - -### Use Cases - -_Explain one or multiple use cases that could be realized using this extension._ - -## Technical Details - -### Interfaces - -_Provide a detailed description of provided and consumed interfaces, e.g., name http endpoints and values, -or classes/methods and parameter, and add details on the purpose. You may use the table below or add an -OpenAPI description or similar._ - -| Interface | Parameters | Description | -| :----| :---- | :-------------------- | -| | | - -### Dependencies - -_Provide some information about dependencies, e.g., used extensions._ - -| Name | Description | -| :----| :-----------| -| extensions:api:management-api | _this is an example_ | - -### Configurations - -_If available, describe what configuration properties this extension comes with or expects as a prerequisite._ - -## Terminology - -_If necessary, introduce important terms and concepts._ - -## Design Principles - -_If available, clearly describe what design principles your implementation follows._ - -## Decisions - -_Clearly describe any decisions you made regarding architectural design or similar, and try to justify them._ diff --git a/docs/templates/launcher.md b/docs/templates/launcher.md deleted file mode 100644 index aabe175d064..00000000000 --- a/docs/templates/launcher.md +++ /dev/null @@ -1,25 +0,0 @@ -# Name of the Launcher - -_Briefly describe the purpose and scope of this launcher._ - -## Prerequisites - -_Clearly explain what prerequisites are necessary for being able to execute this launcher, e.g., SSL certificates._ - -## Modules - -The following modules are used for this launcher. - -| Name | Description | -| :----| :-----------| -| extensions:api:management-api | _this is an example_ | - -## Configuration - -_Clearly describe what settings are used and how they are set up, especially if they're launcher-specific. You -may use code snippets or provide example files._ - -## Running the launcher - -_Clearly describe how to set up and use this launcher, e.g., how a local deployment or a Docker -or Kubernetes should look like. You may use code or command line snippets for guidance._ diff --git a/docs/templates/puml-colors.txt b/docs/templates/puml-colors.txt deleted file mode 100644 index d3bd9c6cdd5..00000000000 --- a/docs/templates/puml-colors.txt +++ /dev/null @@ -1,17 +0,0 @@ - -Insert this at the beginning of each PlantUML file to keep the style consistent. - ---- - -skinParam NoteBackgroundColor WhiteSmoke -skinParam NoteFontColor Black -skinParam ParticipantBackgroundColor WhiteSmoke -skinParam ActorBackgroundColor WhiteSmoke -skinParam AgentBackgroundColor White -skinParam AgentBorderColor SkyBlue -skinparam shadowing false - -!define ConsumerColor f8f2ff -!define ProviderColor d9edff -!define WarningColor Business -!define LeadColor Technology \ No newline at end of file diff --git a/known_friends.md b/known_friends.md deleted file mode 100644 index 61bececf0ff..00000000000 --- a/known_friends.md +++ /dev/null @@ -1,18 +0,0 @@ -# Known Friends of EDC - -To get to know how we define "adoptions" and see how to submit a feature, please take a look at our -[guidelines for submitting features](../main/contribution_categories.md). - -| Title | Description | Links | -|:----------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| -| EDC Extension for Asset Administration Shell (AAS) | Asset Administration Shell (AAS) data can be manually shared over the EDC by linking an EDC Asset to the HTTP endpoint of the specific AAS element. Additionally, contracts and policies have to be defined for each element. In order to minimize configuration effort and prevent errors, this extension is able to link existing AAS services and its elements to the EDC automatically. Furthermore, this extension can also start an AAS service by reading an static AAS model file. A default contract and policy can be chosen to be applied for all elements. For critical elements, additional contracts and policies can be placed. External changes to the structure of an AAS are automatically synchronized by the extension. | [Link to repository](https://github.com/FraunhoferIOSB/EDC-Extension-for-AAS), [more information about AAS](https://www.plattform-i40.de/SiteGlobals/IP/Forms/Listen/Downloads/EN/Downloads_Formular.html?cl2Categories_TechnologieAnwendungsbereich_name=Verwaltungsschale) | -| Data tracking by auditing data | Proof of concept of how to track data usage by aggregating audit-logs of different components in a single instance: The work presents a first proof of concept of how traceability of data usage can be implemented using the EDC Connector event framework and audit logging. In this PoC, the traceability of data is limited to the AWS dataplane. The EDC Connector logs which exact assets are stored with which key in the AWS bucket. With this information, data usage can be traced from the shared logs of the EDC Connector and the AWS S3 bucket. Elasticsearch was chosen as the instance to merge both logs in this project. A simple Python script takes over the analysis of the log data. | [Link to repository](https://github.com/FraunhoferISST/edc-data-tracebility-app) | -| IDS Broker Extension | EDC connector extension that can register offered assets in form of IDS Resources to a central IDS Metadata Broker. | [Link to repository](https://github.com/sovity/edc-extensions/tree/main/extensions/broker) | -| EDC GUI | Extended EDC Data Dashboard that integrates the open-source EDC Connector interfaces while adding asset properties and form validation, providing design and UX changes, and introducing configuration profiles. | [Link to repository](https://github.com/sovity/edc-ui) | -| IDS Clearing House Extension | EDC connector extension that can log events to a central IDS Clearing House. | [Link to repository](https://github.com/sovity/edc-extensions/tree/main/extensions/clearinghouse) | -| EDC Connector HTTP client | An HTTP client to communicate with the EDC Connector for Node.js and the browser. | [Link to repository](https://github.com/Think-iT-Labs/edc-connector-client) [npm](https://www.npmjs.com/package/@think-it-labs/edc-connector-client) | -| Integration for Microsoft Dynamics 365 and Power Platform | The prototype demonstrates how to publish product information from a [Microsoft Power App](https://learn.microsoft.com/en-us/power-apps/powerapps-overview) to a participant in an existing dataspace. The [Microsoft Power Automate custom connector](https://learn.microsoft.com/en-us/connectors/custom-connectors/define-blank) calls the EDC endpoints from the Nocode/lowcode platform to publish an asset and create a contract. This example shows the integration into the [Microsoft Dataverse](https://learn.microsoft.com/en-us/power-apps/maker/data-platform/data-platform-intro). | [Link to repository](https://github.com/edc-oneweek/MinimumViableDataspace/blob/2c20b19b2a70b0631818a25112d04e9cc9fad414/dataverse/README.md) | -| Silicon Economy EDC | The Silicon Economy EDC is a configured version from the Connector of the Eclipse Dataspace Components (EDC). It is used and specialized to easily integrate Silicon Economy components with the IDS. | [Link to repository](https://git.openlogisticsfoundation.org/silicon-economy/base/ids/silicon-economy-edc) | -| EDC Extension for IONOS S3 storage | We are providing an EDC extension to allow the connector to save and to access files kept into an IONOS S3 storage | [Link to repository](https://github.com/ionos-cloud/edc-ionos-s3) | | -| EDC metadata extractor extension | This extension is a PoC to automatically extract metadata of a file that can be used for further processing (e.g., calculating the FAIRness score). | [Link to repository](https://gitlab.fit.fraunhofer.de/ameerali.khan/edc-metadata-extractor-extension) | -| ... | ... | ... | diff --git a/pr_etiquette.md b/pr_etiquette.md deleted file mode 100644 index ac5573e19a8..00000000000 --- a/pr_etiquette.md +++ /dev/null @@ -1,68 +0,0 @@ -# Etiquette for pull requests - -## As an author - -Submitting pull requests in EDC should be done while adhering to a couple of simple rules. - -- Familiarize yourself with [coding style](styleguide.md) - , [architectural patterns](docs/developer/architecture/coding-principles.md) and - other contribution guidelines. -- No surprise PRs please. Before you submit a PR, open a discussion or an issue outlining your planned work and give - people time to comment. It may even be advisable to contact committers using the `@mention` feature. Unsolicited PRs - may get ignored or rejected. -- Create focused PRs: your work should be focused on one particular feature or bug. Do not create broad-scoped PRs that - solve multiple issues as reviewers may reject those PR bombs outright. -- Provide a clear description and motivation in the PR description in GitHub. This makes the reviewer's life much - easier. It is also helpful to outline the broad changes that were made, e.g. "Changes the schema of XYZ-Entity: - the `age` field changed from `long` to `String`". -- If you introduce new 3rd party dependencies, be sure to note them in the PR description and explain why they are - necessary. -- Stick to the established code style, please refer to the [styleguide document](styleguide.md). -- All tests should be green, especially when your PR is in `"Ready for review"` -- Mark PRs as `"Ready for review"` only when you're prepared to defend your work. By that time you have completed your - work and shouldn't need to push any more commits other than to incorporate review comments. -- Merge conflicts should be resolved by squashing all commits on the PR branch, rebasing onto `main` and - force-pushing. Do this when your PR is ready to review. -- If you require a reviewer's input while it's still in draft, please contact the designated reviewer using - the `@mention` feature and let them know what you'd like them to look at. -- Request a review from one of the [technical committers](pr_etiquette.md#the-technical-committers-as-of-may-13-2022). Requesting a review from anyone else is still possible, and - sometimes may be advisable, but only committers can merge PRs, so be sure to include them early on. -- Re-request reviews after all remarks have been adopted. This helps reviewers track their work in GitHub. -- If you disagree with a committer's remarks, feel free to object and argue, but if no agreement is reached, you'll have - to either accept the decision or withdraw your PR. -- Be civil and objective. No foul language, insulting or otherwise abusive language will be tolerated. -- The PR titles must follow [Conventional Commits](https://www.conventionalcommits.org/en/v1.0.0/). - - The title must follow the format as `(): `. - `build`, `chore`, `ci`, `docs`, `feat`, `fix`, `perf`, `refactor`, `revert`, `style`, `test` are allowed for the ``. - - The length must be kept under 80 characters. - - See [check-pull-request-title job of GitHub workflow](https://github.com/eclipse-edc/Connector/blob/main/.github/workflows/scan-pull-request.yaml) for checking details. - -## As a reviewer - -- Please complete reviews within two business days or delegate to another committer, removing yourself as a reviewer. -- If you have been requested as reviewer, but cannot do the review for any reason (time, lack of knowledge in particular - area, etc.) please comment that in the PR and remove yourself as a reviewer, suggesting a stand-in. The [code - owners document](CODEOWNERS) should help with that. -- Don't be overly pedantic. -- Don't argue basic principles (code style, architectural decisions, etc.) -- Use the `suggestion` feature of GitHub for small/simple changes. -- The following could serve you as a review checklist: - - no unnecessary dependencies in `build.gradle.kts` - - sensible unit tests, prefer unit tests over integration tests wherever possible (test runtime). Also check the - usage of test tags. - - code style - - simplicity and "uncluttered-ness" of the code - - overall focus of the PR -- Don't just wave through any PR. Please take the time to look at them carefully. -- Be civil and objective. No foul language, insulting or otherwise abusive language will be tolerated. The goal is to - _encourage_ contributions. - -## The technical committers (as of May 18, 2022) - -- @MoritzKeppler -- @jimmarino -- @bscholtes1A -- @ndr_brt -- @ronjaquensel -- @juliapampus -- @paullatzelsperger diff --git a/resources/media/logo.png b/resources/media/logo.png deleted file mode 100644 index 3ff30153a49..00000000000 Binary files a/resources/media/logo.png and /dev/null differ diff --git a/resources/save_actions_screenshot.png b/resources/save_actions_screenshot.png deleted file mode 100644 index 10ae06cb006..00000000000 Binary files a/resources/save_actions_screenshot.png and /dev/null differ diff --git a/styleguide.md b/styleguide.md deleted file mode 100644 index 2ef22c71a7a..00000000000 --- a/styleguide.md +++ /dev/null @@ -1,63 +0,0 @@ -# Eclipse Dataspace Connector Code Style Guide - -In order to maintain a coherent code style throughout the project we ask every contributor to adhere to a few simple -style guidelines. We assume most developers will use at least something like `vim` and therefore have support for -automatic code formatting, we are not going to list the guidelines here. If you absolutely want to take a look, checkout -the [config written in XML](resources/edc-checkstyle-config.xml). - -## Checkstyle configuration - -Checkstyle is a [tool](https://checkstyle.sourceforge.io/) that can statically analyze your source code to check against -a set of given rules. Those rules are formulated in an [XML document](resources/edc-checkstyle-config.xml). Many modern -IDEs have a plugin available for download that runs in the background and does code analysis. - -Our checkstyle config is based off of the [Google Style](https://checkstyle.sourceforge.io/google_style.html) with a few -additional rules such as the naming of constants and Types. - -_Note: currently we do **not** enforce the generation of Javadoc comments, even though documenting code is **highly** -recommended. We might enable this in the future, such that at least interfaces and public methods are commented._ - -## Running Checkstyle - -Checkstyle can be run in different ways: implicitly we run it through the `checkstyle` Gradle Plugin -during `gradle build`. That will cause the build to fail if any violations are found. But in order to get better -usability and on-the-fly reporting, Checkstyle is also available as IDE plugins for many modern IDEs, and it can run -either on-demand or continuously in the background: - -- [IntelliJ IDEA plugin [recommended]](https://plugins.jetbrains.com/plugin/1065-checkstyle-idea) -- [Eclipse IDE [recommended]](https://checkstyle.org/eclipse-cs/#!/) - -### Checkstyle as PR validation - -Apart from running Checkstyle locally as IDE plugin, we do run it on -our [Github Actions pipeline](.github/workflows/verify.yaml). At this time, Checkstyle will only spew out warnings, but -we may tighten the rules at a future time and without notice. This will result in failing Github Action pipelines. Also, -committers might reject PRs due to Checkstyle warnings. - -It is therefore **highly** recommended running Checkstyle locally as well. - -If you **do not wish** to run Checkstyle on you local machine, that's fine, but be prepared to get your PRs rejected -simply because of a stupid naming or formatting error. - -> _Note: we do not use the Checkstyle Gradle Plugin on Github Actions because violations would cause builds to fail. For now, we only want to log warnings._ - -## [Recommended] IntelliJ Code Style Configuration - -If you are using Jetbrains IntelliJ IDEA, we have created a specific code style configuration that will automatically -format your source code according to that style guide. This should eliminate most of the potential Checkstyle violations -right from the get-go. You will need to reformat your code manually or in a pre-commit hook though. - -## [Optional] Intellij SaveActions Plugin - -If you absolutely want to make sure that no piece of ever-so-slightly misformatted code even hits your hard disk, we -advise you to use the [SaveActions plugin](https://plugins.jetbrains.com/plugin/7642-save-actions) for IntelliJ IDEA. It -takes care that your code is always correctly formatted. Unfortunately SaveActions has no export feature, so please just -copy this configuration: -![](resources/save_actions_screenshot.png) - -## [Optional] Generic `.editorConfig` - -For most other editors and IDEs we've supplied an [.editorConfig](resources/edc-codestyle.editorconfig) file that can be -placed at the appropriate location. The specific location will largely depend on your editor and your OS, please refer -to the -[official documentation](https://editorconfig.org) for details.