Skip to content

Commit

Permalink
feat: add GitHub Actions, modify yoinked code
Browse files Browse the repository at this point in the history
  • Loading branch information
JJGadgets committed Dec 1, 2023
1 parent 804e29e commit 348783b
Show file tree
Hide file tree
Showing 22 changed files with 1,354 additions and 2 deletions.
18 changes: 18 additions & 0 deletions .editorconfig
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
# editorconfig.org
root = true

[*]
indent_style = space
indent_size = 2
end_of_line = lf
charset = utf-8
trim_trailing_whitespace = true
insert_final_newline = true

[Makefile]
indent_style = space
indent_size = 4

[*.{bash,sh}]
indent_style = space
indent_size = 4
3 changes: 3 additions & 0 deletions .github/CODEOWNERS
Validating CODEOWNERS rules …
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
# https://docs.github.com/en/github/creating-cloning-and-archiving-repositories/about-code-owners
* @onedr0p @bjw-s # original pipeline code written by them
* @JJGadgets # professional yoinker, makes OCD edits to perfectly working code
81 changes: 81 additions & 0 deletions .github/ISSUE_TEMPLATE/container-request.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,81 @@
---
name: Container Request
description: Request a new application to be containerized
labels: ["container-request"]

body:
- type: markdown
attributes:
value: |
Doing you due diligence and filling out this form throughly
will gauge how serious your request is.
- type: input
id: application-name
attributes:
label: Application Name
description: Name of the application you would like containerized
placeholder: e.g. Sonarr
validations:
required: true

- type: input
id: application-source-code
attributes:
label: Application Source Code URL
description: URL to the source code of the application
placeholder: e.g. https://github.com/superseriousbusiness/gotosocial
validations:
required: true

- type: dropdown
id: application-language
attributes:
label: Application Language
description: Language this application is written in
options:
- Go
- .Net
- Java
- PHP
- Python
- Ruby
- Typescript
- Other
validations:
required: true

- type: dropdown
id: application-platforms
attributes:
label: Application Architectures
description: Architectures this application supports
multiple: true
options:
- linux/arm64
- linux/amd64
validations:
required: true

- type: textarea
id: additional-information
attributes:
label: Additional Information
description: Mention anything to give further context to this container request

- type: checkboxes
id: self-assign
attributes:
label: Assign to self
options:
- label: I will create a PR to containerize this application myself
required: false

- type: checkboxes
id: terms
attributes:
label: Code of Conduct
description: By submitting this issue, you agree to follow our [Code of Conduct](https://github.com/JJGadgets/containers/blob/main/CODE_OF_CONDUCT.md)
options:
- label: I agree to follow this project's Code of Conduct
required: true
13 changes: 13 additions & 0 deletions .github/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
# Credits

Hey there, @JJGadgets here.

A big portion (most if not all, minus _maybe_ some changes for my own taste) of the pipeline code found here was originally made by @onedr0p and @bjw-s from the Kubernetes@Home community.

While it would be possible to rewrite a simplified version of this, I decided it was more efficient to use the battle tested pipeline code, which was made available and licensed as open source, cherry-pick changes from upstream (onedr0p's [containers]<https://github.com/onedr0p/containers> repo) back into my repo, and where it seems fit, contribute back any fixes or features or whatnot. Effectively, a hivemind.

This makes it easier to focus on actually building containers instead of worrying about the pipeline code to build and maintain said containers all by myself.

As of 2023-12-01, the CODEOWNERS file in this folder (.github) will contain @onedr0p, @bjw-s and myself, to effectively reflect all the people who have contributed to the pipeline code found in this repo.
Only if a technical reason arises where the CODEOWNERS file shouldn't contain these amazing people, will it be changed to only reflect myself, and this README will be the main source of attribution from me.

20 changes: 20 additions & 0 deletions .github/checks/metadata.rules.cue
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
#Spec: {
app: #AcceptableAppName
base: bool
semantic_versioning?: bool
channels: [...#Channels]
}

#Channels: {
name: #AcceptableChannelName
platforms: [...#AcceptedPlatforms]
stable: bool
tests: {
enabled: bool
type?: =~"^(cli|web)$"
}
}

#AcceptableAppName: string & !="" & =~"^[a-zA-Z0-9_-]+$"
#AcceptableChannelName: string & !="" & =~"^[a-zA-Z0-9._-]+$"
#AcceptedPlatforms: "linux/amd64" | "linux/arm64"
35 changes: 35 additions & 0 deletions .github/renovate.json5
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
{
"$schema": "https://docs.renovatebot.com/renovate-schema.json",
"extends": [
"config:recommended",
"docker:enableMajor",
":disableRateLimiting",
":dependencyDashboard",
":semanticCommits",
":automergeDigest",
":automergeBranch",
"helpers:pinGitHubActionDigests"
],
"platform": "github",
"platformCommit": true,
"onboarding": false,
"requireConfig": "optional",
"dependencyDashboardTitle": "Renovate Dashboard 🤖",
"suppressNotifications": ["prIgnoreNotification"],
"packageRules": [
{
"description": "Auto-merge Github Actions",
"matchDatasources": ["github-tags"],
"automerge": true,
"automergeType": "branch",
"ignoreTests": true,
"matchUpdateTypes": ["minor", "patch"],
"matchPackagePatterns": ["renovatebot/github-action"]
},
{
"matchDatasources": ["docker"],
"matchUpdateTypes": ["digest"],
"commitMessagePrefix": "📣 "
}
]
}
27 changes: 27 additions & 0 deletions .github/scripts/json-to-yaml.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@

import os
import json
import yaml

def json_to_yaml(subdir, file):
obj = None

json_file = os.path.join(subdir, file)
with open(json_file) as f:
obj = json.load(f)

yaml_file = os.path.join(subdir, "metadata.yaml")
with open(yaml_file, "w") as f:
yaml.dump(obj, f)

os.remove(json_file)


if __name__ == "__main__":

for subdir, dirs, files in os.walk("./apps"):
for f in files:
if f != "metadata.json":
continue
json_to_yaml(subdir, f)

193 changes: 193 additions & 0 deletions .github/scripts/prepare-matrices.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,193 @@
#!/usr/bin/env python3
import importlib.util
import sys
import os

import json
import yaml
import requests

from subprocess import check_output

from os.path import isfile

# read repository name and repository owner's username from custom env vars, else read from GitHub Actions default env vars
repo_user = os.environ.get('REPO_USER', os.environ.get('GITHUB_REPOSITORY_OWNER'))
repo_name = os.environ.get('REPO_NAME', os.environ.get('GITHUB_REPOSITORY').replace(repo_user, ''))

TESTABLE_PLATFORMS = ["linux/amd64"]

def load_metadata_file_yaml(file_path):
with open(file_path, "r") as f:
return yaml.safe_load(f)

def load_metadata_file_json(file_path):
with open(file_path, "r") as f:
return json.load(f)

def get_latest_version_py(latest_py_path, channel_name):
spec = importlib.util.spec_from_file_location("latest", latest_py_path)
latest = importlib.util.module_from_spec(spec)
sys.modules["latest"] = latest
spec.loader.exec_module(latest)
return latest.get_latest(channel_name)

def get_latest_version_sh(latest_sh_path, channel_name):
out = check_output([latest_sh_path, channel_name])
return out.decode("utf-8").strip()

def get_latest_version(subdir, channel_name):
ci_dir = os.path.join(subdir, "ci")
if os.path.isfile(os.path.join(ci_dir, "latest.py")):
return get_latest_version_py(os.path.join(ci_dir, "latest.py"), channel_name)
elif os.path.isfile(os.path.join(ci_dir, "latest.sh")):
return get_latest_version_sh(os.path.join(ci_dir, "latest.sh"), channel_name)
elif os.path.isfile(os.path.join(subdir, channel_name, "latest.py")):
return get_latest_version_py(os.path.join(subdir, channel_name, "latest.py"), channel_name)
elif os.path.isfile(os.path.join(subdir, channel_name, "latest.sh")):
return get_latest_version_sh(os.path.join(subdir, channel_name, "latest.sh"), channel_name)
return None

def get_published_version(image_name):
r = requests.get(
f"https://api.github.com/users/{repo_user}/packages/container/{image_name}/versions",
headers={
"Accept": "application/vnd.github.v3+json",
"Authorization": "token " + os.environ["TOKEN"]
},
)

if r.status_code != 200:
return None

data = json.loads(r.text)
for image in data:
tags = image["metadata"]["container"]["tags"]
if "rolling" in tags:
tags.remove("rolling")
# Assume the longest string is the complete version number
return max(tags, key=len)

def get_image_metadata(subdir, meta, forRelease=False, force=False, channels=None):
imagesToBuild = {
"images": [],
"imagePlatforms": []
}

if channels is None:
channels = meta["channels"]
else:
channels = [channel for channel in meta["channels"] if channel["name"] in channels]


for channel in channels:
version = get_latest_version(subdir, channel["name"])
if version is None:
continue

# Image Name
toBuild = {}
if channel.get("stable", False):
toBuild["name"] = meta["app"]
else:
toBuild["name"] = "-".join([meta["app"], channel["name"]])

# Skip if latest version already published
if not force:
published = get_published_version(toBuild["name"])
if published is not None and published == version:
continue
toBuild["published_version"] = published

toBuild["version"] = version

# Image Tags
toBuild["tags"] = ["rolling", version]
if meta.get("semantic_versioning", False):
parts = version.split(".")[:-1]
while len(parts) > 0:
toBuild["tags"].append(".".join(parts))
parts = parts[:-1]

# Platform Metadata
for platform in channel["platforms"]:

if platform not in TESTABLE_PLATFORMS and not forRelease:
continue

toBuild.setdefault("platforms", []).append(platform)

platformToBuild = {}
platformToBuild["name"] = toBuild["name"]
platformToBuild["platform"] = platform
platformToBuild["version"] = version
platformToBuild["channel"] = channel["name"]

if meta.get("base", False):
platformToBuild["label_type"] ="org.opencontainers.image.base"
else:
platformToBuild["label_type"]="org.opencontainers.image"

if isfile(os.path.join(subdir, channel["name"], "Dockerfile")):
platformToBuild["dockerfile"] = os.path.join(subdir, channel["name"], "Dockerfile")
platformToBuild["context"] = os.path.join(subdir, channel["name"])
platformToBuild["goss_config"] = os.path.join(subdir, channel["name"], "goss.yaml")
else:
platformToBuild["dockerfile"] = os.path.join(subdir, "Dockerfile")
platformToBuild["context"] = subdir
platformToBuild["goss_config"] = os.path.join(subdir, "ci", "goss.yaml")

platformToBuild["goss_args"] = "tail -f /dev/null" if channel["tests"].get("type", "web") == "cli" else ""

platformToBuild["tests_enabled"] = channel["tests"]["enabled"] and platform in TESTABLE_PLATFORMS

imagesToBuild["imagePlatforms"].append(platformToBuild)
imagesToBuild["images"].append(toBuild)
return imagesToBuild

if __name__ == "__main__":
apps = sys.argv[1]
forRelease = sys.argv[2] == "true"
force = sys.argv[3] == "true"
imagesToBuild = {
"images": [],
"imagePlatforms": []
}

if apps != "all":
channels=None
apps = apps.split(",")
if len(sys.argv) == 5:
channels = sys.argv[4].split(",")

for app in apps:
if not os.path.exists(os.path.join("./apps", app)):
print(f"App \"{app}\" not found")
exit(1)

meta = None
if os.path.isfile(os.path.join("./apps", app, "metadata.yaml")):
meta = load_metadata_file_yaml(os.path.join("./apps", app, "metadata.yaml"))
elif os.path.isfile(os.path.join("./apps", app, "metadata.json")):
meta = load_metadata_file_json(os.path.join("./apps", app, "metadata.json"))

imageToBuild = get_image_metadata(os.path.join("./apps", app), meta, forRelease, force=force, channels=channels)
if imageToBuild is not None:
imagesToBuild["images"].extend(imageToBuild["images"])
imagesToBuild["imagePlatforms"].extend(imageToBuild["imagePlatforms"])
else:
for subdir, dirs, files in os.walk("./apps"):
for file in files:
meta = None
if file == "metadata.yaml":
meta = load_metadata_file_yaml(os.path.join(subdir, file))
elif file == "metadata.json":
meta = load_metadata_file_json(os.path.join(subdir, file))
else:
continue
if meta is not None:
imageToBuild = get_image_metadata(subdir, meta, forRelease, force=force)
if imageToBuild is not None:
imagesToBuild["images"].extend(imageToBuild["images"])
imagesToBuild["imagePlatforms"].extend(imageToBuild["imagePlatforms"])
print(json.dumps(imagesToBuild))
Loading

0 comments on commit 348783b

Please sign in to comment.