Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

USE_TMPFS=all and the like: After a builder is finishes it continues to hold its tmpfs space, even if idle for a notable time #1149

Open
3 tasks done
markmi opened this issue Apr 29, 2024 · 0 comments
Labels

Comments

@markmi
Copy link

markmi commented Apr 29, 2024

Prerequisites

  • Have you checked for an existing issue describing your problem?
  • Are you running the latest version?
  • Is your ports tree recent?
  • [] Is your FreeBSD Host on a supported release? main (but not main specific)

Describe the bug

[This might be viewed as an enhancement request. I had to pick a category. The issue is not limited to FreeBSD's main.]

After a builder is finishes it continues to hold its tmpfs space even if idle for a notable time. For USE_TMPFS=all or the like this can greatly increase the RAM+SWAP requirements, especially if picking the next available builder to use is not based on picking the one with the largest tmpfs use (so that its tmpfs use ends up resized by the setup of the next build).

Example for a 32 hardware thread system with 32 builders active (01 and 28 but I only show
a few others here):

# df -m -t tmpfs | sort -k6
. . .
tmpfs         576705 1359 575346     0%    /usr/local/poudriere/data/.m/main-amd64-bulk_a-default/02
tmpfs           2048    0   2047     0%    /usr/local/poudriere/data/.m/main-amd64-bulk_a-default/02/.p
tmpfs         577331 1985 575346     0%    /usr/local/poudriere/data/.m/main-amd64-bulk_a-default/02/usr/local
tmpfs         576700 1353 575346     0%    /usr/local/poudriere/data/.m/main-amd64-bulk_a-default/03
tmpfs           2048    0   2047     0%    /usr/local/poudriere/data/.m/main-amd64-bulk_a-default/03/.p
tmpfs         577331 1985 575346     0%    /usr/local/poudriere/data/.m/main-amd64-bulk_a-default/03/usr/local
tmpfs         576703 1356 575346     0%    /usr/local/poudriere/data/.m/main-amd64-bulk_a-default/04
tmpfs           2048    0   2047     0%    /usr/local/poudriere/data/.m/main-amd64-bulk_a-default/04/.p
tmpfs         577331 1985 575346     0%    /usr/local/poudriere/data/.m/main-amd64-bulk_a-default/04/usr/local
. . .
Filesystem 1M-blocks Used  Avail Capacity  Mounted on

Imagine a system with 1024 hardware threads with a desire to use a large chunk of them. FreeBSD has had work on making such large hardware thread counts more reasonable to use.

This happens even when the tmpfs space for the idle builder is large (say rust or an electron* or . . . was what had been built).
(My example context above did not happen to have an example at the time.)

This interferes with use of MUTUALLY_EXCLUSIVE_BUILD_PACKAGES to cut down on RAM+SWAP use.
TMPFS_BLACKLIST/TMPFS_BLACKLIST_TMPDIR can help with the largest cases (not that they are all
obvious for what all might be large for a given poudriere usage context) but is not reasonable for coverage a large number of medium-to-small cases, for example. ("large" and "medium" are probably best viewed as specific to the RAM+SWAP context.)
(Of course tmpfs use can be avoided but this has other tradeoffs.)

How to reproduce

There are a large variety of ways to have builders used and then later be idle for a notable time.

One way to see the tmpfs usage is via: # df -m -t tmpfs | sort -k6

Expected behavior

Instead of waiting for a builder to be reused to resize the tmpfs, do something at the end of the builder activity to shrink it.

Screenshots

N/A

Environment

  • Host OS [e.g. 12.2 amd64]: main-n269589-9dcf39575efb amd64 amd64 1500018 1500018 based
  • Jail OS [e.g. 12.0 powerpc]: main-n269589-9dcf39575efb amd64 amd64 1500018 1500018 based
  • Browser: [e.g. chrome, safari]: N/A
  • Poudriere Version [e.g. 3.3.1 or git hash or port version]: poudriere-git-3.4.99.20240419
  • Ports branch and revision [e.g. 2020Q3 r550754]: 62a76b7dc95a (HEAD -> main, freebsd/main, freebsd/HEAD)

Note: not host/jail specific. Not platform specific. Not a new issue but a long term one.

Additional context

Does dealing with this trace back into needing any FreeBSD changes?

@markmi markmi added the bug label Apr 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant