Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: LRU cache for dependencies #98

Merged
merged 8 commits into from
Oct 31, 2023
Merged

feat: LRU cache for dependencies #98

merged 8 commits into from
Oct 31, 2023

Conversation

OliLay
Copy link
Collaborator

@OliLay OliLay commented Oct 25, 2023

Implements an LRU eviction for the dependency cache. Cache size can be configured, and is per default 10G.

In a follow-up PR we can also abstract the cache and re-use it e.g. for object file caching.

@OliLay OliLay requested a review from spirsch October 25, 2023 13:43
Copy link
Collaborator

@spirsch spirsch left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Currently the config takes the max. size in bytes. I would prefer if the user could supply a "size string", i.e. "100M" or "10G" like some Unix tools allow to. Would make it easier to configure, what do you think?

As discussed in person, having a quick, little parsing step for M&G suffixed parameters sounds reasonable.

In a follow-up PR we can also abstract the cache and re-use it e.g. for object file caching.

This would be awesome, yeah! 🚀

homcc/server/cache.py Outdated Show resolved Hide resolved
homcc/server/cache.py Outdated Show resolved Hide resolved
homcc/server/cache.py Outdated Show resolved Hide resolved
homcc/server/cache.py Outdated Show resolved Hide resolved
tests/server/cache_test.py Outdated Show resolved Hide resolved
@OliLay OliLay marked this pull request as ready for review October 27, 2023 09:25
Copy link
Collaborator

@spirsch spirsch left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice! 👍

"""
oldest_hash, oldest_path_str = self.cache.popitem(last=False)
oldest_path = Path(oldest_path_str)
oldest_size = oldest_path.stat().st_size
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the file does not exist, isn't this going to raise an exception?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very good point! This also affects cache size calculation. I moved the stat call into the try block, and adjusted the log message that the cache size calculation may be wrong if files are deleted by other processes.
A different solution would be to store the size of each cache entry in memory, but I don't really like that. In the end, some cache entries being deleted by other processes are probably rather very much an edge case, so I guess handling it like this is fine.

@OliLay OliLay merged commit 9877a29 into main Oct 31, 2023
7 checks passed
@OliLay OliLay deleted the layer/lru branch October 31, 2023 09:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

3 participants