Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hammerspoon Intermediate Gamma Issues #1428

Open
jjf4 opened this issue May 16, 2017 · 3 comments
Open

Hammerspoon Intermediate Gamma Issues #1428

jjf4 opened this issue May 16, 2017 · 3 comments

Comments

@jjf4
Copy link

jjf4 commented May 16, 2017

Hi,
I've been playing around with a method to increase and decrease the "brightness" of my external monitor with keyboard shortcuts via using hs.screen.getGamma and hs.screen.setGamma to increase or decrease the whitepoint rgb values of my monitor, giving the same effect as changing the brightness. Unfortunately, there seems to be some issues with api, when it comes to reloading the config, or disconnecting and reconnecting the monitor. For example, if I lower the gamma values on my monitor to .5, disconnect the monitor, and then reconnect the monitor it restores the white point rgb values to 1. That part is fine, but the problem is that if I then try to adjust the gamma values, the screen instantly dims, and the max value I can set it at is now .5. To fix the issue, I have to call restoreGamma, and then reload my script. setGamma seems to be capping the values for some reason, based on the current gamma value of the monitor. This issue seems to be related to #917 and I'm wondering if there's anyway to fix this, or any workarounds that could work without requiring the script to be reloaded everytime.

@jjf4
Copy link
Author

jjf4 commented May 17, 2017

Having looked into the source code, the issue seems to come from here.
It appears that whenever there is a new screen added, or the init file is reloaded, hs.screen grabs all of the current whitepoint gamma value data for each screen. Normally this is 1, but if the gamma value was adjusted when the screen was disconnected and reconnected, it will store that new gamma value instead. The problem comes in when setGamma is invoked. For some reason, it multiplies the input gamma value by the original gamma value when it stores it. This is fine when the gamma value is 1, but if the gamma value is .5 say, then every call to set gamma will actually decrease the gamma, even if you're trying to increase it. @asmagill & @cmsj , can you clarify on why the setGamma is scaled by the original gamma instead of the gamma capacity?

@cmsj
Copy link
Member

cmsj commented May 17, 2017

I think I see why this is happening, and I think you could work around it the following:

  • Keep track of the gamma you want for each display (hopefully your displays are uniquely identifiable somehow)
  • Run an hs.screen.watcher that:
    • calls restoreGamma() whenever the screen layout changes
    • re-applies your preferred gamma values for each display that is actually present

Hope that helps!

@jjf4
Copy link
Author

jjf4 commented May 17, 2017

Yeah, I managed to find a work around. Unfortunately the solution you suggested will not work. Say for example, the previous gamma was set to .5, and then I disconnected and reconnected the monitor. I can call restoreGamma (actually it appears to be called automatically), which will restore the gamma to 1. However, when the screen was attached, hammerspoon retreives the gamma value of .5, and stores that as the original gamma. Thus when I try to adjust the gamma to say .5 again with setGamma, it will put it at .25.
Instead what I had to do for a work around is keep track of the original gamma values that hammerspoon will store, by retrieving them myself when I add a monitor. Then when I call setGamma, i first divide the value I want to store by this original gamma to counteract the logic done by setGamma.

function changeColors(screen, offset)
  local newWhite = screen:getGamma()['whitepoint']
  local newBlack = screen:getGamma()['blackpoint']
  origWhite = origVals[screen:id()]
  newWhite['red'] = (newWhite['red'] + offset)/origWhite['red']
  newWhite['blue'] = (newWhite['blue'] + offset)/origWhite['blue']
  newWhite['green'] = (newWhite['green'] + offset)/origWhite['green']
  screen:setGamma(newWhite, newBlack)
end

hs.hotkey.bind({}, "f1", function()
  changeColors(hs.screen.mainScreen(), -.1)
end)
hs.hotkey.bind({}, "f2", function()
  changeColors(hs.screen.mainScreen(), .1)
end)

This solution is pretty hacky, and it seems to me like the behavior of setGamma is inconsistent. If I call screen:setGamma(screen:getGamma['whitepoint'], screen:getGamma['blackpoint']), ideally there should be no change, but with the current logic, if the original gamma is not 1, then there will be changes. Is there some use case for multiplying by the original gamma, instead of the gamma capacity?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants