Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

playbackRate computation for slowdown seems to be wrong #230

Open
chrisguttandin opened this issue Feb 5, 2023 · 3 comments
Open

playbackRate computation for slowdown seems to be wrong #230

chrisguttandin opened this issue Feb 5, 2023 · 3 comments

Comments

@chrisguttandin
Copy link

I think the playbackRate computation is wrong when the media element needs to slow down. It's at least not symmetric when compared to the speed up but maybe that's intentional.

Let's imagine the media element has a currentTime of 8.8 but it should be at the targetTime of 8.2. The diff computed as targetTime - currentTime is then -0.6. That means the following code ...

const rate = Math.max(
0,
((diff + this._correctionTime) / this._correctionTime) *
sourcePlaybackRate
);

... evaluates like this:

const rate = Math.max(0, ((-0,6 + 0,5) / 0,5) * sourcePlaybackRate);
const rate = Math.max(0, -0,2 * sourcePlaybackRate);
const rate = 0;

I guess when the media element has to slow down it should instead be something like the following:

const rate = sourcePlaybackRate / ((Math.abs(-0,6) + 0,5) / 0,5);
const rate = sourcePlaybackRate / 2.2;

What about changing the rate computation to something like this?

const rate = diff > 0
    ? ((diff + this._correctionTime) / this._correctionTime) * sourcePlaybackRate
    : sourcePlaybackRate / ((this._correctionTime - diff) / this._correctionTime);
@tjenkinson
Copy link
Owner

tjenkinson commented Feb 5, 2023

Hey @chrisguttandin. The idea was that at a given moment in time we figure out how far off the time is, and change the rate so that after the correction time it would be exactly the right time

If we pretend that the correction time is 1000ms and source playback rate is 1 to make the numbers nicer then then the current approach gives

[-1000, -800, -600, -400, -200, 0, 200, 400, 600, 800, 1000].map((diff) => (diff + 1000) / 1000)

which becomes

[0, 0.2, 0.4, 0.6, 0.8, 1, 1.2, 1.4, 1.6, 1.8, 2]

and your suggestion

[-1000, -800, -600, -400, -200, 0, 200, 400, 600, 800, 1000].map((diff) => (diff > 0 ? (diff + 1000) / 1000 : 1 / ((1000 - diff) / 1000)))

which becomes

[0.5, 0.5555555555555556, 0.625, 0.7142857142857143, 0.8333333333333334, 1, 1.2, 1.4, 1.6, 1.8, 2]

So if we were 400ms ahead (-400) then with current approach the rate would be 0.4 so that after 1000ms we would have saved 600ms and be in sync again. If we were 1000ms ahead then it would be 0 so that 1000ms later it would be in sync. If we were more than 1000ms ahead then it would be impossible to be in sync after 1000ms and the clamp to 0 is there so that it doesn't attempt to play backwards (if browsers even supported that)

So it looks like with your idea if we were 400ms ahead it wouldn't actually slow down as much (0.7142857142857143) and therefore be back in sync earlier than 1 second in to the future? This means the recovery time would be different if you are X ms ahead vs X ms behind?

@chrisguttandin
Copy link
Author

Thanks a lot for looking into this, Tom.

I think I understand now what the difference is. Your approach applies the correction time as wall clock time. And I tried to change the formula to apply the correction time as if it would be media time but only when the playhead is ahead of time.

I still think it might be a good idea to do that. Thinking of the correction time in wall clock time is really nice when the playhead needs to catch up. But it results in very extreme playbackRate values when the playhead is ahead of time with 0 as the most extreme value. On the other hand thinking of the correction time as media time results in moderate playbackRate changes when the playhead is ahead of time but the playbackRate gets as high as infinity when the playhead needs to catch up. Combining both formulas might be a good idea.

But I get that this was not the intention of this library. Should I close the issue?

@tjenkinson
Copy link
Owner

hey @chrisguttandin I can see how it means the rate isn’t as abrupt when playback is ahead and needs to slow down but I’m not really understanding why that’s better 🤔

Is playback at 0.8x worse than playback at 1.2x? I’m still not really sure why the behaviour for speeding up and slowing down shouldn’t be symmetrical

I think it could be quite subjective, and therefore maybe we could extract the code that takes the diff and source playback rate, and then calculates the new rate to a function? Then the existing behaviour could be the default but people could provide a custom calculator if they like?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants