TL;DR

  • The adaptive refresh rate (ARR) feature in Android 15 enables the display refresh rate to adapt to the frame rate of content.
  • The ARR feature reduces power consumption and jank as it lets devices operate at lower refresh rates without the need for mode switching.
  • While previous versions of Android supported multiple refresh rates, they did so by switching between discrete display modes.
  • bossjack@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    8 days ago

    Android Authority 's TL;DR (conveniently) doesn’t mention the actual downside to this update. But it’s fine imo, since this was actually a pretty insightful read.

    My TL;DR:

    • Google’s ARR/VRR implementation is hopefully more compatible with the GKI system vs. current per-vendor, per-device implementations
    • To add this support, vendors must implement v3 of the Hardware Composer and Hardware Abstraction Layer APIs.
    • That means also undoing existing kernel changes for their devices and retooling it to support HWC & HAL v3. Lots of engineering time.
    • This solution still isnt perfect. There’s a notable limitation in something called the “panel’s Tearing Effect”, but im not an expert at displays so CTRL+F it for the paragraph in question.
    • Quack Doc@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      8 days ago

      This is not traditional VRR how we think of it. VRR how we think of it is changing the “frame rate” of the monitor to better suit the frame pacing of the received frames, this is not whats happening here. This is how things like freesync works. It takes your device framerate, say 60fps, and slows it down to better match the frame pacing of the content, say 48fps. Now the monitor doesn’t physically change states or anything, it just allows flexible updating to match the frame pacing.

      You don;t get this with this adaptive refresh rate method

      Here you are effectively getting a noop every refresh cycle it doesn’t need. It’s still good, but not as good as what most people think of as VRR (Freesync/vesa adaptivesync, gsync etc.). You are limited to the steps your display can output. For this to be useful you require a high refreshrate display like 120hz because each application needs to align with a frame refresh.

      IE. say you have a 24fps video, the display won’t change it’s frame pacing, but rather you get a noop every 4 frames and a refresh, (24 * 5). Now assume you have a 90hz display, 24fps has no solid divisor in 90fps, so you have to either wait for sync, or get tearing. The first one leads to judder (which can probably be mitigated using offset sync waits?) the second one is well, tearing.