Many laptops today come with two GPUs (Graphics Processing Units). The GPUs are the chips in the laptop that generate the displays. Today’s GPUs are extremely powerful chips that render 3D images, handling shading, video decompression and other graphical tasks. Unfortunately for laptops they also tend to use large amounts of power.

To combat the power drain of modern GPUs, some laptops provide two different chips. One uses small amounts of power and is relatively anemic - modern games would crawl on them but Word documents would scroll quickly. The other chip is powerful but draws high current.

Some early examples of these setups required manual user intervention. Some even had a physical switch that a user could throw. Apple computers required the user to select which GPU to use, then logout and log back in.

Graphics vendors looked for a better solution. ATI (owned by AMD) created a system that monitored the battery state of the laptop. When the laptop was plugged in, the high-power GPU was activated. When the laptop was on battery power, the low-power GPU was turned on.

NVIDIA came out with a more comprehensive solution, called Optimus. Optimus works by monitoring the calls being made to the Windows graphics sub-system. When it sees calls that might benefit from the high-power GPU, it turns it on. When the calls stop coming, it turns the high-power GPU back off. On Windows this isn’t full-proof so NVIDIA keeps a constantly updated set of profiles describing which applications need the high-power GPU. These profiles are downloaded in the background to user’s laptops.

The ATI solution, which sometimes results in screen flickering, means users have to activate the high-power GPU themselves if they need it while on battery power. The NVIDIA solution is much more seamless. In most cases the users will never even know it’s happening. They’ll just see improved battery life. In the worst cases they’ll see application crashes or corrupted images that are hopefully resolved when new profiles download.

Apple’s newest MacBooks work like Optimus. Any calls to macOS graphics libraries that would benefit from the high-power GPU result in the GPU switch. Where the process differs from Optimus is that there are no application profiles. I don’t know whether macOS has a graphics stack more amenable to this tracking or whether Apple simply widened the net, figuring that it was an acceptable power loss to occasionally turn on the high-power GPU in error.

The other minor difference is that Apple turns off the low-power GPU while the high-power GPU is active. It seems odd but the ATI and NVIDIA solutions don’t do that. While the high-power GPU is active in those solutions both GPUs are drawing power.

The long-term question is whether this trend will continue or whether the mobile GPUs will start running in dual modes and will automatically ramp up performance (and power consumption) automatically based on load. My guess is that’s exactly where they’re going.