When Google launched its Pixel 2 flagship smartphone[1] last year, it included something of a surprise: A co-processor called Pixel Visual Core, the company’s first homegrown, consumer-facing piece of silicon. And while that feels like a momentous foray, the co-processor has lain dormant for months. Monday, Pixel Visual Core goes to work.

As it turns out—and as Google had nodded at[2] previously—the hidden chip inside every Pixel serves a narrow but critical purpose. It will use its eight custom cores, its ability to crunch 3 trillion operations per second, all in the service of making your photos look better. Specifically, the photos you take through third-party apps like Instagram, WhatsApp, and Snapchat.

Those are the three partners at the Pixel Visual Core switch-flipping; since it’s open to all developers, more will presumably follow. They’ll all gain the powers to produce Google's HDR+ images, photos that rely on a series of post-processing tricks to make images shot with the Pixel appear more balanced and lifelike. Photos taken with the Pixel Camera app have already benefited from HDR+ powers since launch—that's one reason Pixel 2 earned the highest marks yet given to a smartphone by industry-standard photo-rater DxOMark[3]. But Pixel Visual Core will extend the feature to the streams, feeds, and snaps of Pixel owners as well, after an update that will roll out early this week.

HDR+

To understand why Google would devote its first homemade smartphone processor to a relatively narrow function—not just photography, but HDR+ specifically—it helps to understand the importance of HDR+ to the Pixel’s photo prowess. For starters, it’s not the HDR you’re used to.

“HDR+ actually works shockingly differently,” says Isaac Reynolds, project manager for Pixel Camera. Where HDR...

Read more from our friends at Wired