In other places — like when initializing a CGContext for other drawing operations — it’s common to use
CGColorSpaceCreateDeviceRGBwhen creating a
CGColorSpaceRef. This will create an sRGB colorspace on most devices, and we’ll lose our wide color information. Most of the initial work for wide color on Instagram was tracking down everywhere that this color space was hard-coded.
Instead, we can see if our screen supports wide colors (using
UIScreen.mainScreen.traitCollection.displayGamut), and if so, use
CGColorSpaceCreateWithName(kCGColorSpaceDisplayP3). Again, we found that creating a wrapper that returns the appropriate colorspace for that device was helpful.
Instagram uses OpenGL for most of its image editing and filtering. OpenGL isn’t color managed; it operates on a range (say, 0.0 to 1.0), and it’s up to the output surface to determine what colors that actually maps to.
The good news is that this meant we had to make very few changes to make our GL pipeline wide-color compatible. The biggest change was to ensure that when we extracted pixel buffers from our GL surface, we were using the appropriate colorspace before converting from a
I can see the Instagram logo in their canary image even though my Mac doesn’t have a P3 display.
Stay up-to-date by subscribing to the Comments RSS Feed for this post.