X
    Categories: News

iPhone 7 & Colour Management

Apple’s iPhone 7 and 7 Plus continue Apple’s plan to let us to live in a more colourful world. Not only do both phones have screens with a wider colour gamut than previous models, but their cameras also capture more colours. Ever since the 1990s desktop operating systems have had to employ system level technology for dealing with varying colour gamuts between input, display and output. Apple were the first to tackle the problem with ColorSync for MacOS and they seem to be leading the way for mobile operating systems as well.

Until recently the mobile world was largely one captured and displayed in sRGB, that venerable colour space based on old CRT monitor technology. Mobile phone camera systems created sRGB jpegs and displayed them on roughly sRGB displays. Mobile phone apps could assume all RGB data was sRGB and just leave the colour values alone. sRGB has always been the lowest common denominator for colour and used in that way for many consumer level imaging products and web browsing. However, over the last few years manufacturers like Samsung have begun to introduce phones and tablets that can display more vivid colours but there has been little in the way of standardisation behind one colour space or operating system level colour management technology.

Apple however, has begun to adopt the Digital Cinema Initiative (DCI)wider colour gamut for its devices and also to back that up with changes to apps and operating systems. Every monitor, display or projector has a range of colours that it can reproduce, based on the colour of the red, green and blue pixels in the LCD matrix, the colour and strength of the backlight and a host of other factors.

As I said above sRGB was based on the range of colours from an old style CRT monitor but technology has moved on a bit since the last century. LCD displays are capable of a much wider colour gamut and it’s partly been the adoption of sRGB as a defacto colour standard that has until recently limited wider gamut display technology to specialist photographic and video editing monitors, such as those from EIZO. Apple’s first display to break away from sRGB was in the recently updated iMac. From a wide array of possible wider gamut colour spaces available they opted for the DCI standard. They have since used the same standard for the 9.7-inch iPad Pro and now the iPhone 7. Apple, as always, have come up with their own name for an existing technology, in this case the DCI colour space; they are calling it Wide Color.

The iPhone 7 goes further than the iPad Pro though in that not only does its display support the Wide Color gamut but its camera does as well, allowing it to capture a broader and deeper range of colours. Indeed it can produce raw image files not just processed and compressed JPEGs. These raw files could then be processed in the desktop versions of Adobe Lightroom or Photoshop and utilise the comprehensive colour management in those applications, including custom raw profiling. However, Apple has also made changes to iOS to allow mobile apps to capture, edit and display in the DCI gamut.

Displaying a larger range of colours would seem to have obvious advantages in terms of image quality. Colours can be more saturated, gradations and tonal detail improved. But there is a potential downside in that some system level way of managing colour needs to be implemented as it can no longer be assumed that all RGB data is sRGB and all displays are sRGB. sRGB data will have to be rendered correctly to DCI and DCI data (from the new camera especially) rendered correctly to sRGB when shared with other devices or posted via apps onto sites such as Facebook or Instagram. It’s not just on an iPhone 7 that changes will need to be made either. If an iPhone 7 user uploads an image to Instagram and a friend then views that image on an iPhone 6 the image will have to rendered from DCI to sRGB. The images below show the possible reduction in colour accuracy if the image data is not correctly colour managed.

This is how my test image should look when correctly colour managed.

This is how my test image would look as an DCI gamut file incorrectly colour managed on an sRGB display.

And this is how an sRGB gamut version of the file would look when viewed on a DCI gamut display without correct colour management.

Thankfully Apple have incorporated system wide colour management into iOS since version 9.3, to the extent that it is now functionally similar to ColorSync on MacOS. All the frameworks that deal with colour and images, such as UIKit, Core Graphics and WebKit have been updated to help developers support Wide Color correctly so that the kind of problems shown above don’t happen.

However, I’ve been involved in desktop colour management since the late 1990s and in the early days the results were far from reliable. Even now it’s possible to open up a Mac or Windows web browser on a wide gamut display and see garishly oversaturated colours because the browser is assuming that the display is sRGB and not colour managing correctly. My guess is that it will take a few years yet before all the kinks are ironed out of mobile colour management. Camera and display technology will continue to advance to leave sRGB far behind making good system level colour management crucial to image quality.

Rob :

View Comments

  • Hi Rob,

    I just got a 5K Retina iMac and have since discovered that it uses the DCI P3 colour space.

    I have calibrated this monitor with a Spyder 4 and when I edit my photos on this machine, then convert them to sRGB and then view them on my iPhone 6 which I'm assuming has an sRGB display, the colours look very different.

    When I was using my previous iMac which had a display closer to sRGB I didn't encounter this issue.

    What do you suggest now when i edit my photos on the 5K iMac, to get them close to looking the same when I view them on my iPhone 6? Do I need to sync them to my phone, then tweak these images on my phone using Snapseed or some other App? It's just a real nuisance and I'm not sure what to do?

    Any advice would be appreciated.

    Thanks,
    Brian

    • Hi Brian

      If you calibrate the iMac to 2.2 gamma, 6500K colour temperature, and 120 cd/m2 luminance you should be fairly close to the phone. Obviously make sure you are viewing the pic on the Mac in a fully colour managed app like Photoshop, and also ensure you have not changed the colour temp of the phone with Night Shift. You can vary the brightness on the phone to help get a match. But at the end of the day one is a phone and the other a much higher end display device so you may not get a perfect match.

      Rob

    • The key thing may be the application you are using to view the images on the iMac. Something fully colour managed, like Photoshop, should take the profile the image is tagged with (sRGB) and then use the monitor profile to help display those sRGB colours accurately. What you should see is the the sRGB image and not DCI. However, poorly colour managed apps can sometimes display the RGB values without conversion to the monitor profile, and so you'd see the colours as more intense on a DCI display. Make sure you embed the sRGB profile into the image as it will help apps like Safari colour manage better.

      However, even if everything is colour managed correctly on the Mac there is no guarantee that it will match an essentially un-colour managed iPhone. You may have just been lucky with the last Mac. The most likely area of difference is luminance - so try calibrating and profiling the iMac with varying levels of brightness.