Tuesday, October 22, 2024

iPhone 16 Adds JPEG XL

Jeremy Gray (Hacker News, Reddit):

Apple and its various software iterations have supported JPEG XL for at least a year, including in Finder, Preview, Final Cut Pro, Pages, Photos, Mail, Safari, and more. Adobe has also supported the format for a while, including in Adobe Camera Raw and Lightroom Classic.

Despite JPEG XL supporting reversible JPEG transcoding and being superior to JPEG in terms of quality and efficiency, the format has yet to be widely adopted. Neither Chrome nor Firefox, two very popular web browsers, support the format natively, for example. Extensions are available to support JPEG XL files, but they’re not installed by default.

The JPEG XL community website cites the format’s ability to reduce file size while delivering “unmatched quality-per-byte.” Compared to a standard JPEG, a JPEG XL file is up to 55% smaller while providing a cleaner image that is visually lossless. Gone are typical JPEG artifacts.

[…]

As Apple explains on the new iPhone models, JPEG XL files are supported on iOS 17 and later and macOS 14 and later. However, as mentioned, these .jxl files are wrapped in a DNG container, so you can’t just fire off .jxl files from the iPhone 16 Pro.

Juli Clover (Reddit):

Compared to the HEIC format that Apple introduced several years ago, JPEG-XL supports both lossy and lossless compression. HEIC is a lossy format, and while it retains better quality than JPG images, pros will likely prefer JPEG-XL for zero image degradation. HEIC has never gained wide support, which has hindered its usefulness.

It sounds like Apple has only enabled Camera support for JPEG XL with the iPhone 16 family, not with iOS 18 generally. Is this because it depends on hardware acceleration that’s only available with the A18? However, iOS 17 and macOS 14 can read the files.

Also, although JPEG XL seems to be superior to HEIC, Apple is not offering it as a general choice alongside JPEG and HEIC. It’s only available when using ProRAW. This is all rather confusing.

Ryan Jones:

Photo settings have gone too far. WTF is happening.

Collin Donnell:

So JPEG XL seems flat out better than HEIC for images? I’m going to start saving all my film scans as 16 bit JPEG XL.

praseodym:

JPEG XL also supports re-encoding existing JPEG files to decrease file size while keeping the original file quality. That really seems like useful feature but so far I haven’t seen any tooling (in macOS) to re-encode my existing photo library.

Previously:

Update (2024-10-23): Florian Pircher:

A lot of confusion, as you mentioned, since the new iPhones don’t use JXL for storing regular images, just for raw images. It used to be that regular images are either JPEG or HEIF and raw images are DNG with the pixel data stored as lossless JPEG. Now, regular images are the same as before, but for raw images you can choose how the pixel data inside the DNG should be stored: lossless JPEG (as before), lossless JXL, or lossy JXL.

Most people who have heard of JPEG XL have only seen it used for regular (non-raw) images. And few people know about the lossless JPEG format that was used before for DNG pixel data.

Update (2024-10-28): Jon Sneyers (2020, via Hacker News):

This section highlights the important features that distinguish JPEG XL from other state-of-the-art image codecs like HEIC and AVIF.

morpheuskafka:

I found this interesting note in the article:

HEIC and AVIF can handle larger [than 35MP, 8MP respectively] images but not directly in a single code stream. You must decompose the image into a grid of independently encoded tiles, which could cause discontinuities at the grid boundaries. [demo image follows].

[…]

The newest Fujifilm X cameras have HEIC support but also added 40MP sensors--does this mean they are having to split their HEIC outputs into two encoding grids?

It seems like the iPhone avoided this, as 48MP output is only available as a “ProRAW” i.e. RAW+JPEG, which previously used regular JPEG and now JPEG-XL, but never HEIC.

13 Comments RSS · Twitter · Mastodon


As far as I can tell ordinary JPEG is still better than HEIC on iPhone (from last time I compared), but only because the HEIC files are compressed a bit too much – at the same size HEIC would have been better, so still taking JPEG photos on iPhone as I do not care about the size, but I would buy a new iPhone if I could set it to use JPEG-XL as the standard format.


It seems that JPEG XL encodes and decodes on the CPU just as fast as standard JPEG (if not faster), so I don’t know why Apple’s limiting it to the current Pro phones.

https://cloudinary.com/blog/how_jpeg_xl_compares_to_other_image_codecs


@remmah Well, I think Apple does regular JPEG on the GPU.


Interesting… now that makes me wonder what the actual energy savings are of doing JPEG on the GPU.


@mjtsai my info may be out of date but I don't think they do JPEG on the GPU; they have dedicated hardware just for it. In the past, it was only used under limited circumstances, though, it was often faster to just use the CPU.


@nolen Maybe the A18 has dedicated hardware for JPEG XL?


@mjtsai Plausible, no idea. Or on the GPU for all I know. But regular JPEG has a bitstream that has to be decoded serially, so is usually not a good candidate for GPU.


Old Unix Geek

My guess would be that they updated the hardware to now support variable sized DCT. JPEG only uses 8x8 blocks. JPEG XL uses 2x2 up to 256x256 including non-square shapes. That part of the algorithm is parallel and probably would save energy. Color conversion was probably already supported in hardware for JPEG. Decoding the bitstream is an irregular serial task and seems better left to software.


@Old Unix Geek decoding the bitstream was done on hardware too, but I don't know how that speed compared to using software. Color conversion and scaling was done in a separate dedicated hardware unit, if it needed to be done at all -- you could send the raw 420 or whatever it was directly to the screen in some cases.


Old Unix Geek

@nolen: If you're right about the decoding the bitstream, I think it's probably still software but running on a small CPU specialized for that sort of task.

Implementing irregular tasks (corresponding to branchy code) directly in hardware gates rarely makes sense since it's error prone, impossible to update and is rarely worthwhile (power-wise or speed-wise).

So I could imagine implementing stuff like entropy decoding in hardware, since it's also used in video decode, but not the actual parsing of the bitstream...

Think of good instruction sets: they are very regular. And think about what a pain parsing x86 is... so much of a pain that most modern x86's convert x86 instructions into a set of microinstructions that are a lot simpler to interpret. The only reason the translation is not done in software (something Transmeta tried) is that it's too slow... and even companies that are successful at implementing it in hardware, AMD and Intel, are talking about dropping older 16 bit instructions to reduce the pain involved in implementing ID (instruction decode).


@Old Unix Geek
I just meant that when the hardware JPEG decoder was used, none of the decoding process took place on the main CPU. I have no idea what happened on the dedicated hardware unit, but you would feed it a raw JPEG bytestream and get back a raw YUV bytestream, without anything running on main CPU except to send commands to the driver.


That makes sense, thanks @nolen.


> The newest Fujifilm X cameras have HEIC support but also added 40MP sensors--does this mean they are having to split their HEIC outputs into two encoding grids?

I thought that was close, so maybe they didn't use the full sensor,. But their newest camera actually has a sensor of 102 MP: https://fujifilm-x.com/global/products/cameras/gfx100-ii/specifications/ They list a bunch of large, medium, small image dimensions but not even the medium ones fit in 35 MP. So either HEIF is only used for the "small" images, or yes, they split them up. In light of the boundary artefacts shown, that doesn't seem ideal.

Leave a Comment