Submitted by leo_sk5 t3_ygtq5z in technology
happyscrappy t1_iub89vg wrote
WebP seems like the smarter play by far. You already have the hardware to decode it on so many platforms.
So JPEG-XL doesn't fly. It'll join JPEG-2000 on the pile. No big deal.
candreacchio t1_iub9dpt wrote
Jp2000 actually has quite a bit of adoption, just not in the spaces you may realize.
All digital cinemas use jp2000 as their compression method (wrapped in a mxf container in xyz colour space)
happyscrappy t1_iubaplq wrote
Your picture is stored in the chip on your passport using JPEG-2000 (US passports).
It's still not much though.
DirectControlAssumed t1_iud7fru wrote
However, as I said in another comment, if you store your images as JP2, there are much better chance to be able to open them in 20 or 30 years than AVIF/JPEG XL/HEIF and other newborn megacorp-backed standards because JPEG 2000 is 20 years old, alive and even is still being developed (e.g. ISO/IEC 15444 Part 15 and Part 16 were published in 2019)!
Have you ever heard of JPEG XR? It was Microsoft's attempt at JPEG replacement that was standardized by ISO too and I'm not sure you can even open such images on anything but Windows (and I am not even sure latest Windows versions still support it). There is a reference codec on jpeg.org that wasn't updated since 2012 and that's it.
emfiliane t1_iufg1po wrote
The good thing about dead formats is that they're frozen where they stood, instead of having dozens of proprietary and incompatible variations. (Cough, anything IFF-based.) Unless they already made it that far before dying, like PCX. It's the proprietary ones that really painfully disappear, like exclusive Adobe-only formats.
I doubt JPEG XL (or XR or maybe XT/XS, or any other also-rans) will end up in that situation, since they're incorporated into the main Swiss army knife libraries, so some utility or another will be around to deal with them as long as the C language survives. It may not be convenient, let alone integrated into your favorite tools, but nor are most dead formats.
DirectControlAssumed t1_iufhbtu wrote
>some utility or another will be around to deal with them as long as the C language survives
...or some nasty security vulnerability is found and related code would be easier to throw away to reduce attack surface rather than maintain because nobody either wants or knows how to deal with it.
It was a lesser problem with historical dead formats that are basically as dumb as P(B,G,P)M (like BMP) but the new ones are very complex because of their advanced compression algorithms, metadata and stuff and require a lot of code to work.
emfiliane t1_iufo7gc wrote
That's not how kitchen sink libraries work, though; they support a lot of obscure and dead formats with known security problems in the implementation, but none of them are enabled by default. If you want to make a tool that's an everything-to-anything, you turn on all the compile options, and if you make it public, hopefully point out that here there be dragons.
Some binary-only remains might very well require virtualization in the future, the way accessing and converting old Pagemaker files does, but that's something retro enthusiasts seem to relish.
DirectControlAssumed t1_iufqu7f wrote
Well, compiling stuff with appropriate flags isn't easy for people who are not programmers. Most of them probably would just give up.
Even if that wasn't a problem I still don't like the idea of putting my precious images into the shaky state of dependence on some format that is susceptible to whims of a single company.
In the end of the day Google didn't really wanted ultimate-rule-them-all image format, they just need something that requires less bandwidth from their networks than JPEG. AVIF and WebP seem to be good enough for that role now. But this is basically endless battle - tomorrow they'll start thinking about even more compact formats and will declare JPEG XL/AVIF/WebP obsolete effectively abandoning them. If nobody else takes the burden of their maintenance in their hands, the files using these formats will become a large PITA for their owners.
JPEG 2000 is already here and one of its primary usages is exactly digital preservation, e.g it is one of the preferred formats of Library of Congress (with TIFF, JPG and PNG)
EDIT: https://www.phoronix.com/news/Chrome-Dropping-JPEG-XL-Reasons
RIP JPEG XL, even your own creator hasn't really liked you.
emfiliane t1_iufs7yq wrote
Sure, I use j2k (and djvu) every day. They're one possible format to transfer archived files and scans to, although being business, PDF/A is preferred over any raw image format in this group.
I'm just saying that major public formats don't just disappear, even if they become inconvenient to use; most major obsolete undocumented formats are still usable in some inconvenient way or another.
DirectControlAssumed t1_iuftz7f wrote
>I'm just saying that major public formats don't just disappear, even if they become inconvenient to use; most major obsolete undocumented formats are still usable in some inconvenient way or another.
I agree with that.
> They're one possible format to transfer archived files and scans to, although being business, PDF/A is preferred over any raw image format in this group.
AFAIK, PDF/A-2+ allows J2K images, so it works there too (if you want it) and, as you obviously know, PDF/A exists precisely for digital preservation.
Also, AFAIK, Adobe hasn't allowed any other "JPEG successors" in their PDF standard, either.
ApertureNext t1_iuhy78x wrote
JPEG XL allows lossless conversion between JPEG and JPEG XL, it quite literally can't become more compatible.
DirectControlAssumed t1_iui0sle wrote
>it quite literally can't become more compatible.
BTW, it can. There is JPEG XT that is just JPEG + additional data that adds new features. The existing JPEG software that doesn't know about JPEG XT still can read its plain old JPEG part.
DirectControlAssumed t1_iuhz5ou wrote
You can't open JPEG XL re-compressed JPEGs with the code that supports JPEGs (that is basically omnipresent and will be supported for foreseeable future without any doubt). If you want your JPEG back, you have to decompress it with djxl first.
So, you still have to rely on JPEG XL specific code which can start to "rot" (due to various reasons) with time if nobody maintains it.
ApertureNext t1_iui0xvy wrote
Exactly, so you aren't losing quality if JPEG XL ends up flopping and you need to transfer back to a more compatible format.
Browsers not supporting the format and now dropping support aren't helping.
DirectControlAssumed t1_iuji4g1 wrote
You are not wrong, I was talking about the "some long forgotten DVD on the attic" scenario when you suddenly find that you used some unusual image format to store your data for archival purpose because, e.g., you wanted to put more images on that DVD and now you don't know how to get it back because the only software that supports it is some Linux CLI tool that requires compilation with right flags to make it work. Or something even more arcane, who knows.
See digital dark age, though I am not talking about intergenerational problem - seeing how fast technology changes today and how more complex it becomes every day makes me think that such problems can happen even within our lifetime.
Leiryn t1_iubr89t wrote
Fuck webp, I always have to rename it to jpeg because a bunch of apps refuse to work with it (notably Google voice)
gurenkagurenda t1_iubsvnb wrote
That has nothing to do with webp specifically. It’s just a typical transition pain for a new format.
FineAunts t1_iuc2eaq wrote
If you make apps at scale there is a clear advantage of webp. Many of the @2x files we serve get an instant 50% weight reduction with zero noticeable quality loss (as compared to the tweaked jpeg). Most are in the 20-30% range but that's still pretty great. It's like going from ttf to woff2 or gzip to brotli.
I wish it was the standard over jpeg at this point. At least on the consumer/user side of things.
siscorskiy t1_iuc5sa6 wrote
Is this a Shopify thing? I notice a lot of the images I scrape from Shopify pages are resized with names like that appended to their filename
DirectControlAssumed t1_iud43ru wrote
>It'll join JPEG-2000 on the pile.
IMHO, JPEG-2000 seems to be the safest bet among all these "JPEG replacements" in a sense that if you store some photos with it for a long time you will not going to find yourself in a situation where the format is abandoned and there will be literally no software to open or transcode the photos in the future
JPEG-2000 is already 20 years old (i.e. U.S. patents on core coding have expired in addition to the fact that it was supposed to be royalty-free since the beginning), there are many codec implementations out there; there are image viewers supporting the format and there are industries that use it very extensively - digital cinema, medical imaging, geographic information systems, digital archives, etc. JPEG/ISO keeps publishing new parts (new features) for the format standard and that means there are people who actually care about it.
It is mostly absent on the consumer devices, yes, but who knows, maybe it will become more prominent since the patents are dead. It is also vendor-neutral unlike AVIF/JPEG XL/HEIF - there is no megacorporation behind it who constantly try to overtake its competitors in format wars. Google (who is the main sponsor of JPEG XL) is especially notorious for ruthlessly killing stuff people use.
I have recently played with JPEG-2000 and found that it has some very nice features. E.g. unlike plain JPEG you can make your image fit into specific size constraints - want to get 1Mb image out of 5Mb JPG? Easy! I also found that its lossless compression consistently beats PNG.
My very limited test also made me question JPEG XL superiority. While lossless JPEG XL constantly beated lossless JPEG 2000, I am not so sure about lossy compression. There are two modes of JPEG 2000 lossy compression (integer/real or reversible/irreversible). While default integer (reversible) lossy JPEG-2000 compression was usually worse than lossy JPEG XL for the same size, real (irreversible) was much better according to PSNR/NCC/AE metrics and directly competed with AVIF (though AVIF was usually better). JPEG-2000 artifacts also looked less ugly than JPEG XL's (though that may be due to the JPEG XL codec immaturity).
I don't know what JPEG-2000 lossy mode people use to make comparisions but I wouldn't be surprised if one of the reason why they find JPEG-2000 lossy performance lacking is that they use default compression settings.
There is also a very interesting thing named "arithmetic coded JPEG". It is a plain JPEG that has its lossless stage (Huffman coding) replaced with more efficient arithmetic coding. It demonstrates significant disk usage improvements over plain JPEG, especially for large images. You can convert JPEGs between Huffman and arithmetic coding without any quality losses - the images would be mathematically equal. This is similar to JPEG XL lossless JPEG re-compression but unlike JPEG XL it is not a brand new format - it is a part of JPEG specification since very long times! The reasons it wasn't used widely are patents that have long expired. Reference codecs like libjpeg supported it for a long time but the feature was often omitted by app developers because of patents so many image viewers still do not support it directly. The lossless nature of such recompression (and the fact that it is a part of plain old JPEG specification) means that you can use it to truly losslessly compress JPEGs for long storage.
I have done limited comparison of arithmetic coding and JPEG XL's JPEG recompression and haven't found absolute winner - sometimes it was arithmetic JPEG who was better, sometimes it was JPEG XL
Viewing a single comment thread. View all comments