To increase adoption they should not have limited this to the latest iPhone models. Why on earth can a one year old iPhone 15’s CPU not handle encoding JXL? It can encode 4K video in real time, so this should be no problem at all, right?
This is how they sell new phones. The grand majority of new features don't require the latest hardware, but artificially restricting it increases sales of new phones. None technical people can't usually tell the difference between hardware and software clearly enough to understand this nasty trick.
To be fair it’s less bad than what some other vendors are doing, which is stopping to provide updates. Not saying apple is an angel, they intentionally generate ewaste by withholding such features…
I spend a lot of time using my phone, and can afford the latest & greatest - but I use a 6 year old apple phone, because it's _perfectly fine_. A 10 year old apple phone is "just barely usable".
When I was in the android ecosystem, a 4 year old phone was "just barely" usable, and phones stopped being "fine" after about two years.
I'll give apple shit for a lot of things, and they could definitely do _more_ about e-waste... but this is as likely to be "we put the image encoder on the camera module, so it would have been slightly more work to backport to old phones"
As Apple explains on the new iPhone models, JPEG XL files are supported
on iOS 17 and later and macOS 14 and later.
JPEG XL isn't limited to the latest phones; just a phone that can run iOS 17 or later. I have used JPEG XL on my iPhone 13 mini with no issues. iOS 17 runs on the iPhone XS (2018) or newer.
The difference is JPEG XL is now part of the Apple's image pipeline for the camera in iPhone 16.
Any 3rd party photo app developer can support JPEG XL if they wish.
One of the tricks for achieving the target battery life is that photo and video formats are offloaded to dedicated and very power-efficient hardware in all mobile devices. The iPhone 16 is the first to get hardware offload for AV1 and JXL, which is why it'll support these formats.
It's not just software, unlike in the PC world where going from 5W hardware decode to 50W software decode basically doesn't matter.
They aren't doing JXL encoding in hardware though. Zero chance that the new iPhone chip has hardware acceleration for JXL. Definitely just plain old CPU encoding.
And we're not talking about video anyway, this is about ProRAW, a still image format.
I don't know, but my guesswork is that the DNG/ProRAW/JXL support comes with compatibility challenges. Limiting the size of the launch to well-informed photography prosumers and professionals will help to iron out the compatibility challenges — rather than make all confused consumers face these challenges at once.
I don't think that hardware support plays a role here. The fastest encoding modes of JPEG XL are ridiculously fast on software, and Apple's CPUs seem powerful enough.
In lossy mode I think there is no difference between AVIF, HEIC or JXL. AVIF is even a little bit ahead.
For lossless mode, JXL's fast modes (-e1 and -e2) are fast. But their compression ratio is terrible. The higher levels are not usable in a camera in terms of speed.
Of course, my favorite and many people's favorite in this regard is HALIC (High Availability Lossless Image Compression). It is a speed/compression monster. The problem is that for now it is closed source and there is no Google or similar company behind it.
The point of reference here is not PNG but lossless JPEG, which was the best available option in DNG before version 1.7 of the DNG spec. Lossless JPEG compresses worse (but faster) than PNG.
I don't know how HALIC works but if there is no FOSS implementation available that seems like a no-go.
> In lossy mode I think there is no difference between AVIF, HEIC or JXL. AVIF is even a little bit ahead.
AVIF is definitely not ahead for the high quality levels you'd use in photography. AVIF is ahead at lower quality levels.
> For lossless mode, JXL's fast modes (-e1 and -e2) are fast. But their compression ratio is terrible.
JXL lossless e1 is still a lot better than the lossless compression people tend to use for photos these days. Like Apple has been using Lossless JPEG, which sucks.
Agreed. At camera quality jpeg xl is far ahead in quality. Apple's last announcement is about camera quality, close to lossless kind of lossy. In this domain jpeg xl does not have real competition. AVIF and HEIC don't even support more than 12 bits of dynamics, and become slow when quality is increased.
It’s good in a corporate setting, you don’t want to suddenly have to deal with a new file format for which you won’t have an app installed on your company PCs to view them.
1. As another commenter points out, your device works exactly as it did before.
2. Nobody on the face of the earth is making a decision about whether or not to buy a new phone based on JPEG-XL support. The fact that you’d even entertain that either means you’re in too much of a bubble or you’re so blinded by Apple hatred that you’re willing to believe any contrived thing that paints the company in a negative light.
What the ... You are ignoring what I wrote while not contributing anything. What did you fail to comprehend? You should be addressing what I wrote. Apparently many other commenters did understand me. What does that tell us? You are just being a jerk.
> Stop it.
Seriously. You are not discussing things, and you are not sharing an opinion here. You should be the one taking a break and consider your behavior.
How does it feel defending one of the richest companies in the world, and realizing they wouldn't care if you dropped dead?
I will give you another unrelated example to demonstrate fake obsolescence on an iPhone. For example, on the iPhone 13 Pro Max, You cannot set the battery to stop charging at 80%. You can do this with the new iPhone. I don't remember either 14 or 15, but you can't do this with the 13 Pro Max. So can you say that the iPhone 13 Promax is actually supported for so many years with the latest and greatest iOS When apple doesn't actually bring these new features. Back to the older version of iPhone?
FYI, you can get most of the same effect on your 13 Pro Max by plugging your charger into a smart plug, and using Shortcuts to make an automation that turns off the smart plug whenever the phone battery goes about 80%.
I use that on my 10th generation iPad and used it on the iPhone X that I had up until about two months ago when it got replaced with a 15 which does support the "charge to 80%" option. It works great.
The only minor annoyance I ran into was that the phones where the OS supports the 80% limit it will occasionally go to 100%, which they say is necessary to keep the battery level indicator calibrated. With the smart plug method you'll have to handle occasionally disabling the shortcut (or charging with the charger plugged straight into the outlet).
Just make sure to pick a smart plug that can be turned off from Shortcuts.
I think you are thinking of the "optimized charging" option, which tries to guess how long you will be leaving the phone on the charger and then pauses when it reaches 80% until it gets near the time it thinks you are going to take the phone off the charger and then charges up to 100%.
Advancements in compression algorithms also came with advancements in decompression speed. New algorithms like tANS are both compressing well, and have very fast implementations.
And generally smaller files decompress faster, because there's just less data to process.
But how does the ecological benefit of space savings compare with the extra power consumption from compressing and decompressing?
And will people take more pictures because of the space savings leading to more power consumption from compressing and decompressing the photos?
Is this just greenwashing by Apple?
But I have now decided to take my photos off of Apple's servers as well as to take way way less photographs, if any. The climate of my near future is way more important than a photograph of my cat.
You have an invalid assumption that extra power is spent on better compression or decompression. It generally takes less energy to decompress a better-compressed file, because the decompression work is mostly proportional to the file size. Energy for compression varies greatly depending on codec and settings, but JPEG XL is among the fastest (cheapest) ones to compress.
Secondly, you have an invalid assumption that the amounts of energy spent on compression have any real-world significance. Phones can take thousands of photos when working off a tiny battery, and most of it is spent on the screen. My rough calculation is that taking over a million photos takes less energy than there is in a gallon of gas.
Apart form that, compression cost is generally completely ignored, because files are created only once, but viewed (decompressed) many many times over.
Smaller files save storage space. Storage has costs in energy and hardware.
Smaller files are quicker to transfer, and transfer itself can be more energy intensive than compression. It's still small in absolute numbers.
Is there a cool down between bursts or can you do 10 a second for as many consecutive seconds as you like? You only have to get the images encoded and stored before the next burst starts, or at least before the user runs out of memory.
There is a pretty usable migration tool built into Angular CLI at least for switching over to the new template syntax. Requires some manual cleanup here and there but overall it’s not painful at all.
That is true for AirPlay, but AirPlay 2 works differently: It controls playback on a device (say, a HomePod) and all devices on the network see what is playing on that device and can control playback.
Wow, this looks so cool. Do you have a BOM or some schematics or even source code lying around? I would be interested to try something like this myself.
There are ad blockers on iOS, too. They are implemented in such a way that they can not spy on you—they simply provide a „block list“ that is executed by the browser. Works pretty well in my experience.
Ultimately, you have to trust someone. Unless you write your own compilers in machine code, and get the source, review it, etc. you have to trust someone along the chain. Hell, Intel/AMD could be fucking with you via the CPU.
If I have to trust someone, I trust the multi-trillion dollar company built on privacy. If they are found to be spying on people, the hit to the wealth of everyone who works there will be massive.
Made me laugh. For nine month I gave a chance to full-time android phone, and every two weeks or so it showed me ads based on what I discussed with my coworker in voice (I mean mouth and ears, not voice messaging) but never googled etc. If Apple does that, they at least pretend not to.
I think using an adblocker and uninstalling the YouTube app should be sufficient. The downside being that the playback quality is worse (720p max, I think).
You could put the payload into the fragment part of the url (after the '#' symbol). This part is not passed to the server and is not subject to the limitations of intermediate systems you have no control over. I assume that the payload is only needed client-side anyways.