ARM has a submarine patent on ASTC encoding. Looks like there will never be a decent compressed texture format that isn’t ruined by greed.
Do you have any specific information on the issue? The principle thing I’ve discovered about this is this blog post about licensing issues surrounding ARM’s ASTC stuff.
Rich also pointed this out:
Uses Arm patent pending method.
Why is it that nVidia hasn’t touched anything ASTC encoding related since 2015?
Why is there still no ASTC on desktop GL? It’s the neatest texture compression format proposed so far, yet everyone keeps it at barge pole distance? Khronos claims ASTC is “royalty-free” (Rich also points this out), but apparently no one believes them?
No known issues.
Of course the spec should technically only cover decoding, rather than encoding, but it would be disingenuous to claim there are ‘no known issues’, since the astcenc tool explicitly forces EULA and discourages alternative implementation (per patent threat).
Nobody else has written an alternative (e.g. FOSS) ASTC encoder in 5 years. What’s going on?
Can Khronos please make a public statement regarding IP on ASTC?
There is ASTC on desktop GL. Intel implements it in some of their drivers. Given the ASTC usage in the mobile space, it seems therefore that the only holdouts are NVIDIA and AMD.
Right, so ‘only’ ALL of the only meaningful (performing) hardware on desktop…
I don’t think it’s because the vendors are lazy, something’s fishy about this.
And what about Vulkan (which I’m not that familiar with). Wasn’t ASTC supposed to be the core compressed texture format? Whatever happened to that? A quick search doesn’t turn up anything conclusive.
Surprisingly, even the Raspberry Pi 2 B lists the ASTC_ldr extension on GLES2 with updated drivers, but uploading an ASTC compressed image results in a GL error. Could be a bug in Mesa though, as I doubt the BCM2836 even supports ASTC, and ETC1 extension support has mysteriously disappeared in the latest driver, which can’t be right.
The funny thing is, I can’t even confirm the ASTC test images I’m using are not garbage, because I can’t find any functional software or hardware that even supports displaying compressed ASTC images…
You didn’t say “no ASTC on meaningful (performing) desktop GL.” You said “desktop GL” with no qualifiers. Which was incorrect.
So. Your title claims that ASTC is dead, and your first post claims that there is a submarine patent involved. The evidence you have for this is:
- A dodgy license.
- A comment in the ASTC compressor.
- AMD/NVIDIA not having implemented ASTC in their desktop GPUs.
That evidence seems decidedly thin on the ground to support those claims. I would hope that we would reserve accusations of submarine patents for something found in more than that it seems “fishy”. I’m not saying that everything in ASTC land is OK or whatever. I’m saying that you need more evidence to be convincing that ARM is trying to pull a fast one here.
One thing that could be keeping AMD and NVIDIA from implementing ASTC is lack of D3D support. If 3D engine developers can’t rely on ASTC for their D3D engine core, then they’re likely not going to bother providing ASTC versions of their textures for the OpenGL or Vulkan ports. And if major 3D engines aren’t going to support it, then AMD/NVIDIA are probably not going to spend the silicone to make it work.
Now, that’s not to say that I’m certain this is the reason why. But it’s a plausible explanation.
I don’t recall anything to that effect being said about Vulkan. Obviously, I didn’t see every piece of media that someone said about Vulkan, so it’s entirely possible I’m wrong about that. But I was watching Vulkan’s development fairly closely from the outside, and I can’t recall hearing anything of the sort.
Indeed, once there was the statement that any ES 3.1 or desktop GL 4.x hardware would be able to run Vulkan, it seems unlikely that ASTC would ever have been a requirement. Khronos wasn’t going to limit Vulkan to only the newest hardware (assuming said hardware had ASTC support to begin with); they want people to use Vulkan with what they already have just as much as what they might get in the future.
Sorry, I sounded a bit harsh. You are correct of course, and I’m rather surprised some Intel hardware supports it. I should have checked the various GL hardware databases first.
So. Your title claims that ASTC is dead, and your first post claims that there is a submarine patent involved. The evidence you have for this is […]
I do totally admit it was just a wild guess. It was more of an impatient* attempt to elicit some sort of clarity out of someone, anyone (preferably someone ‘inside’ the industry). I really just don’t understand the apparant lack of interest in this really neat compression scheme, that offers just about every format you’d ever want to use. ASTC is really clever and offers so much better quality than the current standard, which still mostly just hacks build on crappy old DXT.
Also, against my own theory: it doesn’t make sense for Arm NOT to license their supposed encoding patent, because that’s what Arm does; design & license. And licensed hardware only decodes stuff, whereas the patent (may) only apply to encoding.
That doesn’t excuse the bizarre EULA though, for their encoding tool AND documentation AND ASTC spec. It doesn’t look like your typical boilerplate they forgot to remove when they put it on GitHub, someone specifically wrote it. If it puts Rich off from getting near it, what other damage has this done?
For example, what was the reason nVidia abandoned their hardware accelerated ASTC encoder?
And I still stand by my assertion that the statement “no known IP issues” on the ASTC extension spec is misleading because of that. What’s the point of having hardware decode, if you can’t encode in the first place.
Does anyone know of other cases where the ‘IP’ mentioned in spec specifically applied to software (e.g. driver or GL client side), but not hardware (i.e. GL server side)? E.g. the S3TC extension only applied to hardware vendors, but have there ever been software side IP issues in the history of GL(ES)?
I still stand by my topic title (the ‘dead’ part):
a) we don’t have freedom to encode or develop encoders/tools/SDKs (makes ASTC moot on ALL platforms, not just desktop)
b) desktop ASTC is not widely supported
One thing that could be keeping AMD and NVIDIA from implementing ASTC is lack of D3D support.
Good point, but IMO that only makes it stranger? Another party (MS) that doesn’t touch ASTC?
But indeed it could be nothing more than a chicken vs egg problem. If that’s the case then I will call the vendors (including MS) lazy.
I don’t recall anything to that effect being said about Vulkan.
I must have seen it in some slides, but that was before Vulkan spec was final. It remember feeling pleased because it would mean ASTC be cast into silicon and thus eventually appear desktop GL wide.
Then I guess just nothing came of it all? I expected more progress.
*ASTC was announced september 2013, that’s 7 years ago! Some mobile devices support it, but not high end graphics hardware where we really need it. Just makes me a sad developer.
I also came across this weirdness:
So apparently nVidia do have some hidden support for ASTC, but apparently the driver software decodes the texture, and then uploads it uncompressed.
*Edit: I just tested this, unfortunately it doesn’t work on driver version 390.116 (Linux 64-bit), results in GL_INVALID_ENUM (as it should).
ASTC does date back to 2013, but you have to look at what it competes against.
In the mobile space, ASTC in 2013 was competing against PVRTC and ETC2/EAC. These formats were not great; they didn’t have good quality and many of them were really limited in what they could do. Furthermore, some of them were locked to certain hardware. As such, ASTC was a Godsend: a compression format that could work with anything, provided variable channels, handled HDR, and wasn’t terrible.
Compare this to the desktop space. S3TC isn’t the best format on the planet, but it was decent enough with a good compressor. BC 4&5 gave us multi-channel support. But the really big thing was BPTC/BC6H&7. These gave us HDR compression as well as improved LDR compression relative to S3TC.
Is ASTC better than BCs 4-7? Maybe. Is it significantly better than these? Unlikely.
The only thing ASTC really has on BC6H&7 is variable block size. But that really boils down to being able to make a tradeoff between quality and memory/bandwidth. And while nobody likes to use more memory/bandwidth than they need to, desktop GPUs are not so starving for either that dropping quality significantly is a great tradeoff in most circumstances. In the desktop space, memory isn’t as restrictive, so the need for greater compression is less.
I don’t see this as laziness so much as already having a “good enough” solution in hand.
Good news everyone: ARM has decided to relicense the ASTC encoder to Apache 2.0 license:
That doesn’t offer new info on the patent situation, but at least ASTC can be legally encoded now.
ASTC LDR 6x6 and 8x8 provide a perfect compression for diffuse textures, including images with alpha channels while taking 3.5 and 2 bits per pixel. I regularly use them in my Android apps and quality is close to uncompressed images. While PC GPUs can simply use 12+ GB of RAM instead of supporting new compressions, new gen consoles which have limited RAM could really make use of ASTC.
The only console which uses ASTC is Nintendo Switch (well, because is a handheld device with mobile chip in it).