I still refuse to believe they’re not a fake term used to fluff up tech announcements and make shit sound more powerful than it is because that’s a fucking stupid name that nobody should use
That’s like saying clock rate and core count are fake terms. Sure, by themselves they might not mean much, but they’re part of a system that directly benefits from them being high.
The issue with teraflops metric is that it is inversely proportional (almost linearly) to the bit-length of the data, meaning that teraflops@8-bit is about 2x(teraflops@16-bit). So giving teraflops without specifying the bit-length it comes from is almost useless. Although you could make the argument that 8-bit is too low for modern games and 64-bit is too high of a performance trade off for accuracy gain, so you can assume the teraflops from a gaming company are based on 16-bit/32-bit performance.
I know Terraflops are real and what they measure
I still refuse to believe they’re not a fake term used to fluff up tech announcements and make shit sound more powerful than it is because that’s a fucking stupid name that nobody should use
‘They are’
It’s usually measured as the performance doing a floating point fused multiply-add (fma) - that’s it.
But also, multiply then add is the cornerstone of 3D graphics…
That’s like saying clock rate and core count are fake terms. Sure, by themselves they might not mean much, but they’re part of a system that directly benefits from them being high.
The issue with teraflops metric is that it is inversely proportional (almost linearly) to the bit-length of the data, meaning that teraflops@8-bit is about 2x(teraflops@16-bit). So giving teraflops without specifying the bit-length it comes from is almost useless. Although you could make the argument that 8-bit is too low for modern games and 64-bit is too high of a performance trade off for accuracy gain, so you can assume the teraflops from a gaming company are based on 16-bit/32-bit performance.