Samsung has announced a new NPU technology that will allow on-device AI to be faster, more energy efficient and take up less space on the chip. Thanks to Quantization Interval Learning, 4-bit neural networks can be created that retain the accuracy of a 32-bit network. Using fewer bits significantly reduces the number computations and the hardware that carries them out - Samsung says it can achieve the same results 8x faster while reducing the number of transistors 40x to 120x. This will make for NPUs that are faster and use less power, but can carry out familiar tasks such as object...
via shopmatrix
Subscribe to:
Post Comments (Atom)
Honor Magic V2 RSR Porsche Design is coming to Europe
Earlier today, Honor finally launched the Magic V2 in Europe, almost five months after its initial announcement. Don't miss our unboxin...
-
Introduction Like we've said time and time again, GSMArena is chuck full of true geeks and between us you can find veritable experts on...
-
As 2023 is coming to a close, many still feel that 8GB is enough RAM for a modern smartphone, based on the results from last week’s poll. I...
-
Sony sent out the update to Android 14 to the Xperia 1 V, its mainstream sized flagship, back in early November, and for some reason it'...
No comments:
Post a Comment