News

Matrix multiplication advancement could lead to faster, ... In October 2022, we covered a new technique discovered by a Google DeepMind AI model called AlphaTensor, ...
To evaluate their approach, the researchers compared their MatMul-free LM against a reproduced Llama-2-style model (which they call "Transformer++") across three model sizes: 370M, 1.3B, and 2.7B ...
In collaboration with NVIDIA, researchers from SGLang have published early benchmarks of the GB200 (Grace Blackwell) NVL72 ...
In getting rid of matrix multiplication and running their algorithm on custom hardware, the researchers found that they could power a billion-parameter-scale language model on just 13 watts, about ...
While matrix multiplication is one of algebra’s simplest operations, taught in high school math, it is also one of the most fundamental computational tasks and, as it turns out, ...
A Laser Focus. In 1986, Strassen had another big breakthrough when he introduced what’s called the laser method for matrix multiplication. Strassen used it to establish an upper value for omega of ...
Generally, the number of additions is equal to the number of entries in the matrix, so four for the two-by-two matrices and 16 for the four-by-four matrices. This difference between addition and ...
Researchers at MIT's Computer Science & Artificial Intelligence Lab (CSAIL) have open-sourced Multiply-ADDitioN-lESS (MADDNESS), an algorithm that speeds up machine learning using approximate matrix m ...
The algorithm is able to re-discover older matrix multiplication algorithms and improve upon its own to discover newer and faster algorithms. “AlphaTensor is the first AI system for discovering novel, ...