Engineers Innovate AI Language Models Without Matrix Multiplication
A team of developers from the University of California, Soochow University and LuxiTec has created a new formula that allows to implement AI language models without matrix multiplication (MatMul), which is usually seen as a limiting factor. Pre-printed to arXiv, their method renews 16-bit floating points {0,1} with three numbers {-1, 0, one}, and uses…
