Engineers Run AI Models Without Matrix Multiplication

Engineers Innovate AI Language Models Without Matrix Multiplication

Updated on 10 Jul 2024 | 1 min read

A team of developers from the University of California, Soochow University and LuxiTec has created a new formula that allows to implement AI language models without matrix multiplication (MatMul), which is usually seen as a limiting factor.

Pre-printed to arXiv, their method renews 16-bit floating points {0,1} with three numbers {-1, 0, one}, and uses certain functions as well as quantization techniques that speed up data processing. The researchers replaced the isomorphic potentials with a MatMul-free linear gated recurrent unit from normal transformer blocks. They found that their system scales up as well the current state-of-the-art, but requires markedly less computing power and electricity.

To read the blog, VISIT HERE. 

Techcanvass Academy

About Techcanvass Academy

Techcanvass, established in 2011, is an IT certifications training organization specializing in Business Analysis, Data Analytics, and domain-specific training programs. We offer internationally recognized certifications like CBAP and CCBA, helping professionals become certified Business Analysts. Additionally, we provide training modules for various domains like Banking, Insurance, and Healthcare, alongside specialized certifications in Agile Analysis, Business Data Analytics, Tableau, and Power BI.

Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.
You need to agree with the terms to proceed

Menu