Authors
Yvan Tortorella, Luca Bertaccini, Luca Benini, Davide Rossi, Francesco Conti
Publication date
2023/12/1
Journal
Future Generation Computer Systems
Volume
149
Pages
122-135
Publisher
North-Holland
Description
The increasing interest in TinyML, ie, near-sensor machine learning on power budgets of a few tens of mW, is currently pushing toward enabling TinyML-class training as opposed to inference only. Current training algorithms, based on various forms of error and gradient backpropagation, rely on floating-point matrix operations to meet the precision and dynamic range requirements. So far, the energy and power cost of these operations has been considered too high for TinyML scenarios. This paper addresses the open challenge of near-sensor training on a few mW power budget and presents RedMulE—Reduced-Precision Matrix Multiplication Engine, a low-power specialized accelerator conceived for multi-precision floating-point General Matrix–Matrix Operations (GEMM-Ops) acceleration, supporting FP16, as well as hybrid FP8 formats, with {s i g n, e x p o n e n t, m a n t i s s a}=({1, 4, 3},{1, 5, 2}). We integrate …
Total citations