Ultra-Low Power IoT Smart Visual Sensing Devices for Always-ON Applications

Rusci, Manuele (2018) Ultra-Low Power IoT Smart Visual Sensing Devices for Always-ON Applications, [Dissertation thesis], Alma Mater Studiorum Università di Bologna. Dottorato di ricerca in Ingegneria elettronica, telecomunicazioni e tecnologie dell'informazione, 30 Ciclo. DOI 10.6092/unibo/amsdottorato/8628.
Documenti full-text disponibili:
[img]
Anteprima
Documento PDF (English) - Richiede un lettore di PDF come Xpdf o Adobe Acrobat Reader
Disponibile con Licenza: Creative Commons Attribution Non-commercial No Derivatives 3.0 (CC BY-NC-ND 3.0) .
Download (4MB) | Anteprima

Abstract

This work presents the design of a Smart Ultra-Low Power visual sensor architecture that couples together an ultra-low power event-based image sensor with a parallel and power-optimized digital architecture for data processing. By means of mixed-signal circuits, the imager generates a stream of address events after the extraction and binarization of spatial gradients. When targeting monitoring applications, the sensing and processing energy costs can be reduced by two orders of magnitude thanks to either the mixed-signal imaging technology, the event-based data compression and the use of event-driven computing approaches. From a system-level point of view, a context-aware power management scheme is enabled by means of a power-optimized sensor peripheral block, that requests the processor activation only when a relevant information is detected within the focal plane of the imager. When targeting a smart visual node for triggering purpose, the event-driven approach brings a 10x power reduction with respect to other presented visual systems, while leading to comparable results in terms of detection accuracy. To further enhance the recognition capabilities of the smart camera system, this work introduces the concept of event-based binarized neural networks. By coupling together the theory of binarized neural networks and focal-plane processing, a 17.8% energy reduction is demonstrated on a real-world data classification with a performance drop of 3% with respect to a baseline system featuring commercial visual sensors and a Binary Neural Network engine. Moreover, if coupling the BNN engine with the event-driven triggering detection flow, the average power consumption can be as low as the sleep power of 0.3mW in case of infrequent events, which is 8x lower than a smart camera system featuring a commercial RGB imager.

Abstract
Tipologia del documento
Tesi di dottorato
Autore
Rusci, Manuele
Supervisore
Co-supervisore
Dottorato di ricerca
Ciclo
30
Coordinatore
Settore disciplinare
Settore concorsuale
Parole chiave
ultra-low power, embedded, smart camera, event-based sensing, event-driven, binarized neural networks
URN:NBN
DOI
10.6092/unibo/amsdottorato/8628
Data di discussione
27 Aprile 2018
URI

Altri metadati

Statistica sui download

Gestione del documento: Visualizza la tesi

^