SAVE

Embedded Vision Autonomous System

SAVE aims at studying new autonomous vision systems for the Internet of Things. Indeed, new versatile applications require smart objects able to transmit pictures and videos but having long lasting autonomy. The purpose is to develop vision systems with an electric consumption of a few µW, similar to the one of simple sensors (temperature, smoke detection,..).

SAVE

Embedded Vision Autonomous System

SAVE aims at studying new autonomous vision systems for the Internet of Things. Indeed, new versatile applications require smart objects able to transmit pictures and videos but having long lasting autonomy. The purpose is to develop vision systems with an electric consumption of a few µW, similar to the one of simple sensors (temperature, smoke detection,..).

Objectives

The SAVE project (Système autonome de vision embarquée) will develop an intelligent vision system targeting embedded and autonomous applications. The envisioned domains encompass home automation, transport, traffic supervision, urban life monitoring, industrial manufacturing, environmental monitoring or eHealth. Functionally, the system will encompass the picture/video capture, treatment and communication. As regards energy, the project targets months or years of autonomy, up to perpetual functioning thanks to energy harvesting or renewable energy.
The technical requirements will range from picture capture for smart metering to frame rate over 2fps with a mean consumption of about 50 µW in continuous mode. The consortium will globally optimise such systems by co-design techniques covering different technologies: electronics, algorithms and micro-electronics (development of Ultra Low Power CMOS image sensors).

Results

Apart of technological studies, the project will focus on the specific Use Case for demonstration:

  • optical reading of energy meter
  • traffic monitoring

Added value

The valuable outcomes of the project are twofold: the Ultra Low Power sensor developed by the partners will tackle with needs of specific applications that are not covered at this time, and the demonstrators are expected to lead to new versatile vision systems for Internet of Things.