« Previous 1 2 3 4
Run TensorFlow models on edge devices
On the Edge
Conclusion
TensorFlow Lite allows the accurate prediction of the harvesting date of lettuce growing in a vertical farm, without the need for a powerful server and with only an intermittent connection to the cloud. By reducing the computation cost and the deep neural network model size, the inference can be run on the edge by a Raspberry Pi with the TensorFlow Lite Runtime. This task has a small footprint and lets the Raspberry Pi manage all the other critical tasks of a working vertical farm. Furthermore, TensorFlow Lite can be used on IoT devices as small as ARM Cortex-M microcontrollers.
TensorFlow Lite has several limitations, though. Only a subset of operators is supported, which constrains the design of the model. For instance, recurrent neural networks are not fully supported. However, a Google team seems to be working on closing this gap for all models that TensorFlow currently offers. Another weakness is the absence of support for reinforcement learning, which requires more computation capabilities because the incremental training tasks are performed on the edge. This challenge can nonetheless be overcome with an Edge TPU device or a component like the Google Coral [7] board or the Coral USB accelerator, which can be plugged in to a Raspberry Pi.
Infos
- TensorFlow Lite: https://www.tensorflow.org/lite
- pandas: https://pandas.pydata.org/
- NumPy: https://numpy.org/
- OpenCV: https://opencv.org/
- ELK: https://www.elastic.co/what-is/elk-stack
- virtualenv: https://virtualenv.pypa.io/en/latest/
- Coral: https://coral.ai/
« Previous 1 2 3 4
Buy this article as PDF
(incl. VAT)
Buy ADMIN Magazine
Subscribe to our ADMIN Newsletters
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Most Popular
Support Our Work
ADMIN content is made possible with support from readers like you. Please consider contributing when you've found an article to be beneficial.