A basic example to demonstrate how to use the Zest_Sensor_Camera board with a TLM CNN wich can recognise handwritten digits.
The following boards are required:
- Zest Core STM32L4A6RG
- Zest Sensor Camera
- Zest Battery LiPo (optional)
This demo makes use of the following libraries:
- Zest Sensor Camera (include the lm3405 led flash and ov5640 sensor drivers)
- TensorFlow Light for Microcontrollers
To clone and deploy the project in one command, use mbed import
and skip to the
target and toolchain definition:
mbed import https://gitlab.com/eg-julien/zest-sensor-camera-ai.git zest-sensor-camera-ai
Alternatively:
-
Clone to "zest-sensor-camera-ai" and enter it:
git clone https://gitlab.com/eg-julien/zest-sensor-camera-ai.git zest-sensor-camera-ai cd zest-sensor-camera-ai
-
Create an empty Mbed CLI configuration file:
-
On Linux/macOS:
touch .mbed
-
Or on Windows:
echo.> .mbed
-
-
Deploy software requirements with:
mbed deploy
Define your target (eg. ZEST_CORE_STM32L4A6RG
) and toolchain:
mbed target ZEST_CORE_STM32L4A6RG
mbed toolchain GCC_ARM
Export to Eclipse IDE with:
mbed export -i eclipse_6tron
Compile the project:
mbed compile
Program the target device (eg. STM32L4A6RG
for the Zest_Core_STM32L4A6RG) with a J-Link
debug probe:
python dist/program.py STM32L496RG BUILD/ZEST_CORE_STM32L4A6RG/GCC_ARM/zest-sensor-camera-demo.elf
Debug on the target device (eg. STM32L4A6RG
for the Zest_Core_STM32L4A6RG) with a
J-Link debug probe.
-
First, start the GDB server:
JLinkGDBServer -device STM32L4A6RG
-
Then, in another terminal, start the debugger:
arm-none-eabi-gdb BUILD/ZEST_CORE_STM32L4A6RG/GCC_ARM/zest-sensor-camera-demo.elf
Note: You may have to adjust your GDB auto-loading safe path or disable it completely by adding a .gdbinit file in your $HOME folder containing:
set autoload safe-path /