diff --git a/AzureSetup.md b/AzureSetup.md index 01e3217..abc90ca 100644 --- a/AzureSetup.md +++ b/AzureSetup.md @@ -2,7 +2,7 @@ ## Provision IoT Hub/Device in Azure -The following operations can all be done from the Azure Portal. I'm showing the commands using the Azure CLI which I find more convenient. +The following operations can all be done from the Azure Portal. I'm showing the commands using the [Azure CLI](https://docs.microsoft.com/en-us/cli/azure/?view=azure-cli-latest) which I find more convenient. 1. Create a resource group with the command: `az group create -l northeurope -g RES_GROUP_NAME` 2. Crete an IoT Hub with the command: `az iot hub create -l northeurope -g RES_GROUP_NAME --sku B1 -n IOTHUB_NAME` @@ -12,3 +12,7 @@ The following operations can all be done from the Azure Portal. I'm showing the 3. Create an IoT Device with the command `az iot hub device-identity create -g RES_GROUP_NAME -n IOTHUB_NAME --device-id NAME_YOUR_DEVICE`. This creates an IoT device configuration on the IoT Hub, which you'll use to push readings to it. By default the authentication method is Shared Access Key. 4. Copy the IoT device's Connection String, with the command: `az iot hub device-identity show-connection-string -g RES_GROUP_NAME -n IOTHUB_NAME --device-id NAME_YOUR_DEVICE`. Copy the string starting with "Hostname=..." to a text editor for later use. + +5. You'll also need to create a consummer group, which I'm using in Stream Analytics. I named mine "bme680consummers) (you can find this in the "Built-in endpoints" option in the IoT Hub's page, or create with the CLI - https://docs.microsoft.com/en-us/cli/azure/iot/hub/consumer-group?view=azure-cli-latest ). + +These links can be informative if you want to know more: https://docs.microsoft.com/en-gb/learn/modules/remotely-monitor-devices-with-azure-iot-hub/2-create-iot-hub-device-id?pivots=csharp and https://docs.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-messages-read-builtin . diff --git a/DeviceUploadData.md b/DeviceUploadData.md index 46d92d1..7464c78 100644 --- a/DeviceUploadData.md +++ b/DeviceUploadData.md @@ -12,7 +12,7 @@ 2. After this, compile the program again by calling `./make.sh` as before. 3. Create a `data` folder as a subfolder from where you'll be running the compiled `bsec_bme680` (or you'll get a friendly `Segmentation fault` when you run it). -3. Now type `./bsec_bme680 &` to run the application in the background. If you list the contents of the `data` folder, you'll start seeing new csv files being generated, one every 3 seconds. +3. Now type `./bsec_bme680 &` to run the application in the background. If you list the contents of the `data` folder, you'll start seeing new csv files being generated, one every 3 seconds. You'll have to run this manually every time you reboot/restart your Raspberry, haven't configured auto-start yet (**TBD**). ## Upload the CSV files to an Azure IoT Hub @@ -26,7 +26,7 @@ The steps to follow are: 2. Edit the file to change the value of the `iothub_connstring` variable. This is a string looking like `"HostName=NAME_OF_YOUR_IOTHUB.azure-devices.net;DeviceId=NAME_OF_YOUR_DEVICE_IN_THE_IOTHUB;SharedAccessKey=LOTS_OF_ALFANUM_CHARACTERS"` which you can obtained from the Azure portal. 3. To do a test run, call `python3 scoop_up_data.py ./data/`. This will upload all your already captured CSV files to the Azure IoTHub, in cronological order, and print out something like as it uploads them: -``` +```text pi@rpi0:~/bsec_bme680_linux $ python3 scoop_up_data.py ./data/ Starting iothub client Reading files from /home/pi/bsec_bme680_linux/data: @@ -45,10 +45,13 @@ Files uploaded: 378 When you save and exit, the command above will be executed every minute, and upload the readings (typically 20 files at a time, considering they are recorded every 3 seconds). -## TBD +6. Clean up the uploaded files - as above, do `crontab -e` and add this to the end: + +`*/2 * * * * rm /home/pi/bsec_bme680_linux/data/uploaded*.csv` + +This will remove the uploaded files every two minutes. See here for more detail on the crontab string: https://crontab.guru/every-2-minutes . -**TBD** clear up the uploaded files. -**TBD** - how to set up your Azure IotHub -- create it and an IoT Device. Add a new instructions step. Maybe link to this: https://docs.microsoft.com/en-gb/learn/modules/remotely-monitor-devices-with-azure-iot-hub/2-create-iot-hub-device-id?pivots=csharp +## To be done -**TBD** avoid race condition as per here: https://www.cyberciti.biz/faq/how-to-run-cron-job-every-minute-on-linuxunix/ +How to avoid race conditions in cron jobs as per here: https://www.cyberciti.biz/faq/how-to-run-cron-job-every-minute-on-linuxunix/ . This is a detail. diff --git a/README.md b/README.md index 68386c6..80b0dfb 100644 --- a/README.md +++ b/README.md @@ -48,12 +48,13 @@ To do this, follow steps [4 - Save and upload readings](DeviceUploadData.md). The desired processing steps over the incoming data stream are essentially filtering and aggregation. I store the "bronze" data (i.e. data as is received without any changes) in a table, and then do filtering/aggregation and store this in another table. I decided to aggregate/average every 30 seconds but can have more aggregations by simply adding its configuration. -For this processing I'm using Azure Stream Analytics, and [the detail of what I'm doing is here](StreamProcessing.md). +For this processing I'm using Azure Stream Analytics, and here's [5 - how to configure stream processing](StreamProcessing.md). -## Display data +## Displaying the data -Here's a simple chart to look at Temperature and Humidity, in PowerBI Desktop: -![](simple-pbi-chart.png) +Here's a simple chart to look at Temperature and Humidity (silver data), in PowerBI Desktop: + +![PBI Dashboard](simple-pbi-chart.png) Also started working on an Android App, that's **TBD** atm. \ No newline at end of file