Skip to content

Commit

Permalink
Merge pull request #6 from m4dm4rtig4n/0.3
Browse files Browse the repository at this point in the history
rework ha autodisco
  • Loading branch information
m4dm4rtig4n committed Sep 28, 2021
2 parents 667799e + 9debdbb commit 48b9b3b
Show file tree
Hide file tree
Showing 30 changed files with 806 additions and 333 deletions.
Empty file modified .github/workflows/build_push_docker.yml
100644 → 100755
Empty file.
3 changes: 2 additions & 1 deletion .gitignore
100644 → 100755
Original file line number Diff line number Diff line change
@@ -1 +1,2 @@
.idea
.idea
/data
1 change: 1 addition & 0 deletions Dockerfile
100644 → 100755
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ FROM python:3.9.7

COPY ./app /app

RUN mkdir -p /data
RUN pip install -r /app/requirement.txt

CMD ["python", "-u", "/app/main.py"]
55 changes: 44 additions & 11 deletions README.md
100644 → 100755
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,8 @@ Enedis Gateway limit to 50 call per day / per pdl.

If you reach this limit, you will be banned for 24 hours!

See chapter [persistance](#persistance), to reduce API call number.

## Environment variable

| Variable | Information | Mandatory/Default |
Expand All @@ -40,9 +42,11 @@ If you reach this limit, you will be banned for 24 hours!
| MQTT_PASSWORD | Mosquitto Password (leave empty if no user) | |
| RETAIN | Retain data in MQTT | False |
| QOS | Quality Of Service MQTT | 0 |
| HA_AUTODISCOVERY | Enable auto-discovery | false |
| GET_CONSUMPTION | Enable API call to get your consumption | True |
| GET_PRODUCTION | Enable API call to get your production | False |
| HA_AUTODISCOVERY | Enable auto-discovery | False |
| HA_AUTODISCOVERY_PREFIX | Home Assistant auto discovery prefix | homeassistant |
| BASE_PRICE | Price of kWh in base plan | False |
| BASE_PRICE | Price of kWh in base plan | 0 |
| CYCLE | Data refresh cycle (3600s minimum) | 3600 |
| YEARS | Allows you to define up to how many years you want import | 1 |
| ADDRESSES | Get all addresses information | False |
Expand All @@ -51,12 +55,20 @@ If you reach this limit, you will be banned for 24 hours!

The HC / HP calculations require a lot of API calls and the limit will be reached very quickly

> Need database => Roadmap
**WARNING :**

**The following options generate additional API calls, be careful not to exceed the call limit per day!**
- YEAR (One per years)
- GET_CONSUMPTION (One per YEARS)
- GET_PRODUCTION (One per YEARS)
- ADDRESSES

## Persistance

Since v0.3, Enedis Gateway use SQLite database to store all data and reduce API call number.
Don't forget to mount /data to keep database persistance !!

## Usage :

```
Expand All @@ -68,15 +80,17 @@ MQTT_PREFIX="enedis_gateway"
MQTT_CLIENT_ID="enedis_gateway"
MQTT_USERNAME='enedis_gateway_username'
MQTT_PASSWORD='enedis_gateway_password'
RETAIN='False'
QOS='0'
HA_AUTODISCOVERY='False'
RETAIN="True"
QOS=0
GET_CONSUMPTION="True"
GET_PRODUCTION="False"
HA_AUTODISCOVERY="False"
HA_AUTODISCOVERY_PREFIX='homeassistant'
CYCLE=86400
YEARS=1
BASE_PRICE=1
BASE_PRICE=0
docker run -it -restart=unless-stopped \
docker run -it --restart=unless-stopped \
-e ACCESS_TOKEN="$ACCESS_TOKEN" \
-e PDL="$PDL" \
-e MQTT_HOST="$MQTT_HOST" \
Expand All @@ -87,11 +101,14 @@ docker run -it -restart=unless-stopped \
-e MQTT_PASSWORD="$MQTT_PASSWORD" \
-e RETAIN="$RETAIN" \
-e QOS="$QOS" \
-e GET_CONSUMPTION="$GET_CONSUMPTION" \
-e GET_PRODUCTION="$GET_PRODUCTION" \
-e HA_AUTODISCOVERY="$HA_AUTODISCOVERY" \
-e HA_AUTODISCOVERY_PREFIX="$HA_AUTODISCOVERY_PREFIX" \
-e CYCLE="$CYCLE" \
-e YEARS=$YEARS \
-e BASE_PRICE="$BASE_PRICE" \
-v $(pwd):/data
m4dm4rtig4n/enedisgateway2mqtt:latest
```

Expand All @@ -102,6 +119,8 @@ services:
enedisgateway2mqtt:
image: m4dm4rtig4n/enedisgateway2mqtt:latest
restart: unless_stopped
volumes:
- mydata:/data
environment:
ACCESS_TOKEN: "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXX"
PDL: "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXX"
Expand All @@ -111,21 +130,35 @@ services:
MQTT_CLIENT_ID: "enedis_gateway"
MQTT_USERNAME: 'enedis_gateway_username'
MQTT_PASSWORD: 'enedis_gateway_password'
RETAIN: 'False'
QOS: '0'
HA_AUTODISCOVERY: 'False'
RETAIN: True
QOS: 0
GET_CONSUMPTION: True
GET_PRODUCTION: False
HA_AUTODISCOVERY: False
HA_AUTODISCOVERY_PREFIX: 'homeassistant'
CYCLE: 86400
YEARS: 1
BASE_PRICE: 0.1445
volumes:
mydata:
```

## Roadmap

- Add **DJU18**
- Add HC/HP
- Create Home Assistant OS Addons
- Add Postgres/MariaDB connector

## Change log:

### [0.3] - 2021-09-27

- Rework ha discovery to reduce items
- Fix ha_autodiscovery always enable
- Get Production
- Add SQLite database to store data and reduce number of API Call.

### [0.2] - 2021-09-25

- Helm chart
Expand Down
2 changes: 1 addition & 1 deletion VERSION
100644 → 100755
Original file line number Diff line number Diff line change
@@ -1 +1 @@
0.2
0.3
Binary file added app/__pycache__/addresses.cpython-39.pyc
Binary file not shown.
Binary file added app/__pycache__/contract.cpython-39.pyc
Binary file not shown.
Binary file added app/__pycache__/daily_consumption.cpython-39.pyc
Binary file not shown.
Binary file added app/__pycache__/daily_production.cpython-39.pyc
Binary file not shown.
Binary file added app/__pycache__/function.cpython-39.pyc
Binary file not shown.
Binary file added app/__pycache__/home_assistant.cpython-39.pyc
Binary file not shown.
Binary file added app/__pycache__/main.cpython-39.pyc
Binary file not shown.
28 changes: 28 additions & 0 deletions app/addresses.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
import requests
import json
from dateutil.relativedelta import *

from importlib import import_module
main = import_module("main")
f = import_module("function")

def getAddresses(client):
pdl = main.pdl
url = main.url
headers = main.headers

data = {
"type": "addresses",
"usage_point_id": str(pdl),
}
addresses = requests.request("POST", url=f"{url}", headers=headers, data=json.dumps(data)).json()
customer = addresses["customer"]
f.publish(client, f"{pdl}/details/customer_id", str(customer["customer_id"]))
for usage_points in customer['usage_points']:
for usage_point_key, usage_point_data in usage_points['usage_point'].items():
if isinstance(usage_point_data, dict):
for usage_point_data_key, usage_point_data_data in usage_point_data.items():
f.publish(client, f"{pdl}/details/usage_points/usage_point/{usage_point_key}/{usage_point_data_key}",
str(usage_point_data_data))
else:
f.publish(client, f"{pdl}/details/usage_points/usage_point/{usage_point_key}", str(usage_point_data))
62 changes: 62 additions & 0 deletions app/contract.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,62 @@
import requests
import json
from dateutil.relativedelta import *

from importlib import import_module
main = import_module("main")
f = import_module("function")

def getContract(client):
pdl = main.pdl
headers = main.headers
url = main.url

ha_discovery = {
pdl: {}
}

data = {
"type": "contracts",
"usage_point_id": str(pdl),
}
contract = requests.request("POST", url=f"{url}", headers=headers, data=json.dumps(data)).json()
if "customer" in contract:
customer = contract["customer"]
f.publish(client, f"{pdl}/details/customer_id", str(customer["customer_id"]))
for usage_points in customer['usage_points']:
for usage_point_key, usage_point_data in usage_points['usage_point'].items():
f.publish(client, f"{pdl}/details/usage_points/usage_point/{usage_point_key}", str(usage_point_data))
for contracts_key, contracts_data in usage_points['contracts'].items():
f.publish(client, f"{pdl}/details/usage_points/contracts/{contracts_key}", str(contracts_data))
if contracts_key == "last_activation_date":
f.publish(client, f"{pdl}/activation_date", str(contracts_data))
if contracts_key == "subscribed_power":
f.publish(client, f"{pdl}/subscribed_power", str(contracts_data.split()[0]))
ha_discovery[pdl].update({
"subscribed_power": {
'value': str(contracts_data.split()[0])
}
})
if contracts_key == "offpeak_hours":
offpeak_hours = contracts_data[contracts_data.find("(") + 1:contracts_data.find(")")].split(';')
index = 0
for oh in offpeak_hours:
f.publish(client, f"{pdl}/offpeak_hours/{index}/start", str(oh.split('-')[0]))
f.publish(client, f"{pdl}/offpeak_hours/{index}/stop", str(oh.split('-')[1]))
index = index + 1
f.publish(client, f"{pdl}/offpeak_hours", str(contracts_data))
ha_discovery[pdl].update({
"offpeak_hours": {
'value': str(contracts_data)
}
})
retour = {
"ha_discovery": ha_discovery,
"last_activation_date": contracts_data
}
else:
retour = {
"error": True,
"errorMsg": contract
}
return retour
Loading

0 comments on commit 48b9b3b

Please sign in to comment.