added shit

This commit is contained in:
Angoosh Leviocki 2024-07-07 10:08:19 +02:00
parent 3bdc429850
commit ef6e17b19d
Signed by: angoosh
GPG Key ID: 2DAE446D291BD8D3
12 changed files with 643 additions and 1 deletions

View File

@ -1,3 +1,3 @@
# Metrics_sobkovice # Metrics_sobkovice
Docker stuff and other stuff concerning metrics in Sobkovice Metrics stuff for Sobkovice server

View File

@ -0,0 +1,7 @@
FROM python:3.8-alpine
ENV PYTHONUNBUFFERED=1
WORKDIR /app
COPY /src/exporter.py .
RUN python -m pip install --upgrade pip
RUN pip install --root-user-action=ignore --upgrade goodwe asyncio aiohttp prometheus_client
ENTRYPOINT ["python", "exporter.py"]

View File

@ -0,0 +1,165 @@
# goodwe-prometheus-exporter
Exporter for prometheus to export metrics from GoodWe Inverter
</br>
This exporter should be working on GoodWe ET, EH, BT, BH, ES, EM, BP, DT, MS, and D-NS families of inverters. It may work on other inverters as well, as long as they listen on UDP port 8899 and respond to one of supported communication protocols.
The inverters communicate via UDP protocol, by default running on port 8899. They use a native 'AA55' protocol and (some models) ModBus protocol. ET inverters support both protocols, some inverters may not support both of them.
(If you can't communicate with the inverter despite your model is listed above, it is possible you have old ARM firmware version. You should ask manufacturer support to upgrade your ARM firmware (not just inverter firmware) to be able to communicate with the inveter via UDP.)
more info about the python goodwe library: https://github.com/marcelblijleven/goodwe
</br>
### Pre-requisites
1. Configured inverter:
Inverter must be connected to a 2.4GHz wifi network (if you have a wifi module)
If not, you can configure it following:</br>
1.1 Connect to a wifi network called `Solar-Wifi`. The default login is `admin` and password `admin`</br>
note: default password is sometimes `12345678`</br>
1.2 Open your browser and go to http://10.10.100.253 or http://10.10.100.254</br>
1.3 enter `admin` as username, and `admin` as password</br>
1.4 Click `Start setup`, and select router's SSID (must be a 2.4GHz network) and it's password</br>
1.5 click `complete` to finish the setup process</br>
</br>
2. Installed python (tested with python 3.8, 3.9, 3.10):
for Ubuntu:
```
sudo apt update
sudo apt install software-properties-common
sudo add-apt-repository ppa:deadsnakes/ppa
sudo apt-get install python3.8 python3.8-dev python3.8-distutils python3.8-venv
```
for RHEL/CentOS:
```
yum install python3.8
```
check:
```
python3.8 --version
```
</br>
3. installed required modules for python:
```
python -m pip install asyncio prometheus_client goodwe
```
</br>
### How to get the IP Address of the inverter
*note: the inverter must be on the same network*
To get the IP adress of the inverter, run:
```
python scripts/inverter_scan.py
```
you will see something like:</br>
`Located inverter at IP: 192.168.2.35, mac: 289C6E05xxxx, name: Solar-WiFi222W0782`
</br>
### How to get test data
Edit the file `scripts/get-inverter-data.py` and on the line #7 add the IP address of the inverter
then run it with:
```
python scripts/get-inverter-data.py
```
and you should get all the data your inverter is exposing
</br></br>
## For standalone installation
check that you have:
- installed python
- installed goodwe modules
(see [Pre-requisites](https://github.com/gustonator/goodwe-prometheus-exporter#Pre-requisites) )
</br>
### Run/test
To test, start the exporter with minimal configuration:
```
python src/exporter.py --port <desired port> --interval <interval (s)> --inverter <inverterIP>
ie.
python src/exporter.py --port 8787 --interval 30 --inverter 192.168.2.35
```
(for more settings, see [Supported parameters](https://github.com/gustonator/goodwe-prometheus-exporter#supported-parameters))
</br>
now you can call it via curl (from another terminal) to see, if it exports some metrics:
(run in a new tab)
```
curl http://127.0.0.1:8787
```
</br>
to show help, just run the script with a `--help` parameter:
```
python src/exporter.py --help
```
</br>
if everything is OK, you can set up the script as a service:
For Ubuntu:
<documentation for debian system will follow>
</br>
## For Docker Installation
check that you have:
- installed python (need for the script to get the IP adress)(see [Pre-requisites](https://github.com/gustonator/goodwe-prometheus-exporter#Pre-requisites))
- Installed docker compose ([Docker compose installation](https://docs.docker.com.zh.xy2401.com/v17.12/compose/install/))
</br>
### Install/Run
1. edit the docker-compose.yml file and put there the correct inverter IP. (other values are optional)
- To get the IP address, see section "How to get the IP Address of the inverter"
2. from command line run:
```
docker compose up -d
```
</br>
### check
3. get IP address of the container
```
docker inspect -f '{{range.NetworkSettings.Networks}}{{.IPAddress}}{{end}}' goodwe-exporter
```
</br>
4. check via curl to see,if exporter works metrics. use the IP address from step 3.
```
curl http://<IP>:8787
```
</br>
### Supported parameters
`--inverter <inverterIP>` - [required] IP address of the inverter. To get the IP Address, see section [How to get the inverter IP address](https://github.com/gustonator/goodwe-prometheus-exporter#how-to-get-the-ip-address-of-the-inverter). </br>
`--port <desired port>` - [optional][default: 8787] port, on which the exporter should expose the metrics</br>
`--interval <seconds>` - [optional][default: 30] interval between scrapings (in seconds).</br>
`--energy-price <value>` - [optional][default: 0] energy price per kwh (in eur). If '--scrape-spot-price' is set to true, '--energy-price' value is ignored</br>
`--PVpower <value>` - [optional][default: 5670] maximum power in Watts of your PV you can generate (ie. 5670 = 5.67 KW)</br>
`--scrape-spot-price <bool>` - [optional][default: False] True/False, if the exporter should scrape spot price from https://www.ote-cr.cz. If it's set to 'True', exporter will set the scraped spot price as the energy price (--energy-price is ignored)</br>
`--spot-scrape-interval <minutes>` - [optional][default: 30] (in minutes) scrape interval of spot prices. If you set it too low, ote-cr.cz will block your requests</br></br>

View File

@ -0,0 +1 @@
docker build -t gitea.angoosh.com/angoosh/goodwe-prometheus-exporter:latest .

View File

@ -0,0 +1,24 @@
version: '3.8'
services:
goodwe-exporter:
container_name: goodwe-exporter
build: .
ports:
- 8787:8787
command:
- "--port=8787"
- "--interval=30"
- "--inverter=192.168.88.26"
- "--energy-price=4.5"
- "--PVpower=10000"
- "--scrape-spot-price=False"
networks:
- internal
restart: unless-stopped
networks:
internal:
external: false
driver: bridge

View File

@ -0,0 +1 @@

View File

@ -0,0 +1 @@
docker push gitea.angoosh.com/angoosh/goodwe-prometheus-exporter:latest

View File

@ -0,0 +1,17 @@
import asyncio
import goodwe
async def get_runtime_data():
#ip_address = '2a00:1028:c000:fdb:730b:1770:8783:6eac'
ip_address = '192.168.2.35'
inverter = await goodwe.connect(ip_address)
runtime_data = await inverter.read_runtime_data()
for sensor in inverter.sensors():
if sensor.id_ in runtime_data:
print(f"{sensor.id_}: \t\t {sensor.name} = {runtime_data[sensor.id_]} {sensor.unit}")
asyncio.run(get_runtime_data())

View File

@ -0,0 +1,66 @@
"""Simple test script to scan inverter present on local network"""
import asyncio
import binascii
import logging
import sys
import goodwe
from goodwe.exceptions import InverterError
from goodwe.protocol import ProtocolCommand
logging.basicConfig(
format="%(asctime)-15s %(funcName)s(%(lineno)d) - %(levelname)s: %(message)s",
stream=sys.stderr,
level=getattr(logging, "INFO", None),
)
def try_command(command, ip):
print(f"Trying command: {command}")
try:
response = asyncio.run(ProtocolCommand(bytes.fromhex(command), lambda x: True).execute(result[0], timeout=2, retries=0))
print(f"Response to {command} command: {response.hex()}")
except InverterError as err:
print(f"No response to {command} command")
def omnik_command(logger_sn):
# frame = (headCode) + (dataFieldLength) + (contrlCode) + (sn) + (sn) + (command) + (checksum) + (endCode)
frame_hdr = binascii.unhexlify('680241b1') # from SolarMan / new Omnik app
command = binascii.unhexlify('0100')
defchk = binascii.unhexlify('87')
endCode = binascii.unhexlify('16')
# tar = bytearray.fromhex(hex(logger_sn)[8:10] + hex(logger_sn)[6:8] + hex(logger_sn)[4:6] + hex(logger_sn)[2:4])
# frame = bytearray(frame_hdr + tar + tar + command + defchk + endCode)
frame = bytearray(frame_hdr + binascii.unhexlify(logger_sn) + command + defchk + endCode)
checksum = 0
frame_bytes = bytearray(frame)
for i in range(1, len(frame_bytes) - 2, 1):
checksum += frame_bytes[i] & 255
frame_bytes[len(frame_bytes) - 2] = int((checksum & 255))
return frame_bytes.hex()
result = asyncio.run(goodwe.search_inverters()).decode("utf-8").split(",")
print(f"Located inverter at IP: {result[0]}, mac: {result[1]}, name: {result[2]}")
# EM/ES
try_command("AA55C07F0102000241", result[0])
# DT (SolarGo)
try_command("7F03753100280409", result[0])
# Omnik v5 ?
try_command("197d0001000dff045e50303036564657f6e60d", result[0])
# Omnik 4 ?
sn = bytes(result[2][10:], 'utf-8').hex()
try_command(omnik_command(sn), result[0])
# Omnik 4 reversed ?
sn = "".join(reversed([sn[i:i + 2] for i in range(0, len(sn), 2)]))
try_command(omnik_command(sn), result[0])
print(f"\n\nIdentifying inverter at IP: {result[0]}, name: {result[2]} mac: {result[1]}")
inverter = asyncio.run(goodwe.discover(result[0]))
print(
f"Identified inverter model: {inverter.model_name}, serialNr: {inverter.serial_number}"
)

View File

@ -0,0 +1,250 @@
from prometheus_client import CollectorRegistry, Gauge, Counter, Info
from datetime import date, datetime, timedelta
from decimal import Decimal
import prometheus_client as prometheus
import xml.etree.ElementTree as ET
import traceback
import logging
import sys
import getopt
import time
import asyncio
import aiohttp
import goodwe
#logger = logging.getLogger(__name__)
print("\nGOODWE DATA EXPORTER v1.4.3\n")
QUERY = '''<?xml version="1.0" encoding="UTF-8" ?>
<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:pub="http://www.ote-cr.cz/schema/service/public">
<soapenv:Header/>
<soapenv:Body>
<pub:GetDamPriceE>
<pub:StartDate>{start}</pub:StartDate>
<pub:EndDate>{end}</pub:EndDate>
<pub:InEur>{in_eur}</pub:InEur>
</pub:GetDamPriceE>
</soapenv:Body>
</soapenv:Envelope>
'''
class OTEFault(Exception):
pass
class InvalidFormat(OTEFault):
pass
def checkArgs(argv):
global EXPORTER_PORT
global POLLING_INTERVAL
global INVERTER_IP
global ENERGY_PRICE
global PV_POWER
global SCRAPE_SPOT_PRICE
global SPOT_SCRAPE_INTERVAL
global LAST_SPOT_UPDATE
# set default values
EXPORTER_PORT = 8787
POLLING_INTERVAL = 30
ENERGY_PRICE = 0.20
PV_POWER = 5670
INVERTER_IP = ""
SCRAPE_SPOT_PRICE = False
SPOT_SCRAPE_INTERVAL = timedelta(minutes=int(30))
LAST_SPOT_UPDATE = datetime.now() - SPOT_SCRAPE_INTERVAL
# help
arg_help = "\nREQUIRED PARAMETERS::\n\t-i, --inverter\n\t\tIP adress of the inverter\n\nOPTIONAL PARAMETERS:\n\t-h, --help \n\t\tShows this menu\n\t-p, --port \n\t\texporter port - on which port should the exporter expose data [default:8787]\n\t-t, --interval\n\t\tscrape interval (in seconds) [default:30] \n\t-e. --energy-price \n\t\tprice per KWh in eur [default: 0.20] \n\t-w, --PVpower \n\t\tmaximum KW your PV can produce [default:5670] \n\t-s, --scrape-spot-price \n\t\t[True/False] Set to True, for scraping spot prices from www.ote-cr.cz [default: False] \n\t-x, --spot-scrape-interval \n\t\tscrape interval of spot prices. If you set it too low, ote-cr.cz will block your requests (in minutes) [default:30] ".format(argv[0])
try:
opts, args = getopt.getopt(argv[1:], "hp:t:i:s:", ["help", "port=", "interval=", "inverter=", "energy-price=", "PVpower=", "scrape-spot-price=", "spot-scrape-interval="])
except:
print(arg_help)
sys.exit(2)
for opt, arg in opts:
if opt in ("-h", "--help"):
print(arg_help)
sys.exit(2)
elif opt in ("-p", "--port"):
EXPORTER_PORT= arg
elif opt in ("-t", "--interval"):
POLLING_INTERVAL = arg
elif opt in ("-i", "--inverter"):
INVERTER_IP = arg
elif opt in ("-e", "--energy-price"):
ENERGY_PRICE = arg
elif opt in ("-w", "--PVpower"):
PV_POWER = arg
elif opt in ("-s", "--scrape-spot-price"):
SCRAPE_SPOT_PRICE = True
elif opt in ("-x", "--spot-scrape-interval"):
# define time for spot price scrape interval
SPOT_SCRAPE_INTERVAL = timedelta(minutes=int(arg))
# take defined interval so it scrapes it always the 1st time
LAST_SPOT_UPDATE = datetime.now() - SPOT_SCRAPE_INTERVAL
# check if Inverter IP is set
if not INVERTER_IP:
print("ERROR: missing IP Address of inverter!\n")
print(arg_help)
sys.exit(2)
class InverterMetrics:
ELECTRICITY_PRICE_URL = 'https://www.ote-cr.cz/services/PublicDataService' #deleted "e"
# build the query - fill the variables
def get_query(self, start: date, end: date, in_eur: bool) -> str:
return QUERY.format(start=start.isoformat(), end=end.isoformat(), in_eur='true' if in_eur else 'false')
# download data from web
async def _download(self, query: str) -> str:
try:
async with aiohttp.ClientSession() as session:
async with session.get('https://www.ote-cr.cz') as response:
async with session.post(self.ELECTRICITY_PRICE_URL, data=query) as response:
return await response.text()
except aiohttp.ClientConnectorError as e:
print(f"SSL error occurred: {e}")
def parse_spot_data(self, xmlResponse):
root = ET.fromstring(xmlResponse)
for item in root.findall('.//{http://www.ote-cr.cz/schema/service/public}Item'):
hour_el = item.find('{http://www.ote-cr.cz/schema/service/public}Hour')
price_el = item.find('{http://www.ote-cr.cz/schema/service/public}Price')
current_hour = datetime.now().hour
if (int(hour_el.text) - 1) == current_hour:
price_el = Decimal(price_el.text)
price_el /= Decimal(1000) #convert MWh -> KWh
return price_el
def __init__(self, POLLING_INTERVAL,ENERGY_PRICE,PV_POWER,SCRAPE_SPOT_PRICE,SPOT_SCRAPE_INTERVAL,LAST_SPOT_UPDATE):
self.POLLING_INTERVAL = POLLING_INTERVAL
self.ENERGY_PRICE = ENERGY_PRICE
self.PV_POWER = PV_POWER
self.SCRAPE_SPOT_PRICE = SCRAPE_SPOT_PRICE
self.SPOT_SCRAPE_INTERVAL = SPOT_SCRAPE_INTERVAL
self.LAST_SPOT_UPDATE = LAST_SPOT_UPDATE
self.metricsCount = 0
self.g = []
self.i = []
# create placeholder for metrics in the register
def collector_register(self):
async def create_collector_registers():
inverter = await goodwe.connect(INVERTER_IP)
runtime_data = await inverter.read_runtime_data()
for sensor in inverter.sensors():
if sensor.id_ in runtime_data and type(runtime_data[sensor.id_]) == int or type(runtime_data[sensor.id_]) == float:
self.g.append(Gauge(sensor.id_, sensor.name))
elif sensor.id_ in runtime_data and sensor.id_ != "timestamp" and type(runtime_data[sensor.id_]) != int:
self.i.append(Info(sensor.id_, sensor.name))
# add additional energy-price
self.g.append(Gauge("energy_price", "Energy price per KW"))
# add additional PV Power
self.g.append(Gauge("pv_total_power", "Total power in WATTS, that can be produced by PV"))
asyncio.run(create_collector_registers())
# scrape loop
def run_metrics_loop(self):
self.collector_register()
while True:
self.fetch_data()
time.sleep(self.POLLING_INTERVAL)
# scrape metrics in a loop and write to the prepared metrics register
def fetch_data(self):
self.metricsCount = 0
# get spot prices
if self.SCRAPE_SPOT_PRICE:
now = datetime.now()
# if the last spot price update was more that 30min ago, scrape it again
if now - self.LAST_SPOT_UPDATE > self.SPOT_SCRAPE_INTERVAL:
query = self.get_query(date.today(), date.today(), in_eur=True)
xmlResponse = asyncio.run(self._download(query))
self.ENERGY_PRICE = self.parse_spot_data(xmlResponse)
self.LAST_SPOT_UPDATE = now
async def fetch_inverter():
inverter = await goodwe.connect(INVERTER_IP)
runtime_data = await inverter.read_runtime_data()
countID = 0
for sensor in inverter.sensors():
if sensor.id_ in runtime_data and type(runtime_data[sensor.id_]) == int or type(runtime_data[sensor.id_]) == float:
self.g[countID].set(str(runtime_data[sensor.id_]))
countID+=1
# set value for additional energy-price
self.g[countID].set(float(self.ENERGY_PRICE))
self.g[countID+1].set(float(PV_POWER))
self.metricsCount=len(self.g)
asyncio.run(fetch_inverter())
# print number of metrics and date and rewrites it every time
print('-------------------------------------------------------')
if self.SCRAPE_SPOT_PRICE:
print("energy price(spot):\t\t"+str(self.ENERGY_PRICE)+" eur/KW")
print("last spot price scrape:\t\t"+str(self.LAST_SPOT_UPDATE))
else:
print("energy price (fixed):\t\t"+str(self.ENERGY_PRICE)+" eur/KW")
print("number of metrics:\t\t"+str(self.metricsCount))
print("last scrape:\t\t\t"+ str(datetime.now().strftime("%d.%m.%Y %H:%M:%S")))
def main():
try:
# Set up logging
logging.basicConfig(filename='exporter.log', level=logging.WARNING, format='%(asctime)s %(name)-14s %(levelname)-10s %(message)s', filemode='a')
checkArgs(sys.argv)
print("polling interval:\t\t"+str(POLLING_INTERVAL)+"s")
print("inverter scrape IP:\t\t"+str(INVERTER_IP))
print("total PV power: \t\t"+str(PV_POWER)+"W")
if SCRAPE_SPOT_PRICE:
print("spot price scrape: \t\tEnabled")
print("spot price scrape interval: \t"+str(SPOT_SCRAPE_INTERVAL)+" min")
else:
print("spot price scrape: \t\tDisabled")
print("fixed energy price: \t\t"+str(ENERGY_PRICE)+" eur/KW")
inverter_metrics = InverterMetrics(
POLLING_INTERVAL=int(POLLING_INTERVAL),
ENERGY_PRICE=ENERGY_PRICE,
PV_POWER=PV_POWER,
SCRAPE_SPOT_PRICE=SCRAPE_SPOT_PRICE,
SPOT_SCRAPE_INTERVAL=SPOT_SCRAPE_INTERVAL,
LAST_SPOT_UPDATE=LAST_SPOT_UPDATE
)
# Start the server to expose metrics.
prometheus.start_http_server(int(EXPORTER_PORT))
print("exporter started on port:\t"+str(EXPORTER_PORT)+"\n")
inverter_metrics.run_metrics_loop()
except KeyboardInterrupt:
key_message='Manually interrupted by keyboard'
print("\n"+key_message+"\n")
logging.warning(key_message)
except Exception as e:
logging.error("An error occurred: %s", e)
traceback.print_exc()
if __name__ == "__main__":
main()

View File

@ -0,0 +1,79 @@
services:
goodwe-exporter-garaz:
image: "gitea.angoosh.com/angoosh/goodwe-prometheus-exporter:latest"
restart: always
ports:
- 8787:8787
command:
- "--port=8787"
- "--interval=30"
- "--inverter=192.168.88.26"
- "--energy-price=4.5"
- "--PVpower=10000"
- "--scrape-spot-price=False"
networks:
- grafana
goodwe-exporter-bouda:
image: "gitea.angoosh.com/angoosh/goodwe-prometheus-exporter:latest"
restart: always
ports:
- 8788:8787
command:
- "--port=8787"
- "--interval=30"
- "--inverter=192.168.88.14"
- "--energy-price=4.5"
- "--PVpower=10000"
- "--scrape-spot-price=False"
networks:
- grafana
prometheus:
image: prom/prometheus:v2.33.5
restart: always
volumes:
- ./prometheus.yml:/etc/prometheus/prometheus.yml:ro
- prometheus-data:/prometheus
ports:
- 9090:9090
command:
- "--config.file=/etc/prometheus/prometheus.yml"
- "--storage.tsdb.path=/prometheus"
- '--storage.tsdb.retention.time=1y'
- "--web.console.libraries=/usr/share/prometheus/console_libraries"
- "--web.console.templates=/usr/share/prometheus/consoles"
- "--query.lookback-delta=40m"
networks:
- grafana
grafana:
image: grafana/grafana-oss:9.5.20
restart: always
volumes:
- grafana-cfg:/etc/grafana
- grafana-data:/var/lib/grafana
environment:
GF_FEATURE_TOGGLES_PUBLICDASHBOARDS: "true"
ports:
- 3001:3000
labels:
- "traefik.enable=true"
- "traefik.http.routers.grafana.rule=Host(`grafana.angoosh.com`)"
- "traefik.http.routers.grafana.entrypoints=websecure"
- "traefik.http.routers.grafana.tls.certresolver=letsencrypt"
- "traefik.http.services.grafana.loadbalancer.server.port=3000"
networks:
- grafana
- default
volumes:
prometheus-data:
grafana-cfg:
grafana-data:
networks:
default:
name: gateway
external: true
grafana:

31
grafana/prometheus.yml Normal file
View File

@ -0,0 +1,31 @@
global:
scrape_interval: 15s # By default, scrape targets every 15 seconds.
# scrape_timeout: 15s
# evaluation_interval: 1m
# Attach these labels to any time series or alerts when communicating with
# external systems (federation, remote storage, Alertmanager).
# external_labels:
# monitor: 'PV'
scrape_configs:
# The job name is added as a label `job=<job_name>` to any timeseries scraped from this config.
- job_name: 'prometheus'
# scrape_interval: 5s
static_configs:
- targets: ['localhost:9090']
- job_name: 'PV-garaz'
scrape_interval: 5s
static_configs:
- targets: ['goodwe-exporter-garaz:8787']
labels:
group: 'PV'
- job_name: 'PV-bouda'
scrape_interval: 5s
static_configs:
- targets: ['goodwe-exporter-bouda:8788']
labels:
group: 'PV'