Read APSystem ECU local web data of photovoltaic Grid Tie Inverter
My 2 PV pannels are driven by an APSystem Grid Tie Inverter (GTI) YC-500 using a MPPT algorithm (see my previous post about the system) and an ECU that gather the data through CPL from the GTI
If I connect to the local web server of the ECU
http://192.168.1.130/index.php/realtimedata
I got the table bellow
I am interesting to extract the power of the 2 PV and the data log time
I wrote the following python script to automatically extract the power of my 2 PV
The ECU updates the data every 5 mn
The output of the script is:
As you notice we are far from the 2 x 230 W because it is cloudy today ;-)
You can adapt it to record or send the data using a Raspberry PI3
Hope it helps
If I connect to the local web server of the ECU
http://192.168.1.130/index.php/realtimedata
I got the table bellow
Inverter ID | Current Power | Grid Frequency | Grid Voltage | Temperature | Reporting Time |
---|---|---|---|---|---|
40400024xxxx | - | - | - | - | - |
40400024xxxx | - | - | - | - | - |
40400024xxxx | - | - | - | - | - |
40400012yyyy-A | 11 W | 50.1 Hz | 207 V | 18 °C | 2019-02-01 15:26:10 |
40400012yyyy-B | 10 W | 207 V |
I am interesting to extract the power of the 2 PV and the data log time
I wrote the following python script to automatically extract the power of my 2 PV
The ECU updates the data every 5 mn
import time import json import re import urllib.request from bs4 import BeautifulSoup url = 'http://192.168.1.130/index.php/realtimedata' print ("initialisation") #Setup a loop to send data and updtae status at fixed intervals #in seconds fixed_interval = 30 #30-> 1mn while 1: try: #current time and date datetime = time.strftime('%Y/%m/%d %H:%M:%S') # Get info from PV pannels Gateway response = urllib.request.urlopen(url) html = response.read() soup = BeautifulSoup(html, 'html.parser') table = soup.find("table", attrs={"class":"table table-condensed table-bordered"}) # The first tr contains the field names. #headings = [th.get_text().strip() for th in table.find("tr").find_all("th")] #print(headings) datasets = [] for row in table.find_all("tr")[1:]: #dataset = dict(zip(headings, (td.get_text() for td in row.find_all("td")))) dataset = [td.get_text() for td in row.find_all("td")] datasets.append(dataset) print (tuple(dataset)) print("___________________________________\n") coldatetime=datasets[3][5] lastdatetime=(coldatetime[1:len(coldatetime)-3]).strip() W1=datasets[3][1] W2=datasets[4][1] W1int = int((W1[1:len(W1)-2]).strip()) W2int = int((W2[1:len(W2)-2]).strip()) Ps = W1int+W2int print(W1int, W2int, "=> total", Ps, "W\n") time.sleep(fixed_interval) except IOError: print('Error! Something went wrong.') time.sleep(fixed_interval)
The output of the script is:
initialisation ('40400024xxxx ', '- ', '- ', '- ', '- ', ' -\n ') ('40400024xxxx ', '- ', '- ', '- ', '- ', ' -\n ') ('40400024xxxx ', '- ', '- ', '- ', '- ', ' -\n ') ('40400012yyyy-A ', ' 16 W ', ' 50.0 Hz ', ' 211 V ', ' 20 °C ', ' 2019-02-01 15:01:10\n ') ('40400012yyyy-B ', ' 14 W ', ' 211 V ') ___________________________________ 16 14 => total 30 W ('404000248290 ', '- ', '- ', '- ', '- ', ' -\n ') ('404000247923 ', '- ', '- ', '- ', '- ', ' -\n ') ('404000247924 ', '- ', '- ', '- ', '- ', ' -\n ') ('40400012yyyy-A ', ' 16 W ', ' 50.0 Hz ', ' 211 V ', ' 20 °C ', ' 2019-02-01 15:01:10\n ') ('40400012yyyy-B ', ' 14 W ', ' 211 V ') ___________________________________ 16 14 => total 30 W
As you notice we are far from the 2 x 230 W because it is cloudy today ;-)
You can adapt it to record or send the data using a Raspberry PI3
Hope it helps
Commentaires
Enregistrer un commentaire