Prometheus export

Hi all,

In case it is useful to anyone, I wanted to share that I wrote an app that takes the data exported by the IHD (either cloud or local MQTT) and exposes it as Prometheus metrics.

You can find the source code here, or it's also available as a docker container here. Hopefully it's pretty self-explanatory how to run it, and once it's up you just have to point Prometheus to it and you can use Grafana to create lots of lovely graphs 😀

Hope this is useful, and if you have any problems feel free to raise an issue on GitHub or just post here.

Cheers,

Andrew

Comments

  • I've got most of the way Andrew but am picking up this error on the container? I'm a novice so bear with me please.

    '<=' not supported between instances of 'str' and 'int'

  • Ah, that's interesting. Does it print out a full traceback, including line numbers? I need that to be able to work out where the problem is.

    Andrew

  • Not that I can see, I’ll dig into the logs later and see if there’s anything else. One question, without a topic how does it know where to look on the local MQTT server? That’s running on Hone Assistant, Prometheus and your app are running on my QNAP NAS.

  • Thanks, I will see if I can workout how the error is being thrown and make it clearer.

    It just listens to everything under the glow/ topic, so it should pick everything up.

    Andrew

  • Ta, is the bind variable in the format xx.xx.xx.xx:port? Also, does localhost work? I’ve had problems in the past between the two but usually just bridge the container to my LAN so it’s exposed. I’ll switch it to cloud later and see if it’s the container or my network topology.

  • Yes, bind is ipaddress:port. I've not tested localhost, but it should work.

  • Connected with result code 0 using Cloud which I assume is normal? Nothing showing under Prometheus though so I assume I've got something wrong on the binding. Is this the address of the Prometheus server?

  • The bind address is where it exposes the metric, you then need to add a line to the prometheus configuration file to scrape it.

    You should be able to visit http://bindaddress:port/metrics in a web-browser to confirm it's working.

    In your prometheus.yml file you can add something like:

    scrape_configs:
      - job_name: 'glowprom'
        static_configs:
          - targets: ['bindaddress:port']
    


  • I'm definitely making a hash of the networking, can't connect to anything! I've tried NAT, Bridge, various ports.

  • Sounds like you might have bigger docker issues then, not specifically glowprom. You want to try asking somewhere like Stack Overflow to see if they can help get docker working.

    Andrew

  • I'll have a play tomorrow, portainer is working along with gluten and qbittorrent. I think it's entirely a networking issue. Thanks Andrew.

  • We've made some progress, I removed the BIND option and let it default as per the code then port forwarded 9100 behind the virtual switch. The webpage is showing now but metrics is giving a 404 error. It's also connected to my MQTT server as seen in the logs.

  • That’s definitely good progress :-) I will have to test the bind parameter to make sure it works correctly.

    The metrics will return 404 until it receives some data. Are you able to confirm the Glow IHD is sending data to your mqtt server?

    Andrew

  • edited May 4

    Yep, getting a 404 error. The MQTT server receives from the cloud using an MQTT bridge. This could be something to do with storage on the NAS and permissions. That's my next task. I was getting the same error using cloud.

  • Ah, I hadn’t planned for that use case. The topic names and message formats are different for cloud and local mqtt modes. It won’t work with cloud format via a local server.

    If you configure it to point to the cloud directly it should work.

  • Quick question, it's probably me, server.py, line 39,  self.wfile.write(""", is that formatted correctly, no close brackets?

    I've tried local and cloud and I still get 404?

  • That line is ok. Triple quotes are a multi line string in Python. It’s closed on line 46.

    If it’s connecting to MQTT ok then it must be listening to the wrong topic. That’s handles in mqtt.py line 47. You probably want to experiment with different topic names.

    Andrew

  • Ta, told you, it was me.

    I have:

    SMART/HILD/XXXXXXXXXX

  • I think if you change DEFAULT_MQTT in arguments.py line 22 to your local server, and pass —topic SMART/HILD/+ it might work.

  • On local the device is at glow/XXXXXXXXXXXX/SENSOR/electricitymeter & glow/XXXXXXXXXXXX/SENSOR/gasmeter

  • That should match the default topic name, if you pass —mqtt serverip I would expect it to work.

  • Sorry Andrew, that topic is what I have for cloud, SMART/HILD/XXXXXXXXXX.

  • Cloud is working, thanks for all your help Andrew. Looks like I was chasing rainbows yesterday as cloud MQTT was down!

  • Local working too, looks like the bridge isn't an issue. I'll post a docker compose yaml at some point.

  • Yay! So glad it’s working! I was very confused :-)

  • My docker compose yaml if anyone is interested? This works on QNAP using portainer stacks. I had to manually import the image, also 'latest' didn't work for some reason.

    version: "3.9"

    services: 

      glowprom:  

       image: andrewjw/glowprom:0.5.0

       container_name: glowprom

       environment:

        - PGID=100

        - PUID=1000

    # Use below options for local MQTT (comment out whichever option is not needed)

        - GLOWPROM_MQTT=X.X.X.X #MQTT server address

        - GLOWPROM_PORT=1883 #or whatever local port is used

        - GLOWPROM_USER={mqtt local username}

        - GLOWPROM_PASSWD={mqtt local password}

    # Use below options for cloud MQTT (comment out whichever option is not needed)  

     - GLOWPROM_USER={cloud username}

       - GLOWPROM_PASSWD={cloud password}

       - GLOWPROM_TOPIC=SMART/HILD/{glow device ID}

       ports:

        - 9100:9100 # Prometheus

       restart: unless-stopped

    Below is the end of the Prometheus yaml file, I added everything from -job_name: 'glowprom' down:

      static_configs:

       - targets: ["localhost:9090"]

        

    - job_name: 'glowprom'

      static_configs:

       - targets: ['localhost:9100']

Sign In or Register to comment.