Syslog (sort of!): Maker-API, Logstash, and Graylog

I went into this looking for syslog outputs on the hub, but tangling with node, websockets, and the like didn't get me where I wanted to go, I wanted to share what I've learned and what's working for me.

First my requirements: docker containers, a syslog server that I already have, and limiting the number of new machines/nodes/computers/piles-of-code to maintain. Also I am only really working using the Aeotec home energy manager gen5 device for now.

I have had good success with Logstash, specifically the following pipeline definition, logstash.conf:

input {
  http {
    host => "0.0.0.0"
    port => 8080
  }
}

filter {
  mutate {
    add_field => {"short_message" => "Hubitat Event" }
  }
}

output {
  gelf {
    host => "192.168.1.199"
    id => "logstash-199"
    protocol => "TCP"
  }
}

I did try to use the Logstash websockets input plugin for the /logsocket URL but was not successful. I use the POST URL from the Maker API to post messages to the machine running this particular logstash instance. I also specifically switched to the GELF output instead of syslog, mostly because I know I'm going to connect to Graylog. I also configured my Graylog input to accept raw TCP and that worked too straight from the Maker API. But I know GELF pretty well, so rather than crack TCP or HTTP, I went for the GELF format.

I run it on the same machine I have Graylog running on:

sudo docker run --restart always -d -p 8080:8080 \
-v /var/logstash/pipeline/:/usr/share/logstash/pipeline/ \
--name logstash logstash/logstash-oss:7.8.1 \
sh -c "logstash-plugin install logstash-output-gelf ; /usr/local/bin/docker-entrypoint --config.reload.automatic"

Put the logstash.conf file in /var/logstash/pipeline and bob's your uncle.

So this works for me because I've already got a dedicated syslog server for all the rest of my home infrastructure (NAS's, proxmox, Ubiquity, etc.). Plus the Graylog server(s) themselves with Filebeat. :slight_smile:

If you're familiar with Graylog, I've created an extractor to split and configure the content_value because sometimes it's a float (1.567) and sometimes it's a string ("3.54 Days"). Graylog's grok and regex support is outstanding for this type of thing alone. I didn't do anything clever with Graylog: just running the docker versions. I am persisting the mongo backend to a disk volume, but everything is ephemeral.

I've had some luck making the pretty pictures with Grafana (another tool I :heart:) but NB, the storage backend for Graylog is Elasticsearch, and it does work. But ES isn't time-series data like InfluxDB or Prometheus so the experience can be a bit funky. The Graylog dashboards aren't much to write home about, but they do work with aggregation and alerting, which is what I'm looking for now with power monitoring.

2 Likes

Have you messed around with the grafana/Loki log server?

I hadn't done so, but now I have! Those folks have my interests in mind since I'm also running most of my things on podman and Kubernetes. So pod monitoring is great with Loki. I'm still sifting through it all, and it seems like the logstash is pretty straightforward. I've moved most of my grok and field chunking into Graylog instead of logstash and ruby: no good reason, just I like being in an application for extractors, grok, and stream work. Six of one, half-dozen of another.

I've been tangling with helm for many years now so I'm also really interested in their tanka and groovy work. Interesting project and I'll keep an eye on it. Thank you for the lead.

And I said groovy (device drivers on the brain) but meant json/yaml. :slight_smile:

@staylorx Would you mind sharing your extractors for Graylog?

So, is there a good log shipping method, out of this? Ideally Iā€™d like to send this to Loki.

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.