openLuup: Data Historian

I knew someone would manage this one day. It just had to be you!

I would have to think hard about the effects of that, but at the very least, I think that devices in the wrong-numbered block will show the wrong node name. What duplicate device numbers do, I hate to think…

…time to tidy up your Vera device numbering.

I knew someone would manage this one day. It just had to be you!

I would have to think hard about the effects of that, but at the very least, I think that devices in the wrong-numbered block will show the wrong node name. What duplicate device numbers do, I hate to think…

…time to tidy up your Vera device numbering.[/quote]
LOL.

I wish I could sort of reset the Vera device numbering. At one point to Worldweather plugin went haywire and kept recreating its child devices. I did once change all the device numbers over 10000, but when you add a device the Vera just keeps numbering up. I guess I need to bit the bullet on the openLuup side.

For the historian view, maybe you can test with and without the on disk archiving and see if all bridged variables show an hyperlink.

Cheers Rene

[quote=“reneboer, post:86, topic:199464”]Correct,

Could it be because my first Vera has device numbers going over 10000 and thus show with device IDs on the Verabrige from 10000 - 23500? the second bridged Vera has the 20000 range. I guess I need to up that? However, there goes my DataMine data and scenes ???

Cheers Rene[/quote]

Rene,
While wasting your time with my response, I’m also laughing at myself.
My devices are up to 93, but being an obsessive/compulsive type (read anal), I’m having heart palpitations because of the missing numbers in between.

I need to calm down now.

:-[

Chris
Karma for all you do here.

There’s a system attribute called Device_Num_Next. After backing up, you could try changing this in the user_data.json file, or even on the fly with luup.attr_set().

I don’t know what would happen if you ran into an existing device, though, but I imagine that you must have some fairly large gaps in the device numbering!


Edit: tried luup.attr_set() but didn’t work.

Thanks, I’ll give it a shot. All devices could fit in two digits indeed.

Cheers Rene

Hi akbooer,

It looks simpler (and less risky) if I can change the offset in VeraBridge. Looking at the current code this gets calculated again after each restart. Could it be made the initial offset gets calculated at first install and then taken from the off-set variable? This way you can also remove a VeraBridge without impacting the numbering of any remaining. I must admit I do not understand the role of the Primary Bridge (OFFSET == BLOCKSIZE).

Cheers Rene

That’s an interesting thought, and certainly would fix that particular problem. It may raise other issues - I’ll have to look closer at that.

I must admit I do not understand the role of the Primary Bridge (OFFSET == BLOCKSIZE).

Not sure I understand this? Are you asking why the first bridge has an offset at all? Clearly, native plugins and virtual devices start numbering in openLuup using low numbers, so any bridge variables have to be offset. The only concession made to the first bridge is that the Zwave controller gets mapped to device #1 so that somw exotic plugins, which directly use the Zwave controller, work.

Not sure I understand this? Are you asking why the first bridge has an offset at all? Clearly, native plugins and virtual devices start numbering in openLuup using low numbers, so any bridge variables have to be offset. The only concession made to the first bridge is that the Zwave controller gets mapped to device #1 so that some exotic plugins, which directly use the Zwave controller, work.[/quote]
No, I get why even the first needs an offset so it won’t clash with the openLuup native devices. It is that bit of special logic I did not get the reason for. But that Zwave device is why. Learning again :smiley:

Cheers Rene

An obvious way to skip a VeraBridge block of device numbers is to install another copy of the bridge. The blocks are currently assigned from lowest to highest VeraBridge plugin id, so you could, for example, use bridges 1 and 3, simply not mapping bridge 2 to a remote Vera at all.

Hi akbooer,

I fear it is a bit more tricky than that. Looking at the historian files I see that the devices from the first bridge with numbers over 10000 show up with just the first four digits with the second bridge vera number. I.e. device ID 13523 of vera xxxx1711 show on the History DB page as xxxx0073.3523.EnergyMetering1.Watts where xxxx0073 is the vera number of the second bridge. Maybe getting the device numbers on my Vera down is indeed the best option. Unless you want me to be your guinea pig for this scenario.

Also looking at the History DB page the page shows links like 20m:30d,3h:1y,1d:10y. However, clicking them only produces empty graphs. Also for the openLuup native devices. The same parameters do show a graph on the Historian page. Any idea? Or do I need to install a graphana system or so?

Cheers Rene

Yes, I knew that something like that would happen with Historian. What would be needed here is to identify children of bridge devices by their parent. I didn’t do that initially because it is harder - you have to search up the tree for grandparents, etc. (OK, not too hard.) However, these name space/number conversions actually take place in both directions between three different naming conventions. I haven’t yet worked out whether this is actually possible to do…

Also looking at the History DB page the page shows links like 20m:30d,3h:1y,1d:10y. However, clicking them only produces empty graphs. Also for the openLuup native devices. The same parameters do show a graph on the Historian page. Any idea? Or do I need to install a graphana system or so?

That’s much more concerning. Do none of them work? Do those links show number of points updated? There should be a counter.

That’s much more concerning. Do none of them work? Do those links show number of points updated? There should be a counter.[/quote]
Go figure, today there is data showing?!? It seemed to have needed an extra luup restart or so as it is recording from the time I updated ALTUI yesterday.

Cheers Rene

Well, that’s a relief!

Data Historian now supports mirroring to external Grafana Graphite and InfluxDB databases. (as of Development branch v18.11.26)

Great! I’m going to try it on InfluxDB!

Does that also support custom date ranges (not only … so far ranges) in Grafana do you know?

Another question, I’ve been logging my sensors with Historian in a Whisper database for quite some time now.

If I look at graphs in Grafana and look at larger timespans, I notice that sometimes a sensor logs an invalid value (like a way too high temperature or light measurement) which gives a big peak in the graph, flattening all the other measurements.

My question is twofold:

  • is there a way to remove invalid data from the whisper files? I’ve tried this with whisper-tools, but it gives me errors about not readable metadata.
  • is there a way to specify a range for the logged variables, thus ignoring and eliminating the invalid values from getting stored in the DB in the first place?

Thanks,

Joris

Not sure I fully understand. The date range as applied by the Grafana menu bar works, as do the per-graph duration and timeshift fields. I haven’t tried an explicit time interval in the metric query line on a graph, since I’m not really SQL-savvy.

Yes, this is, IMHO, a ZWave transmission problem. It’s particularly prevalent with some devices, notably, for me, energy meters. It is not an historian problem, per se. It is, however, a nuisance.

- is there a way to remove invalid data from the whisper files? I've tried this with whisper-tools, but it gives me errors about not readable metadata.

Yes, there is. And I use it about once a week to fix such problems. You can’t use the Whisper tools, since the database is not binary compatible. I have, however, written an openLuup CGI, originally for DataYours, which allows editing of the data. I’ve made a version which is tailored towards the Historian database. You exercise the basic editing functionality through an HTML form - I have a crude webpage which suffices, but really would like to do better. However, I’m thinking that Graphite/InfluxDB mirroring, means that you could use the standard tools instead.

- is there a way to specify a range for the logged variables, thus ignoring and eliminating the invalid values from getting stored in the DB in the first place?

I wrote a user-defined processing module for DataYours, which would allow just this, but, in the spirit of recording what’s actually received, rather than what you would like to be received, I haven’t implemented that for the Historian. There’s a philosophical discussion to be had there (one which has been going on here on this forum for at least five years in the context of the dataMine plugin, see: http://forum.micasaverde.com/index.php/topic,14692.0.html)

I’m open for good suggestions as to how to proceed.

So, to recap, there is a simple (and crude) solution, which I’m happy to share, if that addresses your need.

Thanks for your reply akbooer.

I’m sure it’s a Zwave transmission glitch or sensor glitch (they are getting smaller and cheaper, so I think the ICs are prone to measuring errors once in a while).

If you could share the crude clean-up code that would be awesome. I’m going to test the InfluxDB route in the near future as well.

I’m looking at improving this massively, possibly adding it as an editable HTML table to the Console > History DB page, along with the graphic. However, for the moment here’s a file [tt]graphite-editor.lua[/tt] which you should put into [tt]cmh-ludl/cgi/[/tt].

I actually invoke this from a link on my Grafana pages:

http://openLuupIP:3480/cgi/graphite-editor.lua

This brings up an HTML page with three sections:

[ul][li]READ - a form with three fields and a “Read” button
[list]
[li]Target - the finder pattern of the metric you want to edit, eg. openLuup.2*..Memo[/li]
[li]From - the start time, can be Graphite relative time (-1d, -3h) or ISO datetime (2018-11-28T16:45)[/li]
[li]Until - the stop time, same format (can also have the value ‘now’)[/li]
[/list]
[/li]
[li]Data field - initially blank, filled with the data from the above time interval once you press the Read button. This request goes directly to the Historian’s Graphite finder, so can include fully qualified metric names, or wildcards. It can, therefore, return data from multiple metrics, but try just to stick to one. The returned JSON data format is the standard Graphite render one.[/li]
[li]WRITE - a form with a single text field and a Write button[/li]
[list]
[li]POST content - here is where you put the revised data to write. The format is exactly the same as returned in the Data field, so the easiest thing by far is simply to cut and paste everything from the returned data to this field. Find and fix the incorrect data values, taking care not to screw up the JSON format or change any of the times. For this reason, It’s best to have homed in on the required part of the data by fine tuning the time intervals on the read request (which you can repeat as many times as you like.) Once you press the Write button, the revised data is sent to the relevant archive file and in the data area the points which have actually been changed are returned by way of verification.[/li]
[/list][/ul]

There are some subtleties. All archives older than the one retrieved and edited get updated using the correct aggregation function - this means that everything is taken care for you and you don’t need to do separate edits for each individual archive within the Whisper file, which will remain data consistent. BUT, the question is “which archive did you retrieve?” The answer depends on the OLDEST time interval that you asked for, and the definition of the file’s retentions, so you are best to set the From time to relatively close to the time of the errant sample, and do so as soon as you spot the error. It’s not a problem, though, because after a time all the younger archives will have been overwritten.

So, as I warned you, it is very crude, but quite easy to use - the explanation takes far longer to read than it does to make an edit.