DataMine 2.0

[quote=“ConstantSphere, post:37, topic:190421”]@A.Ardon - I can see from the logs that there is an error reading alpha data for channel 40 that isn’t configured as an alpha file.

I also suspect that error message reporting isn’t working properly it it is just saying “handler failed”. When working locally to your Vera, I’d recommend accessing the dataMine2 web interface via http://vera-ip/dm rather than the port_3480 way.[/quote]

@ConstantSphere,
last night I found out that channel 40 was not working. I found out that deleting the info from channel 40 solves the problem.
So I deleted channel 40 and added it back trought the configuration panel. I did not use the ‘Y-Axis lookup’ at all, but there could be a problem that I could not find/think about it.

I have changed my bookmarks to the '/dm/ directory. Now it works fine as always.

@A.Ardon - great - pleased to see you are up and running! Deleting the data and restarting would certainly do the job, I presume you didn’t have a lot of data to lose.

[quote=“jonator, post:40, topic:190421”]Ok then :slight_smile:
I am thinking about the disk space it will take up if logging every 5 sec. But maybe there is no worries if you have a 16gb stick anyway? I don’t remember the space of my rrd database from my old 1-wire when it logged every 2.5 minute…[/quote]

dataMine stores around 16 bytes per sample which is about 100MB a year at a 5 second sample rate. Of more concern to me would be the rate at which you could read the data back as my Vera box can only aggregate around 8,000 samples a second. That’s a maximum of 5 days of data on a single chart before it times out.

Can I ask what the device is? And can you not configure the polling frequency (Poll this node at most once every x seconds) under the device settings?

*edit - increased incorrectly stated processing speed *

[quote=“ConstantSphere, post:43, topic:190421”][quote=“jonator, post:40, topic:190421”]Ok then :slight_smile:
I am thinking about the disk space it will take up if logging every 5 sec. But maybe there is no worries if you have a 16gb stick anyway? I don’t remember the space of my rrd database from my old 1-wire when it logged every 2.5 minute…[/quote]

dataMine stores around 16 bytes per sample which is about 100MB a year at a 5 second sample rate. Of more concern to me would be the rate at which you could read the data back as my Vera box can only aggregate around 6,000 samples a second. That’s a maximum of 4 days of data on a single chart before it times out.

Can I ask what the device is? And can you not configure the polling frequency (Poll this node at most once every x seconds) under the device settings?[/quote]

Ok that’s not much but the other limit as you say might cause some issues using the Plugin.

I am using Fibaro Universal Sensor with some DS18B20 sensors.

I can see parameter 10, 11 and 12 for the Universal Sensor might be useful:

10: Interval between successive reading of temperature from all sensors connected to the device. Default 20 seconds. (0-255)
11: Interval between forcing to send report concerning the temperature conditions. The forced report is sent immediately after the next reading of temperature from the sensor, irrespective of the settings of parameter no. 12. Default 200 s
12: In sensitiveness to temperature changes. There is the maximum acceptable difference between the last reported temperature and the current temperature taken form the sensor. If the temperatures differ by the set value or more, than a report with the current temperature value is sent to the device assigned to association group no. 3. Intervals between taking readings from the sensors are specified by parameter no. 10. Default 8 [0.5oC]

But if I check the default settings of above, and still the dataMine is pushing every 5 seconds they most probably doesnt effect this,

@jonator could you PM me one of your dataMine log files for the device (e.g. /dataMine/database/xx/raw/2401.txt) as I’d be interested to take a closer look. It doesn’t make much sense to me that your room temperature should be changing by more than 0.5 degrees every 5 seconds! To find out what xx should be plot a chart of the variable concerned then look in the debug log for something like “Finished reading files - no more data available/dataMine/database/1/raw/2402.txt”

@jonator - thanks for the PM. It looks like it is all working well within the bounds of what dataMine2 can handle. Here is another quote from the manual from the main graphing screen that may help others…

Clicking the (i) icon in the top right of the main Chart screen shows information about the raw data used to create the graph such as processing time and the number of data points read. In the example below dataMine2 has read 44,565 data points and aggregated them to plot 13 columns on the chart.

[center]Figure 7. Graph Information[/center]
If Aggregate By is set to anything other than None (raw data) then intermediate data points may be created meaning it is possible to have more output data points plotted than there are data samples.

More background reading…

[b]Configuring Aggregations[/b] dataMine2 only records changes to variables being watched and so there is no need to set any sampling frequency. This has the advantage of being easy to configure and allowing all data to be retained without ever throwing any data away. However, whilst the detail of the data is retained, it can be difficult to see the bigger picture without using aggregations. dataMine2 performs all aggregation calculations at the time of the request without attempting to pre-calculate or alter the underlying data recorded. This saves on storage space but charts that have to trawl through large amounts of data may take a few seconds (a typical VeraLite box can aggregate around 8,000 data points per second). Aggregation Period can be set to Auto (default), None (raw data), Hour, Day, Week or Month.

[ul][li]Auto (default) will cause data to be collected in such a way as to have up to 100 data points displayed on the graph at any time. As the scale of the graph chances the number of data points sampled will change to attempt to keep 100 points on the screen.[/li]
[li]None (raw data) will not aggregate the data and will cause data to be displayed exactly as it has been recorded. Note that a maximum of 2,500 points can be displayed on any chart and any more than this will be truncated. This is useful for data that changes rapidly over a short period but infrequently over a long period such as a light switching on or off.[/li]
[li]Hour - aggregate the data by the hour[/li]
[li]Day - aggregate the data by the day[/li]
[li]Week - aggregate the data over 7 days - note that the week starts on a Thursday on most systems and this cannot be configured[/li]
[li]Month - aggregate by the calendar month - beware that value may be aggregated over 28 to 31 days depending on the month, making comparisons difficult.[/li][/ul]

Due to daylight saving time, the length of a day, week or month can vary by an hour.
Aggregation formula can be set to Sample (default), Difference (Max - Min), Sum, Weighted Average, Minimum, Maximum and On Duration (Hours)

[ul][li]Sample (default) - takes a sample of the data at the aggregation period - any values between samples are completely ignored. This is useful for data that doesn’t change much over the short term.[/li]
[li]Difference (Max - Min) - finds the maximum and the minimum data values within the aggregation period and calculates the difference. This is useful for calculating the amount something has changed such as the amount of electricity used during the period.[/li]
[li]Sum - this adds up all the data values found during the period. This is useful for counting the number of times an infra-red sensor has been activated during a day, for example.[/li]
[li]Weighted average - this calculates the average of the data values during the period taking into account the amount of time the data point was at that value. E.g. if a value was zero for 8 hours and 1 for 16 hours the weighted average for the day would be 0.667.[/li]
[li]Minimum - the minimum data value found during the aggregation period. This is useful for plotting low temperatures[/li]
[li]Maximum - the maximum data value found during the aggregation period. This is useful for plotting high temperatures[/li]
[li]On Duration (Hours) - the number of hours the variable was greater or equal to 1. This is useful for determining how long a device such as a light switch or heating controller was on during the day.[/li][/ul]

Once the aggregation value has been set it must be saved. All further graphs generated will use the new aggregation values and none of the underlying data will be changed.

@klamath - the log file you posted contained a reference to the D_DataMine2.xml download error. Have you had any success with the new version of the file, I attached a few posts back?

The latest version of dataMine2 (1.897) has just been approved on the app store and this includes the small fix I did for the D_DataMine2.xml file. I’d be interested to see if this fixes people’s install problems. Your feedback appreciated. Thanks!

The release also includes an additional link found in the device properties so you can go to either the local or remote graphing page directly.

I’ve also attached a link to the first post in this forum to the User Guide I have been working on. Let me know if you have any suggestions for improvements.

On the advice of @akbooer I’ve created a separate forum for discussions on the documentation here: http://forum.micasaverde.com/index.php/topic,35724.0.html

I have updated dataMine2 to 1.897 and got some problem.
When I open the web application I just get stopes in the loading process of dataMine web application. I tried to uninstall and install the app again, but that didn?t help.

Any ides??

@STAIK - it looks like the dataMine app didn’t even start properly as there are no messages in the log. Could you post your entire Vera log file? If you don’t have ssh access to your box, you should be able to get the entire log file through this URL: http://vera-ip:3480/data_request?id=file&parameters=../../tmp/log/cmh/LuaUPnP.log.

It might also be worth checking that you have enough disk space to download the dataMine app and unpack all the files. If you can ssh into your box can you send me the output of the df command. Also try power cycling your Vera box in case there are any memory issues.

@ConstantSphere I tried to reinstall the app again and now DataMine2 starts, but I get to the USB Configuration at the start up. But the mounting don?t work, so I get to this site over and over again. I removed the USB and re-formatted it and restarted the Vera, but that didn?t help. So I replaced the USB disk with another, reinstalled the app and restarted the Vera. But still the same.

The Vera log file is greater than size 512KB so I can?t post it.

The DataMine Debug file

Thanks @STAIK. It’s a bit of a muddy picture but here’s what I can see in the log…
You have 2 USB sticks:
/dev/sdb1: LABEL=“APP” UUID=“B4E2-BE23” (Win95 FAT32)
/dev/sda1: LABEL=“MiOS” UUID=“349967f9-c8e0-4717-8b3f-fe9eb1620d63” (Linux ext3)

and you would like to use APP for dataMine and MiOS for Vera logs. Currently MiOS is successfully mounted to /tmp/log/cmh. At 16:51 and 19:02 you successfully mounted to the MiOS stick to /datamine but the APP stick has never successfully mounted.

As the MiOS stick has mounted successfully and the APP stick hasn’t I think we have to assume that there is something about that stick that Vera doesn’t like. If you have root access, can you try reformatting it from within Vera using the command “mkfs.vfat /dev/sdb1” or put it in another machine and reformat it from there.

I also notice a couple of other things not related to the USB stick…
You have (or did have) the advanced variable SetEventsEnable set to 1 (not surprising as this is the default). Set that to 0 (unless you really need to see live notifications) as it is a big performance drain - I will get that to default to 0 in a future version.

You also have a bunch of Vera errors (not dataMine errors) like “Device_LuaUPnP::LoadDeviceDoc can’t load…”. I’m not really sure why that is, but suspect that your box is having difficulty communicating with the Micasaverde servers (either your end or there end).

Let me know how you get on

@ConstantSphere Hi, I’ve been trying to get DataMine 2 (1.896) working on a VeraEdge 1.7.1598. BTW, I never could get DataMine 1 going on the VeraEdge. DataMine 1 did work on my prior Vera Lite UI5, but I upgraded to the VeraEdge UI7 for some reason I can’t now recall.

I’ve attached a copy of the debug page produced by http://veraip/port_3480/data_request?id=lr_dmCtrl&control=debug. I notice that there are some commands missing from the VeraEdge that are called in the Debug page, namely item 2 output of blkid and item 4 output of fdisk. Trying these commands using PuTTY to access the VeraEdge yields a command “not found” error message.

The automatic pre-populated USB pick-list in the Datamine setup never worked on the VeraEdge. It was always empty. It seems that it does not work because the VeraEdge does not have the blkid command, nor the fdisk command for that matter.

I’m just so elated that I just got DataMine 2 to work on the VeraEdge by doing these things:

  1. using WinSCP to create a dataMine directory at the root level on the VeraEdge;
  2. I had an 8g FAT32 USB in the VeraEdge (the same one that I used to have on the VeraLite, but now empty) and determined the USB mount UUID by using PuTTY to the Vera console and entering the lsusb command, which in my case was 054c:02a5, and then
  3. manually setting these 3 DataMine 2 variables: SetMountUUID to “054c:02a5”, SetDataDirectory to “/dataMine/” and SetMountPoint to “dev/sda1”.

Reload/Reboot etc and all is working.

Don’t know if it is possible or worth it to add the missing commands blkid and fdisk to the VeraEdge, but I do hope this helps anyone else who may want Datamine on a VeraEdge.

Is there any way to update an already saved graph with new data?
If not, here is an example how it perhaps would be possible to implement it.
Click on a saved graph.
Click on Edit saved graph
A new dropplist with the graph channels
Pick the one you need in the dropplist
Save

Is that possible?

@Terryleier - that’s awesome! And thanks for sharing, I’m sure it will help others.

Looking into it a bit further, there shouldn’t be any need to set the SetMountUUID on a VeraEdge system as it is only used in conjunction with the missing blkid command. Setting SetDataDirectory to “/dataMine/” and SetMountPoint to “dev/sda1” are what’s key here.

I’ve added that to one of the sections in the latest user guide, which I’ll publish soon.

The whole UUID thing is only there to ensure that the correct mount location is found when the USB stick is moved to a different USB slot or plugged into a USB hub.

@Vera3 user - that sounds like a nice idea.
One of the great things I love about dataMine and the reason I decided to modify it was because I found the web interface really easy to use but needed the back-end “engine” changing. However, when it came to the minor modifications required for the front end interface, I found it really difficult as it’s done using a framework called Ext JS that’s completely unfamiliar to me ???. I’ll take a quick look but I might have to decline on the grounds of inability :-\ .

@ConstantSphere

I had some problems with the ‘handler failed’. See following link:
[url=http://forum.micasaverde.com/index.php/topic,35592.msg263965.html#msg263965]http://forum.micasaverde.com/index.php/topic,35592.msg263965.html#msg263965[/url]
After deleting the channel and creating a new one, it starts working again.

But now I have the same on a other channel. And I know what I did:
I want to store the outside temperature from the ‘Opentherm Gateway’. Sometimes it stores a faulty value. A example:

1451705301,4.00 1451705690,5.00 1451705742,4.00 1451707964,124.00 1451708020,4.00 1451716052,4 1451719698,4.01 1451719752,4.00 1451728186,4.00

I think that now and than the faulty ‘124’ is created by the boiler or OTG and not by DataMine, in the past I just corrected the vallue and its just fine.
Yesterday I changed a vallue and I got the same error again.
Is there a change that changing a vallue in e.g. ‘2401.txt’ is giving an error?

There nothing added under configuration ‘tab’ Y-Alookup.

Andre