This blog post is part of a series check out the other posts in this series:
So after we have successfully set up the connection between the flukso and our mysql dbase with basically following the same route as the nest thermostat data is pouring into our own dbase on the same device..
The only thing left to do is get this data in SCOM as well. I’ve created a separate management pack and PowerShell script for this to give people the ability to install it separate from each other but the goal is to create one big management pack in the end.
This blog post will explain how to retrieve the data with PowerShell (of course) and dump it into a property bag which is readable by SCOM. This is the second phase in our schematic example:
We basically need the same requirements as for the NEST thermostat monitoring as we ar using the same route:
What do we need to retrieve the data out of the MySQL dbase.
There’s no additional install required on the mysql server although you will need the following to connect:
I’m using this on a virtual Win2012 machine without any issues.
Retrieve the data from MySQL using a PowerShell script
This is the script I created to get the data out of MySQL.
Note that this script only is retrieving one value. It’s possible to retrieve multiple values all at once but I preferred to use different scripts to get the different parameters out of the dbase.
The script has some prep work for water consumption in there as well but this is not yet fully operational as I need to convert the pulses to l/min so more on that later.
The dbase is filled with data every minute so I run the PowerShell script below every 120 sec to get data in. The data is measured in watt.
The script used:
It can be downloaded here: http://scug.be/dieter/files/2014/02/perfdatafrommysqlelectricity.rar
[xml]
Param($energysort)
[void][system.reflection.Assembly]::LoadFrom(“C:\Program Files (x86)\MySQL\MySQL Connector Net 6.8.3\Assemblies\v2.0\MySQL.Data.dll”)
#Create a variable to hold the connection:
$myconnection = New-Object MySql.Data.MySqlClient.MySqlConnection
#Set the connection string:
$myconnection.ConnectionString = "database=flukso;server=<fill in ip of server>;user id=<user>;pwd=<password>"
#Call the Connection object’s Open() method:
$myconnection.Open()
#uncomment this to print connection properties to the console
#echo $myconnection
$API = New-Object -ComObject "MOM.ScriptAPI"
$PropertyBag = $API.CreatePropertyBag()
#The dataset must be created before it can be used in the script:
$dataSet = New-Object System.Data.DataSet
$command = $myconnection.CreateCommand()
#$command.CommandText = "select date, time, sensor_1 from fluksodata";
$command.CommandText = "SELECT Sensor_1 FROM fluksodata ORDER BY IDTimestamp DESC LIMIT 1";
$reader = $command.ExecuteReader()
#echo $reader
#The data reader will now contain the results from the database query.
#Processing the Contents of a Data Reader
#The contents of a data reader is processes row by row:
while ($reader.Read()) {
#And then field by field:
for ($i= 0; $i -lt $reader.FieldCount; $i++) {
$value = $reader.GetValue($i).ToString()
}
}
echo $value
$myconnection.Close()
$PropertyBag.AddValue("energysort", $energysort)
$PropertyBag.AddValue("electricity", $value)
[/xml]
This script will basically do the following.
Note: I’m also using a variable $energysort to identify the flukso sensor.
Now to get all the different parameters as mentioned above the only things you need to change are:
If everything goes well you have your data in your MySQL dbase now and can retrieve it remotely via PowerShell to pass it on to SCOM.
Now that we have the PowerShell in place. Check out this blog post to make a management pack for it: http://scug.be/dieter/2014/02/19/nest-thermostat-monitoring-pack-part-3-create-the-mp/
Download the MP here: http://scug.be/dieter/files/2014/02/flukso.energymeter.rar
This blog post is part of a series check out the other posts in this series:
So after all this hard work. To get the data into my MySQL dbase and into SCOM. What can I actually do with it?
This is just the beginning of a far greater monitoring project I’m building to basically monitor my house but now I have control over the temperature and heating in my house.
I’ve created the views in the nest folder for humidity, Heating status and a separate view for target and current temperature.
Humidity view:
Nothing much we can do with this view as this is actually giving me a good reading. Everything between 30 and 60 is healthy condition so no complaints here.
Next in line is the Heating status:
This is basically a Boolean (on or off). The standard graph in Nest is also telling me this but I have to click through some views to get there. Now I can get this in a simple graph in my console wherever I want it.
Saving the best for last the temperature graph
The first 2 graphs are nice to have but this one is actually pretty cool. This is giving me the relationship between the target temp asked by my household at a given time and the actual temperature in my house. Here I can clearly see that it takes approx. 2 hours to get my home heated up (radiant floor) but the heat stays constant for a long time. This is due to the nature or radiant floor and because my house is well isolated. If I overlay the 2 graphs I can clearly see that the temp is rising as soon as my heating is working…
So now I have the data in there. Next step in the process is to create a API control console task to actually change the target temp. This is possible via the API I’ve mentioned so it will be added to the mp in a short while.
This blog post is part of a series check out the other posts in this series:
The Nest thermostat monitoring pack is in general part of a “monitor your home with scom” series which can be found here: http://scug.be/dieter/2014/02/19/monitor-your-home-with-scom/
Download the MP I’ve created here: http://scug.be/dieter/?p=1204
Now that we have discussed how to get the data from the Nest website via an api call into our dbase where we were able to get the data via PowerShell into a property bag. It’s now time to get SCOM working with this data.
I’m gathering a sample of the dbase every 5 minutes and the dbase itself is filled with data only when there was a call home from the device to the Nest website. As some of you will probably already know it’s not possible to use a PowerShell script to populate a performance rule in the console itself. You need to have a vbs. In fact it’s not a good idea at all to create a management pack in the console as it will be filled with GUID’s and such.
I’ve used the SCOM 2007 authoring console for quite a long time and am still using it but the biggest disadvantage is the fact it cannot interpret SCOM 2012 mp’s. If you create a MP with the authoring console it will work on both SCOM 2007 and SCOM 2012. But if you try to load a scom 2012 mp into the authoring console you’ll get a schema mismatch because it just can’t cope with the new schema. This makes it impossible to create the mp in the authoring console, load it in the SCOM 2012 management group, make minor modifications and then load it back into the console…
Well then you should use Visual Studio Authoring Extensions… True… But I don’t know Visual Studio. It’s still on my to do list but hey there are many things in there.
So for now I’ve used another great tool: Silect MP Author which is freely available. I made the core rules in there to get the PowerShell performance collection rule in there and then made view modifications and such in the console itself. Ok it’s not pretty but it’s just to showcase the possibilities of SCOM and I plan to integrate this bit into a larger “Monitor your home” mp which I probably rebuild from scratch using VSAE.
So enough chatting. Let’s create this performance rules to get the data we have in our dbase via PowerShell into our management group.
First things first. Download the free Silect MP author tool here: https://bridgeways.com/mp-author-landing-page
Install the tool and open MP Author
Create a new management pack:
Give it a proper name:
Save in a location (I’ll do it by default on my SkyDrive so I can work on my project anytime from anywhere)
Leave the references as is
Choose Empty
Create
Now create a target for our watcher node which is identified by reg key: HKLM\software\nest\watchernode
Right click target and choose group
Check for a server where the regkey is located on. It’s easier to browse than to type in the key.
Supply credentials.
and locate the key: HKLM\Software\nest\watchernode
Give the attribute a name
identify the discovery
and we only want to check whether it exists. We don’t care about the content.
We run this every day.
Create.
Next thing we need to do is create the group of watcher nodes
Fill in the desired expression and click next (note this changed in the final mp I’ve uploaded)
Create the group
So now we need to create the performance rule with our PowerShell we tested earlier on:
Copy paste the script in the script body window.
Make sure to change the credentials in the connection string as discussed in part 2 of this blog series.
Fill in the location variable
Identify the performance rule
Map the content of the property bags to instance and value which scom can use to create the performance dataset.
Leave the schedule as is (more on this later)
Create the script
So now we have a script to get humidity in our environment, the value that is… The same process needs to be followed to get current temperature, target temperature and heating status in as well.
I already did it in the mp I’ve uploaded but I really wanted to show you the ease of use of mpauthor. I plan to do a more thorough blog series on this great tool but this is not in scope of this blog series of course.
So now the scheduler part… We want to collect data more frequent than once a day of course. Turns out it’s not possible to change this in MP author nor in the console after you have loaded the mp. You need to change the xml code itself.
The collection rule is only configured in the mp as daily:
You need to change this to:
And you need to do this for all rules you have created of course.
Note: Like I said before this is just a small showcase of how the management pack is constructed. The management pack which is attached here is slightly different and has some config done in the console in there so it’s not as clean as ID’s concerned. Again I’m planning to rebuild a full mp when I have all my different monitoring aspects in place.
If you want to use the management pack I have created make sure to change the connection string values to your dbase location username and password
So all left to do now is load the MP in your management group and check whether everything is running.
This blog post is part of a series check out the other posts in this series:
So after we have successfully set up the connection between the nest and our mysql dbase data is pouring into our own dbase. So how do we get that data into SCOM so we can graph it and monitor it.
This blogpost will explain how to retrieve the data with PowerShell (of course) and dump it into a property bag which is readable by SCOM. This is the second phase in our schematic example:
What do we need to retrieve the data out of the MySQL dbase.
There’s no additional install required on the mysql server although you will need the following to connect:
I’m using this on a virtual Win2012 machine without any issues.
This is the script I created to get the data out of MySQL.
Note that this script only is retrieving one value. It’s possible to retrieve multiple values all at once but I prefered to use different scripts to get the different parameters out of the dbase.
The Nest parameters I read in:
The script used:
It can be downloaded here: http://gallery.technet.microsoft.com/SCOM-Retrieve-performance-507293f1
[xml]<br />#===================================================================================================<br /># AUTHOR: Dieter Wijckmans<br /># DATE: 18/02/2014<br /># Name: Nest_humidity.PS1<br /># Version: 1.0<br /># COMMENT: Get the current humidity value from the nest device from the mysql dbase<br />#<br /># Usage: .\Nest_humidity.ps1<br />#<br />#===================================================================================================<br />param($location)<br />#load the connector but make sure to check the path if you are using a different version<br />[void][system.reflection.Assembly]::LoadFrom(“C:\Program Files (x86)\MySQL\MySQL Connector Net 6.8.3\Assemblies\v2.0\MySQL.Data.dll”)<br />#Create a variable to hold the connection:<br />$myconnection = New-Object MySql.Data.MySqlClient.MySqlConnection<br />#Set the connection string:<br />$myconnection.ConnectionString = "database=<Fill in your dbase name>;server=<Fill in your server ip>;user id=<fill in user id>;pwd=<not 1234 right?>"<br />#Call the Connection object’s Open() method:<br />$myconnection.Open()<br />#Prepare the property bag<br />$API = New-Object -ComObject "MOM.ScriptAPI"<br />$PropertyBag = $API.CreatePropertyBag()<br />#The dataset must be created before it can be used in the script:<br />$dataSet = New-Object System.Data.DataSet<br />#Run the actual query<br />$command = $myconnection.CreateCommand()<br />$command.CommandText = "SELECT humidity FROM data ORDER BY timestamp DESC LIMIT 1";<br />$reader = $command.ExecuteReader()<br />#Processing the Contents of a Data Reader we only want the last value<br />while ($reader.Read()) {<br /><%%KEEPWHITESPACE%%> for ($i= 0; $i -lt $reader.FieldCount; $i++) {<br /><%%KEEPWHITESPACE%%> $value = $reader.GetValue($i).ToString()<br /><%%KEEPWHITESPACE%%> }<br />}<br />$myconnection.Close()<br />$PropertyBag.AddValue("location", $location)<br />$PropertyBag.AddValue("humidity", $value)<br />$PropertyBag<br /><br />[/xml]
This script will basically do the following.
Note: I’m also using a variable $location to identify the Nest Thermostat if you have more than one.
Now to get all the different parameters as mentioned above the only things you need to change are:
If everything goes well you have your data in your MySQL dbase now and can retrieve it remotely via PowerShell to pass it on to SCOM.
Next blog post we’ll get everything in SCOM.
Stay Tuned.
This blog post is part of a series check out the other posts in this series:
I recently bought myself a Nest Thermostat. For those who are not familiar with this device. It’s basically an internet connected thermostat which learns your heating patterns with different variables. It’s of course packaged in a nice futuristic design… Cool…
More info here: https://nest.com/thermostat/life-with-nest-thermostat/
Although it is not officially released in Europe nor supported it’s actually quite easy to install. If you are reading this and need more info please make sure to check out this blog post: http://www.fousa.be/blog/nest-thermostat
So I survived the installation…
This round eye thing as my wife refers to it is doing its job nicely. I get a lot of feedback from the app and even on the website. Everything looks shiny and cool… But… What if… What if I could get this data in SCOM and see whether I can monitor and control it from there… Imagine the fun.
Well this blog post will document my way of getting data into SCOM. To give you a high level idea this is actually what I did:
§ Retrieve data from the nest.com website (not from the actual device as my first attempts failed miserably)
§ Get this data into a MySQL dbase
This is a schematic overview of what components were used:
In this blog post we’ll discuss how to get data from the nest into a MySQL dbase
So how was I going to get data out of this nice circle on the wall which is controlling my heating… My first attempt was to sniff all the traffic which came from the IP of the nest thermostat and see whether there were valuable entries in the traffic between the device and the Nest web service. Well a lot of data was sent but it would take a lot of work to make sure everything was discovered correctly. I opened Google and Bing (always nice to compare both results) and started searching on the web. A lot of users had the same question how to get data out but not many had the answer. Until i stumble upon the github library and did a search there.
I found this API of Guillaume Boudreau on github: https://github.com/gboudreau/nest-api
This nice gentlemen wrote a PHP based api which could retrieve all the info and return it with a simple REST API call. I’m not a dev so this was unknown territory for me but I gave it a shot. My aim was to get this data stored locally so I can have full control over how it’s stored and retrieved by SCOM. This to make my life afterwards a bit easier.
I could inject it into a SQL dbase but this would require my server to be switched on all the time. I would also need to have a machine which is running a webserver to take care of all the different php requests to the API and such. Sure I can build a dedicated machine on my hyper-v host but the server had to be on all the time for the entire process to work not a good idea with current energy rates. But wait a minute. I have another device running all the time. My Synology DS412+. Up until now it only served as a storage device for both entertainment and iscsi luns for hyper-v. But this thing is capable of doing much more. I checked the different fan blogs / forums and realized right away that this device was the way to go. Without any knowledge of MySQL and or php I set of on an adventure..
Note: All this is configured on the beta version of DSM 5.0 so screenshots may differ from previous versions of DSM.
I had a fun time configuring the NAS to get it a point that everything was working. I’ll link to some blog posts which got me going as this will take me too far. If you have issues please do not hesitate to comment or connect on twitter @dieterwijckmans to get more specific on this topic:
Install Mysql and phpadmin on synology nas
Install winscp to browse your Synology build in ftp server (via scp actually)
Install Putty to get a terminal access to the linux distro underneath your synology OS Note: Just install it you do not have to setup a key chain to access your synology.
A couple of quick pointers (I will write a more detailed process if I ever find the time or if there’s a huge demand for it)
Again this will be a fairly straight forward process if you’ve done this before. For me all this was quite new as a 99% windows oriented tech (ok it’s out in the open I admit it) but it proofed to be a challenging but rewarding route.
First things first. These were my resources to get everything in Mysql. I started down a different path to write my own scripts when I stumbled upon a post on github by chriseng. He actually wrote a nice front end webpage to get data of Nest in a comprehenable way… He writes the data to a mysql dbase first. Exactly what I needed!
https://github.com/chriseng/nestgraph
So follow the instructions on there and get the api + script in place.
If all goes well open up your phpmyadmin by opening your browser and surfing to: http://<address of your synology>/phpmyadmin . You should see a dbase created and the first line of data in there:
I know this is a high level and quick manual but this is in fact how I got it running without any knowledge of mysql / php and so on.
Now that we have 1 line in the dbase (mine has more in there but I have already automated the process) we need to schedule the command to poll the web service on a regular basis.
This took me quite some time to figure out so if you made it this far I’ll safe you the time of figuring it out yourself.
High level steps:
The cron is basically the Linux equivalent of task scheduler. You need to put in different parameters and Linux will run the command for you on a regular interval. More info can be found here: http://en.wikipedia.org/wiki/Cron
On the Synology the cron is located in folder: /etc and the file is called crontab.
It should look like this:
Some considerations which got me searching for quite a while why it wasn’t working:
So… You wait patiently to see the data coming into your dbase. But nothing is happening. You check again a while later. Nope. I see myself again struggling with this. Turned out you still need to restart the cron service on your Linux distro. All so complicated for us windows guys… sigh…
To do this on DSM 4X run:
To do this on DSM 5 run:
Do this in a putty session as these are both Linux related commands:
After this the data should be flowing in…
Still with me?
Next thing we need to do is get the data queried out of our MySQL dbase and into a property bag to pass onto to SCOM.
Check out part 2 (link in top of blog post)