This blog post is part of a series how to backup your SCOM environment.
You can find the other parts here:
Next step in our backup process is to take a backup of our unsealed management packs to make sure we don’t loose all the customization we’ve made to the environment.
First a little bit of explanation about the difference between sealed and unsealed management packs. At my clients I sometimes see some misunderstanding about these 2 sorts of management packs.
Difference between sealed management packs and unsealed management packs.
The difference is rather simple. All the management packs you download from vendors such as Microsoft, Dell, HP,… are sealed once. They have been developed by the vendors and sealed to prevent any further customizations. These management packs often include a variety of Rules, monitors, views and even reports which are installed when you import the management pack.
All the management packs you create yourself are by default unsealed ones. In here you store all your customizations such as: overrides on the sealed management packs, custom reports, custom made rules, custom made monitors…
Notice the word “custom”… In my book the word “custom” equals a lot of time and effort are spend to create them… Don’t want to loose them in case of a disaster then!
So how do we back these up… There are basically 2 ways: Manually or automated.
If you have one unsealed management pack that you want to backup or you want to quickly back it up while working in the console you can use the following method:
Open the console and navigate to Administration > management packs > select the management pack you wish to backup and right click > choose Export Management Pack…
Select a location for your backup:
Click ok and your management pack is successful.
If you check your location you’ll see the management pack in XML format.
While the above method works like a charm for a quick backup before changing something in your management pack it’s not workable and a hassle when you want to backup several management packs. Not to forget the human factor… You have to remember to take backups of your management packs…
Therefore the preferred way to backup is by automating it via script using PowerShell.
Microsoft has actually gave the proper tools to do so in the powershell cmdlet set for SCOM.
The command to use:
$mps = get-managementpack | where-object {$_.Sealed -eq $false}
foreach ($mp in $mps)
{
export-managementpack -managementpack $mp -path “C:\Backups”
}
There are basically 2 approaches to automate this: SCOM scheduled rule or Scheduled tasks on the RMS itself
You can make your choice based on this nice discussion to compare the 2: http://www.systemcentercentral.com/BlogDetails/tabid/143/IndexId/56798/Default.aspx
I choose to use the Scheduled task method to avoid the extra (although it’s minimal) load on the server and create a management pack to monitor the process.
This blog post is part of a series how to backup your SCOM environment.
You can find the other parts here:
There are 2 versions supported in SCOM and they both need a different approach to backup.
IIS6 is normally used if you are running SCOM 2007R2 on a Windows 2003 platform. It is used to support the components of the web console and the SQL Server Reporting Services. If you are using SQL Server Reporting Services from SQL 2008 IIS is not used anymore and it is just sufficient to backup the dbases.
IIS7 is normally used if you are running SCOM 2007R2 on a Windows 2008 platform (it can also run on a windows 2003 server but is not installed by default). The approach to backup IIS7 is somewhat different as it stores it’s data differently. The files you need to backup are web.config files and the applicationhost.config files. For more info on how to backup IIS 7 you can read this nice reference: http://blogs.iis.net/bills/archive/2008/03/24/how-to-backup-restore-iis7-configuration.aspx
So let’s start with the backup shall we:
Connect to the RMS and navigate to Start > administrative Tools > Internet Information Services (IIS) Manager
Right click the server name and navigate to “All Tasks” > “Backup/restore Configuration…”
The configuration backup dialog box will come up.
Fill in a backup name and if you want to make a secure backup you can tick the box “Encrypt Backup using password” and supply a password for the backup.
Click OK and your backup is made…
The actual backup file is stored in “%systemroot%\system32\inetsrv\MetaBack”
I suggest you take a copy of this file in case your RMS is unrecoverable. Otherwise the file itself which resides on the server will also be gone and what’s the point in backing up then
Open an elevated prompt on your RMS
Navigate to %windir%\system32\inetsrv
Run the command “appcmd add backup “name of your backup set”. If you not specify a name a name will be generated with the date and time.
If you are using IIS 7 your live just became a little bit easier. If you are using Vista SP1 or later / Windows Server 2008 the backup is automatically done if you create an initial backup like shown below. IIS automatically makes a history snapshot of ApplicationHost.config each time it detects a change so you can easily restore a prior version. By default it will keep 10 prior versions and checks every 2 minutes.
The files are stored in “%systemdrive%\inetpub\history”
Pretty cool feature if you ask me and a big improvement from the previous version.
To enumerate a list of backups and configuration history files, use the following command:
“%windir%\system32\inetsrv\appcmd.exe list backup”
This blog post is part of a series how to backup your SCOM environment.
You can find the other parts here:
One of the key factors in a successful restore of your environment is the SCOM encryption key.
This encryption key is used to store the data in the Operations manager dbase. It’s ensuring that the data in the dbase remains confidential and encrypted. The RMS uses this keep to read and write data to the Operations Manager dbase.
Pretty severe actually. If you don’t have the key you can’t establish connection from your fresh RMS to your existing Operations Manager dbase and therefore you loose all your settings, customizations and have to start all over again.
Please note that’s it’s a best practice to take this backup once after installation of the environment and after ANY changes to the RunAs accounts in the environment.
So how do you back this key up in case Murphy pays you a visit
There are actually 2 ways: GUI or command line
Log on to your RMS with an account with admin privileges
Open an elevated command prompt and navigate to your Operations manager install folder. In this case I kept it at default so c:\program files\system center operation manager 2007\
Note: Securestoragebackup.exe is only installed if you have installed a console on your RMS. If not you need to copy the securestoragebackup.exe file from the SupportTools folder from the installation media
The Encryption Key Backup or Restore Wizard pops up:
Click continue and select Backup the Encryption key.
A dialog box will appear to save your bin file. Best practice is to not save the file on the RMS. This makes perfect sense because you’ll need the file when there’s an issue with your RMS so there’s a big chance you can not reach the file.
I always save it on my file server and keep an extra copy somewhere else just to be save. As soon as you have exported the key you can make a copy of the bin file and store it twice on different locations.
So the location is set let’s continue.
Fill in a password to secure the backup bin file. Make sure you remember the password in X amount of time when you’ll need it to restore the key.
It will take no more than a few seconds to backup the key and if all goes well a nice complete message appear.
Log on to your RMS with an account with admin privileges
Open an elevated command prompt and navigate to your Operations manager install folder. In this case I kept it at default so c:\program files\system center operation manager 2007\
Run securestoragebackup.exe /? to get the syntax of the command.
The command used: securestoragebackup backup <filename>
You need to supply the password twice
and the second time
And the key was successfully backed up.
Downside is you cannot automate this process without further scripting because you need to put in a password. Would be nice that it would be an option in the exe to give your password as a parameter but maybe in another release
Recently I was working on a migration from MOM2005 towards SCOM2007R2. Unfortunately the MOM2005toSCOM2007 migration tool which was included in the SCOM2007 install media was not working anymore so I had to convert the management packs manually.
This post is for my reference whenever I need it again but in my search on the web I did not find many good write-ups so I wrote one of my own.
I needed a Microsoft Biztalk 2004 management pack in SCOM 2007. Unfortunately Microsoft never released a SCOM 2007 management pack but only a MOM2005 one.
So I had to convert it and load it into SCOM.
I’ve downloaded the management pack here: http://www.microsoft.com/download/en/details.aspx?id=14417
It’s an exe file which you need to unzip to a folder of your choice. I this example I’m going to put it on c:\managementpacks .
The file you need for the initial conversion is “c:\managementpacks\microsoft biztalk server 2004 management pack\Microsoft Biztalk Server 2004.AKM”
Next we need to convert the management pack to a valid XML file format which is used by Scom 2007.
You need MP2XML.exe to perform this conversion. It’s part of the MOM2005Resourcekit.
The syntax: C:\program files\microsoft operations manager resource kit\tools\convert management packs to xml\mp2xml.exe “inputfile”.akm “outputfile”.xml
so in this case C:\program files\microsoft operations manager resource kit\tools\convert management packs to xml\mp2xml.exe “c:\managementpacks\microsoft biztalk server 2004.akm” “c:\managementpacks\microsoft_biztalk_server_2004.xml”
Note the underscores in the naming. It’s not allowed to have spaces in the name of the XML file.
Next thing we need to do is convert the actual XML file to SCOM2007 format.
This is achieved with the mpconvert.exe which resides on the RMS in c:\program files\System center Operations manager2007\mpconvert.exe
The syntax in our example is: c:\program files\system center operations manager 2007\mpconvert.exe “c:\managementpacks\microsoft_biztalk_server_2004.xml” “c:\managementpacks\microsoft_biztalk_server_2004_converted.xml”
Now you can load the xml in SCOM like you load any other management pack.
There was an error with this particular management pack resulting in a failure of the import. Turned out there was an issue with the XML.
If you want to import the management pack you get the following error:
XSD Verification failed for management pack. The ‘Name’ element is invalid. This is an issue in the XML itself. A parameter which needs to be present in the XML is either corrupt or missing.
I’ve installed XML notepad 2007 on my machine to check the xml file.
You can download this nice tool here: http://www.microsoft.com/download/en/details.aspx?displaylang=en&id=7973
Now if you look in the XML you browse down to the manifest > name field you immediately notice that the field is empty:
You can just fill in a name at the right like I did below:
Save the file and try to import the management pack again.
This time it’s working and it let’s you continue with the import.
In this example a standard mp without any customization was used from Microsoft. However during my migration I had a lot of mp’s which were full of custom rules.
I ‘ll create a Powershell script to automate this process and post it as a follow up on this blog.
If you have any questions please feel free to leave a comment or shoot me a mail.
This blog post is part of a series how to backup your SCOM environment.
You can find the other parts here:
In order to keep the possibility to restore your SCOM environment in case of a disaster you need to make sure that you have a good backup of your dbases.
Your dbases keep track of all the info in your environment so it’s crucial that you don’t loose these valuable assets of your environment.
A good backup strategy of your SQL dbases is crucial as you need to make sure that you always have a recent copy at hand.
If you have a backup admin in your environment and are using a backup product like Data Protection Manager it’s best to meet up with the admin to check how his SQL backup schedule is configured. If he’s confident that he provides your backups no need to backup them twice…
If not you’ll need to perform the backup yourself.
First of all a small word about the different options you have to backup a SQL dbase (this applies to all SQL dbases and is not SCOM specific)
A good mix of the 3 methods above is a good strategy. I always take a full backup of the operations dbase once a week (datawarehouse once a month), a incremental backup once a day (for datawarehouse once a week) and Transaction log backups every 2 hours.
Again you can skip the Transaction log backups but then you risk to loose a max of 23h59m of data.
So let’s get the backups up and running:
We’re going to create the full backup schedule for Operations dbase in this example:
Connect to the dbase.
Open the tree and go to the “OperationsManager” dbase > right click > Tasks > Back Up…
Choose Options in the left pane and set:
Verify backup when finished
When all the settings are correctly configured choose Script button at the top of the page and choose “Script action to Job”.
This will generate a SQL job which you can schedule in the SQL agent so it can fire the backup when needed.
Name the job: In this case “Back up Database – OperationsManager_weekly”
Choose Schedules in the left pane and select New at the bottom:
Name the schedule and define the frequency + schedule. This will be scheduled in the SQL agent jobs.
When your schedule is made > click ok and the job is created.
You can check this job in the SQL Server Agent under the tree Jobs…
This is for the Weekly full of the Operation manager dbase.
You need to complete the same steps to perform the backup schedules for your other dbases.
The funny thing is most of the admins think of backups just after they had a major crash and there were no backups available.
Most of the admin’s think backups are a hassle and they take some but loose interest in the long term. When disaster strikes they miss a vital piece to restore their environment, have an outdated backup or even worse… no backup at all.
In this series of blogs I’ll go over the different aspects of backing up your SCOM2007 environment to make sure that when Murphy is choosing you, you’re prepared…
One of my favorite cartoons to illustrate backups…
So let’s get started and get you prepared when disaster strikes.
Which components do you need for a successful restore of your environment:
This blog post is part of a series how to backup your SCOM environment.
This series of blogs will be divided in the categories shown above and will be linked back to this post.
Today I had to explain to a customer how you need to target a rule or monitor to a specific computer group.
This is actually not a very intuitive process and if you are used to work with MOM2005 the process is different and can have big implications in the behavior of the rule / monitor you’ve created.
This is the only correct way if you want to target a rule or monitor to a select group of server.
Open your console and go to the tab Authoring and navigate to the Rules. Right Click > Create a new rule…
In the “Create Rule Wizard” select the desired rule. In this example I’m going to create an Event Based rule in the NT Event Log (Alert).
CAUTION: make sure to change the destination management pack to a custom management pack and NOT the default management pack.
Give the rule name and click the “Select” button just behind Rule Target:
Here you need to target a class of which you are certain all the servers you want to target are part of. In this case I choose “Windows Server” but if you are for example convinced they are all SQL server you can target the “SQL server” class.
If you have selected the appropriate class hit ok but not next on the page.
Make sure the “Rule is enabled” tick box is off!
Now choose the event log where to target your rule. In our case it’s the Application log
The filter. In this example I’m searching for an Event ID 150 created by the source “Eventcreate”
Next thing is to specify the information that will be generated by the alert:
Now click create.
So far the rule has been created but is disabled. The next thing we need to do is create our group which contains the specific set of servers which need to be targeted. In the Authoring pane choose “Groups” > Right click > choose “Create a new Group…”
Choose a name for the group and again CHANGE the default management pack as a target.
NOTE: Choose the same management pack where you want to create your override in later on. It’s not possible to reference another unsealed group from a unsealed group so either use the same group for both your override and group or seal the management pack where your group is created in.
The next option is to specify the explicit group members.
There are actually 2 approaches to populating the group (which can be combined).
The first one is that you specify the explicit members of the group. They will be always in the group included no matter what criteria you specify later on. The disadvantage you have is if you install a new server which need to be targeted you have to manually include it here.
The second approach to populate your group is Dynamic Inclusion rules. These rules have a set of conditions to add servers. These can be for example all servers which are SQL servers based on the class or all servers which name starts with “SERVER0”.
You can also specify servers to be included in this group which reside in another group.
Specifically deny Objects from being included in the group:
When you are confident you have included all the servers in the groups click create.
At this point go back to the Authoring pane > Rules > search for your new created rule.
In this example you can see our newly created rule in disabled state:
Right click the rule and choose Overrides > Override the Rule > For a Group…
Now choose the group we created earlier on:
In the override parameter locate the “Enabled” parameter and tick the box in the “Override” column. In the Override Value choose “True” , click Apply and OK.
At this point the rule we have created is targeted only to the servers you’ve added to the computer group and not enabled on all the other servers. This is in face a total different approach from the way of working in MOM2005.
This is because the computer groups (The class of objects that are computer groups) only exist on the RMS. If you target a rule directly to a computer group it will try to collect info from the RMS instead of the computers you have intended.
So now that you normally have a clear view on the assessment and what people expect you can start writing your design document.
I’ll be pointing out how I usually write my design docs. You can use these guidelines or create your own totally different layout + structure. Feel free to do so.
NOG AANVULLEN
First of all you need to write a design doc for people who are not familiar with the product. You already have some insights in the technology / product but be aware that most of the managers do not have these insights so you have to educate a little bit as well.
Therefore it’s a good thing to explain all the different components of your SCOM structure briefly before pointing out your decision concerning the component.
This is a brief overview of my framework for my design doc. Again this is my framework. Feel free to use it or alter it as you please:
These are in general the 7 chapters you need to cover. Let’s start with the first one:
This chapter will be an easy one. You formulate here what you came to know during the Assessment phase. It’s best to sum up where you want to be when the project finishes. Don’t go much into detail here yet. There’s plenty of room for this later on
Describe a little bit the purpose of this design document. Again don’t go into much detail here yet.
Again you can use your notes from the assessment meetings to summarize what the current situation is and why there’s decided to switch to SCOM. Here you can already make a small high level comparison between the current system and the new SCOM environment.
Include maps of the current topology of the network environment and/or the old monitoring system.
4. Explanation of the Operations Environment and different components.
Here begins the hard work. Luckily you only have to do this once because you can reuse this section later on because the explanation of the components will not change except for the design decisions.
Everybody who has been working with SCOM remembers his first time when he opened the console. It can be overwhelming and you just got passed the experience of designing and installing the SCOM system.
So you lean back and ask yourself… Where do we go from here. Where to start. Where to get the info needed… So many questions and so many answers to find online posted by user groups, team blogs, white papers,…
Overload…
In the next series of blogs I’ll try to set up a step by step guide to get things going to a level where you can already showcase the environment and further fine tune. I’m working 2 years with SCOM 2007 so the memory of the start is still fresh but fading fast when you dig deeper in the program.
First things first. The different phases of a SCOM project and the different trap holes they bring along:
These are subject to change off course but I always keep more or less to these 6 Phases. I call it my SCOM framework.
This series will be based on a install of a SCOM 2007 R2 environment.
I’ll walk through different phases of process. If there are any suggestions down the road please do not hesitate to leave a reply or contact me via my contact info on the front page.
Recently I got a mail of a user stating he’s not receiving his reports anymore via mail. They were created way back and normally these reports are in my category “set it and forget it”…
When I checked the schedule reports pane instantly I noticed that all the reports are showing an error as shown below:
“The Subscription Contains parameter values that are not valid” error message is in the status field.
During my search on the web the most common solution was to recreate the report which I did for one but because these are like 20 reports it will be a lot of work to recreate them all and risk the fact that they break again without knowing when and why.
So the next step I tried in my troubleshooting is to see whether I could fill in the missing parameters in the report which resides in a custom management pack holding all these special reports.
When I tried to run the report I noticed the following: Data Aggregation and Histogram are greyed out and it’s impossible to change them
When I tried to run the report the following error message came up:
So there is an issue with the ‘Data Aggregation’ parameter. No possibility to troubleshoot any further in the SCOM environment so we’ll have to dig deeper and turn our attention to the underlying SQL Reporting Services (SRS) install.
Connect to the SRS server and open up the SQL management studio.
Note: If you’re not sure where your SRS install resides navigate to SCOM console > administration > Reporting. The Reporting Server URL is filled in there so you can retrieve the server name / alias here.
Make sure you select “Reporting Services” in the Server Type and select the server name you’ve retrieved from your console.
Navigate to Home > “Your management pack” > reports > Subscriptions.
In this example we’re troubleshooting the “PROD3_IOReport”.
Right click and choose view report.
The web browser opens and will generate the report. However in this case the following error shows up:
Didn’t we have an issue with the “DataAggregation”? The error above shows we have an issue with our “ManagementGroupId”.
Let’s take a look at the report properties to find out.
Right click the report and choose Properties.
The familiar SQL properties page pops up.
Behind the “ManagementGroupID” (in the above print screen the sixth item) it’s indicated that there are multiple… We only have one management group so why should there be multiple?
If you open the value you get a drop down box with the 2 id’s listed
So which one is the correct one…
I opened a newly created report in the same management pack (which I recreated to solve the issue with the first report) and there there’s only one ID listed:
This report is working with all the parameters so this ID is the correct ID for our management group.
Next step is deleting the ”wrong ID” in my report parameters and click ok:
Now we go back to our SCOM console and check the report once more.
Open the report and now it’s possible to check the Data Aggregation and Histogram again.
After clicking “run” the report is generated successfully.
So all we need to do is change the parameters in our scheduled report.
Navigate back to the scheduled reports list, right click the report and choose edit.
Check the parameters and fill in the correct Data Aggregation / Histogram settings (and check the other settings as well while you’re at it).
Click finish and check back at the scheduled report view.
The report has gone from error to “ready” and is able to process when the scheduled time is there…
In this particular case it apparently was an issue when there were agents temporarily multi homed to a test environment and this test environment was deleted afterwards.
Although this was a mistake on our side I posted this blog post to illustrate that the error message in SCOM was not the cause of the real problem which was hidden in the SRS installation. This threw me off when troubleshooting the issue because I was focusing on the wrong error and has cost me a lot of valuable troubleshooting time.
I’ve posted my experience to save you some time in troubleshooting the issue