SCUG Belgium is holding it’s first offsite event after the summer.
Join us on Thursday 29th september 2011 to discuss the brand new version of SCORCH 2012(the product formerly known as Opalis).
This event will consist out of 2 sessions of an hour:
Session 1: System Center Orchestrator 2012 overview will go over the new version of SCORCH. How can SCORCH help you to automate tasks and create workflows. This session is given by Kurt Van Hoecke who’s a member of the SCUG and a SCSM specialist. You can find his blog here: http://scug.be/blogs/scsm/default.aspx
Session 2: System Center Orchestrator 2012 migration plan will give you an overview on what things to consider to migrate from a previous version to the new version. This session is given by Christophe Keyaert who’s a member of the SCUG and specialized in SCORCH / Opalis. You can find his blog here: http://scug.be/blogs/christopher/default.aspx
Don’t miss this great opportunity to view this new release in action and sign up here: http://scorch.eventbrite.com/
Update: We’ve added a live meeting to sign in as well so if you can’t make it to Belgium make sure to follow it by livemeeting!
See you there!
Scom is a great product but from time to time you need a custom build tool or script to do just the thing, or change just that bit that’s not possible in the SCOM console.
I’m personally a huge fan of the Powershell cmdlet supplied with SCOM. For most of the tasks (whether it’s automating or extending SCOM) it does the trick quickly and easily.
From time to time there’s a tool passing by on the world wide web that fills a gap to make our lives as a SCOM admin more easy.
Yesterday another of these fine tools emerged: http://systemcentercentral.com/forums/tabid/60/indexid/88501/default.aspx?tag=
Note: You need to register to download the tool.
This is the first version of the nice tool to count the instances per management group. This can be helpful to troubleshoot your environment. The PowerShell script which was posted in the community a while back took sometimes 3 hours to complete the task while this nice .net program is taking minutes…
You need .net framework 4 to run the tool.
Keep an eye on the topic because I’m sure it will progress in the next days like the authors mentioned in the topic itself.
I came across an excellent list written by Sonda (MPFE in UK) of all the required resources you’ll need to get up to speed with SCOM 2007 R2.
There’s plenty of info for the absolute rookie and the novice.
You can find the exhaustive list here:
If you are looking for a similar list of links to resource for Service Manager look no further. Kurt Van Hoecke created a nice list of all the resources you’ll need to get you going.
Happy reading
Yesterday Microsoft released the Cumulative Update 5 (CU5) for Scom 2007 R2.
This new update contains some additional fixes for Operations Manager 2007 R2 + support is added for Red Hat 6.
You can download the CU5 package (948.0 MB) here: http://www.microsoft.com/download/en/details.aspx?id=26938
The KB2495674 article apparently is not online yet but can be found here: http://support.microsoft.com/kb/2495674
For instructions on how to install a CU package in general (blog is written for CU4 but best practices can be followed for installation of CU5 as well) you can check this :http://blogs.technet.com/b/kevinholman/archive/2011/02/01/opsmgr-2007-r2-cu4-rollup-hotfix-ships-and-my-experience-installing-it.aspx
This blog post is part of a series how to backup your SCOM environment.
You can find the other parts here:
Another part in the process of backing up your environment and thus making sure that all the data is available to restore your environment is backing up the Reporting services dbase which basically contains all your reports.
The standard report can be easily recreated when reimporting the management pack but if you made custom reports they will be lost if you do not have a backup.
This process consists of 4 steps:
Let’s get started!
The 2 dbases to backup are Reportserver and Reportservertempdb. Although it’s not absolutely necessary to backup the Reportservertempdb to restore your environment it will definitely save you some time in the process. If you loose your Reportservertempdb you’ll have to recreate it… So if you’re in the process of backing up take a backup of the Reportservertempdb as well.
You can use any method which is allowed by SQL to backup these dbases whether it’s System Center Data Protection Manager, third party software or the build-in SQL backup process.
I’ll be using the build-in SQL backup:
Open the Microsoft SQL Server manager and browse to your server / dbase:
Right click your reporting dbase and choose Tasks > Back Up…
Leave the backup type as Full, change the name (if you like otherwise use the default name) and check the location of the file.
Caution: Make sure you choose a file location which is included in your normal day to day file backups so you have it in your backup system when your server is completely lost.
If all goes well you’ll get the message that your backup was successful
Now repeat the steps above as well for the ReportservertempDB and save it in the same location as you have saved your ReportServerDB backup.
This encryption key is used for encrypting sensitive information in the dbase to ensure the safety of the data in it. You normally only have to save this key once as this is used in a 1-on-1 relationship with the dbase and the Symmetric key.
The key needs to be restored in the following cases:
Open your reporting Services Configuration Connection by choosing Start > all programs > Microsoft Sql Server ‘version’ > Configuration tools > Reporting Services Configuration Manager
A dialog box will appear to check the Server name and the report Server Instance:
If the are correct click Connect.
On the next page choose Encryption Keys and in the right pane click Backup button.
Choose the file location + name by clicking the … button.
Fill in a desired password. This password is used to encrypt the file so make sure you use a password you remember because there’s no way to restore the key without it + there’s also no way to reset the password on the exported SNK file.
If all goes well the key has been backed up and your receive the “Creating Encryption Key Backup” successful message at the bottom.
The reporting services uses different files to store the application settings. It’s very important you have this config files handy when disaster strikes because they contain all your settings / customizations.
Best practice is to take a backup of these files when you have installed the server, deploy custom extensions or when you run a full backup of your environment for drp reasons.
The following files must be included in a backup location which is covered by your filebackup system:
Rssvrpolicy.config
Rsmgrpolicy.config
Reportingservicesservice.exe.config
Web.config for both the Report Server and Report Manager ASP.NET applications
Machine.config for ASP.NET
Backup the files that you create and maintain in Report Designer and Model Designer. These include report definition (.rdl) files, report model (.smdl) files, shared data source (.rds) files, data view (.dv) files, data source (.ds) files, report server project (.rptproj) files, and report solution (.sln) files.
Remember to backup any script files (.rss) that you created for administration or deployment tasks.
Verify that you have a backup copy of any custom extensions and custom assemblies you are using.
This was the last blog post in the series “How to backup your SCOM environment”. If you follow these guidelines you’ll have a pretty good chance of recovering from a disaster with as little downtime, as little data loss as possible.
In the next series I’ll be posting how to Recover it all using the backups we took so stay tuned and as usual if you have remarks or feedback you can reach me on facebook / twitter.
This blog post is part of a series how to backup your SCOM environment.
You can find the other parts here:
After we have backed up all the other necessary bits of our environment in the previous blog posts just a little more bits remain to make sure we can successfully restore our environment when there has been a disaster.
This can save you a lot of time and can even be useful when you just have to pull up a report on what actually get’s monitored in the environment.
I’ve based my script on a nice blog post of Kristoper Bash which can be found here:
http://operatingquadrant.com/2009/08/19/scom-automating-management-pack-documentation/
I’ve adapted the script so it will be in line with the script I uses in this post: How to backup your unsealed management packs
You could combine the 2 into 1 script but for now I’ll leave them separate just in case you only need to document the environment instead of backing up + documenting everything.
The script, open the pictures to read out or download the script here: List mp script download location
The markings in yellow you need to modify to your own liking + environment.
Variables to change in the next session:
Note that in the next session the mailing capability is completely commented. If you would like to send out a mail with the result of the script you can just uncomment and thus activate this feature by removing the ‘#’ in front of each line.
Variables to change in the next session:
Schedule this with a scheduled task in windows on your RMS and you’re ready to go.
Tip: If you need both the documentation and the backup you could combine this script with the script I featured in this post: How to backup your unsealed management packs
Hmmm here in Belgium our pools look like this these days:
So maybe it’s a good chance to dive in the “Virtual summer” with Microsoft!
If you dive in here http://technet.microsoft.com/nl-be/ff793346?ocid=ban-n-be-loc—sumitprovirt1tnug you can find all the necessary goodies on Virtualization wit Microsoft to keep you occupied inside while you enjoy your nice view of your pool in the rain
There are tracks for:
So for every weather type there’s something in there. Get ready for the next virtualization wave an read up on all the different techniques and components used to get you going.
http://www.microsoftvirtualacademy.com
When you are done with browsing through the vast amount of info please also do not hesitate to put your knowledge to the test and visit the Microsoft Virtual Academy. Another great resource on the various Virtualization topics. Here you’ll find different tracks on:
First check out the training material and then take the test. The tracks are nicely build up and you can take them at your own pace.
They give you a nice overview of the different techniques and provide a nice mix of in-depth knowledge and overview of the products and features used.
Make sure to check out the virtualization tracks. They also hold a lot of info on the new upcoming version of SCVMM2012.
Each track provides you also with the materials and links to related eval versions of the software.
You see enough to keep you busy when things aren’t going as planned weather wise
A new milestone in the development of System Center Operations Manager 2012 (SCOM2012) today. The release of the beta to the public.
More info on the team blog here:
A small portion of the SCOM 2012 FAQ which is something most people are very curious about:
“We’ve made significant investments to help our customers build more comprehensive monitoring for their private cloud environments, while integrating their existing datacenter investments.
Here’s the direct download link:
http://www.microsoft.com/download/en/details.aspx?id=26804
Technet info link:
http://technet.microsoft.com/nl-be/library/hh205987(en-us).aspx
The final release is still planned for the first half of 2012. But you can evaluate it now already.
Caution:
As this is beta software it’s not supported to run this in a production environment.
We’ll be blogging more on the install process and first findings soon.
Just recently in my ever during quest for information I stumble upon a great blog post which basically gathers all the different product team blogs of System Center and feeds them into one blog.
It’s named System Center Unlimited and is maintained by J.C. Hornbeck. All the different posts on the different product team blogs are fed into this ever growing and very active blog. If you’re looking for one place to get a full overview of the System Center products this is the place to go…
However…
If you’re looking for a System Center specific product blog make sure to check out his extensive list of all the different product team blogs RSS feeds which he monitors :
I’ve included the links below to the different product team blogs :
Name | URL |
Application Virtualization | http://blogs.technet.com/b/appv/ |
Avicode | http://blogs.technet.com/b/Avicode/ |
Configuration Manager 2007 | http://blogs.technet.com/b/configurationmgr/ |
Data Protection Manager | http://blogs.technet.com/b/dpm/ |
Mobile Device Manager | http://blogs.technet.com/b/mdm/ |
MS Enterprise Desktop Virtualization | http://blogs.technet.com/b/MEDV/ |
Out of Band management | http://blogs.technet.com/b/oob/ |
Opalis/ Orchestrator | http://blogs.technet.com/b/orchestrator/ |
Operations Manager | http://blogs.technet.com/b/operationsmgr/ |
Service Manager | http://blogs.technet.com/b/servicemanager/ |
System Center Essentials | http://blogs.technet.com/b/systemcenteressentials/ |
System Center Virtual Machine Manager | http://blogs.technet.com/b/scvmm/ |
Windows Software Update Services | http://blogs.technet.com/b/appv/sus.aspx |
Server Application Virtualization | http://blogs.technet.com/b/serverappv/ |
The System Center Teamblog | http://blogs.technet.com/systemcenter/ |
If there are things missing please don’t hesitate to drop me a line.
Hopes it helps you to find your way through the System Center universe
As part of my series how to back up your SCOM environment I ‘ve created a backup strategy for my unsealed management packs.
The setup I choose is to use a PowerShell script with error handling included which is run by the Task Scheduler on the RMS and monitored by a management pack in SCOM.
The advantages of this setup are:
The PowerShell script I used is based on the UnsealedMPbackup management pack which is posted here: https://skydrive.live.com/?cid=397bb61b75cc76c5&id=397BB61B75CC76C5%21217#
Although this is an excellent script I modified it to have Error handling in there. If you look at the script there’s also a mailer included in the script but it’s commented out for now. If you would like to use this as a standalone script without the monitoring of SCOM over the process you can easily switch on the email function and be alerted when things went wrong.
This script will:
The parameters you will need to fill in are marked in yellow.
You can download the PowerShell script here.
In this section we are actually defining the location and exporting the management packs:
Define the backup location:
Export the unsealed management packs. With this command we’ll export all the unsealed management packs to the folder. If you (for one reason or another) want to backup all your management packs you can change this code to:
$all_mps = get-managementpack
foreach($mp in $all_mps)
{
export-managementpack -managementpack $mp -path “C:\backups”
}
Thanks to Maarten Goet for providing this example.
Error handling
In this section the script is writing to the Operations manager eventlog a event whether it was successful (id 910) or unsuccessful (ID 911). This can be used to monitor the process.
Feel free to change the ID’s as you please but don’t forget to modify the supplied management pack later on.
The mailing section:
If you choose not to monitor this process with SCOM you can activate the mailing section that warns you about the outcome of the process.
Make sure to change the highlighted sections.
As said before I’m scheduling this script on the RMS using the build-in Windows Task Scheduler.
The command to schedule should be (if you save the file in c:\scripts):
powershell -command “& ‘c:\scripts\backup_mp.ps1’ ”
You can easily monitor the process with SCOM and setup notifications whenever there’s an error. I’ve created a small management pack which contains a monitor to check the status of the backup.
This monitor is healthy by default and comes in critical error when event 911 is logged. However when the next backup is successful, when event 910 is logged, it will return back to healthy. I don’t mind to miss one backup
There’s an automated recovery task included as well to restart the backup when failed.
You can download this small MP here.
Don’t forget to setup your notifications and you’re all done.