Had a strange one today.
Customer had tried previously to setup the connection between VMM and SCOM but made some mistake somewhere along the line and then uninstalled the SCOM console without removing the connection as they said the console was causing the VMM service to constantly crash.
Having not seen that behaviour before and slightly doubting it somewhat I re-installed the console and sure enough was prevented from accessing the VMM console as the service was crashing.
Just as an added check, I tried running some PowerShell commands to check it wasn't a GUI issue only to be created by error messages complaining that the VMM service wasn't running or accessible.
So I uninstalled the console again which allowed me access back to VMM and running the Get-SCOpsMgrConnection showed me the broken connection. However attempts to remove via the console or PowerShell were both met by errors telling me I needed the SCOM console installed first in order to manage the connection. Arh.. slight problem...
After checking everything I could think of (SPN's, SCP's, Service Accounts etc etc) and not finding anything that stood out (Including nothing useful in the event logs) I thought I'd try a timing trick.
So I opened up a SCVMM PowerShell window ready, kicked off the SCOM console install again and then repeatedly spammed Remove-SCOpsMgrConnection -Force and wouldn't you know it after a few messages saying the SCOM console must be installed, just before the install completed the command completed successfully and removed the broken connection. More to the point the SCOM console installation completed and the VMM service didn't crash.
After recreating the connection everything remained stable, but even though the create connection job ran successfully, the following error was present in the connector details:
“Operations Manager discovery failed with error: “Exception of type ‘Microsoft.VirtualManager.EnterpriseManagement.common.discoverydatainvalidrelationshipsourceexceptionOM10’ was thrown.
This is because the SCOM connection was created with the PRO-Tips enabled but without a SCOM monitoring agent deployed to the VMM Server.
Easily fixable, just untick the PRO and Maintenance Mode connection settings, deploy a SCOM agent to the VMM server and once the agent is installed and reporting, re-enable the options.
Showing posts with label PowerShell. Show all posts
Showing posts with label PowerShell. Show all posts
Tuesday, 1 October 2013
Monday, 30 September 2013
Offload Data Transfer (ODX) in Windows Server 2012
I've been working on a nice Dell R720 host based Hyper-V cluster this week with a Dell Compellent array providing the storage.
One of the things I was looking forward to with this job was getting hands on with the ODX feature of the Compellent.
ODX (Offload Data Transfer) is a feature found on some of the newer storage arrays that helps with large file operations by (in simplified terms) keeping the transfers within the array rather than passing the file to the source then destination servers then back to the array.
The first thing to do (assuming you know the hardware supports it) would be to check that the OS and it's software components supports ODX.
Now this is a Windows Server 2012 and 2012 R2 only feature so if you're on 2008 R2, time to upgrade.
From a PowerShell prompt, run the following command:
Fltmc instances
Take a note of the volume name of either the drive, or in my case the CSV volume you want to check. Then run:
Fltmc instances -v <Volume Name>
e.g. Fltmc instances -v C:\ClusterStorage\Volume1
This will give you the filter names that you will need to check.
Run this command, replacing the <Filter Name> with those shown by the previous command.
Get-ItemProperty hklm:\system\currentcontrolset\services\<FilterName> -Name "SupportedFeatures"
So for my two filters of FsDepends and MpFilter I get the following output:
The property that needs checking is "SupportedFeatures". If this has a value of 3 then ODX is supported and you're good to go. Anything else and you'll need to look into it further.
Lastly, check if ODX is enabled on your server using this command:
Get-ItemProperty hklm:\system\currentcontrolset\control\filesystem -Name "FilterSupportedFeaturesMode"
If it returns a "FilterSupportedFeaturesMode" other than 0 as shown below then ODX isn't enabled.
Run this to enable ODX:
Set-ItemProperty hklm:\system\currentcontrolset\control\filesystem -Name "FilterSupportedFeaturesMode" -Value 0 -Type DWord
Or this to disable ODX if needed:
Set-ItemProperty hklm:\system\currentcontrolset\control\filesystem -Name "FilterSupportedFeaturesMode" -Value 1 -Type DWord
In order to demonstrate to the client that ODX was indeed enabled and more to the point worth having, I modified the script on Hans Vredevoort shows on his blog post discussing ODX testing between 3Par and Compellents here: http://www.hyper-v.nu/archives/hvredevoort/2013/07/notes-from-the-field-using-odx-with-hp-3par-storage-arrays/
I ran the script which loops through creating 10 x 50Gb and 10 x 475Gb fixed disks with ODX enabled and then does the same but with ODX disabled.
This was the timings from the test:
With ODX
12.6 seconds for 10 x 50Gb vhdx files
84.2 seconds for 10 x 475Gb vhdx files
96.8 seconds total for all vhdx files
Without ODX
1015.5 seconds (nearly 17 mins) for 10 x 50Gb vhdx files
8615.8 seconds (just over 2 hours) for 9 x 475Gb vhdx files (N.B. I ran out of disk space for the 10th)
9631 seconds or 2.7 hours total for all vhdx files
There is a MASSIVE difference in creation times!
ODX is a feature well worth having in my opinion. What I really can't wait for is ODX support with SCVMM libraries in the SCVMM 2012 R2 release!!
I've uploaded the ODX Test script to SkyDrive here: http://sdrv.ms/16QhZZE
One of the things I was looking forward to with this job was getting hands on with the ODX feature of the Compellent.
ODX (Offload Data Transfer) is a feature found on some of the newer storage arrays that helps with large file operations by (in simplified terms) keeping the transfers within the array rather than passing the file to the source then destination servers then back to the array.
The first thing to do (assuming you know the hardware supports it) would be to check that the OS and it's software components supports ODX.
Now this is a Windows Server 2012 and 2012 R2 only feature so if you're on 2008 R2, time to upgrade.
From a PowerShell prompt, run the following command:
Fltmc instances
Take a note of the volume name of either the drive, or in my case the CSV volume you want to check. Then run:
Fltmc instances -v <Volume Name>
e.g. Fltmc instances -v C:\ClusterStorage\Volume1
This will give you the filter names that you will need to check.
Run this command, replacing the <Filter Name> with those shown by the previous command.
Get-ItemProperty hklm:\system\currentcontrolset\services\<FilterName> -Name "SupportedFeatures"
So for my two filters of FsDepends and MpFilter I get the following output:
The property that needs checking is "SupportedFeatures". If this has a value of 3 then ODX is supported and you're good to go. Anything else and you'll need to look into it further.
Lastly, check if ODX is enabled on your server using this command:
Get-ItemProperty hklm:\system\currentcontrolset\control\filesystem -Name "FilterSupportedFeaturesMode"
If it returns a "FilterSupportedFeaturesMode" other than 0 as shown below then ODX isn't enabled.
Run this to enable ODX:
Set-ItemProperty hklm:\system\currentcontrolset\control\filesystem -Name "FilterSupportedFeaturesMode" -Value 0 -Type DWord
Or this to disable ODX if needed:
Set-ItemProperty hklm:\system\currentcontrolset\control\filesystem -Name "FilterSupportedFeaturesMode" -Value 1 -Type DWord
In order to demonstrate to the client that ODX was indeed enabled and more to the point worth having, I modified the script on Hans Vredevoort shows on his blog post discussing ODX testing between 3Par and Compellents here: http://www.hyper-v.nu/archives/hvredevoort/2013/07/notes-from-the-field-using-odx-with-hp-3par-storage-arrays/
I ran the script which loops through creating 10 x 50Gb and 10 x 475Gb fixed disks with ODX enabled and then does the same but with ODX disabled.
This was the timings from the test:
With ODX
12.6 seconds for 10 x 50Gb vhdx files
84.2 seconds for 10 x 475Gb vhdx files
96.8 seconds total for all vhdx files
Without ODX
1015.5 seconds (nearly 17 mins) for 10 x 50Gb vhdx files
8615.8 seconds (just over 2 hours) for 9 x 475Gb vhdx files (N.B. I ran out of disk space for the 10th)
9631 seconds or 2.7 hours total for all vhdx files
There is a MASSIVE difference in creation times!
ODX is a feature well worth having in my opinion. What I really can't wait for is ODX support with SCVMM libraries in the SCVMM 2012 R2 release!!
I've uploaded the ODX Test script to SkyDrive here: http://sdrv.ms/16QhZZE
Thursday, 26 September 2013
Using PowerShell CIM Sessions to Query Dell Hardware
I've been "playing" with some Dell hardware recently and as with everything I like to try and automate as many tasks as possible.
Dell have a really useful tool called Racadm which is a command line utility which you can call from a script to read and write various properties of Dell iDRAC and CMC (Chassis Management Controller).
However, since the latest iDRAC and CMC are built around WSMAN and DMTF standards, I prefer a more PowerShell only approach.
The key PowerShell command for querying is Get-CimInstance. Before we can use this command however we first need to establish a remote CIM Session to the hardware.
This is accomplished by using the New-CimSession and New-CimSessionOption cmdlets.
So...
Use some variables to store the IP, username and password for the iDRAC
$UserName="root"
$Password="calvin"
$DracIP="10.10.0.120"
Convert the username and password into a PS Credential
$SecurePass = ConvertTo-SecureString $Password -AsPlainText -Force
$DracCred = new-object -typename System.Management.Automation.PSCredential -argumentlist $UserName,$SecurePass
We can then create a new CimSessionOption object, which for the Dell Hardware the below works nicely
$cimop=New-CimSessionOption -SkipCACheck -SkipCNCheck -SkipRevocationCheck -Encoding Utf8 -UseSsl
Then using the above variables and new session object we can create a new CIM session to the iDRAC
$Dracsession=New-CimSession -Authentication Basic -Credential $DracCred -ComputerName $DracIP -Port 443 -SessionOption $cimop -OperationTimeoutSec 10000000
Once we have the session established, we can then use the Get-CimInstance cmdlets to query various properties by passing in a WSMAN/WinRM ResourceURI.
For example, if we just wanted to query the general BIOS properties, we could use the following URI: http://schemas.dmtf.org/wbem/wscim/1/cim-schema/2/root/dcim/DCIM_SystemView
That would form the following command (cmdlet - session - resourceuri):
Get-CimInstance -CimSession $Dracsession -ResourceUri "http://schemas.dmtf.org/wbem/wscim/1/cim-schema/2/root/dcim/DCIM_SystemView"
Which supplies information like this:
This way if you assign the object to a variable ($BIOSINFO=Get-CimInst ...) then we can pull out specific items within scripts:
Again, you can do similar things with other hardware properties, for example I can use the resource URI for getting the network card information from a server (http://schemas.dmtf.org/wbem/wscim/1/cim-schema/2/root/dcim/DCIM_NICView)
Drop this into a command:
$NICS=Get-CimInstance -CimSession $Dracsession -ResourceUri "http://schemas.dmtf.org/wbem/wscim/1/cim-schema/2/root/dcim/DCIM_NICView"
... and now we can get the various MAC Addresses of the various NICs
$NICS[0].PermanentMACAddress
$NICS[1].PermanentMACAddress
...
$NICS[6].PermanentMACAddress
$NICS[7].PermanentMACAddress
Hmm... Useful for SCVMM Bare Metal deployment scripting maybe?
The only thing that I struggled with this very simple method of querying the hardware for info, was the resource URI needed.
Well to help with this, the following bits of information from Dell are a god send:
DCIM Profile Library
http://en.community.dell.com/techcenter/systems-management/w/wiki/1906.dcim-library-profile.aspx
WinRM WebServices for Lifecycle Controller
http://en.community.dell.com/techcenter/extras/m/white_papers/20066174.aspx
Next time I'll post about using PowerShell to set the values rather than just query them.
Dell have a really useful tool called Racadm which is a command line utility which you can call from a script to read and write various properties of Dell iDRAC and CMC (Chassis Management Controller).
However, since the latest iDRAC and CMC are built around WSMAN and DMTF standards, I prefer a more PowerShell only approach.
The key PowerShell command for querying is Get-CimInstance. Before we can use this command however we first need to establish a remote CIM Session to the hardware.
This is accomplished by using the New-CimSession and New-CimSessionOption cmdlets.
So...
Use some variables to store the IP, username and password for the iDRAC
$UserName="root"
$Password="calvin"
$DracIP="10.10.0.120"
Convert the username and password into a PS Credential
$SecurePass = ConvertTo-SecureString $Password -AsPlainText -Force
$DracCred = new-object -typename System.Management.Automation.PSCredential -argumentlist $UserName,$SecurePass
We can then create a new CimSessionOption object, which for the Dell Hardware the below works nicely
$cimop=New-CimSessionOption -SkipCACheck -SkipCNCheck -SkipRevocationCheck -Encoding Utf8 -UseSsl
Then using the above variables and new session object we can create a new CIM session to the iDRAC
$Dracsession=New-CimSession -Authentication Basic -Credential $DracCred -ComputerName $DracIP -Port 443 -SessionOption $cimop -OperationTimeoutSec 10000000
Once we have the session established, we can then use the Get-CimInstance cmdlets to query various properties by passing in a WSMAN/WinRM ResourceURI.
For example, if we just wanted to query the general BIOS properties, we could use the following URI: http://schemas.dmtf.org/wbem/wscim/1/cim-schema/2/root/dcim/DCIM_SystemView
That would form the following command (cmdlet - session - resourceuri):
Get-CimInstance -CimSession $Dracsession -ResourceUri "http://schemas.dmtf.org/wbem/wscim/1/cim-schema/2/root/dcim/DCIM_SystemView"
Which supplies information like this:
This way if you assign the object to a variable ($BIOSINFO=Get-CimInst ...) then we can pull out specific items within scripts:
Again, you can do similar things with other hardware properties, for example I can use the resource URI for getting the network card information from a server (http://schemas.dmtf.org/wbem/wscim/1/cim-schema/2/root/dcim/DCIM_NICView)
Drop this into a command:
$NICS=Get-CimInstance -CimSession $Dracsession -ResourceUri "http://schemas.dmtf.org/wbem/wscim/1/cim-schema/2/root/dcim/DCIM_NICView"
... and now we can get the various MAC Addresses of the various NICs
$NICS[0].PermanentMACAddress
$NICS[1].PermanentMACAddress
...
$NICS[6].PermanentMACAddress
$NICS[7].PermanentMACAddress
Hmm... Useful for SCVMM Bare Metal deployment scripting maybe?
The only thing that I struggled with this very simple method of querying the hardware for info, was the resource URI needed.
Well to help with this, the following bits of information from Dell are a god send:
DCIM Profile Library
http://en.community.dell.com/techcenter/systems-management/w/wiki/1906.dcim-library-profile.aspx
WinRM WebServices for Lifecycle Controller
http://en.community.dell.com/techcenter/extras/m/white_papers/20066174.aspx
Next time I'll post about using PowerShell to set the values rather than just query them.
Saturday, 21 September 2013
Converting a WIM file to VHD on a UEFI system
I always use the excellent Convert-WindowsImage.ps1 script by Mike Kolitz for taking the WIM files from the Windows media and converting them into bootable VHD files. It's the quickest and easiest way for creating VM Templates in SCVMM.
The script can be found here: http://gallery.technet.microsoft.com/scriptcenter/Convert-WindowsImageps1-0fe23a8f/
However, I ran into a problem today with the script throwing an error about "Could not get the BootMgr object from the Virtual Disks BCDStore"
It turns out from a couple of replies in the discussion thread of the TechNet Gallery listing that this will generally happen if trying to run the script from a device that uses UEFI to boot, which I happen to be doing.
Thankfully the fix is relatively easy, you just need to modify the script slightly.
http://technet.microsoft.com/en-us/library/dd744347(v=WS.10).aspx
Save the script and you're good to go!
The script can be found here: http://gallery.technet.microsoft.com/scriptcenter/Convert-WindowsImageps1-0fe23a8f/
However, I ran into a problem today with the script throwing an error about "Could not get the BootMgr object from the Virtual Disks BCDStore"
It turns out from a couple of replies in the discussion thread of the TechNet Gallery listing that this will generally happen if trying to run the script from a device that uses UEFI to boot, which I happen to be doing.
Thankfully the fix is relatively easy, you just need to modify the script slightly.
- Do a search in the script for $bcdBootArgs which is usually first referenced at line 4055
- On the line a couple below (usually 4057) change the following
"/s $Drive" modify to "/s $Drive /f ALL"
http://technet.microsoft.com/en-us/library/dd744347(v=WS.10).aspx
Save the script and you're good to go!
Labels:
deployment,
Image,
PowerShell,
SCVMM,
VHD,
VHDX,
WIM,
Windows
Monday, 24 June 2013
System Center 2012 R2 Preview - Download and Extract Script
Well, System Center 2012 R2 Preview is here a day earlier than I expected.
Eskor Koneti posted a list of the direct download links to the preview bits here:
http://eskonr.com/2013/06/configmgr-sccm-2012-r2-preview-is-available-for-download/
So I thought I'd wrap them quickly into a PowerShell script that downloads and then extracts all the components ready for install.
I've commented out the DPM download and install as for me it wasn't downloading correctly (either manually or via the script) but feel free to try it.
The script has no error checking and I know could be much smoother, but hey, it's not even 8am here in the UK so what do you expect!
$dwnld = "E:\System_Center_2012_R2"
if (!(Test-Path -path $dwnld))
{
New-Item $dwnld -type directory
}
$object = New-Object Net.WebClient
$SCCMurl = 'http://care.dlservice.microsoft.com/dl/download/evalx/sc2012/SC2012_R2_PREVIEW_SCCM_SCEP.exe'
$object.DownloadFile($SCCMurl, "$dwnld\SC2012_R2_PREVIEW_SCCM_SCEP.EXE")
$object = New-Object Net.WebClient
$SCCMurl = 'http://care.dlservice.microsoft.com/dl/download/evalx/sc2012/SC2012_R2_PREVIEW_SCOM.exe'
$object.DownloadFile($SCCMurl, "$dwnld\SC2012_R2_PREVIEW_SCOM.EXE")
$object = New-Object Net.WebClient
$SCCMurl = 'http://care.dlservice.microsoft.com/dl/download/evalx/sc2012/SC2012_R2_PREVIEW_SCVMM.exe'
$object.DownloadFile($SCCMurl, "$dwnld\SC2012_R2_PREVIEW_SCVMM.EXE")
$object = New-Object Net.WebClient
$SCCMurl = 'http://care.dlservice.microsoft.com/dl/download/evalx/sc2012/SC2012_R2_PREVIEW_SCSM.exe'
$object.DownloadFile($SCCMurl, "$dwnld\SC2012_R2_PREVIEW_SCSM.EXE")
$object = New-Object Net.WebClient
$SCCMurl = 'http://care.dlservice.microsoft.com/dl/download/evalx/sc2012/SC2012_R2_PREVIEW_SCO.exe'
$object.DownloadFile($SCCMurl, "$dwnld\SC2012_R2_PREVIEW_SCO.EXE")
$object = New-Object Net.WebClient
$SCCMurl = 'http://care.dlservice.microsoft.com/dl/download/evalx/sc2012/SC2012_R2_PREVIEW_SCAC.exe'
$object.DownloadFile($SCCMurl, "$dwnld\SC2012_R2_PREVIEW_SCAC.EXE")
#$object = New-Object Net.WebClient
# $SCCMurl = 'http://care.dlservice.microsoft.com/dl/download/evalx/sc2012/SC2012_R2_PREVIEW_SCDPM.exe'
# $object.DownloadFile($SCCMurl, "$dwnld\SC2012_R2_PREVIEW_SCDPM.EXE")
Start-Process -FilePath "$dwnld\SC2012_R2_PREVIEW_SCAC.EXE" -Wait -ArgumentList /DIR="$dwnld\SCAC", /VERYSILENT
#Start-Process -FilePath "$dwnld\SC2012_R2_PREVIEW_SCDPM.EXE" -Wait -ArgumentList /DIR="$dwnld\SCDPM", /VERYSILENT
Start-Process -FilePath "$dwnld\SC2012_R2_PREVIEW_SCO.EXE" -Wait -ArgumentList /DIR="$dwnld\SCO", /VERYSILENT
Start-Process -FilePath "$dwnld\SC2012_R2_PREVIEW_SCOM.EXE" -Wait -ArgumentList /DIR="$dwnld\SCOM", /VERYSILENT
Start-Process -FilePath "$dwnld\SC2012_R2_PREVIEW_SCSM.EXE" -Wait -ArgumentList /DIR="$dwnld\SCSM", /VERYSILENT
Start-Process -FilePath "$dwnld\SC2012_R2_PREVIEW_SCVMM.EXE" -Wait -ArgumentList /DIR="$dwnld\SCVMM", /VERYSILENT
Start-Process -FilePath "$dwnld\SC2012_R2_PREVIEW_SCCM_SCEP.EXE" -Wait -ArgumentList /Auto, "$dwnld\SCCM"
Eskor Koneti posted a list of the direct download links to the preview bits here:
http://eskonr.com/2013/06/configmgr-sccm-2012-r2-preview-is-available-for-download/
So I thought I'd wrap them quickly into a PowerShell script that downloads and then extracts all the components ready for install.
I've commented out the DPM download and install as for me it wasn't downloading correctly (either manually or via the script) but feel free to try it.
The script has no error checking and I know could be much smoother, but hey, it's not even 8am here in the UK so what do you expect!
$dwnld = "E:\System_Center_2012_R2"
if (!(Test-Path -path $dwnld))
{
New-Item $dwnld -type directory
}
$object = New-Object Net.WebClient
$SCCMurl = 'http://care.dlservice.microsoft.com/dl/download/evalx/sc2012/SC2012_R2_PREVIEW_SCCM_SCEP.exe'
$object.DownloadFile($SCCMurl, "$dwnld\SC2012_R2_PREVIEW_SCCM_SCEP.EXE")
$object = New-Object Net.WebClient
$SCCMurl = 'http://care.dlservice.microsoft.com/dl/download/evalx/sc2012/SC2012_R2_PREVIEW_SCOM.exe'
$object.DownloadFile($SCCMurl, "$dwnld\SC2012_R2_PREVIEW_SCOM.EXE")
$object = New-Object Net.WebClient
$SCCMurl = 'http://care.dlservice.microsoft.com/dl/download/evalx/sc2012/SC2012_R2_PREVIEW_SCVMM.exe'
$object.DownloadFile($SCCMurl, "$dwnld\SC2012_R2_PREVIEW_SCVMM.EXE")
$object = New-Object Net.WebClient
$SCCMurl = 'http://care.dlservice.microsoft.com/dl/download/evalx/sc2012/SC2012_R2_PREVIEW_SCSM.exe'
$object.DownloadFile($SCCMurl, "$dwnld\SC2012_R2_PREVIEW_SCSM.EXE")
$object = New-Object Net.WebClient
$SCCMurl = 'http://care.dlservice.microsoft.com/dl/download/evalx/sc2012/SC2012_R2_PREVIEW_SCO.exe'
$object.DownloadFile($SCCMurl, "$dwnld\SC2012_R2_PREVIEW_SCO.EXE")
$object = New-Object Net.WebClient
$SCCMurl = 'http://care.dlservice.microsoft.com/dl/download/evalx/sc2012/SC2012_R2_PREVIEW_SCAC.exe'
$object.DownloadFile($SCCMurl, "$dwnld\SC2012_R2_PREVIEW_SCAC.EXE")
#$object = New-Object Net.WebClient
# $SCCMurl = 'http://care.dlservice.microsoft.com/dl/download/evalx/sc2012/SC2012_R2_PREVIEW_SCDPM.exe'
# $object.DownloadFile($SCCMurl, "$dwnld\SC2012_R2_PREVIEW_SCDPM.EXE")
Start-Process -FilePath "$dwnld\SC2012_R2_PREVIEW_SCAC.EXE" -Wait -ArgumentList /DIR="$dwnld\SCAC", /VERYSILENT
#Start-Process -FilePath "$dwnld\SC2012_R2_PREVIEW_SCDPM.EXE" -Wait -ArgumentList /DIR="$dwnld\SCDPM", /VERYSILENT
Start-Process -FilePath "$dwnld\SC2012_R2_PREVIEW_SCO.EXE" -Wait -ArgumentList /DIR="$dwnld\SCO", /VERYSILENT
Start-Process -FilePath "$dwnld\SC2012_R2_PREVIEW_SCOM.EXE" -Wait -ArgumentList /DIR="$dwnld\SCOM", /VERYSILENT
Start-Process -FilePath "$dwnld\SC2012_R2_PREVIEW_SCSM.EXE" -Wait -ArgumentList /DIR="$dwnld\SCSM", /VERYSILENT
Start-Process -FilePath "$dwnld\SC2012_R2_PREVIEW_SCVMM.EXE" -Wait -ArgumentList /DIR="$dwnld\SCVMM", /VERYSILENT
Start-Process -FilePath "$dwnld\SC2012_R2_PREVIEW_SCCM_SCEP.EXE" -Wait -ArgumentList /Auto, "$dwnld\SCCM"
Labels:
2012,
Demo,
Download,
Eval,
Evaluation,
PowerShell,
Preview,
R2,
Scripting,
System Center
Wednesday, 12 June 2013
Seize FSMO roles in Server 2012
One of the beautiful things of a test lab is getting to try things you might not get chance to do in a production environment. So when my main Domain Controller went pop the other day, rather than work on bringing it back online I saw a good chance to test seizing the FSMO roles with PowerShell.
Previously the main way to seize the roles was using the Ntdsutil in Server 2003 & 2008.
Since PowerShell is now my weapon of choice I thought it would be useful to quickly document the method.
Move-ADDirectoryServerOperationMasterRole is the command that is used for this task. More information on the command can be found here:
http://technet.microsoft.com/en-us/library/ee617229.aspx
You can use either the Role Name or Number to specify which role to move, this table shows the details:
Get-ADForest DomainName | FT SchemaMaster,DomainNamingMaster
Get-ADDomain DomainName | FT PDCEmulator,RIDMaster,InfrastructureMaster
One thing to note, only seize the roles if you have no intention of bringing the original holding Domain Controller back online. Domains don't tend to like having two FSMO role holders...
Previously the main way to seize the roles was using the Ntdsutil in Server 2003 & 2008.
Since PowerShell is now my weapon of choice I thought it would be useful to quickly document the method.
Move-ADDirectoryServerOperationMasterRole is the command that is used for this task. More information on the command can be found here:
http://technet.microsoft.com/en-us/library/ee617229.aspx
You can use either the Role Name or Number to specify which role to move, this table shows the details:
Operation Master Role Name | Number |
PDCEmulator | 0 |
RIDMaster | 1 |
InfrastructureMaster | 2 |
SchemaMaster | 3 |
DomainNamingMaster | 4 |
Use the -Identity switch to specify the target Domain Controller and the –OperationMasterRole to specify which role to transfer. I've also used the -Force command as my current FSMO holder is offline.
I'll be moving all the roles to a target DC called TLDC02.
N.B. To move the SchemaMaster role you'll need to be a member of the Schema Admins group. My account was also a member of Enterprise Admins when I ran this.
N.B. To move the SchemaMaster role you'll need to be a member of the Schema Admins group. My account was also a member of Enterprise Admins when I ran this.
- Logon to a working Domain Controller and launch an elevated PowerShell session.
- Type: Move-ADDirectoryServerOperationMasterRole -Identity TLDC02 -OperationMasterRole 0,1,2,3,4 -Force
- Either type Y on each role move prompt, or type A to accept all prompts
- After a while, all the roles should be successfully moved.
Get-ADForest DomainName | FT SchemaMaster,DomainNamingMaster
Get-ADDomain DomainName | FT PDCEmulator,RIDMaster,InfrastructureMaster
One thing to note, only seize the roles if you have no intention of bringing the original holding Domain Controller back online. Domains don't tend to like having two FSMO role holders...
Sunday, 26 May 2013
Migrate Knowledge Base Articles from Service Manager 2010 to 2012
To help ease the migration between Service Manager 2010 and 2012 (or even just one management group to another!) I've created a script that will export all of the Knowledge Articles, including the Rich Text used for the Analyst and End User content.
You can find the script here on the TechNet Gallery:
http://gallery.technet.microsoft.com/Migrate-Knowledge-Base-15b81ab6
Download and extract the zip file and put the SCSMExportKB.ps1 file in a directory you have access to.
The script also relies on the SMLets from CodePlex found here which also makes this independent of which version of Service Manager you're running.
After you've installed the SMLets, launch a PowerShell session (Elevated as Admin) and ensure that the execution of scripts is allowed by typing:
Set-ExecutionPolicy -ExecutionPolicy Unrestricted
Next navigate to the folder with the script and run it with the paramerter of where to export the KB Articles to. If you do not specify a path it will default to exporting the KB Articles to the users temp folder.
For example I would run this to export to my downloads folder:
.\SCSMExportKB.ps1 C:\Users\SBAdmin\Downloads
The script will then start running and you will see the progress of it exporting to RTF files any Analyst or End User content and then the rest of the KB Article details.
N.B. Before you do the import, be sure to remove the first line of the CSV which has the headers in it!
You can find the script here on the TechNet Gallery:
http://gallery.technet.microsoft.com/Migrate-Knowledge-Base-15b81ab6
Download and extract the zip file and put the SCSMExportKB.ps1 file in a directory you have access to.
The script also relies on the SMLets from CodePlex found here which also makes this independent of which version of Service Manager you're running.
After you've installed the SMLets, launch a PowerShell session (Elevated as Admin) and ensure that the execution of scripts is allowed by typing:
Set-ExecutionPolicy -ExecutionPolicy Unrestricted
Next navigate to the folder with the script and run it with the paramerter of where to export the KB Articles to. If you do not specify a path it will default to exporting the KB Articles to the users temp folder.
For example I would run this to export to my downloads folder:
.\SCSMExportKB.ps1 C:\Users\SBAdmin\Downloads
The script will then start running and you will see the progress of it exporting to RTF files any Analyst or End User content and then the rest of the KB Article details.
Copy these exported files to the same location on the target server (or modify the csv to point to a new location) and then use the KBImport.xml provided in the zip file along with the Knowledge.csv created by the PowerShell script to import them into the target Service Manager system using the CSV Import Wizard.N.B. Before you do the import, be sure to remove the first line of the CSV which has the headers in it!
And that should be that, one set of exported and imported Knowledge Articles.
There is one limitation however...
In this current version only the Out-Of-Box lists are supported. I'm working on the script to handle custom list values and will update the solution when it's automated. Until then you will need to find the enumeration ID's from your target site and replace the source ID's in the CSV file with the corresponding ones.
I'd also like to thank Anton Gritsenko (aka FreemanRU) for pointing me in the right direction for this script.
Labels:
2010,
2012,
Articles,
CSV,
Export,
Import,
Knowledge,
PowerShell,
SCSM,
Service Manager,
System Center
Wednesday, 22 May 2013
Service Manager - Exporting the Knowledge Base Analyst and End User Comments
How can I export my knowledge base articles from Service Manager?
I've heard this question asked a few times now, especially when it comes to doing a migration rather than an upgrade from 2010 to 2012.
While it's fairly straight forward to output things like the Article ID, Title, Category etc using PowerShell, getting the Analyst and EndUser comments out is slightly more tricky due to them being stored as binary data within the database (They're RTF files basically).
While I may update this post to give a script that will export every last detail I'm pushed for time at the moment so I'll just post the most complicated part.
You can use the following PowerShell script to export the comments to individual rtf files for both Internal and Analyst comments.
$OutDir = "C:\KnowledgeFolder"
Get-SCSMClass -DisplayName "Knowledge Article" | Get-SCSMClassInstance | ForEach-Object {
$KBName=$_.ArticleId
If($_.EndUserContent -ne $null)
{
$br = new-object io.binaryreader $_.EndUserContent
$al = new-object collections.generic.list[byte]
while (($i = $br.Read()) -ne -1)
{
$al.Add($i)
}
Set-Content ("$OutDir\$KBName"+"EndUserContent.rtf") $al.ToArray() -enc byte
}
If($_.AnalystContent -ne $null)
{
$br = new-object io.binaryreader $_.AnalystContent
$al = new-object collections.generic.list[byte]
while (($i = $br.Read()) -ne -1)
{
$al.Add($I)
}
Set-Content ("$OutDir\$KBName"+"AnalystContent.rtf") $al.ToArray() -enc byte
}
}
I've only had a quick test of this within a 2012 environment which has the native Get-SCSMClass and Get-SCSMClassInstance cmdlets whereas 2010 doesn't.
However with the SCSM Cmdlets from CodePlex this script should be easily adaptable for the 2010 environment.
You could then use either PowerShell to import them back into a new environment, or use Anders CSV import method shown on his post here
Travis also has a useful post with the Enum GUID's that you would need when importing via CSV here
Some useful details for working with Knowledge Articles in PowerShell:
Class Details:
DisplayName - Knowledge Article
Name - System.Knowledge.Article
ManagementPackName - System.Knowledge.Library
Some of the array contents:
Abstract
AnalystContent
ArticleId
ArticleOwner
ArticleTemplate
ArticleType
AssetStatus
Category
Comments
CreatedBy
CreatedDate
DisplayName
EndUserContent
ExternalURL
ExternalURLSource
Keywords
Notes
ObjectStatus
PrimaryLocaleID
Status
Tag
Title
VendorArticleID
#Id
#Name
#Path
#FullName
#LastModified
#TimeAdded
#LastModifiedBy
EnterpriseManagementObject
RelationshipAliases
I've heard this question asked a few times now, especially when it comes to doing a migration rather than an upgrade from 2010 to 2012.
While it's fairly straight forward to output things like the Article ID, Title, Category etc using PowerShell, getting the Analyst and EndUser comments out is slightly more tricky due to them being stored as binary data within the database (They're RTF files basically).
While I may update this post to give a script that will export every last detail I'm pushed for time at the moment so I'll just post the most complicated part.
You can use the following PowerShell script to export the comments to individual rtf files for both Internal and Analyst comments.
$OutDir = "C:\KnowledgeFolder"
Get-SCSMClass -DisplayName "Knowledge Article" | Get-SCSMClassInstance | ForEach-Object {
$KBName=$_.ArticleId
If($_.EndUserContent -ne $null)
{
$br = new-object io.binaryreader $_.EndUserContent
$al = new-object collections.generic.list[byte]
while (($i = $br.Read()) -ne -1)
{
$al.Add($i)
}
Set-Content ("$OutDir\$KBName"+"EndUserContent.rtf") $al.ToArray() -enc byte
}
If($_.AnalystContent -ne $null)
{
$br = new-object io.binaryreader $_.AnalystContent
$al = new-object collections.generic.list[byte]
while (($i = $br.Read()) -ne -1)
{
$al.Add($I)
}
Set-Content ("$OutDir\$KBName"+"AnalystContent.rtf") $al.ToArray() -enc byte
}
}
I've only had a quick test of this within a 2012 environment which has the native Get-SCSMClass and Get-SCSMClassInstance cmdlets whereas 2010 doesn't.
However with the SCSM Cmdlets from CodePlex this script should be easily adaptable for the 2010 environment.
You could then use either PowerShell to import them back into a new environment, or use Anders CSV import method shown on his post here
Travis also has a useful post with the Enum GUID's that you would need when importing via CSV here
Some useful details for working with Knowledge Articles in PowerShell:
Class Details:
DisplayName - Knowledge Article
Name - System.Knowledge.Article
ManagementPackName - System.Knowledge.Library
Some of the array contents:
Abstract
AnalystContent
ArticleId
ArticleOwner
ArticleTemplate
ArticleType
AssetStatus
Category
Comments
CreatedBy
CreatedDate
DisplayName
EndUserContent
ExternalURL
ExternalURLSource
Keywords
Notes
ObjectStatus
PrimaryLocaleID
Status
Tag
Title
VendorArticleID
#Id
#Name
#Path
#FullName
#LastModified
#TimeAdded
#LastModifiedBy
EnterpriseManagementObject
RelationshipAliases
Labels:
2010,
2012,
Comments,
Export,
Import,
Knowledge,
Migrate,
PowerShell,
SCSM,
Service Manager,
System Center,
Systems Centre
Tuesday, 14 May 2013
Problems Clustering Virtual Machines on Windows Server 2012 Hyper-V
I was re-building our lab environment at work the other week in preparation for our big Summit13 event, that and the lab had been trashed over the last year...
As part of the re-build I had decided to implement a couple of virtual machine clusters, one for a scale-out file server and one as a SQL cluster.
I'd deployed the virtual machines for the cluster nodes using Service Templates in SCVMM and as part of that template chosen to use an availability set to ensure the VM's were separated across hosts (a cluster doesn't provide much High Availability if they all reside on the same host that's failed!).
When I started to create the cluster I ran straight into a problem with the Failover Cluster Manager reporting problems due to timeouts when creating the cluster.
Creating a single node cluster worked fine, but then would again fail when trying to add in another node.
I happened to put one of the Hyper-V hosts into maintenance mode for something and it migrated all the VM's onto the same host, at which point creating the cluster worked flawlessly, yay!
However, when the Hyper-V host came back out of maintenance mode and the availability sets kicked in during optimisation forcing a VM node back onto a separate physical host, the clusters broke again, not yay :(
So after someGoogling Binging about and a shout on Twitter (Thanks @hvredevoort and @WorkingHardInIT) an issue with Broadcom NICs was brought to my attention and I came across this TechNet Forum post talking about the same issue.
Sophia_whx suggested trying to use Disable-NetAdapterChecksumOffload on the NICs to help with the issue.
So, first things first. Use the Get-NetAdapterChecksumOffload to see just what the configuration was and sure enough Checksum Offload is enabled for just about all services across the majority of the NICs.
I then used the Disable-NetAdapterChecksumOffload * -TcpIPv4 command which resulted in this:
After doing this on the other host and giving them a reboot, my clustered virtual machines appear to be nice and stable when split across physical hosts. Yay! Problem fixed.
Just as another side note about Broadcom adapters, there have also been reports of performance issues when the Virtual Machine Queue (VMQ) setting is enabled, despite it being a recommended setting.
A quick check of my hosts showed it was enabled:
Another quick PowerShell line later and it wasn't:
Get-NetAdapterVmq -InterfaceDescription Broad* | Disable-NetAdapterVmq
As part of the re-build I had decided to implement a couple of virtual machine clusters, one for a scale-out file server and one as a SQL cluster.
I'd deployed the virtual machines for the cluster nodes using Service Templates in SCVMM and as part of that template chosen to use an availability set to ensure the VM's were separated across hosts (a cluster doesn't provide much High Availability if they all reside on the same host that's failed!).
When I started to create the cluster I ran straight into a problem with the Failover Cluster Manager reporting problems due to timeouts when creating the cluster.
Creating a single node cluster worked fine, but then would again fail when trying to add in another node.
I happened to put one of the Hyper-V hosts into maintenance mode for something and it migrated all the VM's onto the same host, at which point creating the cluster worked flawlessly, yay!
However, when the Hyper-V host came back out of maintenance mode and the availability sets kicked in during optimisation forcing a VM node back onto a separate physical host, the clusters broke again, not yay :(
So after some
Sophia_whx suggested trying to use Disable-NetAdapterChecksumOffload on the NICs to help with the issue.
So, first things first. Use the Get-NetAdapterChecksumOffload to see just what the configuration was and sure enough Checksum Offload is enabled for just about all services across the majority of the NICs.
I then used the Disable-NetAdapterChecksumOffload * -TcpIPv4 command which resulted in this:
A reboot later and then perform it on the second host and whoa....
For some reason, the virtual switch really didn't like having that done to it.
I wish I had some screenshots, but I went into "get it fixed fast" mode.
Basically the switch via powershell was showing as up, the NIC Teaming GUI was showing it down and all the bound adapters as failed. SCVMM had lost all configuration for the switch altogether.
Deleting the switch from SCVMM didn't delete it from the host, but brought it back to life on the host but was missing in SCVMM. SCVMM then wouldn't redetect it or let you build it again as it was still there, apparently???
I had to manually remove the team from a remote NIC Teaming GUI (I could of PowerShell'd it I know!) and then recreated via SCVMM.
Anyway... at first this looked to have fixed the clustering within virtual machine issues, but it only delayed the symptoms i.e. it took longer to evict nodes and randomly brought them back online.
So next I tried disabling Checksum Offload for all services, being careful not to touch the Virtual Switch this time.
Rather than going adapter by adapter I used the following command:
Get-NetAdapter | Where-Object {$_.Name -notlike "Converged*"} | Disable-NetAdapterChecksumOffload
This resulted in the Checksum Offload being disabled for the various services as shown, except for my virtual switch.
After doing this on the other host and giving them a reboot, my clustered virtual machines appear to be nice and stable when split across physical hosts. Yay! Problem fixed.
Just as another side note about Broadcom adapters, there have also been reports of performance issues when the Virtual Machine Queue (VMQ) setting is enabled, despite it being a recommended setting.
A quick check of my hosts showed it was enabled:
Another quick PowerShell line later and it wasn't:
Get-NetAdapterVmq -InterfaceDescription Broad* | Disable-NetAdapterVmq
Wednesday, 3 April 2013
Orchestrator & VMware vSphere – Quiesce Snapshots
While there is an official VMware vSphere Integration Pack for Orchestrator available from Microsoft (here) I was working for a customer the other week where they had an extra need that the Create Snapshot activity didn’t seem to provide.
The customer as part of their SoP (Standard Operating Proceedure) have the requirement that all snapshots are quiesced for both memory and file operations during a snapshot activity.
While the vSphere IP activity allows you to set the option for capturing the memory state with the snapshot, there is no reference as to whether it quiesces the file system by default or not.
So the quickest way to achieve this, fall back to PowerShell and script it.
VMware have a PowerShell module available for vSphere known as PowerCLI.
Once I had downloaded and installed this on both the Runbook Designer workstations and the SCORCH server I started working on the script.
The script:
$VC = "<Insert vCenter Server>"
$VMName = "<Insert VM Name>"
$Snapshot = "<Insert VM Name>"
if(-not (Get-PSSnapin VMware.VimAutomation.Core -ErrorAction SilentlyContinue))
{
Add-PSSnapin VMware.VimAutomation.Core
}
Set-PowerCLIConfiguration -DefaultVIServerMode Single -InvalidCertificateAction Ignore -confirm:$false
Connect-VIServer -Server $VC -ErrorAction SilentlyContinue
Get-VM $VMName | New-Snapshot -Name $Snapshot -Memory:$true -Quiesce:$true -Description "Snapshot for protection while running automated process" -Confirm:$false
Basically this script will take the input of a virtual machine name and a name for the snapshot then perform a snapshot of the VM but with the options set to quiesce both the memory and file system.
The customer as part of their SoP (Standard Operating Proceedure) have the requirement that all snapshots are quiesced for both memory and file operations during a snapshot activity.
While the vSphere IP activity allows you to set the option for capturing the memory state with the snapshot, there is no reference as to whether it quiesces the file system by default or not.
So the quickest way to achieve this, fall back to PowerShell and script it.
VMware have a PowerShell module available for vSphere known as PowerCLI.
Once I had downloaded and installed this on both the Runbook Designer workstations and the SCORCH server I started working on the script.
The script:
$VC = "<Insert vCenter Server>"
$VMName = "<Insert VM Name>"
$Snapshot = "<Insert VM Name>"
if(-not (Get-PSSnapin VMware.VimAutomation.Core -ErrorAction SilentlyContinue))
{
Add-PSSnapin VMware.VimAutomation.Core
}
Set-PowerCLIConfiguration -DefaultVIServerMode Single -InvalidCertificateAction Ignore -confirm:$false
Connect-VIServer -Server $VC -ErrorAction SilentlyContinue
Get-VM $VMName | New-Snapshot -Name $Snapshot -Memory:$true -Quiesce:$true -Description "Snapshot for protection while running automated process" -Confirm:$false
Basically this script will take the input of a virtual machine name and a name for the snapshot then perform a snapshot of the VM but with the options set to quiesce both the memory and file system.
Labels:
Orchestrator,
PowerShell,
SCO,
SCORCH,
System Center,
VMware
Saturday, 12 January 2013
SCVMM - Delete IP Pool
Quick PowerShell snippet for handy reference when I'm playing in the lab and need to delete an IP Pool:
##Display all IP's and the VM's they are assigned to:
$ippool = Get-SCStaticIPAddressPool "Internal Network"
Get-SCIPAddress -StaticIPAddressPool $ippool | ft -property Address,Description,State
##Return all the IP's for that pool ready to remove the pool
$ip = Get-SCIPAddress -StaticIPAddressPool $ippool
$ip | Revoke-SCIPAddress
##Display all IP's and the VM's they are assigned to:
$ippool = Get-SCStaticIPAddressPool "Internal Network"
Get-SCIPAddress -StaticIPAddressPool $ippool | ft -property Address,Description,State
##Return all the IP's for that pool ready to remove the pool
$ip = Get-SCIPAddress -StaticIPAddressPool $ippool
$ip | Revoke-SCIPAddress
Friday, 4 January 2013
System Center 2012 SP1 Setup Software Prerequisites
Since I'm updating my lab environment (more like a wipe and reload) to Service Pack 1 for System Center 2012 I thought I'd just take the time quickly (mainly for my own reference) to document the prerequisites needed for deploying the various components.
This is all based on installing it on Windows Server 2012 as the Server OS and SQL Server 2012 in a clustered, named instance setup.
This post is focused on software pre-reqs, not design or account pre-reqs. See my other post here about service accounts and System Center 2012. This post is also not a step by step install guide.
Virtual Machine Manager (SCVMM)
Very simple for SCVMM, before the SCVMM installation install the Windows ADK and only choose:
$dwnld = "C:\Downloads"
if (!(Test-Path -path $dwnld))
{
New-Item $dwnld -type directory
}
$object = New-Object Net.WebClient
$ADKurl = 'http://download.microsoft.com/download/9/9/F/99F5E440-5EB5-4952-9935-B99662C3DF70/adk/adksetup.exe'
$object.DownloadFile($ADKurl, "$dwnld\adksetup.exe")
Start-Process -FilePath "$dwnld\adksetup.exe" -Wait -ArgumentList "/quiet /features OptionId.DeploymentTools OptionId.WindowsPreinstallationEnvironment"
Configuration Manager (ConfigMgr)
Slight bug/feature in Server 2012 when it comes to installing .NET Framework 3.5 where you'll need the install media to get it installed using the following command:
N.B. This command line assumes you have the source media in the D drive.
dism /online /enable-feature /featurename:NetFX3 /all /Source:d:\sources\sxs /LimitAccess
To install all other features via PowerShell:
Get-Module ServerManager
Install-WindowsFeature Web-Windows-Auth,Web-ISAPI-Ext,Web-Metabase,Web-WMI,BITS,RDC, NET-Framework-Features,Web-Asp-Net,Web-Asp-Net45,NET-HTTP-Activation, NET-Non-HTTP-Activ,UpdateServices-Services,UpdateServices-RSAT
Also required is the Windows ADK and only choose:
$dwnld = "C:\Downloads"
if (!(Test-Path -path $dwnld))
{
New-Item $dwnld -type directory
}
$object = New-Object Net.WebClient
$ADKurl = 'http://download.microsoft.com/download/9/9/F/99F5E440-5EB5-4952-9935-B99662C3DF70/adk/adksetup.exe'
$object.DownloadFile($ADKurl, "$dwnld\adksetup.exe")
Start-Process -FilePath "$dwnld\adksetup.exe" -Wait -ArgumentList "/quiet /features OptionId.DeploymentTools OptionId.WindowsPreinstallationEnvironment OptionId.UserStateMigrationTool"
Service Manager (SCSM)
For all SCSM roles we need to use dism to add .Net 3.5 before we can even start the installer
dism /online /enable-feature /featurename:NetFX3 /all /Source:d:\sources\sxs /LimitAccess
Management Server & Data Warehouse
SQL Analysis Management Objects (AMO) from here
SQL Native Client from here N.B As stated earlier this is the SQL 2012 Native Client, you'll need the version matching your SQL environment.
Both of these pre-reqs are part of the SQL 2012 Feature Pack
Report Viewer 2008 SP1 is also required, but not on the Data Warehouse.
If the server you are installing SCSM on has internet access then you could use this Powershell snippet to download and install the Prerequisites automatically.
$dwnld = "C:\Downloads"
if (!(Test-Path -path $dwnld))
{
New-Item $dwnld -type directory
}
$object = New-Object Net.WebClient
$AMOurl = 'http://go.microsoft.com/fwlink/?LinkID=239666&clcid=0x409'
$SNCurl = 'http://go.microsoft.com/fwlink/?LinkID=239648&clcid=0x409'
$RPTurl = 'http://download.microsoft.com/download/0/4/F/04F99ADD-9E02-4C40-838E-76A95BCEFB8B/ReportViewer.exe'
$object.DownloadFile($AMOurl, "$dwnld\SQL2012AMO.msi")
$object.DownloadFile($SNCurl, "$dwnld\SQL2012NCli.msi")
$object.DownloadFile($RPTurl, "$dwnld\ReportViewer.exe")
Start-Process -FilePath msiexec -ArgumentList /i, "$dwnld\SQL2012AMO.msi", /qn -Wait
Start-Process -FilePath msiexec -ArgumentList /i, "$dwnld\SQL2012NCli.msi", /qn, IACCEPTSQLNCLILICENSETERMS=YES -Wait
Start-Process -FilePath "$dwnld\ReportViewer.exe" -ArgumentList /q -Wait
Orchestrator (SCORCH)
Again, we need to use dism to add .Net 3.5 before we can even start the installer
dism /online /enable-feature /featurename:NetFX3 /all /Source:d:\sources\sxs /LimitAccess
The only other pre-req is the IIS role, but if this isn't installed the SCORCH setup will do this for you anyway.
However, if you'd like to prep the server manually, use the following PowerShell command:
Add-WindowsFeature Web-Server,Web-Log-Libraries,Web-Request-Monitor,Web-Http-Tracing,Web-Digest-Auth,Web-Windows-Auth,Web-Net-Ext,Web-Asp-Net,Web-CGI,Web-Mgmt-Tools,NET-WCF-HTTP-Activation45,NET-WCF-MSMQ-Activation45,NET-WCF-Pipe-Activation45,NET-WCF-TCP-Activation45,MSMQ,RDC,WAS
Operations Manager (SCOM)
Operations Console
Microsoft Report Viewer 2010 Redistributable Package is required on any device where the console is to be installed and that is usually at least one management server.
The following PowerShell script will download and install it for you.
$dwnld = "C:\Downloads"
if (!(Test-Path -path $dwnld))
{
New-Item $dwnld -type directory
}
$object = New-Object Net.WebClient
$RPTurl = 'http://download.microsoft.com/download/E/A/1/EA1BF9E8-D164-4354-8959-F96843DD8F46/ReportViewer.exe'
$object.DownloadFile($RPTurl, "$dwnld\ReportViewer.exe")
Start-Process -FilePath "$dwnld\ReportViewer.exe" -ArgumentList /q -Wait
Web Console
The SP1 Operations Manager 2012 web console requires the some IIS Server Features to be enabled.
The following PowerShell line will install and enable the required features:
Add-WindowsFeature Web-Server,Web-Request-Monitor,Web-Windows-Auth,Web-Asp-Net,Web-CGI,Web-Mgmt-Tools,NET-WCF-HTTP-Activation45,Web-Metabase
If for some reason the pre-req checker is complaining about "Enable the ISAPI and CGI restrictions in IIS for ASP.NET 4.x" then try a reboot first but if it still complains then try the following command and then another reboot. It says the command doesn't work in Server 2012/IIS8 but it worked for me after a reboot.
C:\Windows\Microsoft.NET\Framework64\v4.0.30319\aspnet_regiis.exe -enable -r
App Controller
The following PowerShell line will install and enable the required features:
Add-WindowsFeature NET-Framework-Features,NET-Framework-Core,Web-Mgmt-Console,Web-Static-Content,Web-Default-Doc,Web-Http-Errors,Web-Http-Logging,Web-Request-Monitor,Web-Http-Tracing,Web-Stat-Compression,Web-Filtering,Web-Basic-Auth,Web-Windows-Auth,Web-ISAPI-Filter,Web-ISAPI-Ext,Web-Net-Ext,Web-Asp-Net45
General
If you need to check if a server has some of the features installed already, use this PowerShell line:
Get-WindowsFeature | where {$_.Installed -eq "True"} | ft DisplayName, Name, Installed
This is all based on installing it on Windows Server 2012 as the Server OS and SQL Server 2012 in a clustered, named instance setup.
This post is focused on software pre-reqs, not design or account pre-reqs. See my other post here about service accounts and System Center 2012. This post is also not a step by step install guide.
Virtual Machine Manager (SCVMM)
Very simple for SCVMM, before the SCVMM installation install the Windows ADK and only choose:
- Deployment Tools
- Windows Preinstallation environment (Windows PE)
$dwnld = "C:\Downloads"
if (!(Test-Path -path $dwnld))
{
New-Item $dwnld -type directory
}
$object = New-Object Net.WebClient
$ADKurl = 'http://download.microsoft.com/download/9/9/F/99F5E440-5EB5-4952-9935-B99662C3DF70/adk/adksetup.exe'
$object.DownloadFile($ADKurl, "$dwnld\adksetup.exe")
Start-Process -FilePath "$dwnld\adksetup.exe" -Wait -ArgumentList "/quiet /features OptionId.DeploymentTools OptionId.WindowsPreinstallationEnvironment"
Configuration Manager (ConfigMgr)
Slight bug/feature in Server 2012 when it comes to installing .NET Framework 3.5 where you'll need the install media to get it installed using the following command:
N.B. This command line assumes you have the source media in the D drive.
dism /online /enable-feature /featurename:NetFX3 /all /Source:d:\sources\sxs /LimitAccess
To install all other features via PowerShell:
Get-Module ServerManager
Install-WindowsFeature Web-Windows-Auth,Web-ISAPI-Ext,Web-Metabase,Web-WMI,BITS,RDC, NET-Framework-Features,Web-Asp-Net,Web-Asp-Net45,NET-HTTP-Activation, NET-Non-HTTP-Activ,UpdateServices-Services,UpdateServices-RSAT
Also required is the Windows ADK and only choose:
- Deployment Tools
- Windows Preinstallation environment (Windows PE)
- User State Migration Tool (USMT)
$dwnld = "C:\Downloads"
if (!(Test-Path -path $dwnld))
{
New-Item $dwnld -type directory
}
$object = New-Object Net.WebClient
$ADKurl = 'http://download.microsoft.com/download/9/9/F/99F5E440-5EB5-4952-9935-B99662C3DF70/adk/adksetup.exe'
$object.DownloadFile($ADKurl, "$dwnld\adksetup.exe")
Start-Process -FilePath "$dwnld\adksetup.exe" -Wait -ArgumentList "/quiet /features OptionId.DeploymentTools OptionId.WindowsPreinstallationEnvironment OptionId.UserStateMigrationTool"
Service Manager (SCSM)
For all SCSM roles we need to use dism to add .Net 3.5 before we can even start the installer
dism /online /enable-feature /featurename:NetFX3 /all /Source:d:\sources\sxs /LimitAccess
Management Server & Data Warehouse
SQL Analysis Management Objects (AMO) from here
SQL Native Client from here N.B As stated earlier this is the SQL 2012 Native Client, you'll need the version matching your SQL environment.
Both of these pre-reqs are part of the SQL 2012 Feature Pack
Report Viewer 2008 SP1 is also required, but not on the Data Warehouse.
If the server you are installing SCSM on has internet access then you could use this Powershell snippet to download and install the Prerequisites automatically.
$dwnld = "C:\Downloads"
if (!(Test-Path -path $dwnld))
{
New-Item $dwnld -type directory
}
$object = New-Object Net.WebClient
$AMOurl = 'http://go.microsoft.com/fwlink/?LinkID=239666&clcid=0x409'
$SNCurl = 'http://go.microsoft.com/fwlink/?LinkID=239648&clcid=0x409'
$RPTurl = 'http://download.microsoft.com/download/0/4/F/04F99ADD-9E02-4C40-838E-76A95BCEFB8B/ReportViewer.exe'
$object.DownloadFile($AMOurl, "$dwnld\SQL2012AMO.msi")
$object.DownloadFile($SNCurl, "$dwnld\SQL2012NCli.msi")
$object.DownloadFile($RPTurl, "$dwnld\ReportViewer.exe")
Start-Process -FilePath msiexec -ArgumentList /i, "$dwnld\SQL2012AMO.msi", /qn -Wait
Start-Process -FilePath msiexec -ArgumentList /i, "$dwnld\SQL2012NCli.msi", /qn, IACCEPTSQLNCLILICENSETERMS=YES -Wait
Start-Process -FilePath "$dwnld\ReportViewer.exe" -ArgumentList /q -Wait
Orchestrator (SCORCH)
Again, we need to use dism to add .Net 3.5 before we can even start the installer
dism /online /enable-feature /featurename:NetFX3 /all /Source:d:\sources\sxs /LimitAccess
The only other pre-req is the IIS role, but if this isn't installed the SCORCH setup will do this for you anyway.
However, if you'd like to prep the server manually, use the following PowerShell command:
Add-WindowsFeature Web-Server,Web-Log-Libraries,Web-Request-Monitor,Web-Http-Tracing,Web-Digest-Auth,Web-Windows-Auth,Web-Net-Ext,Web-Asp-Net,Web-CGI,Web-Mgmt-Tools,NET-WCF-HTTP-Activation45,NET-WCF-MSMQ-Activation45,NET-WCF-Pipe-Activation45,NET-WCF-TCP-Activation45,MSMQ,RDC,WAS
Operations Manager (SCOM)
Operations Console
Microsoft Report Viewer 2010 Redistributable Package is required on any device where the console is to be installed and that is usually at least one management server.
The following PowerShell script will download and install it for you.
$dwnld = "C:\Downloads"
if (!(Test-Path -path $dwnld))
{
New-Item $dwnld -type directory
}
$object = New-Object Net.WebClient
$RPTurl = 'http://download.microsoft.com/download/E/A/1/EA1BF9E8-D164-4354-8959-F96843DD8F46/ReportViewer.exe'
$object.DownloadFile($RPTurl, "$dwnld\ReportViewer.exe")
Start-Process -FilePath "$dwnld\ReportViewer.exe" -ArgumentList /q -Wait
Web Console
The SP1 Operations Manager 2012 web console requires the some IIS Server Features to be enabled.
The following PowerShell line will install and enable the required features:
Add-WindowsFeature Web-Server,Web-Request-Monitor,Web-Windows-Auth,Web-Asp-Net,Web-CGI,Web-Mgmt-Tools,NET-WCF-HTTP-Activation45,Web-Metabase
If for some reason the pre-req checker is complaining about "Enable the ISAPI and CGI restrictions in IIS for ASP.NET 4.x" then try a reboot first but if it still complains then try the following command and then another reboot. It says the command doesn't work in Server 2012/IIS8 but it worked for me after a reboot.
C:\Windows\Microsoft.NET\Framework64\v4.0.30319\aspnet_regiis.exe -enable -r
App Controller
The following PowerShell line will install and enable the required features:
Add-WindowsFeature NET-Framework-Features,NET-Framework-Core,Web-Mgmt-Console,Web-Static-Content,Web-Default-Doc,Web-Http-Errors,Web-Http-Logging,Web-Request-Monitor,Web-Http-Tracing,Web-Stat-Compression,Web-Filtering,Web-Basic-Auth,Web-Windows-Auth,Web-ISAPI-Filter,Web-ISAPI-Ext,Web-Net-Ext,Web-Asp-Net45
General
If you need to check if a server has some of the features installed already, use this PowerShell line:
Get-WindowsFeature | where {$_.Installed -eq "True"} | ft DisplayName, Name, Installed
Labels:
ConfigMgr,
PowerShell,
SCCM,
SCOM,
SCSM,
SCVMM,
Service Pack 1,
SP1,
System Center
Monday, 5 March 2012
Service Manager - Bulk change the priority (Impact & Urgency)
A customer had the requirement the other day to bulk change the priority on a certain classification of Incidents (They were using incidents of a certain classification in place of Service Requests)
I ran into a problem getting Orchestrator connected to SCSM, so it was time for a quick PowerShell script.
Import-Module SMLETS
$IRClass = Get-SCSMClass -Name System.WorkItem.Incident$
$EnumClass = Get-SCSMEnumeration | Where-Object{$_.displayname -eq 'Other Problems'}
$IRs = Get-SCSMObject -Class $IRClass |Where-Object{$_.Classification -eq $EnumClass}
$PropertyHashTable = @{"Impact" = "Low"}
$IRs | Set-SCSMObject -PropertyHashtable $PropertyHashTable
$PropertyHashTable = @{"Urgency" = "Low"}
$IRs | Set-SCSMObject -PropertyHashtable $PropertyHashTable
This basically finds all incidents of classification 'Other Problems' (just as an example!) and then changes the impact and urgency to low, therefore putting the priority to the lowest possible value.
I ran into a problem getting Orchestrator connected to SCSM, so it was time for a quick PowerShell script.
Import-Module SMLETS
$IRClass = Get-SCSMClass -Name System.WorkItem.Incident$
$EnumClass = Get-SCSMEnumeration | Where-Object{$_.displayname -eq 'Other Problems'}
$IRs = Get-SCSMObject -Class $IRClass |Where-Object{$_.Classification -eq $EnumClass}
$PropertyHashTable = @{"Impact" = "Low"}
$IRs | Set-SCSMObject -PropertyHashtable $PropertyHashTable
$PropertyHashTable = @{"Urgency" = "Low"}
$IRs | Set-SCSMObject -PropertyHashtable $PropertyHashTable
This basically finds all incidents of classification 'Other Problems' (just as an example!) and then changes the impact and urgency to low, therefore putting the priority to the lowest possible value.
Disclaimer: This script is provided as is, with no warranties, expressed or implied and has only been tested within my test environment.
Friday, 1 July 2011
Scripted Install of System Center Orchestrator Beta and all pre-reqs
There's been a few posts floating around now showing how to do a standard setup of the new System Center Orchestrator Beta (SCORCH) so I thought rather than doing yet another that it was time to do something different.
So I thought it might be a good idea to try automating the installation of SCORCH, which is rather fitting for an automation product.
So I wrote a quick powershell script that not only silently installs SCORCH, but will also setup a fresh built server with all required pre-reqs, SQL, create a service account and setup necessary rights for it.
I also thought I'd get it all on video as well!
So there you have it, an easy, repeatable, automated method of setting up a quick Orchestrator test server.
So I thought it might be a good idea to try automating the installation of SCORCH, which is rather fitting for an automation product.
So I wrote a quick powershell script that not only silently installs SCORCH, but will also setup a fresh built server with all required pre-reqs, SQL, create a service account and setup necessary rights for it.
I also thought I'd get it all on video as well!
Basically the script will go through and install/setup in this order:
.Net Framework 3.5
.Net Framework 4.0
Silverlight
Required IIS Role & Features
(Web-Common-Http,Web-Static-Content,Web-Default-Doc,Web-Dir-Browsing,Web-Http-Errors,Web-Http-Logging,Web-Request-Monitor,Web-Stat-Compression)
SQL 2008 R2
Create a Local Service account called SCORCH_SA
Add it to the local admin group
Assign it logon as a service rights (using ntrights.exe from the 2003 resource kit)
The script then silently installs Orchestrator, with all components.
This is achieved using the following command:
setup.exe /Silent /ServiceUserName:$UserName /ServicePassword:$Password /Components:All /DbServer:$ComputerName /DbNameNew:Orchestrator /ScoAdmin:$ComputerName\Administrators /WebServicePort:81 /WebConsolePort:82
where you see a $ prefix that's where a variable is passed from the script to the command line, normally these would be manually typed as username, server, password etc
During the CEP kickoff meeting it was mentioned that an install of SCORCH and an import of a policy on a server with the pre-reqs installed had been done in 5 minutes, and a mini challenge laid down to see if it could be done quicker.... once the pre-reqs were installed, my script let me install and import in about 3 minutes!!
You can find the link to the powershell script here:
If you want to try this yourself then you'll need to add NTRights.exe from the 2003 reskit, .Net4.0 full framework, silverlight and the PS script to the SCORCH extracted folder and have the SQL disk in Drive D (or modify the script)
Thursday, 14 April 2011
Renaming files using PowerShell and a CSV
During a migration of users from an outside organisation into ours, we had to bring across all their exchange data using exmerge.
The problem was, all their exported PST's were named incorrectly.
So, a quick CSV was knocked up with two columns.
Column 1 = Path and current file name
Column 2 = New file name
Add a header at the top of the CSV to give something like this:
Path,NewAlias
F:\Export\fredb.pst,fred.bloggs.pst
F:\Export\DSaster.pst,derdrie.saster.pst
Then use the following PowerShell line:
import-csv c:\PSTImport.csv | foreach {rename-item -path $_.path -newname $_.newalias}
Job done, one nicely renamed folder of PST files ready to exmerge (shame we're still on Exchange 2003!)
The problem was, all their exported PST's were named incorrectly.
So, a quick CSV was knocked up with two columns.
Column 1 = Path and current file name
Column 2 = New file name
Add a header at the top of the CSV to give something like this:
Path,NewAlias
F:\Export\fredb.pst,fred.bloggs.pst
F:\Export\DSaster.pst,derdrie.saster.pst
Then use the following PowerShell line:
import-csv c:\PSTImport.csv | foreach {rename-item -path $_.path -newname $_.newalias}
Job done, one nicely renamed folder of PST files ready to exmerge (shame we're still on Exchange 2003!)
Thursday, 10 February 2011
Using Opalis to automate the gathering of Dell Warranty information and update Service Manager with it
For ages now there has been various vb scripts around that has allowed the Dell Warranty website to be checked in a scripted fashion by passing a service tag number across in a url and parsing the returned html for the required information.
One really cool usage of this was using ConfigMgr to run the script on PC's and add the information into WMI for inventorying later via mof edits.
Sherry Kissinger has an excellent article here:
http://myitforum.com/cs2/blogs/skissinger/archive/2008/12/03/dell-warranty-info-hardware-inventory-extension.aspx
While this is great for active clients, once the device is decommissioned or you have problems with the client and remove it from SCCM for one reason or another, your data is gone! Not so good for long term reporting or asset management.
Since I'm in the process of migrating all of our asset management reporting and information storage into Service Manager 2010, warranty expiration, start date, type etc was one of the main items of information that we needed to store so I started to think how I could automate this as much as possible.
3 Options sprang to mind:
Things to note:
So.... I created a policy that looks something like this:
This is basically where the "magic" happens, the powershell script.
All credit for this goes to Marcus Oh and his blog post here:
http://marcusoh.blogspot.com/2009/06/retrieving-dell-warranty-data-via.html
I've only tweaked his PS code slightly by adding proxy details so that I could get out of our network, added an underscore ( _ ) to the $sData = $sData | Select-String "contract_" line to stop an error about javascript that was happening and flattened the output by adding $Warranty =$cMyData | foreach {$_.EndDate} to only output the warranty end date. Same principal could be applied for start date, days left etc or output it all and do the manipulation in Opalis.
One really cool usage of this was using ConfigMgr to run the script on PC's and add the information into WMI for inventorying later via mof edits.
Sherry Kissinger has an excellent article here:
http://myitforum.com/cs2/blogs/skissinger/archive/2008/12/03/dell-warranty-info-hardware-inventory-extension.aspx
While this is great for active clients, once the device is decommissioned or you have problems with the client and remove it from SCCM for one reason or another, your data is gone! Not so good for long term reporting or asset management.
Since I'm in the process of migrating all of our asset management reporting and information storage into Service Manager 2010, warranty expiration, start date, type etc was one of the main items of information that we needed to store so I started to think how I could automate this as much as possible.
3 Options sprang to mind:
- Supply Dell with a list of all our service tags and then CSV import it - allows large bulk updates but takes time if large volumes of updates required again.
- Implement the SCCM scripts then create a custom connector to gather the data - overly complex for my liking and duplicates the information storage
- Use Opalis to gather the data and update Service Manager - Sounded cool ;)
Things to note:
- This is proof of concept, failure handling etc would need adding for production
- I've extended the Windows Computer class in advance with a "Warranty Expiration Date" property. You could in theory extend any class to hold it, Computer (Deployed) might be more fitting but I've got other reasons for choosing the Windows Computer class.
So.... I created a policy that looks something like this:
First step is to get the GUID's of the Windows Computer objects, and then query them for their relationships with Computer (Deployed) objects.
Second Step is to then get the related Computer (Deployed) object which then gives us access to the Serial Number (Dell service tag) via the Opalis Databus and then pass it across to a Powershell Script.
All credit for this goes to Marcus Oh and his blog post here:
http://marcusoh.blogspot.com/2009/06/retrieving-dell-warranty-data-via.html
I've only tweaked his PS code slightly by adding proxy details so that I could get out of our network, added an underscore ( _ ) to the $sData = $sData | Select-String "contract_" line to stop an error about javascript that was happening and flattened the output by adding $Warranty =$cMyData | foreach {$_.EndDate} to only output the warranty end date. Same principal could be applied for start date, days left etc or output it all and do the manipulation in Opalis.
$sSerial = "{Serial Number from "Get Object - Computer (Deployed)"}"
$oWeb = New-Object System.Net.WebClient
$proxy = New-Object System.Net.WebProxy("YourProxyServerHere:8080")
$proxy.UseDefaultCredentials = $true
$oWeb.proxy = $proxy
$sUrl = "http://support.euro.dell.com/support/topics/topic.aspx/emea/shared/support/my_systems_info/en/details?c=uk&cs=RC1050265&l=en&s=pad&~ck=anavml&servicetag=$($sSerial)"
$sData = $oWeb.DownloadString($sUrl)
$sData = $sData -creplace '<a style.*?>', ''
$sData = $sData | ForEach-Object { $_ -replace "<i>", "" }
$sData = $sData | ForEach-Object { $_.Split("<") }
$sData = $sData | Select-String "contract_"
$sData = $sData | ForEach-Object { $_ -replace $_,"$_`n" }
$oRegEx = [regex]'"contract_.*row">(.*)'
$cMatches = $oRegEx.Matches($sData)
$cMatches = $cMatches | ForEach-Object { $_.Groups[1].value }
$cMyData = @()
foreach ($i in 0..($cMatches.count -1)) {
$cRecord = New-Object -TypeName system.Object
[void] $foreach.MoveNext()
$cRecord | Add-Member -MemberType noteProperty -Name 'Provider' $cMatches[$foreach.current]
[void] $foreach.MoveNext()
$cRecord | Add-Member -MemberType noteProperty -Name 'StartDate' $cMatches[$foreach.current]
[void] $foreach.MoveNext()
$cRecord | Add-Member -MemberType noteProperty -Name 'EndDate' $cMatches[$foreach.current]
[void] $foreach.MoveNext()
if ($cMatches[$foreach.current] -ne "") {
$cRecord | Add-Member -MemberType noteProperty -Name 'DaysLeft' $cMatches[$foreach.current]
} else {
$cRecord |
Add-Member -MemberType noteProperty -Name 'DaysLeft' "0"
}
$cMyData += $cRecord
}
$Warranty = $cMyData | foreach {$_.EndDate}
$oWeb = New-Object System.Net.WebClient
$proxy = New-Object System.Net.WebProxy("YourProxyServerHere:8080")
$proxy.UseDefaultCredentials = $true
$oWeb.proxy = $proxy
$sUrl = "http://support.euro.dell.com/support/topics/topic.aspx/emea/shared/support/my_systems_info/en/details?c=uk&cs=RC1050265&l=en&s=pad&~ck=anavml&servicetag=$($sSerial)"
$sData = $oWeb.DownloadString($sUrl)
$sData = $sData -creplace '<a style.*?>', ''
$sData = $sData | ForEach-Object { $_ -replace "<i>", "" }
$sData = $sData | ForEach-Object { $_.Split("<") }
$sData = $sData | Select-String "contract_"
$sData = $sData | ForEach-Object { $_ -replace $_,"$_`n" }
$oRegEx = [regex]'"contract_.*row">(.*)'
$cMatches = $oRegEx.Matches($sData)
$cMatches = $cMatches | ForEach-Object { $_.Groups[1].value }
$cMyData = @()
foreach ($i in 0..($cMatches.count -1)) {
$cRecord = New-Object -TypeName system.Object
[void] $foreach.MoveNext()
$cRecord | Add-Member -MemberType noteProperty -Name 'Provider' $cMatches[$foreach.current]
[void] $foreach.MoveNext()
$cRecord | Add-Member -MemberType noteProperty -Name 'StartDate' $cMatches[$foreach.current]
[void] $foreach.MoveNext()
$cRecord | Add-Member -MemberType noteProperty -Name 'EndDate' $cMatches[$foreach.current]
[void] $foreach.MoveNext()
if ($cMatches[$foreach.current] -ne "") {
$cRecord | Add-Member -MemberType noteProperty -Name 'DaysLeft' $cMatches[$foreach.current]
} else {
$cRecord |
Add-Member -MemberType noteProperty -Name 'DaysLeft' "0"
}
$cMyData += $cRecord
}
$Warranty = $cMyData | foreach {$_.EndDate}
Final steps, we'll split the incoming data to make it easier to handle, we'll assume for this PoC that the FIRST field returned is the correct warranty expiration date (it seemed to be in all the ones I manually checked).
Then we'll format that returned data into a date format that can be used by Service Manager.
Then finally we'll update the extended property created earlier with the information.
Et Voila! This could either be setup now to run when objects are updated by using the "Monitor Object" SCSM IP component and scoping it for updated serial number properties, or schedule it to run at scheduled times or even just manually when you desire.
Remember, this is proof of concept, it works in my test lab, but will need developing and testing before you would use it in a production environment.
Subscribe to:
Posts (Atom)