# A Bit of DSC Frustration

§ August 20, 2014 14:42 by beefarino |

DSC is like remoting – when it works it’s amazing.  When it doesn’t work….. who knows why?

While prepping a Desired State Configuration demo for SQL Saturday #328, I hit a small roadblock that proved to waste several hours of my time.  I’m documenting it here in the hopes that it helps someone else.

I’m using DSC to install SQL Server onto an Azure VM.  At some point in the process, running Start-DscConfiguration started to error out, with a cryptic and altogether unhelpful message:

An old configuration is still pending. Please wait for the pending configuration to finish. If the problem persists,
execute Start-DSCConfiguration command with -Force parameter.
+ CategoryInfo          : ResourceExists: (root/Microsoft/...gurationManager:String) [], CimException
+ FullyQualifiedErrorId : MI RESULT 11
+ PSComputerName        : nottaken.cloudapp.net


This error persisted regardless of what I tried, and it eventually took a new form:

Cannot invoke the SendConfigurationApply method. The PerformRequiredConfigurationChecks method is in progress and must
return before SendConfigurationApply can be invoked.
+ CategoryInfo          : NotSpecified: (root/Microsoft/...gurationManager:String) [], CimException
+ FullyQualifiedErrorId : MI RESULT 1
+ PSComputerName        : nottaken.cloudapp.net

I finally swallowed my pride and reached out to DSC trailblazer Steve Murawski, who was able to quickly and generously point me to the fix.

# The Cause

In my case, it would seem that the DSC local service was hard at work on my target node trying to make things work.  The problem was that DSC neglected a failed installer (or the resource in question failed to report it correctly).  In any case, this left DSC in a state where it thought a configurable something is happening, and thus it should prevent new things from happening.

Specifically, I was trying to install SQL Server using xSqlPs, but the password I specified for sa did not meet the strong password requirements and the installation exited.  For some reason I’m still investigating, the xSqlServerInstall resource failed to notice the error.

# The Fix

To get DSC working again, I did the following steps, on the target machine,in this order:

1. Delete c:\windows\system32\configuration\pending.mof;
2. Stop all WMI processes;
3. Restart the WinRM service;

The first step was the only one I hadn’t tried, because I didn’t know anything about that file.  And it’s the key – that file tells DSC what needs to be done.  If it isn’t there, DSC seems to think that everything is back to being cool.

# The Code

Here is some code I’ve started using to automate the fix described above.

remove-item $env:systemRoot/system32/configuration/pending.mof -force; get-process *wmi* | stop-process -force; restart-service winrm -force Good luck! # If it’s Worth the Going it’s Worth the Ride § August 9, 2014 18:34 by beefarino | My best friend from childhood is Lee Carpenter. He and I lived in adjacent lots that shared a corner. That corner never grew grass because we were always stomping back and forth between our houses. He had a swing set in his yard and I had piney woods with trees to climb and access to the creek. He used to let me ride his larger Big Wheel because I was bigger than him and couldn’t ride mine anymore, but that way we could both cruise around our neighborhood together looking for trouble. I went to his First Communion even though I had to wear uncomfortable clothes and no idea what a First Communion was or what was happening. Our Lego collections routinely intermingled. Once he added pretend wings to our pretend pirate ship because he knew I was deadly afraid of the ocean. We protected each other when our neighbor’s big mean dog got out of its yard and came after us. We took a dance class together because our moms wanted us to. We once spent an afternoon figuring out how matchbox cars are put together by smashing ours apart. Lee’s awesome, and I just got off the phone with his mother. He passed away on July 24, 2014 from a rare germ cell cancer he apparently carried in him for his whole 41 years on Earth. Getting back in touch with Lee has been on my GTD “someday” list. For years. I’m looking at the list right now, and it’s on there. It just says “Lee?” but I know what it means. I probably copied that item from list to list maybe a dozen times over the span of years. Something I should really do at some point. And that’s as far as my effort went. The stupid line item doesn’t matter anymore. It’s no longer within my purview as to whether it happens. It’s a terribly harsh reminder that as important as goals are, they’re just meaningless noise if you don’t make the effort. So I know a Lee whose father left, but I’ll never know the Lee who raised a son on his own. I know a Lee who couldn’t read. I’ll never know the Lee who taught himself to do so in his Twenties. I know a Lee who went to some “special” school no adult would elaborate on. I’ll never know the Lee that worked in the Smithsonian restoring historical pieces. I know a Lee who could build homes for Star Wars action figures out of Legos. I’ll never know the Lee who could recreate authentic furniture stains and polishes from different historical periods. I’m happy to know the Lee I know. But I think I’ll always wonder about the other one. He sounds like a pretty great guy. So hey, why don’t you stop reading this and do that thing you’ve wanted to do that you haven’t done. Please. # Introducing Simplex § July 31, 2014 10:05 by beefarino | During my P2F talk at the PowerShell Summit NA 2014, I announced a project that would make creating a PowerShell Provider “stupid simple.” I happily present you the first iteration of this stupid simplicity in the form of the Simplex open source project. The goal of Simplex is to remove any barrier between the operator and the items they would like to access as a PowerShell drive. Instead of focusing on C# types, interfaces, and cmdlet support, Simplex keeps you in script, using a simple domain-specific language (DSL) based on PowerShell to define a drive hierarchy. Here is an example of the DSL: root { # the root folder of the drive folder System { # a folder named System script Processes -id Id { # a folder named Processes get-process # that contains Process objects } script Errors -id Index { # a folder named Errors containing event log entries get-eventLog -log application -entrytype error -newest 25 } } } To mount this Simplex script as a PowerShell drive, you just need to use the Simplex module, like so: import-module simplex; new-psdrive -name s -psprovider simplex -root "c:\path\to\simplexscript.ps1" Once the script it mounted, you can navigate the folders and script containers as if they were a filesystem: PS C:\Windows\SysWOW64\WindowsPowerShell\v1.0> cd s: PS s:\> dir Container: Simplex\Simplex::C:\share\simplex.ps1 Type Name ---------- ---- ---- d+~< Folder System PS s:\> cd system PS s:\system> dir Container: Simplex\Simplex::C:\share\simplex.ps1\system Type Name ---------- ---- ---- d+~< Script Processes d+~< Script Errors PS s:\system> cd errors PS s:\system\errors> dir Index Time EntryType Source InstanceID Message ----- ---- --------- ------ ---------- ------- 63846 Jul 31 09:05 Error Application Error 1000 ...  # Simplex DSL The Simplex DSL has three elements: root, folders, and scripts. Each element defines a container location on the drive. The root element defines the root of the drive and contains any number of script and folder elements. root { # any number of script and/or folder elements } Folders can also contain other folders and script elements, and folders must have a name. folder <foldername> { # any number of script and/or folder elements } Scripts are containers that use bits of PowerShell to supply the items they contain. Scripts elements also must have a name, and they can optionally specify an –idField parameter to identify a property to be used as the item’s child name. script <foldername> [-idField <propertyname>] { # PowerShell script to return objects from this folder } The DSL is “just PowerShell,” so you can actually do any PowerShell things you want to do. In this example, the DSL generates folders on demand based on the available performance counter sets: root { get-counter -list * | foreach {$folderName = $_.CounterSetName; script$folderName {
#...
}
}
}

When you mount and explore the drive, you’ll find a set of folders generated from the script:

PS g:\> dir

Container: Simplex\Simplex::C:\share\gen.ps1

Type       Name
---------- ----       ----
d+~<       Script     RAS
d+~<       Script     WSMan Quota Statistics
d+~<       Script     Network QoS Policy
d+~<       Script     SMB Client Shares
d+~<       Script     SynchronizationNuma
d+~<       Script     Synchronization
d+~<       Script     Event Tracing for Win...
d+~<       Script     Thermal Zone Information
d+~<       Script     Processor Information
d+~<       Script     Event Tracing for Win...
d+~<       Script     FileSystem Disk Activity
# ...


So please check out the project, submit any issues/features you find/want.  And as always, enjoy!

# Managing NuGet Packages with StudioShell

§ July 25, 2014 14:14 by beefarino |

I’ve been getting a lot of questions about doing NuGet things in StudioShell.  It would seem simple enough: import the NuGet PowerShell package into the StudioShell environment, and use the functions defined there.  Unfortunately it’s not that simple.  The NuGet module is tightly bound to the Package Manager Console host in the module’s PSD1 definition file:

# ...
# Name of the Windows PowerShell host required by this module
PowerShellHostName = 'Package Manager Host'

# Minimum version of the Windows PowerShell host required by this module
PowerShellHostVersion = '1.2'
# ...

In other words, you won’t be able to import the NuGet module outside of the Package Manager Console.  This seems rather arbitrary to me, especially considering the fact that the module itself doesn’t do anything that mandates any specific host, just that the host environment contains the $dte variable reference to the Visual Studio Application object. Anyhoodle, I digress, and shall save the rest of the rant for another post. So how does one combine the awesome sauce of StudioShell with the utility of NuGet? Well, you may not be able to bring NuGet into StudioShell, but you can certainly bring StudioShell to into the Package Manager Console. Recent releases of StudioShell include specialized support for the Package Manager console. This allows you to use the two in tandem, and this post describes some of the crazy things you can do. At the 2014 NA PowerShell Summit, I did a talk on using the P2F project to develop providers. At the start of that talk, I ran a single command in the PM console: new-providerProject -name TypeProvider The command automates the process of setting up a new P2F provider project; specifically, it does all of the following: 1. Creates a new C# class library project. 2. Adds an assembly reference to System.Management.Automation to the new project. 3. Installs the P2F NuGet package into the newly created project. 4. Modifies the project’s debug settings to launch PowerShell.exe with a command line that imports the project’s binary into the PowerShell session. 5. Sets the new project as the solution’s current startup project. 6. Enables NuGet package restore on the solution. Here’s the function definition in its entirety: import-module studioshell.provider import-module studioshell.contrib function new-providerProject($name )
{
# create the project
new-item "dte:\solution\projects\$name" -type classlibrary -language csharp # add necessary references new-item "dte:\solution\projects\$name\references" -type assembly -name System.Management.Automation

# install the P2F nuget pacakge
install-package "P2F" -project $name; # configure the project settings$project = get-item "dte:\solution\projects\$name"$project.configurationmanager | foreach {
$_.properties.item('startprogram').value = "c:\windows\system32\windowspowershell\v1.0\powershell.exe"$_.properties.item('startarguments').value =
'-noexit -command "ls *.dll | ipmo"'
$_.properties.item('startaction').value = 1 } # set this project as the startup project$dte.solution.projects.item("StartupProject").value = $project.name; # enable nuget package restore enable-nugetPackageRestore; } The first two lines import the necessary StudioShell modules; StudioShell.Provider is the simplified NuGet distribution of the DTE provider, and StudioShell.Contrib is a community contribution module with useful wrappers around common StudioShell uses. The first line of the function creates a new C# class library project: # create the project new-item "dte:\solution\projects\$name" -type classlibrary -language csharp

Here we use the common PowerShell item cmdlets against the StudioShell DTE provider, specifying a path under the solution’s projects tree where we want the project to be created.  The value for the type parameter often confounds people; that is, you may not be sure what value to use here.  StudioShell has your back – you can find a list of the project templates for various languages under the dte:/templates/projects path.

The next line modifies the assembly references for the new project:

# add necessary references
new-item "dte:\solution\projects\$name\references" -type assembly -name System.Management.Automation  Again, just using the standard new-item cmdlet at the right path does the trick. At the project’s references folder, the type parameters can be “assembly”, “project”, or “com”, depending on the type of reference you’re adding. The name parameter specifies the assembly name, project name, or COM ProgId to reference. Next, we leverage the NuGet module to install the P2F package: install-package "P2F" -project$name

This command should look familiar if you’re a NuGet user.  If not, you’re using NuGet wrong and should feel bad.  This command does a lot of magic stuff – it pulls the P2F package from the main NuGet repository, modifies the project references, and so forth.  All that NuGet-ish stuff, in just one little command.

Now comes an ugly part.  Whenever I’m making a new PowerShell provider, I want to set up the project debuggery so that it launches PowerShell with a specific command line.  In the UI, I would go into the project properties debug tab and make the necessary modifications.  In the PM console, I accomplish the same thing as follows:

$project = get-item "dte:\solution\projects\$name"
$project.configurationmanager | foreach {$_.properties.item('startprogram').value = "c:\windows\system32\windowspowershell\v1.0\powershell.exe"
$_.properties.item('startarguments').value = '-noexit -command "ls *.dll | ipmo"'$_.properties.item('startaction').value = 1
}

Honestly it took me a little while to find the specific property names I needed to modify.  And I see how ugly and unhelpful this code is, so I’ve added project properties to the DTE drive topology for the upcoming release of StudioShell.  So, HOORAY ME and you owe me a beer if you’re reading this.

And while I’m in the ugly bits, I set the solution-level property that marks this project as the startup project:

$dte.solution.projects.item("StartupProject").value =$project.name;

Again, this code is going away in favor of new DTE hives for solution properties.  Again, more beer is owed by you to me.

Finally, we have a little extra NuGet magic.  This one’s been on my plate to share for a while, but special thanks go to Attila Hajdrik for kicking me in the seat to get it done.  This last command will enable the NuGet package restore for the solution, if it is not already enabled:

# enable nuget package restore
enable-nugetPackageRestore;

The code behind this command taps in to the bottomless well of woe that is the Visual Studio service provider model.  I’m not going into the details of the implementation, but you can see Attila's approach in this gist.  Because I do this enough that I want to keep it simple to automate, I’ve added Attila’s implementation to the StudioShell.Contrib project.

So there you have it: using StudioShell to manage projects, and NuGet to manage package references, all in one big automated pile of bytes.