A Feature Request for Visual Studio v.Next

§ January 9, 2013 14:56 by beefarino |

simpleisbetterThose of you that know me well know I don’t have many strong opinions.  I tend to keep multiple perspectives and work in whatever limits are provided.  So when I do express a strong opinion, it tends to be backed up with experience and reason (or at times, alcohol).

With that out of the way, let me express the only feature I really desire in the next version of Visual Studio:

Replace the format of all project and solution files with PowerShell scripts.

I hear you groaning – just hear me out.  I have many reasons for wanting this – too many to list all but the highlights here.  In a nutshell they all boil down to the notion of simplicity.

Expressing Data and Logic

A build has two broad parts: data that defines WHAT to build, and logic that defines HOW to build it.  E.g., your typical C# project file contains a list of source files and project/assembly references, and a set of instructions for accomplishing specific things with that data – such as producing assemblies or deploying a website.

It’s simple to express data in PowerShell.  You can declare arrays and hashtables inline.  You can create complex object hierarchies if needed. 

It also simple to express data in XML.  After all, that’s what XML is for.

It’s simple to express logic in PowerShell; like data and XML, expressing logic is what a programming language is designed to do.  However, expressing logic in XML is … well, “cumbersome” is a generous word.  “Obtuse” is perhaps a better choice.

Customizing the Build

It gets worse when you require custom build steps.  Getting logic into the MSBuild XML schema requires a lot of ceremony that serves nothing outside of MSBuild.  Consider the process:

  1. Implement your build task in C# (pulling in all the necessary bits to make MSBuild recognize your task)
  2. Import the DLL into your project using the <Import /> element
  3. Add the XML necessary to get your task working in the build
  4. Reload the project in Visual Studio
  5. Address any security warnings that pop up related to the new “unknown” task you added

At this point, the logic of your custom build task is about as far away from the build as it can get – inside of a binary dll that now must be packed around with the project file.  You have literally no inroad to the task in the same band as the build – all information about using the task (its name, dependencies, parameters, outputs, etc) must be communicated elsewhere.

Now, look at customization process for the PowerShell project file:

  1. Modify the PowerShell build script
  2. There is no step 2.  Again, simplicity.

Moreover, PowerShell is explicitly transparent – documentation is a get-help command away, and the built-in discovery mechanisms let you know what’s there.  Plus, the build logic stays with the build.

Running the Build

Here’s where my opinion really polarizes.  I get irate when a software project can only build inside Visual Studio.  Recent examples of my experiences here include Azure deployments and SCOM management packs.  It’s not that I mind the experience of pushing the Deploy button and having a magical process ensue that mystically transfers and configures an entire website and database – quite the contrary I want that experience.  The thing is, I want it everywhere, not just in Visual Studio.  What exactly constitutes “everywhere?”  For starters, I want the same build experience in:

  • Visual Studio
  • the shell
  • my co-worker’s machine
  • a fresh VM image
  • the automated CI server or build farm

If the build was “just PowerShell,” this would be a piece of cake.  In fact, I’ve taken to using PSake to drive my builds these days for this very reason – so I can maintain a consistent expectation of the build from one location to the next. 

What it Might Look Like

I dunno.  Let’s see what I can do with a default CSharp class library project.  Here’s the original MSBuild project file:

   1: <?xml version="1.0" encoding="utf-8"?>
   2: <Project ToolsVersion="4.0" DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
   3:   <PropertyGroup>
   4:     <Configuration Condition=" '$(Configuration)' == '' ">Debug</Configuration>
   5:     <Platform Condition=" '$(Platform)' == '' ">AnyCPU</Platform>
   6:     <ProductVersion>8.0.30703</ProductVersion>
   7:     <SchemaVersion>2.0</SchemaVersion>
   8:     <ProjectGuid>{5C818DB6-C86B-4B05-AF13-A4CF46C8ACA9}</ProjectGuid>
   9:     <OutputType>Library</OutputType>
  10:     <AppDesignerFolder>Properties</AppDesignerFolder>
  11:     <RootNamespace>ClassLibrary1</RootNamespace>
  12:     <AssemblyName>ClassLibrary1</AssemblyName>
  13:     <TargetFrameworkVersion>v4.0</TargetFrameworkVersion>
  14:     <FileAlignment>512</FileAlignment>
  15:   </PropertyGroup>
  16:   <PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Debug|AnyCPU' ">
  17:     <DebugSymbols>true</DebugSymbols>
  18:     <DebugType>full</DebugType>
  19:     <Optimize>false</Optimize>
  20:     <OutputPath>bin\Debug\</OutputPath>
  21:     <DefineConstants>DEBUG;TRACE</DefineConstants>
  22:     <ErrorReport>prompt</ErrorReport>
  23:     <WarningLevel>4</WarningLevel>
  24:   </PropertyGroup>
  25:   <PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Release|AnyCPU' ">
  26:     <DebugType>pdbonly</DebugType>
  27:     <Optimize>true</Optimize>
  28:     <OutputPath>bin\Release\</OutputPath>
  29:     <DefineConstants>TRACE</DefineConstants>
  30:     <ErrorReport>prompt</ErrorReport>
  31:     <WarningLevel>4</WarningLevel>
  32:   </PropertyGroup>
  33:   <ItemGroup>
  34:     <Reference Include="System" />
  35:     <Reference Include="System.Core" />
  36:     <Reference Include="System.Xml.Linq" />
  37:     <Reference Include="System.Data.DataSetExtensions" />
  38:     <Reference Include="Microsoft.CSharp" />
  39:     <Reference Include="System.Data" />
  40:     <Reference Include="System.Xml" />
  41:   </ItemGroup>
  42:   <ItemGroup>
  43:     <Compile Include="Class1.cs" />
  44:     <Compile Include="Properties\AssemblyInfo.cs" />
  45:   </ItemGroup>
  46:   <Import Project="$(MSBuildToolsPath)\Microsoft.CSharp.targets" />
  47:   <!-- To modify your build process, add your task inside one of the targets below and uncomment it. 
  48:        Other similar extension points exist, see Microsoft.Common.targets.
  49:   <Target Name="BeforeBuild">
  50:   </Target>
  51:   <Target Name="AfterBuild">
  52:   </Target>
  53:   -->
  54: </Project>

And here’s some PowerShell I hacked up to represent the same thing:

   1: param(
   2:     $configuration = 'debug',
   3:     $platform = 'anycpu'
   4: );
   6: $outputType = 'library'
   7: $projectGuid = [guid]::newguid('5C818DB6-C86B-4B05-AF13-A4CF46C8ACA9');
   8: $targetFrameworkVersion = 'v4.0'
  10: $errorReport = 'prompt';
  11: $warnLevel = 4;
  13: switch ( "$configuration|$platform" ) {
  14:     'debug|anycpu' {
  15:         $optimize = $false;
  16:         $outputPath = 'bin\debug';
  17:         $defines = 'debug','trace';
  19:         $debugSymbols = $true;
  20:         $debugType = 'full'; 
  21:     }
  23:     'release|anycpu' {
  24:         $optimize = $true;
  25:         $outputPath = 'bin\release';
  27:         $debugSymbols = $true;
  28:         $debugType = 'pdbonly'; 
  29:     }
  31:     default {
  32:         throw "$buildFlavor is not a valid build configuration"
  33:     }
  34: };
  36: $assemblyReferences = @(
  37:     "System", "System.Core", "System.Xml.Linq", 
  38:     "System.Data.DataSetExtensions", 
  39:     "Microsoft.CSharp", "System.Data", "System.Xml"
  40: );
  42: $sourceFiles = @(
  43:     "Class1.cs",
  44:     "Properties\AssemblyInfo.cs"
  45: );
  47: import-module psbuild;
  48: function invoke-BeforeBuild {}
  49: function invoke-AfterBuild {}

Granted, I’m not putting much thought into this conversion – but even as is, this script lends itself to many more possibilities than its MSBuild counterpart.  For example, if you wanted to run your own static analysis on the source code for the project, you could dot-source this file and reference the $sourceFiles variable in your own script…

Anyway, opinion expressed.  Back to work.

Tickets on Sale for #pssat002

§ August 2, 2012 12:00 by beefarino |

logo_iidI am pleased to announce that tickets for PowerShell Saturday #002 in Charlotte, NC on September 15, 2012 are now on sale! 

For a paltry $10 you get a full day of learning and fun.  We have fifteen sessions across three tracks - Introductory PowerShell, Advanced PowerShell, and Applied PowerShell - there will be no shortage of useful, practical PowerShell available to you on this day.

Speakers include scripting heavyweights like Scripting Guy Ed Wilson, as well as many field-tested shell hackers looking to bring you some awesome.  Presentations cover everything from getting started to managing enterprise systems, virtualization farms, Windows Server 2012, Active Directory, and Sharepoint from the shell. 

In addition – and I’m a little excited about this one - you'll have a chance to compete for the first-ever Iron Scripter award and claim the title Iron Scripter - PowerShell Saturday #002, and walk away with a very special and unique prize...

For more information about the event, please see the event site at http://www.powershellsaturday.com/002.

To get tickets for the event, please register at http://www.powershellsaturday.com/002/register.

Here's hoping to see you there!

SeeShell NFR Licenses for Community Leaders and User Groups

§ July 6, 2012 17:37 by beefarino |

imageHere’s the skinny:

My company Code Owls LLC is proud to offer individual NFR licenses of SeeShell to technical and social community leaders.  If you fall into this category and could use a copy of our awesome software, simply contact us with your story and we’ll hook you up. 

We’re also offering NFR licenses to relevant user groups.  If you’d like a few SeeShell licenses to raffle to your members, send us your group information, including a website URL, and we’ll see about getting you the delicious PowerShell data visualization goodness your group so desperately needs.

Thinking Out Loud

SeeShell is a commercial product, and it exists to support the community.  How’s that?  Every purchase of SeeShell allows us to spend more time on our open source software solutions.  This is where we want to be focused, on making the technical world a better place for scripters and shell mages everywhere.  Selling SeeShell helps us accomplish that.

To that end, we’re working on something else as well – another PowerShell-y integration-ish visual-eye-candy-that-saves-you-tons-of-time-and-effort-making-you-more-awesome piece of commercial software.  We’re looking at a rapid time-to-market with a very surgical feature set.  Expect the announcement around the PowerShell Saturday #002 event.

PowerShell Scripting Game

§ June 9, 2012 16:24 by beefarino |

After our May meetup of the Charlotte PowerShell Users Group, one of the members offered an exciting idea: run a single Scripting Games-style event at our next script club.  Normally we use our script club meetups to help each other with PowerShell issues, but many of our group members commented on how much they learned by solving those realistic problems during this year’s Scripting Games, so I thought it would be a worthwhile effort. 

And I was right: not only did we have a blast, but everyone walked away learning something.  In fact, it was such a hit that we’ve decided to run another event at next month’s meetup.

Running the Game

The game started with me laying out the rules:

  1. We will run the game “just like” a real Scripting Games event, except the problem will be smaller and you’ll have to solve it faster.
  2. You may work alone or in a group.
  3. Scripts will be judged by two independent PowerShell smartypants on a 5-point scale.
  4. The script with the highest average rating wins a prize.

Then I presented the game parameters; these were proposed by group member Brian Wilhite.  In order to be rated, the script must do the following:

  1. Query the local computer for all installed software from Microsoft.
  2. Export the software Name, Vendor, and Version into a CSV file in the temp directory of the current user.
  3. The CSV file should not contain any extra type information.

Game players had 30 minutes or so to work on the script.  In true Scripting Games fashion, after 15 minutes a few of the more experienced scripters began to offer guidance to those who needed it.  After about 40 minutes total, members presented their solutions to the group for discussion and rating.

Our two judges for the evening – Brian Wilhite and Ed “Scripting Guy” Wilson – rated each script independently.  After everyone presented their scripts, the winner (member Glenn Hurley) was chosen as the script with the highest average rating.  Thanks to our sponsor SAPIEN, Glenn walked away with a license for PowerShell Studio 2012!

All in all the event took about 2.5 hours to run, which left little time for our regular Script Club activities; however no one seemed to mind.  In fact, they got so much out of it they want to do another game at our next meetup.

What Worked

I think the game was a hit because 95% of our members are at the same level of PowerShell experience: little to none.  The problem was scoped correctly for them, and they walked away from the experience knowing a little more about WMI and filtering.  If they had been a more diverse group it would have been difficult to find a problem that would challenge everyone.

I also like the idea of having the members present their own scripts to the group.  Part of the user group experience is flexing those presentation muscles and putting your neck out in front of your peers in a safe and neutral environment.  Everyone seemed comfortable with it, which made me happy.

What Didn’t

A few things I want to change for next month…

I want to make members “submit” their scripts somehow.  It’s just too tempting to listen to all the critique and change your script before you present it.  Of course no one did that – I’m just thinking ahead to keep things fair.  Nothing fancy, even just emailing their script to me and presenting off of my laptop would work.

Instead of having the judges announce their ratings, I’m going to have them record them silently.  Critique will still be offered of course, since that’s why everyone is there.  But keeping the actual ratings a secret until the end builds suspense, and turns the game into a show. 

Which bring me to…

Where I Want This to Go

Other user groups in Corpus Christi and Arizona are asking about the game (which is why I’m writing this post).  I would *love* it if other groups started doing something similar.  If it catches on we can start adding some formality to the events and turn them into Iron Scripter competitions.  I think there is an opportunity there to foster learning and friendly competition year-round, with special Iron Scripter sessions at local SQL/Sharepoint/PowerShell Saturday events with prizes including the opportunity to claim one-of-a-kind titles like: “Iron Scripter – SQL Saturday #321”.

So, whatcha think interweb?