The Difference Between ~ and $home

§ February 4, 2013 10:08 by beefarino |

I keep getting burned by this one – hopefully writing this post will cement this dichotomy into my brain for the foreseeable future…

In PowerShell, there is a big difference between the “~” path shortcut and the value of the $home automatic variable:

  • “~” is a shortcut to the home path for the PowerShell provider backing the current location.  In other words, the value of “~” changes depending on what type of drive you’re currently working (e.g., File System, registry, etc).
  • $home is set to the user’s home directory; e.g.: C:\users\yourusername or whatever your corporate overseers want it to be.  $home never changes.

I keep using ~ when I mean to use $home.  Why?  Because ~ in Linux is the semantic equivalent to $home in PowerShell.  PowerShell got it close, but not quite “right.”  Consider:

~ and $home are identical if you’re working on a file system drive.  For example:

PS C:\> resolve-path ~


PS C:\> $home

When you move to a drive for a different provider, the value of ~ changes, usually to an undefined value:

PS C:\> cd cert:
PS cert:\> resolve-path ~
Resolve-Path : Home location for this provider is not set. To set the home location, call
"(get-psprovider 'Certificate').Home = 'path'".
At line:1 char:13
+ resolve-path <<<< ~
+ CategoryInfo : InvalidOperation: (:) [Resolve-Path], PSInvalidOperationEx
+ FullyQualifiedErrorId : InvalidOperation,Microsoft.PowerShell.Commands.ResolvePath

In fact, the only core PowerShell provider that defines a value for ~ is the file system provider:

PS C:\> get-psprovider | where {$_.home}

Name                 Capabilities                       Drives
----                 ------------                       ------
FileSystem           Filter, ShouldProcess              {C, D}

PS C:\>

In that respect, using ~ in scripts is fairly fragile unless you set-location to a file system path first.  $home is a better choice since it’s an absolute path.

A Feature Request for Visual Studio v.Next

§ January 9, 2013 14:56 by beefarino |

simpleisbetterThose of you that know me well know I don’t have many strong opinions.  I tend to keep multiple perspectives and work in whatever limits are provided.  So when I do express a strong opinion, it tends to be backed up with experience and reason (or at times, alcohol).

With that out of the way, let me express the only feature I really desire in the next version of Visual Studio:

Replace the format of all project and solution files with PowerShell scripts.

I hear you groaning – just hear me out.  I have many reasons for wanting this – too many to list all but the highlights here.  In a nutshell they all boil down to the notion of simplicity.

Expressing Data and Logic

A build has two broad parts: data that defines WHAT to build, and logic that defines HOW to build it.  E.g., your typical C# project file contains a list of source files and project/assembly references, and a set of instructions for accomplishing specific things with that data – such as producing assemblies or deploying a website.

It’s simple to express data in PowerShell.  You can declare arrays and hashtables inline.  You can create complex object hierarchies if needed. 

It also simple to express data in XML.  After all, that’s what XML is for.

It’s simple to express logic in PowerShell; like data and XML, expressing logic is what a programming language is designed to do.  However, expressing logic in XML is … well, “cumbersome” is a generous word.  “Obtuse” is perhaps a better choice.

Customizing the Build

It gets worse when you require custom build steps.  Getting logic into the MSBuild XML schema requires a lot of ceremony that serves nothing outside of MSBuild.  Consider the process:

  1. Implement your build task in C# (pulling in all the necessary bits to make MSBuild recognize your task)
  2. Import the DLL into your project using the <Import /> element
  3. Add the XML necessary to get your task working in the build
  4. Reload the project in Visual Studio
  5. Address any security warnings that pop up related to the new “unknown” task you added

At this point, the logic of your custom build task is about as far away from the build as it can get – inside of a binary dll that now must be packed around with the project file.  You have literally no inroad to the task in the same band as the build – all information about using the task (its name, dependencies, parameters, outputs, etc) must be communicated elsewhere.

Now, look at customization process for the PowerShell project file:

  1. Modify the PowerShell build script
  2. There is no step 2.  Again, simplicity.

Moreover, PowerShell is explicitly transparent – documentation is a get-help command away, and the built-in discovery mechanisms let you know what’s there.  Plus, the build logic stays with the build.

Running the Build

Here’s where my opinion really polarizes.  I get irate when a software project can only build inside Visual Studio.  Recent examples of my experiences here include Azure deployments and SCOM management packs.  It’s not that I mind the experience of pushing the Deploy button and having a magical process ensue that mystically transfers and configures an entire website and database – quite the contrary I want that experience.  The thing is, I want it everywhere, not just in Visual Studio.  What exactly constitutes “everywhere?”  For starters, I want the same build experience in:

  • Visual Studio
  • the shell
  • my co-worker’s machine
  • a fresh VM image
  • the automated CI server or build farm

If the build was “just PowerShell,” this would be a piece of cake.  In fact, I’ve taken to using PSake to drive my builds these days for this very reason – so I can maintain a consistent expectation of the build from one location to the next. 

What it Might Look Like

I dunno.  Let’s see what I can do with a default CSharp class library project.  Here’s the original MSBuild project file:

   1: <?xml version="1.0" encoding="utf-8"?>
   2: <Project ToolsVersion="4.0" DefaultTargets="Build" xmlns="">
   3:   <PropertyGroup>
   4:     <Configuration Condition=" '$(Configuration)' == '' ">Debug</Configuration>
   5:     <Platform Condition=" '$(Platform)' == '' ">AnyCPU</Platform>
   6:     <ProductVersion>8.0.30703</ProductVersion>
   7:     <SchemaVersion>2.0</SchemaVersion>
   8:     <ProjectGuid>{5C818DB6-C86B-4B05-AF13-A4CF46C8ACA9}</ProjectGuid>
   9:     <OutputType>Library</OutputType>
  10:     <AppDesignerFolder>Properties</AppDesignerFolder>
  11:     <RootNamespace>ClassLibrary1</RootNamespace>
  12:     <AssemblyName>ClassLibrary1</AssemblyName>
  13:     <TargetFrameworkVersion>v4.0</TargetFrameworkVersion>
  14:     <FileAlignment>512</FileAlignment>
  15:   </PropertyGroup>
  16:   <PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Debug|AnyCPU' ">
  17:     <DebugSymbols>true</DebugSymbols>
  18:     <DebugType>full</DebugType>
  19:     <Optimize>false</Optimize>
  20:     <OutputPath>bin\Debug\</OutputPath>
  21:     <DefineConstants>DEBUG;TRACE</DefineConstants>
  22:     <ErrorReport>prompt</ErrorReport>
  23:     <WarningLevel>4</WarningLevel>
  24:   </PropertyGroup>
  25:   <PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Release|AnyCPU' ">
  26:     <DebugType>pdbonly</DebugType>
  27:     <Optimize>true</Optimize>
  28:     <OutputPath>bin\Release\</OutputPath>
  29:     <DefineConstants>TRACE</DefineConstants>
  30:     <ErrorReport>prompt</ErrorReport>
  31:     <WarningLevel>4</WarningLevel>
  32:   </PropertyGroup>
  33:   <ItemGroup>
  34:     <Reference Include="System" />
  35:     <Reference Include="System.Core" />
  36:     <Reference Include="System.Xml.Linq" />
  37:     <Reference Include="System.Data.DataSetExtensions" />
  38:     <Reference Include="Microsoft.CSharp" />
  39:     <Reference Include="System.Data" />
  40:     <Reference Include="System.Xml" />
  41:   </ItemGroup>
  42:   <ItemGroup>
  43:     <Compile Include="Class1.cs" />
  44:     <Compile Include="Properties\AssemblyInfo.cs" />
  45:   </ItemGroup>
  46:   <Import Project="$(MSBuildToolsPath)\Microsoft.CSharp.targets" />
  47:   <!-- To modify your build process, add your task inside one of the targets below and uncomment it. 
  48:        Other similar extension points exist, see Microsoft.Common.targets.
  49:   <Target Name="BeforeBuild">
  50:   </Target>
  51:   <Target Name="AfterBuild">
  52:   </Target>
  53:   -->
  54: </Project>

And here’s some PowerShell I hacked up to represent the same thing:

   1: param(
   2:     $configuration = 'debug',
   3:     $platform = 'anycpu'
   4: );
   6: $outputType = 'library'
   7: $projectGuid = [guid]::newguid('5C818DB6-C86B-4B05-AF13-A4CF46C8ACA9');
   8: $targetFrameworkVersion = 'v4.0'
  10: $errorReport = 'prompt';
  11: $warnLevel = 4;
  13: switch ( "$configuration|$platform" ) {
  14:     'debug|anycpu' {
  15:         $optimize = $false;
  16:         $outputPath = 'bin\debug';
  17:         $defines = 'debug','trace';
  19:         $debugSymbols = $true;
  20:         $debugType = 'full'; 
  21:     }
  23:     'release|anycpu' {
  24:         $optimize = $true;
  25:         $outputPath = 'bin\release';
  27:         $debugSymbols = $true;
  28:         $debugType = 'pdbonly'; 
  29:     }
  31:     default {
  32:         throw "$buildFlavor is not a valid build configuration"
  33:     }
  34: };
  36: $assemblyReferences = @(
  37:     "System", "System.Core", "System.Xml.Linq", 
  38:     "System.Data.DataSetExtensions", 
  39:     "Microsoft.CSharp", "System.Data", "System.Xml"
  40: );
  42: $sourceFiles = @(
  43:     "Class1.cs",
  44:     "Properties\AssemblyInfo.cs"
  45: );
  47: import-module psbuild;
  48: function invoke-BeforeBuild {}
  49: function invoke-AfterBuild {}

Granted, I’m not putting much thought into this conversion – but even as is, this script lends itself to many more possibilities than its MSBuild counterpart.  For example, if you wanted to run your own static analysis on the source code for the project, you could dot-source this file and reference the $sourceFiles variable in your own script…

Anyway, opinion expressed.  Back to work.

Tickets on Sale for #pssat002

§ August 2, 2012 12:00 by beefarino |

logo_iidI am pleased to announce that tickets for PowerShell Saturday #002 in Charlotte, NC on September 15, 2012 are now on sale! 

For a paltry $10 you get a full day of learning and fun.  We have fifteen sessions across three tracks - Introductory PowerShell, Advanced PowerShell, and Applied PowerShell - there will be no shortage of useful, practical PowerShell available to you on this day.

Speakers include scripting heavyweights like Scripting Guy Ed Wilson, as well as many field-tested shell hackers looking to bring you some awesome.  Presentations cover everything from getting started to managing enterprise systems, virtualization farms, Windows Server 2012, Active Directory, and Sharepoint from the shell. 

In addition – and I’m a little excited about this one - you'll have a chance to compete for the first-ever Iron Scripter award and claim the title Iron Scripter - PowerShell Saturday #002, and walk away with a very special and unique prize...

For more information about the event, please see the event site at

To get tickets for the event, please register at

Here's hoping to see you there!

SeeShell NFR Licenses for Community Leaders and User Groups

§ July 6, 2012 17:37 by beefarino |

imageHere’s the skinny:

My company Code Owls LLC is proud to offer individual NFR licenses of SeeShell to technical and social community leaders.  If you fall into this category and could use a copy of our awesome software, simply contact us with your story and we’ll hook you up. 

We’re also offering NFR licenses to relevant user groups.  If you’d like a few SeeShell licenses to raffle to your members, send us your group information, including a website URL, and we’ll see about getting you the delicious PowerShell data visualization goodness your group so desperately needs.

Thinking Out Loud

SeeShell is a commercial product, and it exists to support the community.  How’s that?  Every purchase of SeeShell allows us to spend more time on our open source software solutions.  This is where we want to be focused, on making the technical world a better place for scripters and shell mages everywhere.  Selling SeeShell helps us accomplish that.

To that end, we’re working on something else as well – another PowerShell-y integration-ish visual-eye-candy-that-saves-you-tons-of-time-and-effort-making-you-more-awesome piece of commercial software.  We’re looking at a rapid time-to-market with a very surgical feature set.  Expect the announcement around the PowerShell Saturday #002 event.