Why I Hate IServiceProvider

§ February 20, 2009 09:18 by beefarino |

I've worked with a lot of code that uses IServiceProvider as a way to disconnect an object from its dependencies.  I've come to loathe this interface for many reasons and have opted for a systematic pattern of dependency injection.

First reason IServiceProvider sucks: it hides the dependencies of an object while decoupling from them.  What do I mean by that?  Pretend you're using this blackbox component from your code:

public class BlackBoxComponent
{
    public BlackBoxComponent( IServiceProvider services );
    public void DoAllTheWork();
} 

Can you tell what services are going to be requested from the service provider?  Me neither.  Now you need another way to discover what dependencies need to be available to the BlackBoxComponent - documentation, source code, something out-of-band that takes you away from your work at hand.

Compare that with some simple constructor injection:

public class BlackBoxComponent
{
    public BlackBoxComponent( IRepository< Thing > thingRepository, ILogManager logManager );
    public void DoAllTheWork();
}

With this, you know exactly what a BlackBoxComponent needs to do its job just from looking at the constructor.

Second reason IServiceProvider sucks: it adds a lot of code.  Fetching the services is cumbersome at best:

    // ...    
    public BlackBoxComponent( IServiceProvider services )
    {
        thingRepository = ( IRepository< Thing > ) services.GetService( typeof( IRepository< Thing > ) );
        logManager  = ( ILogManager ) services.GetService( typeof( ILogManager ) );
    }
    // ...

Sure you can use some syntactic sugar to work around the typeof'ing and naked casting:

public static class ServiceProviderExtension
{
    public static T GetService< T >( this IServiceProvider serviceProvider )
    {
        return ( T ) serviceProvider.GetService( typeof( T ) );
    }
}

which cleans up the code a bit:

// ...    
public BlackBoxComponent( IServiceProvider services )
{
    thingRepository = services.GetService< IRepository< Thing > >();
    logManager  = services.GetService< ILogManager >();
}
// ...

but you're still stuck having to reach out and grab every dependency you need from the service container - which implies that somewhere, some other piece of code is responsible for filling up that service container:

//...
ServiceContainer services = new ServiceContainer();
services.AddService( 
    typeof( ILogManager ),
    new Log4NetLogManager()
);
services.AddService( 
    typeof( IRepository< Thing > ),
    new ThingRepository()
);
//...

More code to write, all of it unnecessary and obsolete given the state of the art in dependency injection frameworks. 



Lack of Consistency in PowerShell

§ February 16, 2009 02:40 by beefarino |

Y'all know I love powershell.

But I'm getting pretty tired of the lack of consistency in the product.  I'm not speaking of quality here - just about how to get things done.  Case in point: at the moment I'm trying to figure out the new modules feature, which so far hasn't been difficult.  The most annoying thing is that I keep trying to get the list of available modules by typing this:

dir module:

which works for other powershell internals like variables:

dir variable:

and functions:

dir function:

but not for modules.  Why doesn't it work?  Well, those little drive-letter-type-monikers need something called a provider to enable them.  There's one built-in to powershell to enable this feature for variables and functions, but not for modules.

Not a big deal really, but one of the original selling points of powershell was its consistency - files, registry, certificates, etc., they all look like a little file system when you work with them.  So the act of adding, removing, moving, renaming these things always looks the same.  Why should I build up this expectation when it's availability is spotty?  And I'm not sure why a provider isn't managing this - modules are stored on the filesystem anyway, in a few specific places, and outside of using them you have all the basic provider operations: create, delete, rename, etc.  Having a provider around them should be a no-brainer.  The fact that one doesn't exist tells me that either it's too much effort (which having done so a few times I can say is probably the case) or goes against the grain of powershell "philosophy of use".

Oh well, it's still CTP3, maybe they'll have it in the RTM, right?  Or maybe I just don't "get" when something should have a provider and when it shouldn't.  Am I missing the point, or is this a case of powershell not eating its own dogfood?

Update

About two seconds after positing this I saw this line at the top of the Modules module.psm1 file:

# Create a drive for My Modules
New-PsDrive -Scope Global -Name MyMod -PSProvider FileSystem -Root (($env:PSMODULEPATH -split ";")[0]) 

which meets my general spelunking needs.

*sigh*

foot | mouth;


The Thing about Best Practices

§ February 12, 2009 03:34 by beefarino |

There's been a big stinky vine growing on the interwebs lately, rooted in the some comments made by Joel and Jeff during a podcast and aggravated by a recent post on Coding Horror. If you've been under a rock for the past week, a good summary can be found here.  Or go google stackoverflow podcast rant and bathe in it. 

I'm pretty late to the game on this meme, but since my experience and attitude seems to differ a bit from what I'm reading elsewhere I thought I'd share it.

I think I'm an above-average software engineer.  It's something I take seriously, and I spend a great deal of my time learning, experimenting, reading, and seeking out new knowledge in this area.  It's one of the things I would pursue even if I wasn't getting paid to do it.  That said, I routinely choose not to apply good engineering principles on software projects where it yields no benefit.  E.g., I don't do as much TDD or refactoring on spikes when I know the code is going to be thrown away.  

I also think I'm a mediocre woodworker, at best.  I enjoy doing it, but I don't feel the same compulsion to learn as much as I can, to hone those skills, sacrifice the time and resources necessary to take those skills to the next level.  I'm content doing it as poorly as I do.  However, two years ago when I was working on the two-story tree house that would be holding my girls eight feet off the ground, you can bet your ass I learned all I could about proper tree house design and construction and applied every bit of that knowledge to my project.

What I'm trying to say is this:

  • you don't always have to apply the best practices, but you can't make that choice without knowing what they are;
  • you don't have to be very experienced to realize what you don't know and when you need to learn it.


Convert-FromHex PowerShell Filter

§ February 11, 2009 09:20 by beefarino |

While moving some data around, I found myself in need of a powershell filter to translate a hex string into its byte array equivalent.  I've written this routine many times, but never quite this succinctly:

process
{
    $_ -replace '^0x', '' -split "(?<=\G\w{2})(?=\w{2})" | %{ [Convert]::ToByte( $_, 16 ) }
}

My favorite part is the regex used to split the hex string - it matches nothing concrete, only lookarounds.

Use it like any other pipeline filter when you have a hex string and want a byte array; e.g.:

PS >"0x1234" | convert-fromhex
18
52

Enjoy!