What is the Metric for Obtuse LINQ?

§ August 4, 2010 03:14 by beefarino |

The more I use LINQ the more I like it.  However I’m starting to wonder when it’s too much.

Check out this code below.  My concern is that the other guy on the project won’t have a clue what it’s doing.  See if you can tell what it’s doing before reading the explanation that follows.  I’m no LINQ expert, so any advice on making this simpler and easier to comprehend is appreciated…

   1: public IEnumerabe<DocumentField> Adapt(IEnumerable<ScanMark> marks )
   2: {
   3:     var fieldGroupMarkValues = (from mark in marks
   4:                                 let monikerParts = mark.Name.Split(new char[] {':'}, 2)
   5:                                 let groupName = monikerParts[0]
   6:                                 let markValue = mark.IsMarked ? monikerParts[1] : null
   7:                                 select new
   8:                                            {
   9:                                                FieldGroup = groupName,
  10:                                                Value = markValue
  11:                                            });
  12:     var fieldGroupValues = from markGroup in fieldGroupMarkValues
  13:                      group markGroup by markGroup.FieldGroup
  14:                      into fields
  15:                      let monikerParts = fields.Key.Split(new char[] {'!'}, 2)
  16:                      let fieldName = monikerParts[0]
  17:                      let groupName = monikerParts[1]
  18:                      let rawValue = ( from f in fields where null != f.Value select f.Value ).ToArray()
  19:                      let zeroValue = ( ! rawValue.Any() ? " " : null )
  20:                      let strayValue = 1 < rawValue.Count() ? "*" : null
  21:                      let fieldValue = rawValue.FirstOrDefault()
  22:                          select new
  23:                                     {
  24:                                         FieldName = fieldName,
  25:                                         GroupName = groupName,
  26:                                         Value = zeroValue ?? strayValue ?? fieldValue
  27:                                     };
  28:  
  29:     return from fieldGroup in fieldGroupValues
  30:                       group fieldGroup by fieldGroup.FieldName into fields
  31:                       let fieldName = fields.Key
  32:                       let value = from g in fields select g.Value
  33:                       select new DocumentField()
  34:                                      {
  35:                                          Name = fieldName,
  36:                                          Value = String.Join( "", value.ToArray() )
  37:                                      };
  38:     
  39:  
  40: }        

In a nutshell, the Adapt method transforms a set of ScanMark objects into a set of DocumentField objects.  The tricky part is that there is a hierarchy encoded into the ScanMark.Name property values, which appear similar to this:

FIELDNAME!GROUPNAME:MARKNAME

Each ScanMark belongs to a named group of marks – GROUPNAME in the example above.  The collection of marks within a group determine a single string value for the group.  There can be three valid states:

  1. The group contains no marks (every ScanMark.IsMarked value is false ).  In this case, the group value is a space (“ ”).
  2. The group contains one mark (only one ScanMark.IsMarked value is true ).  In this case, the group value is the name of the selected ScanMark.
  3. The group contains more than one mark (two or more ScanMark.IsMarked values are true).  In this case, the group value is an asterisk (“*”).

Moreover, the value across a set of groups is concatenated into a single named field value (the name being FIELDNAME from the ScanMark.Name property example).

So my thought process is set-based:

  1. Collate marks across all field groups;
  2. Transform field group mark values into a single field group value;
  3. Collate field group values into a single value for the field.

I’m just not convinced the code says this clearly.  Thoughts?



Open Sourcing of the ASP.NET Membership PowerShell Provider

§ July 27, 2010 01:41 by beefarino |

Over the past year I've blogged and presented about the benefits of targeting PowerShell as a framework for supporting the applications you develop.  I firmly believe PowerShell is the most appropriate choice of platforms for creating interactive and flexible toolsets.  Today I'm proud to announce that one of the original projects that led to this belief - the ASP.NET Membership PowerShell Provider - is being released by Code Owls LLC as open source.

You can find the project hosted on CodePlex here.

My reasons for this don't really center around wanting to share the code.  That is, I've already written detailed blogs about creating this particular provider, and plan to round those out with a few more posts:

So in my mind, the code is already public.  The primary reason I wanted to get this project public and posted was to get people using it and contributing to the project.  At the moment, the glaring omission is Active Directory support, and this is where I need the most help since I don’t have ready access to an Active Directory environment.  If you’re interested in helping out, by all means contact me through this blog or through the CodePlex project page.

I realize this project may be a bit niche, but its a niche is begging to be filled.  The existing Membership toolset is atrocious, and the applicable PowerShell offering is robust, interactive, and full of chewy goodness.

Enjoy!



CodeStock 2010: PowerShell as a Tools Platform

§ July 4, 2010 01:54 by beefarino |

At long last, I'm home from my working vacation and have a chance to do some CodeStock postprocessing.  Several people have asked me for the resources from my PowerShell presentation.  You'll find downloadable RARs of the powerpoint and code below.  I've also placed the deck on SlideShare for convenience:

The bulk of the code is described in the following posts:

I will be adding a few posts soon to round out the code coverage.  Feel free to drop me any questions or concerns you have.  I'd love the chance to give this talk to any .NET user groups in the area!

ASPNETMembership.rar (284.35 kb)

PowerShell as a Tools Platform.rar (1.17 mb)



Log4Net Hack: Customizing Log File Paths After Configuration

§ November 28, 2009 04:48 by beefarino |

I get a lot of log4net questions through my blog because of the tutorials I've written up.  One item that comes up frequently is how to configure a FileAppender to be able to write a single log file from multiple instances of the same application.  The truth is, you can't do it.  Or more precisely, there is no way to do it and expect not to lose log events and have your application performance suffer greatly.

First let me show you how to allow multiple applications access to a single log file.  It's actually quite easy:

 <?xml version="1.0" encoding="utf-8" ?>
 <configuration>
  <configSections>
    <section name="log4net" type="log4net.Config.Log4NetConfigurationSectionHandler, log4net"/>  
  </configSections>
  <log4net>
    <appender name="FileAppender" type="log4net.Appender.FileAppender">
      <file value="log-file.txt" />
      <appendToFile value="true" />
      <lockingModel type="log4net.Appender.FileAppender+MinimalLock" />
      <layout type="log4net.Layout.SimpleLayout" />         
    </appender>
    <root>
      <level value="DEBUG" />
      <appender-ref ref="FileAppender" />      
    </root>    
  </log4net>
  
</configuration> 

Since I discuss the parameters and pitfalls of the FileAppender elsewhere, I will leave it to you to read up on them more if you want to.  The locking mode being used here is causing log4net to acquire the file handle before each log event, then close the handle after the log event is written.  Doing this allows other applications to get write access to the file when no one else is currently logging, but the technique has a few serious flaws that should prevent you from using it:

  1. All of that file opening and closing seriously hampers performance;
  2. The log file will be shared, but access conflicts will still occur between applications attempting to log events at the same time, resulting in more performance degredation and "dropped" log events.

You may be able to address some of the performance issues using a BufferingForwardingAppender that sends large chunks of events to the minimally-locking FileAppender; however this will not resolve the contention over the single log file that is at the root of the issue.

The simplest solution to this problem is to use a different type of appender that is designed for concurrent use.  For instance, the EventLogAppender or AdoNetAppender use technologies that will manage concurrency issues for you.  If you're dead set on using the filesystem, the next simplest solution is to have each application log to a unique file, thus removing any log file contention at runtime.  The separate log files can be collated once the run is over using a tool like LogParser.  The drawback to this approach is that you have to hack it in: there is no direct way to modify the filename in the FileAppender based on the runtime environment.

That said, it's not hard.  Check out this simple console application:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using log4net;
using log4net.Appender;
namespace log4netPostConfig
{
    class Program
    {       
        static void Main( string[] args )
        {
            log4net.Config.XmlConfigurator.Configure();
            var filePostfix = "_" + Guid.NewGuid().ToString( "N" );
            
            var fileAppenders = from appender in log4net.LogManager.GetRepository().GetAppenders()
                                where appender is FileAppender
                                select appender;
            fileAppenders.Cast<FileAppender>()
                .ToList()
                .ForEach(
                    fa =>
                    {
                        fa.File += filePostfix;
                        fa.ActivateOptions();
                    }
                );
            ILog Log = log4net.LogManager.GetLogger( System.Reflection.MethodBase.GetCurrentMethod().DeclaringType );
            Log.InfoFormat( "this process is using log file postfix [{0}]", filePostfix );
        }
    }
} 

This example loads the logging configuration from the app.config (line 14).  The log4net configuration is searched for instances of FileAppenders (line 16), which have their filename parameters handrolled with some process-specific information (line 25) - a GUID in this case, the current process identifier may be another good choice.  Calling the ActivateOptions on each modified appender is vital (line 26), as it recreates each file handle using the new filename configuration parameter set in the code.

The app.config for this example is just a plain vanilla logging configuration:

<?xml version="1.0" encoding="utf-8" ?>
 <configuration>
  <configSections>
    <section name="log4net" type="log4net.Config.Log4NetConfigurationSectionHandler, log4net"/>  
  </configSections>
  <log4net>
    <appender name="FileAppender" type="log4net.Appender.FileAppender">
      <file value="log-file.txt" />
      <appendToFile value="true" />
      <encoding value="utf-8" />
      <layout type="log4net.Layout.SimpleLayout" />         
    </appender>
    <root>
      <level value="DEBUG" />
      <appender-ref ref="FileAppender" />      
    </root>    
  </log4net>
  
</configuration>

Note that the log-file.txt specified in the app.config will be created when the XML logging configuration is loaded (line 13 in my code example above), but it will never be written to.

Edit Notes

I just noticed after publishing this that a very similar example was written almost 2 months ago by Wil Peck.