Thursday, May 29, 2008

Hitler's Toothbrush

You know how anyone caught wearing a toothbrush style mustache these days would be looked at with a bit of disdain?  (Well, from most people...)  This latest article tells me that we're taking this stuff a bit too far.  I mean, really?  The iced coffee pitch is a subliminal message purporting the virtues of terrorism?  Yasser Arafat is now defining fashion in the U.S.?  Michelle Malkin, the author of the terrorism comments, ought to be ashamed of herself.  How hard do you have to be looking to see symbolism in Rachael Ray's neck scarf?  People are gullible - You say something like that and they believe you.  (See...  You believed that, didn't you?)  I mean...  To the point that they actually pulled the ads.

What if Hitler had never trimmed his mustache?  The entire mustache wearing world have been deeply affected.  Fortunately, for the world, he was caught wearing a decidedly unattractive caterpillar looking patch of pubic hair that won't soon be missed.  Let's be clear...  I am not saying that caterpillars were in any way responsible for The Holocaust!  If a guy dresses like Michael Jackson, it doesn't mean he can dance.

We need to grow up.  Outside of the fashion industry, people are not defined by their facial hair or their scarves.  In the real world (shoot... even in most virtual worlds) people are defined by their actions.  What's the real problem, Malkin?  Didn't get your caffeine that morning?

Tuesday, February 12, 2008

Database Dependent Duo Delivers Dazzling Dynamic Data Display

How many of you out there - raise your hands - have databases with more than one table?  Now, how many of you - raise your hands, again - actually raised your hands?  You know I can't actually see you, right?  (And in case you answered that out loud...  I can't hear you, either.)

I do a lot of work with Microsoft's Identity Lifecycle Manager and I'm quite privileged to do much of that work with one, Brad Turner - Identity Management MVP and all around nice guy.  (Funny looking, but nice.)  Brad has developed a series of SQL Server Reporting Services based reports for viewing the status and history of the identity data and management agent processing for your ILM (MIIS) system.  Over the course of several implementations, we have used these reports as the foundation for developing an ILM Management Portal...  We are co-presenting on this topic at the upcoming DEC 2008 conference in Chicago.

But, this is not the story of our ILM portal.  This is the story of a nifty little technique I developed in order to make the data we present in our portal a bit more dynamic.  So, the only reason I bring up the history is to brag a bit about our awesome portal solution and our upcoming DEC presentation.  The rest of this is best illustrated with a simple example...

Let's go stereotypes...  You're a business.  You have data.  You have data about your products.  You have data about your customers.  You have data about the products your customers order.  It's probably safe to assume that you might want to view data about a particular customer with a list of their orders and the ability to see the product details of each order.  But if you are using SQL Reporting Services, there's no clean way to do this...  Yes, you can create sub-reports.  You can even create reports that let you drill down into the details of a child record, but with SRS that typically means the child detail report replaces the parent, leaving just a link to navigate back up the tree.  But what you really want is to be able to click on an order and see the details on the same page.  (This is for an online reporting solution...  I'm not suggesting that you'll be able to click on a line in a printed report and have the data change.  Just want to be clear about that.)

Well, here's how you do it using Windows SharePoint Services (WSS) and SQL Reporting Services (SRS)...  (If you're a developer, and  you look at .Net 2.0 Web Parts and such, you'll probably realize that WSS is not necessarily a required piece, here, but it sure helps with the presentation...  And it's free!)  We'll also need some HTML and a bit of JavaScript, but I'll give you most of that.  (I am going to assume that you know your way around the basics of WSS and SRS - if not, then you'll probably have to do a bit of reading up on that before you can implement this stuff.)

The basics:  We have a system with WSS installed and SRS installed in SharePoint integration mode.  (This gives us a handy little Report Viewer Web Part that we use as the anchor for our dynamic report.)  We also have a few reports:

  • Customer Detail - Displays information about our customer: Name, address, customer number and so forth.
  • Customer Orders - Displays a list of orders for a particular customer.
  • Order Detail - Displays the details of a particular order.

Our desired end result for this example is a two paneled report.  The upper panel showing the customer detail, along with a list of their orders, each displayed as a hyperlink.  The lower panel showing the detail of any order that you click on.

The first step is to create the parent report for the upper panel.  (This is not a tutorial on creating SRS reports, there're plenty of them out there.  I'm just giving you the basic chunks of what needs to happen.)  This is actually two reports.  You create a customer detail report and embed a customer orders report within it.  You can develop and test this within Visual Studio.  You'll most likely have the customer detail report accept the customer name or number as a parameter, then pass that along to the embedded orders report.  (We're going to add some funky navigation to the orders report, but we'll come back to that.)

Next, create an order details report that accepts an order number as a parameter.

Now, let's move to WSS...  As part of the SRS / SharePoint integration, you'll already have a document library for your reports - configure this doc lib as your deployment target in Visual Studio.  We need a nice new Web Part page to act as the canvas for our dynamic report.  For our project, I created a new document library called Dashboards that I used to collect these pages.  I used the Web Part template that just has a single column of web parts the full width of the page.

The first web part to add is the Report Viewer Web Part.  Configure this guy to display the parent report - the one with the customer detail and embedded orders list.  Below that, add a Content Editor Web Part.  This is where a good part of the magic takes place.  This Web Part is going to contain two important elements: An HTML iframe definition and a short JavaScript function to update the source attribute of the iframe.  Make sure that when you edit the Content Editor Web Part, you use the Source editor, not the Rich Text editor.  Here's what the contents of this Web Part will look like:

Content Editor JavaScript:
<script type="text/javascript">
  function LoadOrderDetails(OrderNum)
  {
    document.getElementById('OrderDetail').src =
    'http://win2k3/reportserver?http://win2k3/ilm/reports/' +
    'OrderDetail.rdl&rs:Command=Render&rc:Toolbar=false' +
    '&rc:Parameters=false&OrderNum=' + OrderNum;
  }
</script>

<iframe name="OrderDetail" scrolling="no" id="OrderDetail" src="" width="100%"></iframe>

(Obviously you'll have to tweak that a bit - use URLs appropriate to your setup.  You might have to play with the URL format a bit ,too, depending on where your SRS virtual directories get placed.)

A quick overview of what this code does...  It exposes a JavaScript function to the page.  In this case, the function is called LoadOrderDetails().  The function accepts a single parameter which, in our example is the order number for the order we need to display details about.  We take this information and embed it into a URL that we assign to the src attribute of the iframe.  The document.getElementById() bit is how we get a reference to the iframe.  If you examine the URL that's being generated, you'll see that it points to the reportserver virtual directory, and passes along the URL to the report that we want displayed.  That second URL has a series of parameters embedded into it.  Most of them tell SRS how to display the report: basically, render it, hide the toolbar and hide the parameters window.  The last item passed in the URL is the name of the parameter as defined in the report that we're calling.  And we concatenate the value passed into the function as the value for that parameter.  In our example, our OrderDetails report defines a parameter called "OrderNum".  If we didn't include this in the URL, the report would expect to have the user type it in as they do in the parent report.  But since we're hiding the parameters window and the whole point of this exercise if to eliminate the need for manually linking the reports, we include it as part of the call to display the report.

The iframe definition is just below the script block.  Make sure that your tokens match up!  If you make a call to document.getElementById('OrderDetail') then make sure you assign "OrderDetail" as the value for the id attribute of the iframe.

At this point, we have two pieces of the puzzle assembled.  We have a report that displays the customer detail, along with a list of orders for that customer.  We also have an iframe poised to display a specific report at the whim of a JavaScript function call.  All we need to do now is connect them so the order details report displays the details of any order the viewer clicks on in the parent report.

This final step is to configure the parent report to call the JavaScript function which updates the iframe source.

In the report definition of the customer order report (that's the one embedded into the parent report), right click on the textbox that displays the order number itself and choose properties.  you'll get a window that looks a bit like this:

SRS Textbox Navigation Settings

As you can see, on the navigation tab, you enter the call to the JavaScript function in the "Jump to URL:" box.  The code there inserts the value of the OrderNum field of the report's datasource.  When this is rendered in html as the report displays, it become a client side call to the JavaScript function we defined in the in the Content Editor Web Part in SharePoint.

See? ... It's a piece of cake!  Now each of the order numbers is a hyperlink that dynamically updates a sub-report displayed on the same html page.

So let's go over the basic steps:

  • Define a parent report in SRS that lists a collection of related values.  (Customer Detail, Listing Order Numbers)
  • Define a child report that accepts a parameter to display further detail of an item. (Customer Order, Accepts Order Number)
  • Create a Web Part page to host the cooperative reports.  (I like that...  Cooperative Reports.  I have coined a new term.)
  • Configure a standard SRS Report Viewer Web Part to display the parent report.
  • Configure a Content Editor Web Part with the appropriate JavaScript and iframe definition.
  • Configure the appropriate navigation property in the parent report to call the JavaScript function.

I truly hope that folks find this useful.  I've seen quite a few inquiries on many forums about how to do something like this.  Many people are surprised (as I was) to discover that SRS does not have this capability.  But, then, it's a reporting engine, not a UI driven application.

Here are a couple of things to note about this technique:

  1. Configuring the textbox to make the JavaScript call does not mean it will appear as a hyperlink when rendered.  In the report definition, you'll have to format it in a way that catches the viewer's eye and let's them know they can click on it.  (I typically go for the standard bright blue, underlined text.)
  2. Setting this up creates a tightly bound relationship between these elements.  You couldn't use this parent report anywhere that you don't also supply the JavaScript that it's expecting to call.  (At least not without getting browser side errors.)
  3. The parent report does not have to be very complex and it does not have to have an embedded sub-report.  In one instance, I implemented a parent that does nothing but lets the viewer search for users in a directory with a search string.  The resulting list of users are clickable links that update a user detail panel below the "search box".
  4. You don't have to stop at one level of detail.  For our ILM portal, I use a "search box" report to get a list of metaverse identities that match a search string.  Click one to get details on the identity, including connector history and, in our case, a list of workflows that are associated with it.  Click a workflow id and you get a rendered image of the current status of the workflow in another panel.  Now, since the workflow image report is actually being called from within the iframe that the identity detail is displayed in, you have to scope the JavaScript call: ="javascript:parent.LoadOrderDetails()" (and if you want another level of detail you may find yourself calling a parent of a parent...

Well, I certainly think that's quite enough to try and wrap your brain around for one sitting...  Please let me know if you have success (or not) with this technique.  And if you can add any more interesting angles, I'd love to see what you come up with.

Sunday, December 23, 2007

Uncovering the MIIStery of Attribute Level Deltas (In Holiday Verse)

lights12lights12lights12

'Twas the night before Christmas in The Keys rental house,
A blog must be written, so I warmed up my mouse.

Wrote some new code in the SQL software,
in the hopes that our deltas will be processed with flair.

No longer all changes, attributes instead
will be processed, as needed, as each row is read.

Now sync'ing the data will be quick as a snap.
Clients will smile, they might even clap.

Between data and metaverse we'll eliminate chatter.
('Though the view of the delta might get a bit fatter.)

The results, very pleasing - make it known in a flash.
Must post on my blog. On the keyboard I'll mash.

This is knowledge the field might be keen to know.
Visitor count on my site might even grow.

So what are the details? You're eager to hear?
I'll tell you right now. (First a sip of my beer.)

To process this way, had to think of a trick.
It had to be clever and it had to run quick.

Keep it simple to implement, that was the game.
So that admins could run it and not go insane.

"Now, WHERE clause! Now, LEFT JOIN! Now CASE WHEN and IF IN
On, INSERT! On, UNION!" (Some sweet code I'm mixin')

To the top of the set: ADDs, DELETEs, yes, list all.
It's the modified ones that are different, ya'll.

For each change in the row we must now specify
A new row in the table.  I gave it a try.

With all of the data, list the attribute, too.
That new little column is really a clue.

This really streamlines the process, to tell you the truth.
Now ILM can focus on just what is new.

Temp tables and queries in SQL abound,
'Til each little change in the data is found.

(In the original poem, here, this line ends with "foot".
But to rhyme it in context...  Couldn't think what to put...)

On the code and technique I continued to hack
when finally all the results were on track.

So I said to myself, "Good job, there, Jerry."
Stored procedure is where you must now, this code, bury.

Input param'ters would be apropos
We must tell the script, after all, where to go.

(Now here's another hard rhyme. This one, "teeth"
I'll fake and rhyme this line with the word "beef".)

Okay, I admit it, that line was just silly.
But all of the lines in the poem I must filly.

Alright, back on track, now, I tell myself,
This quality work won't just finish itself.

The parameters, yes, this proc must be fed.
After all it can't read what's in your head.

Provide the table and view names for this thing to work.
Then the code takes your data and just goes berserk.

Nearing the end, whew, soon I can type prose.
Need to wrap this up cleanly, then, I suppose.

If you have thoughts or comments, just give me a whistle.
To the comments you leave, I'll respond like <something that rhymes with whistle>.

If this earns MVP for me, that's out of sight!
Time to end this in rhyme, so to all a good night.

lights12lights12lights12

Yes, yes...  'tis a bit silly, but you read it all, didn't you?  So stop your whining and let's get down to the gory details:

The stored procedure is called spCreateAttributeLevelDeltaTable.  Take that code and paste it into a new query window in SQL Management studio.  (It'll look better once it's pasted there, too.)  Run it to create the stored procedure.

The proc takes four parameters:

  • KeyColumn: The column you intend to use as the key for matching rows between the current and original data sources.
  • CurrentTable: The data source for the current version of the data.  Table or view, doesn't matter.
  • OriginalTable: The data source for the original version of the data.  Table or view, doesn't matter.
  • DeltaTable: The name of the table that will be created and filled with the delta information.

The delta table that you specify will be dropped and re-created each time the procedure is run, so don't get too attached to it.  Add a call to this guy in the pre-processing stage of your delta sync script.  And make sure that you drop (or at least truncate) the delta table after a successful delta sync to eliminate any redundant change processing.  Here's what the call should look like:

Call Syntax: spCreateAttributeLevelDeltaTable
EXEC [dbo].[spCreateAttributeLevelDeltaTable]
     @KeyColumn = N'UniqueID',
     @CurrentTable = N'tDataCurrent',
     @OriginalTable = N'tDataOriginal,
     @DeltaTable = N'tDataAttrDelta'

Obviously - well if it's not obvious, then this is all above your head, anyway - you can peruse the SQL code and get a feel for the technique I employed.  You can also change the delta type keywords (ADD, DELETE, etc...) and column names to match your standard naming conventions.  Then a few changes to the MA delta configuration through the MIIS UI and you're good to go.

This is my <Insert Holiday Name Here> gift to you.  Use it well and please let me know how it goes.  (A little feedback wouldn't kill you, you know?)

Tuesday, December 18, 2007

The Same, But Different

Okay...  A short while ago I posted a little application I wrote called the MIIS Delta View Creator.  It's neat and clean and does what it says it does.  But I also have another method up my sleeve.  (Well, it might be up my sleeve if I kept my sleeve in the hard drive bay...)  Anyway, here's another version implemented as a SQL Management Studio template.

Create the template:

  • In SSMS, enable the Template Explorer (Ctrl + Alt +T)
  • If desired (and it is desired) create a new template folder.
    • Right click on the SQL Server Templates root folder and select "New" and then, yes...  "Folder"
    • Type the new name for the folder.  I suggest either "Rosencrantz" or "MIIS Custom", but this is all you.
  • Right click on your new template folder and create a new template.
    • I suggest any name other than, "New SQL Server Template".
    • I called mine, "Create Basic MIIS Delta View Components" and I have a reference to that in the comments of the script.
  • Now, right click your new template and select "Edit".
  • Paste this code into the query editor window and save the template.

Use the template:

  • In SSMS, double click the template.
    • A copy of it will open in a new query editor window.
    • If you want to edit the template itself, right click it and select "Edit".
  • With the new query editor window as the active window, press Ctrl + Shift + M.
  • Enter the appropriate value for each parameter and click "OK".
    • You now have a script that, when run, will create the necessary delta view components.

Okay...  enough with the bullets.

  • Maybe just one more...

The resulting script will create a basic delta view setup.  All that's necessary to get started is to have one existing table/view.  This table should represent the "Current" view of the data.  The "Original" table and the delta view will be created.  If there are objects with the specified names already in the database, they will be dropped!

The script also creates a couple of additional components:  A table copy stored procedure and a post process stored procedure.  These components are useful for the way we implement a lot of the MIIS functionality.  Read through the code and all will be revealed.  If you have questions, comments or suggestions, please feel free to post them here.

All I ask is: if my readers (yes, both of you) use this code, please note where you got it from in any altered versions that you mangle (create).

Saturday, December 8, 2007

Cider, Workflows and Just Enough Knowledge...

As I recall, it was out in the country somewhere...  The kind of place you can't find without having been there before.  The sweet, musty scent of a single room log cabin that hasn't seen a carbon based life form larger than a raccoon for well over 20 years.  The only light coming from the soft blue glow of the power L.E.D from the wireless access point... Hmm...  Wait a second. I'm not remembering quite right...  Might have been a room at the DoubleTree in San Jose.

On the road and armed with Peanut M&M's, a case of Hornsby's Hard Apple Cider, and some book we'd just purchased from the local techno-geek bookshop, Mr. Turner and I decided to implement our first Windows Workflow Based solution at a client.  Start simple, was our motto.  So we did.  We took one of simplest concepts we could come up with and decided to implement it in the most complex and convoluted way possible.  All in the name of progress.  We had chosen to build a workflow based delayed events Management Agent for MIIS.

Brad's knowledge of the inner workings of MIIS is intense, and I have the ability to write code so concise that a popular compression algorithm, in a fit of jealousy and frustration, locked itself in the inner sanctum of my display driver and refuses to come out.  (The proof is in the dead pixel in the middle of the screen on my laptop.)  And while the intricacies of marrying these two skill sets have evoked solicitations from the Hollywood screenwriting elite, that's not what this blog is about.  That's just a bit of background.  The history.  A peek behind the wizard's curtain, if you will...

What I gathered you all here to talk about is this:  Never drop a goldfish into a glass of vodka.  Okay?  No, not even if it's the good vodka like the kind that comes in the fancy frosted bottle with that guy's face that you can see through the small patch of clear glass on the inside of the other side of the bottle.  (How do they get that guy in there?)

And speaking of goldfish, here's a little story on how I was blind-sided by a .Net assembly versioning conflict...

I did build that workflow engine.  'Twas my first foray into the world of Windows Workflow Foundation and I had just enough knowledge to make it all work, but not enough to understand what I was doing.  The solution consisted of four projects: The workflow engine service, the workflows assembly, a utility project and a setup project.  The workflows, themselves, were pretty simple: Delay, Notify, Delay.  Short and sweet.  The workflow engine was implemented as a service and contains the standard SQL based persistence service, the standard tracking service and an External Data Exchange service to allow the workflows to notify the host of certain events.  All of the versioning was set to auto increment the build and revision.

After creating the workflows and getting the service built and running and everything tested to a point of satisfaction, the service was rolled out and a production WorkflowMA was born.  And it was good.  But not good enough.  Had to tweak the service a bit and ended up having to deploy a couple of replacement versions.

These were long running workflows, delaying for up to 90 days at a time.  So it was quite a while before I was enlightened unto the error of my ways.  After a while we realized that things were not processing as they did in testing.  (You see, in testing, I used 90 seconds, not 90 days.  Deadlines, you know...)  So I'll skip past the details here and get to the stuff that matters...

I used strong named assemblies.  That's important.  If I hadn't, I might not have had any of these issues.  But then I couldn't have deployed signed code, either.

What was happening was that workflows were getting stuck in the persistence service.  While I didn't make changes to the workflows, I had made updates to the service.  But I always redeployed the full setup.  And when I did a full solution recompile, I inadvertently changed the version numbers of the workflows.  So when the persistence service tried to re-hydrate them, it failed as the appropriately versioned workflow classes were not available.  .Net will not automatically use a newer version of a strong named assembly.  They just sat there in the persistence store.  Orphans.

How to fix it?  Well, my first attempt at a workaround was to use a binding redirect in the app.config:

Code
<configuration>
  <runtime>
    <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">
      <dependentAssembly>
        <assemblyIdentity name="MIISWorkflowLib" publicKeyToken="123abc456def7890" culture="neutral" />
        <bindingRedirect oldVersion="1.0.0.0-1.0.9999.0" newVersion="1.1.0.0" />
      </dependentAssembly>
    </assemblyBinding>
  </runtime>
</configuration>

Just specify a version range big enough to cover all the previous versions and redirect them to the one new and latest version.

Awesome!  The old workflows rehydrated using the new assembly.  But then the tracking service complained:

Value cannot be null. Parameter name: profile

There was a mismatch in the profile information in the serialized workflow and the tracking database records.  Another exception.  It was like the parents had come back to claim their orphaned little workflow, but they didn't have proper ID.  Couldn't prove they were they rightful owners, so the WWF authorities intercepted the happy reunion, again leaving the workflow cold and naked in the persistence store.  (Why naked, you ask?  More dramatic.) 

How to fix it?  Well, you can look at the tracking database (examine the WorkflowDefinition column of the Workflow table) and see the version numbers of the workflows that it's cataloged.  (If the WorkflowTypeID doesn't match any records in the WorkflowInstance table, you can probably skip that version.  No workflows were actually created from that assembly.)  Recompile a workflow assembly for each version, updating the AssemblyVersion before compile and copying the compiled assembly to a subfolder structure under the host's startup folder.  Then use codebase hints in the app.config file to tell the host where each version of the assembly lives.  (For some reason, I didn't use the GAC.  If you do, you can just dump each version there and be done with it.  But my solution required more typing, so it must be better.)

Folder Hierarchy:

Folder Hierarchy 

app.config:

Code
<configuration>
  <runtime>
    <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">
        <dependentAssembly>
            <assemblyIdentity name="MIISWorkflowLib" publicKeyToken="123abc456def7890" culture="neutral" version="1.0.2701.23729" />
            <codeBase version="1.0.2701.23729" href="Workflows\1_0_2701_23729\MIISWorkflowLib.dll" />
        </dependentAssembly>
        <dependentAssembly>
            <assemblyIdentity name="MIISWorkflowLib" publicKeyToken="123abc456def7890" culture="neutral" version="1.0.0.0" />
            <codeBase version="1.0.0.0" href="Workflows\1_0_0_0\MIISWorkflowLib.dll" />
        </dependentAssembly>
    </assemblyBinding>
  </runtime>
</configuration>

Awesomer!  The workflows can re-hydrate and that finicky tracking service is no longer complaining.  But then one more little demon wielded its ugly head.  The ExternalDataExchange service.  See, I was using that to allow the workflows to chat with the host application.  When I put this project together I only had one workflow assembly.  I added the required interface definition to that project and just referenced it from the workflow host application.  In Visual Studio, I set a reference to the workflow project in the host application project.  This allowed me access to that interface.  But post-fix there were multiple versions of the workflow assembly.  You can't easily reference more than one assembly with the same name.  (With Reflection, all things are possible...  Well, many things.)  And I needed a reference to an interface my class would implement, not a class already defined in the other assembly.

How to fix it?  Well, this took me while to figure out.  Not because it's an especially difficult problem, really, but because I expected it to be.  And if you're not looking for a simple solution, I can promise that you won't find one.  All I had to do here was separate the interface definition from the workflow definition.  I couldn't add the interface to the host app because the host app already had a reference to the workflow app.  If I put the interface there, the workflow app would need a reference to the host app.  That's what they call a circular dependency and, in some states, that's a felony.  (Well, Visual Studio won't let you do it, anyway.)  So enter project number five, consisting of only the interface definition.  Reference the new project from both the host application and the workflow assembly and... Viola! A complete and working solution.  (Awesomest!)

Now...  The real lesson here is not how to fix this situation - it's that you should avoid it all to begin with.  I was so focused on messing with the new workflow gizmos that I just didn't think through the peripheral .Net stuff.  Lesson learned and In the solution, now, the workflow assembly project no longer auto-increments it's version.  Yeah...  It would have been that simple.

Tuesday, December 4, 2007

Delta Dawn...

Hey, there...  How'ya doin'?  Good...  Good...  Been a while, I know, but I finally have something new to cast off into the blogoshpere.  Here's my first attempt at shining some light on creating MIIS delta views.  Well, perhaps "shining some light on" is not the proper metaphor, but it does tie in nicely with the title and, yes... the theme music.  (That's good old fashioned Barbershop harmony, that is.  If you're into that sort of thing, there's a lot more of it here.  Don't be shy, gentlemen, seek out your local chapter of the Barbershop Harmony Society today!)

Introducing the MIIS Delta View Creation Wizard(version 0.9 - beta type software)  A handy little utility to automate the creation of standard delta views.  The interface is pretty straight forward:

 

Download it.  Try it.  Give it as a unique holiday gift.  And by all means, please post your comments and suggestions.

"When there's darkness on the delta..." use the MDVC and brighten up your day!  (If you don't get that, you didn't listen to the theme music.)

Friday, August 10, 2007

I Need More Minutes

I lied...  I had intended to write my next entry on some user interface details with the .Net 2.0 GridView, but I haven't gotten around to it yet.  I need more minutes.  So many topics...  So little time.  I wish life was more like a mobile phone plan.  Well, in one particular way, at least.

Let's face it, for the most part, the way mobile phone providers and hardware vendors are tied together in this country - that's the USA for anyone reading this from abroad...  Well, abroad from my perspective... - makes for annoying service contracts on the service provider side and a stifling of creativity on the hardware vendor side.  But none of that is my point. 

The one thing I wish I could get in life is more minutes.  For a fee I can have minutes added to my cellular phone contract.  I can use 'em, for the most part, whenever I want.  Or not use them at all.  It's my option.  But I can request minutes.

I end up with so many ideas and interests that I want to pursue, there's so much time that needs to be dedicated to implementing a client solution properly and let's not forget about the wife and child, they deserve their time, too.  (Although Mufasa, the aforementioned child - an overly affectionate, tiny version of a lion - has no issue strolling up and sprawling himself across my keyboard or perching on my shoulder when he feels neglected...)  The only way to accomplish it all would be with more minutes.  I'd be willing to pay, no doubt.  More minutes.  I love the sound of that.

"Honey...  I know you wanted to go camping this weekend and we still need to bathe the office and paint the cat, but I have to finish this project before Monday and I have to do it from the client's site."  You know that's not going to win you any points on the home front.  But if you could just add more minutes...

My current life service provider offers 60 minutes each hour, and I use every one 'em.  I want to upgrade my plan so I can go camping.

Imagine...  Add an additional 30 minutes per hour during peak periods and get a bonus of 500 extra night and weekend minutes.  I could finish my project and enjoy an extended camping trip with my family.

And consider this:  if I was Hindu, perhaps I could get that rollover plan and enjoy some of my unused minutes next time around...

Yeah...  I definitely need more minutes.