Sunday, December 23, 2007

Uncovering the MIIStery of Attribute Level Deltas (In Holiday Verse)

lights12lights12lights12

'Twas the night before Christmas in The Keys rental house,
A blog must be written, so I warmed up my mouse.

Wrote some new code in the SQL software,
in the hopes that our deltas will be processed with flair.

No longer all changes, attributes instead
will be processed, as needed, as each row is read.

Now sync'ing the data will be quick as a snap.
Clients will smile, they might even clap.

Between data and metaverse we'll eliminate chatter.
('Though the view of the delta might get a bit fatter.)

The results, very pleasing - make it known in a flash.
Must post on my blog. On the keyboard I'll mash.

This is knowledge the field might be keen to know.
Visitor count on my site might even grow.

So what are the details? You're eager to hear?
I'll tell you right now. (First a sip of my beer.)

To process this way, had to think of a trick.
It had to be clever and it had to run quick.

Keep it simple to implement, that was the game.
So that admins could run it and not go insane.

"Now, WHERE clause! Now, LEFT JOIN! Now CASE WHEN and IF IN
On, INSERT! On, UNION!" (Some sweet code I'm mixin')

To the top of the set: ADDs, DELETEs, yes, list all.
It's the modified ones that are different, ya'll.

For each change in the row we must now specify
A new row in the table.  I gave it a try.

With all of the data, list the attribute, too.
That new little column is really a clue.

This really streamlines the process, to tell you the truth.
Now ILM can focus on just what is new.

Temp tables and queries in SQL abound,
'Til each little change in the data is found.

(In the original poem, here, this line ends with "foot".
But to rhyme it in context...  Couldn't think what to put...)

On the code and technique I continued to hack
when finally all the results were on track.

So I said to myself, "Good job, there, Jerry."
Stored procedure is where you must now, this code, bury.

Input param'ters would be apropos
We must tell the script, after all, where to go.

(Now here's another hard rhyme. This one, "teeth"
I'll fake and rhyme this line with the word "beef".)

Okay, I admit it, that line was just silly.
But all of the lines in the poem I must filly.

Alright, back on track, now, I tell myself,
This quality work won't just finish itself.

The parameters, yes, this proc must be fed.
After all it can't read what's in your head.

Provide the table and view names for this thing to work.
Then the code takes your data and just goes berserk.

Nearing the end, whew, soon I can type prose.
Need to wrap this up cleanly, then, I suppose.

If you have thoughts or comments, just give me a whistle.
To the comments you leave, I'll respond like <something that rhymes with whistle>.

If this earns MVP for me, that's out of sight!
Time to end this in rhyme, so to all a good night.

lights12lights12lights12

Yes, yes...  'tis a bit silly, but you read it all, didn't you?  So stop your whining and let's get down to the gory details:

The stored procedure is called spCreateAttributeLevelDeltaTable.  Take that code and paste it into a new query window in SQL Management studio.  (It'll look better once it's pasted there, too.)  Run it to create the stored procedure.

The proc takes four parameters:

  • KeyColumn: The column you intend to use as the key for matching rows between the current and original data sources.
  • CurrentTable: The data source for the current version of the data.  Table or view, doesn't matter.
  • OriginalTable: The data source for the original version of the data.  Table or view, doesn't matter.
  • DeltaTable: The name of the table that will be created and filled with the delta information.

The delta table that you specify will be dropped and re-created each time the procedure is run, so don't get too attached to it.  Add a call to this guy in the pre-processing stage of your delta sync script.  And make sure that you drop (or at least truncate) the delta table after a successful delta sync to eliminate any redundant change processing.  Here's what the call should look like:

Call Syntax: spCreateAttributeLevelDeltaTable
EXEC [dbo].[spCreateAttributeLevelDeltaTable]
     @KeyColumn = N'UniqueID',
     @CurrentTable = N'tDataCurrent',
     @OriginalTable = N'tDataOriginal,
     @DeltaTable = N'tDataAttrDelta'

Obviously - well if it's not obvious, then this is all above your head, anyway - you can peruse the SQL code and get a feel for the technique I employed.  You can also change the delta type keywords (ADD, DELETE, etc...) and column names to match your standard naming conventions.  Then a few changes to the MA delta configuration through the MIIS UI and you're good to go.

This is my <Insert Holiday Name Here> gift to you.  Use it well and please let me know how it goes.  (A little feedback wouldn't kill you, you know?)

Tuesday, December 18, 2007

The Same, But Different

Okay...  A short while ago I posted a little application I wrote called the MIIS Delta View Creator.  It's neat and clean and does what it says it does.  But I also have another method up my sleeve.  (Well, it might be up my sleeve if I kept my sleeve in the hard drive bay...)  Anyway, here's another version implemented as a SQL Management Studio template.

Create the template:

  • In SSMS, enable the Template Explorer (Ctrl + Alt +T)
  • If desired (and it is desired) create a new template folder.
    • Right click on the SQL Server Templates root folder and select "New" and then, yes...  "Folder"
    • Type the new name for the folder.  I suggest either "Rosencrantz" or "MIIS Custom", but this is all you.
  • Right click on your new template folder and create a new template.
    • I suggest any name other than, "New SQL Server Template".
    • I called mine, "Create Basic MIIS Delta View Components" and I have a reference to that in the comments of the script.
  • Now, right click your new template and select "Edit".
  • Paste this code into the query editor window and save the template.

Use the template:

  • In SSMS, double click the template.
    • A copy of it will open in a new query editor window.
    • If you want to edit the template itself, right click it and select "Edit".
  • With the new query editor window as the active window, press Ctrl + Shift + M.
  • Enter the appropriate value for each parameter and click "OK".
    • You now have a script that, when run, will create the necessary delta view components.

Okay...  enough with the bullets.

  • Maybe just one more...

The resulting script will create a basic delta view setup.  All that's necessary to get started is to have one existing table/view.  This table should represent the "Current" view of the data.  The "Original" table and the delta view will be created.  If there are objects with the specified names already in the database, they will be dropped!

The script also creates a couple of additional components:  A table copy stored procedure and a post process stored procedure.  These components are useful for the way we implement a lot of the MIIS functionality.  Read through the code and all will be revealed.  If you have questions, comments or suggestions, please feel free to post them here.

All I ask is: if my readers (yes, both of you) use this code, please note where you got it from in any altered versions that you mangle (create).

Saturday, December 8, 2007

Cider, Workflows and Just Enough Knowledge...

As I recall, it was out in the country somewhere...  The kind of place you can't find without having been there before.  The sweet, musty scent of a single room log cabin that hasn't seen a carbon based life form larger than a raccoon for well over 20 years.  The only light coming from the soft blue glow of the power L.E.D from the wireless access point... Hmm...  Wait a second. I'm not remembering quite right...  Might have been a room at the DoubleTree in San Jose.

On the road and armed with Peanut M&M's, a case of Hornsby's Hard Apple Cider, and some book we'd just purchased from the local techno-geek bookshop, Mr. Turner and I decided to implement our first Windows Workflow Based solution at a client.  Start simple, was our motto.  So we did.  We took one of simplest concepts we could come up with and decided to implement it in the most complex and convoluted way possible.  All in the name of progress.  We had chosen to build a workflow based delayed events Management Agent for MIIS.

Brad's knowledge of the inner workings of MIIS is intense, and I have the ability to write code so concise that a popular compression algorithm, in a fit of jealousy and frustration, locked itself in the inner sanctum of my display driver and refuses to come out.  (The proof is in the dead pixel in the middle of the screen on my laptop.)  And while the intricacies of marrying these two skill sets have evoked solicitations from the Hollywood screenwriting elite, that's not what this blog is about.  That's just a bit of background.  The history.  A peek behind the wizard's curtain, if you will...

What I gathered you all here to talk about is this:  Never drop a goldfish into a glass of vodka.  Okay?  No, not even if it's the good vodka like the kind that comes in the fancy frosted bottle with that guy's face that you can see through the small patch of clear glass on the inside of the other side of the bottle.  (How do they get that guy in there?)

And speaking of goldfish, here's a little story on how I was blind-sided by a .Net assembly versioning conflict...

I did build that workflow engine.  'Twas my first foray into the world of Windows Workflow Foundation and I had just enough knowledge to make it all work, but not enough to understand what I was doing.  The solution consisted of four projects: The workflow engine service, the workflows assembly, a utility project and a setup project.  The workflows, themselves, were pretty simple: Delay, Notify, Delay.  Short and sweet.  The workflow engine was implemented as a service and contains the standard SQL based persistence service, the standard tracking service and an External Data Exchange service to allow the workflows to notify the host of certain events.  All of the versioning was set to auto increment the build and revision.

After creating the workflows and getting the service built and running and everything tested to a point of satisfaction, the service was rolled out and a production WorkflowMA was born.  And it was good.  But not good enough.  Had to tweak the service a bit and ended up having to deploy a couple of replacement versions.

These were long running workflows, delaying for up to 90 days at a time.  So it was quite a while before I was enlightened unto the error of my ways.  After a while we realized that things were not processing as they did in testing.  (You see, in testing, I used 90 seconds, not 90 days.  Deadlines, you know...)  So I'll skip past the details here and get to the stuff that matters...

I used strong named assemblies.  That's important.  If I hadn't, I might not have had any of these issues.  But then I couldn't have deployed signed code, either.

What was happening was that workflows were getting stuck in the persistence service.  While I didn't make changes to the workflows, I had made updates to the service.  But I always redeployed the full setup.  And when I did a full solution recompile, I inadvertently changed the version numbers of the workflows.  So when the persistence service tried to re-hydrate them, it failed as the appropriately versioned workflow classes were not available.  .Net will not automatically use a newer version of a strong named assembly.  They just sat there in the persistence store.  Orphans.

How to fix it?  Well, my first attempt at a workaround was to use a binding redirect in the app.config:

Code
<configuration>
  <runtime>
    <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">
      <dependentAssembly>
        <assemblyIdentity name="MIISWorkflowLib" publicKeyToken="123abc456def7890" culture="neutral" />
        <bindingRedirect oldVersion="1.0.0.0-1.0.9999.0" newVersion="1.1.0.0" />
      </dependentAssembly>
    </assemblyBinding>
  </runtime>
</configuration>

Just specify a version range big enough to cover all the previous versions and redirect them to the one new and latest version.

Awesome!  The old workflows rehydrated using the new assembly.  But then the tracking service complained:

Value cannot be null. Parameter name: profile

There was a mismatch in the profile information in the serialized workflow and the tracking database records.  Another exception.  It was like the parents had come back to claim their orphaned little workflow, but they didn't have proper ID.  Couldn't prove they were they rightful owners, so the WWF authorities intercepted the happy reunion, again leaving the workflow cold and naked in the persistence store.  (Why naked, you ask?  More dramatic.) 

How to fix it?  Well, you can look at the tracking database (examine the WorkflowDefinition column of the Workflow table) and see the version numbers of the workflows that it's cataloged.  (If the WorkflowTypeID doesn't match any records in the WorkflowInstance table, you can probably skip that version.  No workflows were actually created from that assembly.)  Recompile a workflow assembly for each version, updating the AssemblyVersion before compile and copying the compiled assembly to a subfolder structure under the host's startup folder.  Then use codebase hints in the app.config file to tell the host where each version of the assembly lives.  (For some reason, I didn't use the GAC.  If you do, you can just dump each version there and be done with it.  But my solution required more typing, so it must be better.)

Folder Hierarchy:

Folder Hierarchy 

app.config:

Code
<configuration>
  <runtime>
    <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">
        <dependentAssembly>
            <assemblyIdentity name="MIISWorkflowLib" publicKeyToken="123abc456def7890" culture="neutral" version="1.0.2701.23729" />
            <codeBase version="1.0.2701.23729" href="Workflows\1_0_2701_23729\MIISWorkflowLib.dll" />
        </dependentAssembly>
        <dependentAssembly>
            <assemblyIdentity name="MIISWorkflowLib" publicKeyToken="123abc456def7890" culture="neutral" version="1.0.0.0" />
            <codeBase version="1.0.0.0" href="Workflows\1_0_0_0\MIISWorkflowLib.dll" />
        </dependentAssembly>
    </assemblyBinding>
  </runtime>
</configuration>

Awesomer!  The workflows can re-hydrate and that finicky tracking service is no longer complaining.  But then one more little demon wielded its ugly head.  The ExternalDataExchange service.  See, I was using that to allow the workflows to chat with the host application.  When I put this project together I only had one workflow assembly.  I added the required interface definition to that project and just referenced it from the workflow host application.  In Visual Studio, I set a reference to the workflow project in the host application project.  This allowed me access to that interface.  But post-fix there were multiple versions of the workflow assembly.  You can't easily reference more than one assembly with the same name.  (With Reflection, all things are possible...  Well, many things.)  And I needed a reference to an interface my class would implement, not a class already defined in the other assembly.

How to fix it?  Well, this took me while to figure out.  Not because it's an especially difficult problem, really, but because I expected it to be.  And if you're not looking for a simple solution, I can promise that you won't find one.  All I had to do here was separate the interface definition from the workflow definition.  I couldn't add the interface to the host app because the host app already had a reference to the workflow app.  If I put the interface there, the workflow app would need a reference to the host app.  That's what they call a circular dependency and, in some states, that's a felony.  (Well, Visual Studio won't let you do it, anyway.)  So enter project number five, consisting of only the interface definition.  Reference the new project from both the host application and the workflow assembly and... Viola! A complete and working solution.  (Awesomest!)

Now...  The real lesson here is not how to fix this situation - it's that you should avoid it all to begin with.  I was so focused on messing with the new workflow gizmos that I just didn't think through the peripheral .Net stuff.  Lesson learned and In the solution, now, the workflow assembly project no longer auto-increments it's version.  Yeah...  It would have been that simple.

Tuesday, December 4, 2007

Delta Dawn...

Hey, there...  How'ya doin'?  Good...  Good...  Been a while, I know, but I finally have something new to cast off into the blogoshpere.  Here's my first attempt at shining some light on creating MIIS delta views.  Well, perhaps "shining some light on" is not the proper metaphor, but it does tie in nicely with the title and, yes... the theme music.  (That's good old fashioned Barbershop harmony, that is.  If you're into that sort of thing, there's a lot more of it here.  Don't be shy, gentlemen, seek out your local chapter of the Barbershop Harmony Society today!)

Introducing the MIIS Delta View Creation Wizard(version 0.9 - beta type software)  A handy little utility to automate the creation of standard delta views.  The interface is pretty straight forward:

 

Download it.  Try it.  Give it as a unique holiday gift.  And by all means, please post your comments and suggestions.

"When there's darkness on the delta..." use the MDVC and brighten up your day!  (If you don't get that, you didn't listen to the theme music.)

Friday, August 10, 2007

I Need More Minutes

I lied...  I had intended to write my next entry on some user interface details with the .Net 2.0 GridView, but I haven't gotten around to it yet.  I need more minutes.  So many topics...  So little time.  I wish life was more like a mobile phone plan.  Well, in one particular way, at least.

Let's face it, for the most part, the way mobile phone providers and hardware vendors are tied together in this country - that's the USA for anyone reading this from abroad...  Well, abroad from my perspective... - makes for annoying service contracts on the service provider side and a stifling of creativity on the hardware vendor side.  But none of that is my point. 

The one thing I wish I could get in life is more minutes.  For a fee I can have minutes added to my cellular phone contract.  I can use 'em, for the most part, whenever I want.  Or not use them at all.  It's my option.  But I can request minutes.

I end up with so many ideas and interests that I want to pursue, there's so much time that needs to be dedicated to implementing a client solution properly and let's not forget about the wife and child, they deserve their time, too.  (Although Mufasa, the aforementioned child - an overly affectionate, tiny version of a lion - has no issue strolling up and sprawling himself across my keyboard or perching on my shoulder when he feels neglected...)  The only way to accomplish it all would be with more minutes.  I'd be willing to pay, no doubt.  More minutes.  I love the sound of that.

"Honey...  I know you wanted to go camping this weekend and we still need to bathe the office and paint the cat, but I have to finish this project before Monday and I have to do it from the client's site."  You know that's not going to win you any points on the home front.  But if you could just add more minutes...

My current life service provider offers 60 minutes each hour, and I use every one 'em.  I want to upgrade my plan so I can go camping.

Imagine...  Add an additional 30 minutes per hour during peak periods and get a bonus of 500 extra night and weekend minutes.  I could finish my project and enjoy an extended camping trip with my family.

And consider this:  if I was Hindu, perhaps I could get that rollover plan and enjoy some of my unused minutes next time around...

Yeah...  I definitely need more minutes.

Saturday, July 28, 2007

Filtering data with the ASP.NET 2.0 GridView

Finally, something else to blog about...  Are you happy, Brad?

In implementing a full Identity Management solution, there are times when you need to supply the customer with some external utilities.  In this case, we needed a simple interface to allow the customer to manage a list of location codes, mapping them to Active Directory OUs and login script paths.  We decided that, in this case, the best solution was a simple database table and a web interface to update the information.

ASP.NET, especially with it's more recent incarnations, provides some mechanisms for building simple sites with little more than drag and drop in Visual Studio.  But as you try to build in a bit more functionality, you find that you sometimes have to tweak the built-in stuff.  (Our tweaking, here, is very simple - it's just a matter of adding the proper code to the proper events.)

The LocationCodes mini-project found me banging my head against the wall (drywall repairs: $73.26) trying to implement something as simple as a filter for narrowing down the list of items on the page.  Here's what I discovered through a combination of posting, searching and just plain messing around with it.  It's actually pretty simple...

First thing to note is that since a GridView merely displays the record set that it's given, filtering actually takes place in the DataSource control it's bound to.  Cool - the Datasource control has a "Filtering" event!  How much simpler could it be...?  Well, just to mess with you, the "Filtering" event of the DataSource is exactly where you don't place the code.  The filtering event only triggers if the DataSource's filter property already has a value.

Here's our scenario: We have a web page with a DataSource, a GridView and a couple of Buttons and a TextBox for dealing with the filter.

Here's the event firing order for the first time the page loads (We have no filter set yet.):

Object Event
Page PreInit
DataSource Init
GridView Init
Page Init
Page InitComplete
Page PreLoad
Page Load
DataSource Load
GridView Load
Page LoadComplete
Page PreRender
GridView DataBinding
GridView RowDataBound (Once for each row.)
GridView DataBound
GridView PreRender
Page PreRenderComplete
DataSource Unload
GridView Unload
Page Unload

When a filter is set, or cleared, or when an item is edited, the order of events is slightly different because we have to react to what the users intent is, meaning that we'll trigger some events programmatically.

Eliminating most of the events that don't really effect us, here are the events that occur when a user interacts with the page:

When a filter is entered in the TextBox and the user clicks the Filter button:

Object Event
Page Load
DataSource Load
GridView Load
Filter Button Click (Set Filter, Call DataSource.DataBind)
DataSource DataBinding
Page LoadComplete
DataSource Filtering
GridView DataBinding
GridView RowDataBound (Once for each row.)
GridView DataBound
DataSource Unload
GridView Unload
Page Unload

When the button is clicked, we set a Session variable to hold the filter text and force the DataSource to bind.  (For readability, the code snips presented her don't include the full method signatures.  But Visual Studio creates those for you anyway...)

Protected Sub btnFilter_Click(...) 
Session("FilterExp") = txtFilter.Text
LocationCodesDS.DataBind()
End Sub


In the DataBinding event, we check to see if there is any filter text.  If so, then set the FilterExpression property of the DataSource.  If there is no filter (the Session variable is not set) then clear the FilterExpression property:



Protected Sub LocationCodesDS_DataBinding(...)
Dim f As String = Session("FilterExp")
 
If (f Is Nothing) Then
LocationCodesDS.FilterExpression = ""
Else
LocationCodesDS.FilterExpression = _
"LocationCode Like '" & f & "%'"
End If
End Sub


When the FilterProperty of the DataSource is set, the text in the FilterExpression is basically added to the DataSource's SQL statement as a WHERE clause.  If the FilterExpression of the DataSource is set, then the Filtering event of the DataSource will be triggered.  This is where you can validate the filter expression and intercept the filtering process (e.Cancel = True).  The way I implemented this, I would check the filter expression in the Filter Button's click event.  But if you bound the DataSource's FilterExpression property to the TextBox directly, you'd need a way to stop the filtering process if, by chance, a user entered something that might be considered invalid or even malicious when attached to the underlying SQL.



Since the click event of the Filter button doesn't run every time the page loads (there are other ways to interact with page besides that one button) we need to tell the DataSource about any potential filters every time we have to rebuild the page.  The DataSource's Init event is fine place to do this.  Without this, if a user, say, clicks to edit a particular row, the DataSource will load the full record set on the postback and the GridView's row pointer will very likely end up pointing to a different row, causing the user to edit the wring data.  The code is simple:



Protected Sub LocationCodesDS_Init(...)
If Session("FilterExp") <> String.Empty Then
LocationCodesDS.DataBind()
End If
End Sub


VB .NET is kind enough to allow String.Empty to be compared to an actual empty string or to VB's null equivalent of 'Nothing'.  Thus allowing us this comparison even if the Session variable doesn't exist.  Conversion of this bit of code to C# needs to take this into account.



The last bit of functionality required for our basic filter implementation is the ability to clear the filter.  We do this in the Click event of the ClearFilter Button:



Protected Sub btnClearFilter_Click(...)
Session.Remove("FilterExpression")
LocationCodesDS.DataBind()
txtFilter.Text = ""
End Sub


That's it for basic filtering.  Next time, we'll talk a bit more about the GridView and how to work with it in regards to maintaining a proper user interface...

Wednesday, May 23, 2007

After Duress

Okay...  So I'm new to this blogging thing and a couple of days after my first post I realized that I forgot to mention something that is directly related.

But what's the protocol for blog updates?  Technically, this isn't a new subject.  How sacred are the posted blogs?  Can one alter a blog once posted without disturbing the natural order of things?  And since this is all digital, is the natural order simply 0 then 1?  I mean there's not much room for disturbing that particular natural order short of completely reversing it.  And certainly I don't want to be responsible for an alteration of that magnitude.

But I digress...  Or not, actually, since I haven't even started discussing the thought I was thinking before I was confronted with the whole blog alteration conundrum.  So now I <insert opposite of digress, here>...

In discussing the issue of certificate revocation and the delays caused by automatic verification of certain, signed .Net assemblies, I forgot to mention that it seems the verification process is intermittent.  When I'm on an airplane (as I am now, not that it matters) and I start up SQL Server Management Studio, I don't get that long delay we experienced with disconnected MIIS server.

So it seems that either the certificate is being verified against a published crl, or the system is updating it's local copy of the crl or something along those lines, and once verified, or updated, it's good for some amount of time.  So it's possible that the issue can be solved by simply opening ports through the firewall to allow the system a quick peek to the published crl to satisfy its curiosity for a while.  Once sated, the firewall can be locked down again until the system loses confidence and requires another gander at the certificate black list.

This is all speculation on my part, however.  I haven't verified this other than watching the behavior of my own system.  And digging deeper into this isn't too high on my priority list.  But if I do stumble across a definitive answer to this, I'll be sure to post it.

Or do I just update this post...?

Tuesday, May 22, 2007

Under Duress

At the last minute and at great expense, the Frobozz Magic Blog company (How many people will get that reference...?) is proud to present... My blog.

For quite some time I've been intending to start a blog, if for no other reason than to have an easily accessible repository of useful bits and pieces that I can get to from pretty much anywhere. Call it an information wallet, a portable brain or perhaps auxiliary memory, basically a place to keep all that information I can't remember just at the time I need to remember it. I'd get to it one day - if, in a moment of spare time, I could just remember to do it.

But my good friend and colleague, Brad Turner - with an evil grin on his face (I know that for a fact, as I was sitting next to him when he did this...) - has forced my hand, well both hands, actually, since I use both when typing... His recent post regarding an issue we were troubleshooting has an embedded link to what was my non-existent blog. So out of fear of Internet wide humiliation, I was forced to blog my first blog in the blogosphere.

So here's the actual blog part of this blog post...

Since most of you probably got here from the link on Brad's post, I'll not rehash the entire situation. But if, by chance, you happened upon this post via some other circuitous route, here's the basic summary:

While building a decidedly kick-ass identity management solution for a client, we ran across a situation with one of the servers in which it appeared a bit unresponsive at times and, in general, just didn't behave in a fashion similar to it's fail-over brother in a far away data center, despite being built as a virtual twin. The main symptoms were lethargic application startups and curious memory errors delivered by MIIS. After numerous troubleshooting attempts, staring stupidly at the monitor and asking questions along the lines of, "What the...?" we did finally figure out the problem.

In a nutshell, the problem was that the system was trying to verify certificates associated with certain applications, but that particular server did not have access to the Internet. So applications, in this case SQL Server Management Studio, take some time to startup because they're waiting for the timeout in trying to access the Microsoft Certificate Revocation List at http://crl.microsoft.com.

In the case of MIIS, the delay caused MA extension timeouts and seemingly unassociated out of memory errors.

There are a few ways to mitigate this issue. We chose, at least for now, to disable certificate verification through the advanced properties in Internet Explorer.

All of the details, including Event Log entries, etc. are in Brad's post, so if you haven't been there yet, go now.

Go on...

That's all I have to say for now...

No reason to hang out here anymore...