IdM Business Case Financial Forecasting

I think business cases are important but it is not necessary or even beneficial to be overly precise and granular in the quantitative sections except as form of organizational signaling (look how thorough I am).  If you compared the pro forma’s I did as a 28 year old with the ones I do today you would have seen a lot more rigor back then.  That rigor was a complete waste of time.

Last year I did a strategic engagement for a large multinational.  The whole experience was shall we say? Less than rewarding.  When things go wrong you must ultimately blame yourself for one simple reason, if you are responsible for all the good that happens to you then you are responsible for all the bad.

Part of this engagement was to develop a business case which I never quite got to, due to all the time wasted in hand wringing over slides on a PowerPoint (which seemed to be the required mode of management communication within the company) and waiting for feedback.  Regardless, I was handed a template for a business case (which looked like it had been developed for manufacturing not IT) and it looked like it was written by a finance nerd in an MBA program.  I did find out who wrote it for them (a large multi-national consulting firm) and the firm was apparently being paid by the bit.  It had an extraordinary amount of granularity and to complete it would require a team to collect all the inputs.  In short it is a complete waste of time.

Why would I say that?  Because bottom up detail does not equal forecast accuracy.  When you develop a financial model based solely on internal estimates it’s going to be way off.  The only way to get any degree of accuracy would be to have your CIO call three other CIOs at companies whose size and complexity is a rough match and ask them how long it took them to deploy and how much did it cost.  That beats the vast majority of internal estimates.  The hardest hurdle to get over is the IT’s confidence in their ability to do a better forecast.  Will they?  Maybe.  Statistics indicate they will not.

If the outside option is not available then use your company’s own experiences in IT projects of equal complexity.  You will save a lot of time and effort and perhaps improve your accuracy.

The other side of the article

It’s seldom that I publish more than one blog post on a single piece, but Mark Diodati’s article “Changing times for identity management ” (login required) spoke of two main themes that I felt needed to be discussed.  In an article on IdM Thoughtplace, I looked into some issues of what composes “New School” Idm.

In this piece, I’d like to comment on a couple of points that Mark makes that I particularly agree with.

First off, Mark mentions that thorough analysis and review of IdM offerings is essential.  The selection team/steering committee  needs to remember that no IdM product exists in a vacuum.  Testing against ERP, enterprise LDAP/AD and other key systems is essential, and involving a pilot group is key as well.  I’d go a step beyond what Mark specifies, by adding that your pilot group needs to be multi-disciplinary. Just IT or Help Desk folks won’t cut it here.  Make sure there’s some HR and ERP users along with other “typical” users in your organization.  You’ll need to do a little more hand holding and training earlier that you’d like, but you’ll get better responses and metrics in return.

I’m also in agreement that you should review all offerings and available features/upgrades from current infrastructure. That “buried treasure” could be the key to keeping your infrastructure secure and compliant. Also find every way possible to use and reuse your current infrastructure., it can pay off in the long run.

It’s a tough economy out there, but that does not mean that you should stop your review of  IdM improvements.  Use the current time for evaluation and planning.  Bring some vendors in for a PoC to make sure it fits into current infrastructure.  The best place to start looking is right in your server rooms and data centers.  Go to it!

The Overestimation of Knowledge

When it comes to dealing with risks and understanding the distribution of risks, we greatly over estimate what we know. We use mathematical models derived from observable phenomena which may in fact be random or misleading.  Even worse many take as proof that because it never happened “here”, the threat must be exaggerated.

Right now some are turning to their respective governments demanding they “deal” with the current recession.  What do these men know, many of whom are academics? Does reading make one omniscient?  Does living your entire life on the taxpayer make you unequally qualified to make market policy?  Like a  blind folded passenger jerking the steering wheel and stomping on the gas,  they are far more likely to send an economy headed for a ditch  into a tree.    All the mathematical models in the world, designed by academic geniuses did not prepare the financial industry for the collapse that happened.

Who today is any different?  One  hears many information security professionals speaking with such assuredness about their perimeter security.    I see lax practices in major corporations where as long as it passes audit they are happy.  One supposes if something goes wrong they can always blame an outside auditor or at least the junior member on the team.  What did Mel Brooks playing the Governor William J. LePetomane say in Blazing Saddles?  “We’ve gotta protect our phoney-baloney jobs, gentlemen, we must do something about this immediately!”  That something is frequently find the scapegoat.  Leaders  who brag about their decisiveness and bark orders to subordinates, who are the epitome of knowledge and confidence, who spout advice on success to the lesser  suddenly become hapless victims,  mere naïve children.   Irrespective of whom one blames, the end result is the same and the damage is done.

The drive to grow the modern enterprise quickly is the source of many kinds of these problems.  Every successful quarter reinforcing the risky behaviour, every interview an opportunity to put one’s knowledge on display;  an tireless parade of sycophants anxious to win trust.  When cells grow quickly in the body it means one is a  fetus.  If one is an adult, it means cancer.

Rapid growth may lead to capital appreciation and nice dividends for a decade but it also leads to failure to hedge against catastrophic risks, reckless behaviour,  and frequently  fraud.   When one person wins 300 million in a lottery they say he got lucky.  When 10,000 entrepreneurs enter the market with the same basic idea and one of them succeeds, they call it genius.

Perhaps instead mankind  is a blind squirrel grubbing for the proverbial nut and only some of them have the humility to admit it.  It is impossible to identify every risk, anticipate every possible outcome and for the last fifty years we have had the benefit of being relatively free of want in the west.  Our ancestors saved and prepared themselves for unpredictable disaster, braced themselves emotionally for loss of children because the world was uncertain.  Many of those uncertainties have been reduced but others abound.  Dealing with risk means building robustness, redundancies, establishing financial reserves, going slower because mitigation of risk slows you down.  This recession might have shown us who was properly prepared by watching those who weren’t disappear into financial history, instead we socialized the risk across the whole of America and it feels a lot like a suicide pact.   They have the knowledge; we have the exposure.

VDS Logging

The astute observer will notice that the most recent releases of SAP’s NetWeaver Virtual Directory Server are missing the logging control buttons. There is a  very good reason for this seemingly missing functionality.  Much like NetWeaver Identity Management, VDS is also merging into the NetWeaver, specifically NetWeaver’s logging framework.  This means that there is not a need to have VDS offer internal logging.

However, VDS also offers the ability to run in a “Standalone mode” which allows for VDS to run independently of NetWeaver.  If you plan on running in this mode you’ll need to take advantage of the following configuration tweak in order to access the logs:

Update the file standalonelog.prop that can be found in the Configurations folder.  If you do not have this file, information can be found in the SAP NetWeaver Idenity Management Operations Guide. This document can be found on SDN. The file is a basic text file that includes setting the log level and desired location of the log file.

Once this file is configured it needs to be placed in the Work Area folder (typically underneath the Configurations folder.  Note that creating this file will not bring the buttons back, it will only create the logs in the paths specified in standalonelog.prop.

From what I understand the internal log viewer will be back in the next Service Pack for VDS.  It will be good to have it back.

IDM Scripting – tip

When creating scripts (Javascript or VB Script) to be used within Netweaver Identity Management, make it a point to create and refer to global Constants/Variables rather than direct references to values that might change with time or an environment. This simple tip just makes it easier to manage and maintain scripts within your IDM configuration. Without global constants/variables, you would have to comb through each and every script that might possibly use or directly refer to a value that needs to be modified and then change it accordingly within each script. Instead, when you start using a global constant/variable that’s referenced within multiple scripts, you will be able to make the necessary modifications to just one central location and be done with the change; It greatly simplifies script maintenance.

Project Scope and Sustainability

(This post was written by Matt Pollicove)

One thing I’ve noticed when talking to people about Identity Management projects involves how to determine the project’s overall scope.  “How do I scope this?” they will say to me.  Now that’s kind of tough to answer right off the cuff, especially when considering Pollicove’s Law of Provisioning  which basically says there’s no guarantee that companies that are in the same vertical will work the same way.

 

However, I think there are some best practices that can be worked with when considering implementing an Identity Management solution:

 

1.       Make sure you have executive sponsorship.  C-level support is going to be important in balancing the needs of your stakeholders and their budget dollars.

2.       Make sure you have a good plan of what you want your Identity Management solution to cover.   An essential part of this is conducting a thorough assessment.  Document everything, diagram existing processes, then take them apart and put them back together the way they should be, then do it again.  This should be done with a combination of internal and external sources.  Internal resources know how current systems are configured and interact.  External resources will offer an impartial assessment of how these systems can interact more efficiently. External resources will also be helpful in determining which Identity Management products will work best in your infrastructure.

3.       When building your plan, know where you’re starting from.  What will be used as your authoritative store?  How will it be built?

4.       Where are you going to? What will you provision to?  What will you control access to?

5.       How are you going from 2 to 3?  How will you engineer your changes?  What will the phases of your project consist of?

 

Item 4 is probably the most important part of the process.  Many a project has suffered due to overreaching phase objectives. Carefully define what you want to achieve in each phase. 

 

Data cleansing and analysis is almost always your first phase.  If you don’t have clean data, you won’t have a clean project.  Future phases can deal with:

·         Creating an Authoritative Store

·         Provisioning to essential systems

·         Password management

·         Role management

·         Provisioning to secondary systems

·         Etc.

 

So the big question is what order does this happen in?  How long will it take?  I always suggest attacking “low hanging fruit” first by attacking the easiest objectives that will show the biggest net gain?  As a part of number 1 above, think about solving automation needs, compliance needs, addressing password management request costs to the helpdesk? 

 

How long will it take?  As long as it has to.  This is going to be a major project that will affect many systems and departments.  Take it slow and easy.  Test it thoroughly and make sure there’s a good knowledge management/training initiative to let your users know what happening and how everyone will benefit.  It’s never good if your users equate an Identity Management initiative with a foul tasting medicine.  This includes your stakeholders.