Password Sync or SSO?

I’m wondering why organizations are still doing password sync over disparate systems rather than Single Sign On (SSO)?

It seems to me that you’re looking at equal amounts of effort in either case to distribute passwords via Sync or setup an SSO solution.  SSO provides a much better degree of security since even a password gets hacked, you’re not getting the keys to the kingdom.

What makes this even more worrisome is that given the way Password Sync works, some systems are easier to hack than others, simply work on the repository that has the easiest policy.  Invariably this is a mainframe or legacy app that won’t accept mixed case, special or numeric characters. Even a long password’s benefits are rendered moot in these circumstances.

Advertisements

Rendering Mediocrity

“The general tendency of things throughout the world is to render mediocrity the ascendant power among mankind.” – John Stuart Mill

Here we have John Stuart Mill discussing “best practices.” Not really, of course, but when you think about it If every one is doing the same thing the same way that’s mediocrity. The first few companies that adapt or discover the best way to do something reap the benefit. Best practices are ephemeral, married to the dominant business processes of a particular age. Frequently they are not even best practices but merely fads. Anyone remember the ten year strategic plans of the mid 1980s? I can remember when bankers would ask to see at least a five year strategic plan. What does this have to do with Identity Management? This, I am asked frequently what are the best practices concerning IAM. This is usually an IT centric view and they are thinking technical deployment. I believe a better question is this “How can we use this tool to augment our competitive advantage while lowering our total risk? It’s possible to do both with IAM. There aren’t a lot of systems in the security domain that can.

It’s all about the relationship

The morning sessions here at Burton largely revolved around the topic of relationship management.

It’s long been debated about what gets done after provisioning is completed.  No one believes that IdM stops here.  In my time I’ve heard of several next steps after provisioning/de-provisioning is completed (aside from continuing maintenance of the provisioning/de-provisioning system):

  • execution of more detailed compliance work
  • further refining entitlements and role management
  • continued maintenance of the authoritative store
  • Business Process Management exercises

It seems that the Burton folks believe that the next phase of Identity Management must be the creation and maintenance of relationships.  Once a person is defined in the authoritative source, building relationships between identities and services needs to happen. Several examples were cited as to how relationships can be made and maintained. Notable examples came from the fields of Social Networking and eCommerce. Certainly areas such as Education, Business-to-Business, CRM, and Vendor Relationship Management (A new one to me) can be made as well.

I like this from an operational standpoint.  It’s actually something outward facing to be done in the IdM space as opposed to inward facing compliance and role management activities.  Not that they don’t have an important purpose mind you, but it always seems to me that it’s always nice to have something to do with the data as opposed to having it just lie there.

There are still some challenges ahead though as we determine the exact way that these relationships will be made and maintained.  Who are the providers of the relationship service is probably the biggest challenge.  Will they simply be a part of Web/Identity 2.0 or something else that we’re not quite aware of yet.  This will be one of those topics that will be interesting to follow in the coming months.

Burton Day 1

I am here at the Burton conference and I can already tell it’s going to be interesting. I’ve spoken to a few people about what they’re thinking about in IdM and I’ve heard a lot of different responses.

SOA and SaaS seem to be high on everyone’s radar.  There’s also a fair bit of talk about Role Management, which is a good thing since there’s a bunch of sessions scheduled on this topic for tomorrow.

I’m looking forward to hear what Burton has to say.  I’ll be posting when I can to give some of the main points and what I’m hearing from people on the floor.

Creating An Authoritative Store

I’ve been thinking recently about what the best method is for creating an authoritative Identity Store for use in a provisioning solution.

it seems to me that there are three ways to do this:

  1. Use authoritative attributes from one or more trusted repositories to create a master authoritative store.
  2. Read each repository into a separate database table and merge them together into a temporary or holding table.
  3. Read each repository into a single holding  table, overlaying information as needed.

I think all three methods are viable, but one main question needs to be asked first:

Does it make sense from an architectural or process standpoint to have a holding table. There are times wherer the holding table can be used for other purposes (basis for a Virtual Directory, feed for an external access application and just plain redundancy.  The issue however is that these tables take up two critical resources, time and storage.  Building holding tables requires extra operations (passes in the SAP NetWeaver identity Manager world) and in large organizations these table(s) can get large fairly quickly.

The other issue with building a holding table from multple tables is that there must be a common key between all of the tables.  Strangely enough this is not as common as one would think, especially when some of the tables that will make up the authoritative store are from legacy systems.

Whenever possible, I prefer to use authoritative attributes as it helps to cut down on speed and storage.  As I was saying to a colleague the other day, it’s a simply that execution time is a function of the number of repositories to be processed and the number of attributes in the repository.  The more attributes in the repositories, the longer it takes, and vice versa.  This method also reduces storage space and  elimiates the need for links between the repositores.

The next questions becomes, what do we do with the old sources one you have a solid authoritative store, but that’s for a future article.

Verizon Data Breach Analysis and Federation

The Verizon data breach report is a just a great read. One of the stats that really stuck out was the 39% figure for intrusions originating from vendors. The slow adoption rates of federation should not be surprising. People look at their own risk levels and project that out onto others. In complex societies trust levels are significantly higher than in small towns where one ostensibly knows each other. To make federation work between suppliers and customers you are going to have to take the time audit each other’s risk management practices and infrastructure. If you don’t have the time or personnel, you have to question if federation is worth it.

Start up…

This is a new blog on SAP’s Netweaver Identity Manager, fka Maxware. from the practitioners of Secude Global Consulting. We will use our experiences installing, deploying and configuring this product to help you get the maximum benefit from this versatile piece of software. You may also see an occasional post on systems thinking and enterprise risk management.