Wednesday, November 24, 2010

Process Scalability and its Importance

It is often seen that a process that works well for a small organization suddenly seems to fail as the organization grows. The failure is usually attributed to the new systems and the splurge of new people in the organization who do not ‘understand’ the intent and the need of the original process.

The fact however remains that the failure cannot be attributed to the people or the systems, but to the process itself. It is important that when a process is defined, it has to factor scalability. Scalability is defined as the “ability to scale” or the “potential to adapt” itself to new dimensions.

The dimension aspect here is the growth of the organization. A well designed process should be able to factor changes to the organization automatically. Here are some of the typical symptoms where the process is not designed for scalability:
  1. When you find in your QMS any policy or a process that has not been revisited (reviewed, modified, appended etc) for more than 8 months.
  2. When the quality department of the organization is only talking of one thing – “we need to be compliant to CMMI” or “we need to be compliant to ISO” instead of – “we need to see how CMMI can be used to make life easier and business cheaper” or “how can we make process simple”
  3. When only a few are talking about data and metrics while the rest of the organization is fighting on late work hours and poor quality or when there is a lot of blame game that is happening.
  4. Management never questions the integrity of data or how the baselines are arrived at
  5. Organization appoints “fixed” people to do all their process design and most of the times; these people are invisible and get active just before a compliance check etc.
On the other hand, these are the symptoms that indicate a scalable process:
  1. All employees talk about business with the perspective of quality and metrics (data points)
  2. There are employees on rotation to design processes and there is data to prove that the rotation works
  3. The major crib factor is about lack of business or so much of business than about lack of tools, overtime, redundant process etc
  4. All policies and process documents indicate a revision every 8 months (minimum)
  5. Quality team has a strong say and visibility to improve business and there are visible plans months ahead to indicate all checks (internal and external) meetings, internal and external drives etc – across all the departments.
The time period of 8 months is crucial as the efficiency of a designed process can be determined only after a proper gestation period. In an organization where there are capable and scalable systems, there is a check that happens every 6 months to re-visit the past and re-plan the future and the changes to the plan, policies and processes is usually completed after the half yearly inputs from various stakeholders across the organization – within a fixed time after the half yearly inputs and it typically takes about two months time to make those changes.

Anything beyond that is an indication of a process scalability failure.

Process scalability is very important as it helps the not just the growth in the business but also to ensure that the organization is adept in meeting the ‘present’ situation.
There is an interesting quote that says “Yesterday is history, Tomorrow is mystery, but Today is a gift, that is why it is called Present!”.
The processes that are designed should be scalable to be used today; now! If designed properly, it will automatically provide the required data and inputs to be scalable for tomorrow so that it will become less of a mystery.
The most important aspect of having scalability is that the business is not impacted in any way as the organization scales and there are not double and triple efforts that the organization puts to be compliant to a standard, framework or a model or any internal needs of an organization.

Three tips to make the processes scalable:
  1. Have a mechanism where both SQA and SEPG members are rotational within an organization
  2. Make plans visible and managed– ensure that all process related plans are published, communicated and referred to at every given opportunity and that it is monitored for changes regularly
  3. 24X7 Communication – ensure that there are multiple modes of communication throughout the year on the intent and the principles of the various interlinked systems within the organization
It is not enough if only a few people talk about this. ALL should talk the same language and ALL should understand the intent and its importance for this to work.

Thursday, May 6, 2010

Transcending from Process Definition to Process Design

It is now becoming very clear that the need for process definition is slowly moving away from defining for software/systems lifecycles and operations, to entire supply chain & business processes. The need is also moving towards ‘design’ instead of ‘define’.

With the advent of technology and the change in the global scenario in terms of better communication and the availability of better opportunities to obtain something when compared to 20 years ago has changed the game of process improvements.

It is no more looking at how a typical SDLC (Software/Systems Development Life Cycle) should be defined, but about covering the end to end scope of the present and future business of a company. Many of the complex tasks of time management, effort, productivity, defect etc have been converted into tool based solutions that aid in better, faster and easier project management. The revolution of ‘tooling’ all SDLC activities has enabled good project managers to be better informed and have reduced their work load to a great extent. This tooling revolution has also triggered the thought on the need of a role of a project manager whose work now can be done via a set of intelligent tools. All it requires now is for a technical person to double up as a project manager and manage the team and thus reducing the overhead of the role of a project manager.

This aspect is also applicable to various roles in Quality or that of Human Resources, where these roles would ultimately be part of the tooling revolution and be part of the day-today activities of the employees of the organization – apart from doing the development tasks.

This emphasises the need of ‘designing’ processes to fit this future mode of operation so as to enable businesses and people in a much better and an efficient manner. The word ‘define’ will soon loose its significance as the word is more about being concise and limited to the entity that needs definition, while ‘design’ is a more creative word. This would; I believe be the new lexicon when it comes to Processes.

Transcending from 'define' to 'design' would also require people with special skills and creative talent. The new “Quality Designers” should understand the human and business behaviour and should be able to apply concepts of Cognitive Psychology, Anthropology, Behavioural Economics, and Interactive Design and also be able to move around with present and futuristic technologies – to be able to transform the existing paradigm to a new and usable personal processes that are designed to fit the business need of the company.

Sunday, April 25, 2010

Future of India and Quality

I was going through an interesting piece of writing on the future predictions of NASSCOM. The presentation talks about some of the points in detail. You can find more here:
http://www.nasscom.in/Nasscom/templates/NormalPage.aspx?id=56269
http://www.nasscom.in/nasscom/templates/flagshipEvents.aspx?id=56362

A word of caution – my personal view:
This report projects a wonderful scenario of the future where India can be a great place to be in with the projected global growth and the opportunities within the country up for improvements. According to the report, there would be a surge in the healthcare, education and the finance domains within the country where the need of providing health, education and bank accounts to the remotest part of the country is expected to become a certainty.

As with these reports, which say that there is a nice growth, I would also urge you to look at the statement issued by the World Bank:

“The strategy envisages total proposed lending of US$14 billion for 2009 - 2012. As private financing dries up in the wake of the global financial crisis, the Bank has agreed to provide an additional US$ 3 billion as part of the total financing envelope of US$ 14 billion.”

This statement contradicts what NASSCOM is trying to say. You can find more if you visit the world bank site here: http://go.worldbank.org/OQ25M3AW80

Our job in the Quality World – my personal view:
We in the area of quality professes to have the required ‘stuff’ to be able to ensure implementation of any standard/framework/model. Our focus should shift from the current scenario where we are focusing just on the aspect of a typical SDLC to a more robust area of Business Process Engineering/Re-Engineering in such a way that the overall goal, apart from achieving the need of the business should also be to ensure that our debts to these ‘world banks’ are also reduced.

I would also like to interest you in a book called “Confessions of an Economic Hit Man” by John Perkins. In this book, John explains how economies of various countries can be changed by just a few positive projections, which in-turn projects high return of investment and urges countries to take up loans. But as the false projections fall, the country is pushed more and more into the debts of the banks that offer these loans, in effect making the people who control these banks as the “Invisible Kings”. This is also the reason why the Rich get richer and the Poor become poorer. Do read this interesting book for arriving at your own conclusions.

In a nutshell – my personal view:
We can still make it great provided we move away from the process of converting information to knowledge and start looking at information to knowledge and from knowledge to wisdom.
Do let all know your thoughts and do quiz me for any further information.

Sunday, March 28, 2010

An Insight into Knowledge Transfer

Folks,
Here is an interesting presentation on Knowledge Transfer. The author segregates people involved (roles/population) into five different categories:
  1. Innovators (2.5%)
  2. Early Adopters (13.5%)
  3. Early Majority (34%)
  4. Late Majority (34%)
  5. Laggards (16%)

Though this presentation is not from a typical Software Background, it does provide a good insight into how these aspects affect even the Software Industry.

Hope this presentation provokes you to identify the people with whom you interact on a regular basis, into the 5 categories, and aid you in adapting better way of instigating the project folks to get into the rut of proper Knowledge Transfer.

Sunday, March 21, 2010

Measurement, Metrics, Indicators

Here is an interesting article on the difference between Measurement, Metrics and Indicators: http://www.stsc.hill.af.mil/crosstalk/1995/03/Measure.asp

This article can be compared to the importance of Standards, Frameworks and Models.
  1. Measurement is like a standard – which has been established globally (like the value of a $, or lines of code or function points or time etc) or locally within an organization (like man months etc)
  2. Metrics is like a framework – which uses a combination of measures to arrive at a meaningful outcome (like cost variance, size variance, productivity, etc)
  3. Indicators is like a model – which is to be established and interpreted based on the need of the business (like cost variance should be below 5%, size variance should be less than 2%, productivity should be greater than 8 FP per work day etc).

It is very important that we use the right combination of all the three to ensure that the business/organization benefits from these in line with their business objectives.

Wednesday, March 17, 2010

Model, Standard or a Framework?

We are surrounded by many models, standards and frameworks to help implement various business and quality needs. But how do we differentiate between the three?

First let’s look at what the dictionary says about these words (taken from http://www.m-w.com/) – which is relevant to our context:

  1. Standard - something set up and established by authority as a rule for the measure of quantity, weight, extent, value, or quality
  2. Model – a system of postulates, data, and inferences presented as a mathematical description of an entity or state of affairs; a computer simulation based on such a system
  3. Framework – a basic conceptual structure (as of ideas) ; a skeletal, openwork or a structural frame


Here are specific definitions of these words:
  1. Standard - Very rigid, generally accepted methods of doing something. Very specific. A standard will usually only include a single element (i.e. do this, this way) whereas a framework or model defines a system of doing things.
  2. Model - This is the process of how we get from point A to point B. If we were using the house analogy, if we were a builder then we would construct houses the same way every single time. Other builders might to it different ways, some might be faster, some might produce a better quality product, but they all arrive at the destination. The model is just the path we take to get there. These are our processes. We aren't going to be building one house one way and another house completely different, right?
  3. Framework - A framework is a support system. It may not be the whole picture, but it provides a strong base for building upon. I always liken it to the frame of a house. It can stand on its own, but it's really there to be added to. However, you can never just take it away.

So, as you can see, ITIL is a framework, it is used for a specific purpose and can be implemented in more than one way, but what needs to be achieved is very rigid.
All ISO’s are standards – very rigid – in terms of the outcome and also on how it has to be done.
CMMI on the other hand is a model. It is a combination of ideas and experience put together for the business to interpret – in the context of the business. The model would expect certain things to be in place (like SEPG, SQA), but is not very rigid in terms of how it is done. All it focuses on is that the end result should benefit the business.

All the three have a focus on the satisfaction of how the business can be bettered, but the usage will depend on what is required for the business. In my view, for ensuring proper rigor in the members, an implementation of standard is very important – like ISO27K, ISO20K or ISO 9000. These standards have fixed objectives and can be easily be assessed against. But the ‘how to’ can be tailored to fit the need of the business. So, the standards are assessed based on the objectives rather than the how to.
To ensure that the members abide by a certain set of rules that guide them to achieve the how to of coding, or incident management etc, ITIL, PMI or various SDLC can be considered. These are all frameworks that have been proven to help in ensuring that activities are done within a certain boundaries – so it may vary across different businesses. The interpretation of the framework will impact both the objective and the “how to”. This is the reason why we cannot assess organizations against a Framework.
A model (like CMMI) on the other hand has a different way of addressing the issue. It provides a set of “how to”, but also gives a very wide set of ideas to implement them. The objective here is to address just the “How to” part of it which in turn is mapped to the BUSINESS OBJECTIVES – where the organization has to define the business objective. The assessment in this case happens on how the various processes that are required to be in place are mapped to the Business Objectives of the organization.

It is only the right combination of all the three that drives the true improvements in an organization.

Tuesday, March 16, 2010

Constituents of a Process and do we actually require a checklist?

Here is an interesting pdf on “the difficult process of defining a process” which speaks of the importance of ensuring that a “process” should ensure that the “big picture” of the activity should be taken into consideration to ensure that the process is accurate. But let us step back a bit and ask ourselves – why do we require a process?

Typically a process is put in place to ensure that there is a ‘common’ way of executing an activity. The need for this is usually governed by the fact that the activity in question is to be performed to achieve a business and project goal – and is to be performed by a set of identified roles who are required to ensure that the business and project goals are met. Having mentioned that, let me cite an example that is common but does not seem to have a process in place – brushing your teeth. Is there a defined process in place that talks of brushing our teeth? Though the answer is a big ‘NO’, we still go about its execution in a nearly perfect manner and at the same time, would ridicule if one were asked to write a process of brushing our teeth.
The ridicule factor is not due to the fact that it is a common way of execution, but due to the fact that the so called process has been “hard coded” into our system so much that we can do this with our eyes closed – which in many cases typically is, as one does not always get up early in the morning fully awake.

So, what kind of activities actually require a process? A process should not be defined just for the sake of definition, but a proper due-diligence has to be made before one even decides to define a process. All should be aware that every process has a cost associated with it – either in terms of effort, cost, or time. So defining a process for the sake of defining is a full waste of time, effort and cost.

So what are the constituents of a process? Page 77 of the CMMI Development constellation V 1.2 has a neat list:
  • Purpose
  • Inputs
  • Entry criteria
  • Activities
  • Roles
  • Measures
  • Verification steps
  • Outputs
  • Exit criteria
Once the process has been defined, one way to help its users to follow it is by providing templates.
Checklist, however has a different purpose altogether. One creates a checklist only when the details of the constituents of a process is too much or when the defined process is not clear. As in the earlier example, we do not use a process to brush our teeth and similarly we do not use a checklist to check if the brush is available, or if the paste is there etc. The checklist is also “hard coded” into our system.
Similar to the need of a process, the need of a checklist is there to ensure a quick check to assess the compliance of a process. It is in no way a complete assurance that the process is being followed – for example, if the process has been tailored, the checklist should have covered that fact, else, the intent of the check that needs to be performed is lost. For example, our minds are tuned to use a brush and not a stick from a Neem tree. But the intent of the process of brushing our teeth is still met and our hard coded mind can still accept this and live by it as we ‘know’ that the intent is met.
Similarly, the user of the checklist should “know” the project and the process to be able to effectively use the checklist else, the intent of the usage of the checklist is lost.

Tuesday, March 2, 2010

Value of Variance

All data analysts will keep a keen eye open for variance from the norm of any process or product measure. The main objective is to keep the performance of the measure stable and capable - within the acceptable norm.
Lets quickly look at a situation where there is no variance:
  • Great quality
  • Great performance
  • Great predictability
  • Very stable and capable process
But to achieve this, we would require 'machine' like people who can execute the process and these people will cost the company a lot of money to hire, sustain and retain as these people typically come with perfection all around them and taking decisions in a perfect world is almost impossible.

The downside of zero variance is:
  • No mistakes - so no learning from mistakes
  • No improvement possibility (in terms of knowledge)
  • No growth path beyond what has been achieved
Variance gives the intent and the opportunity to dream beyond and the hope to achieve it. It promotes mistakes, lessons learnt, knowledge sharing etc - all provided, it is within an acceptable range. Variance promotes the business of an organization to grow and facilitates the true value of the organization (it's people) to be truly valuable - thus increasing the maturity of the organization.
This will in-turn create more visions for the organization, based on which the organization can go on a mission with the enhanced value.

Variance control should not be an overdrive as it would kill a lot of necessary aspects of the DNA of an organization.
This big picture on the value of variance is a must have view for all professionals. This is the basic intent of ensuring that variance must exist.
Yes, it cannot be beyond a certain limit and that is what the data analysts should be able to determine - keeping the intent of variance very much alive.

Tuesday, February 23, 2010

The Quality Toolbook

The Quality Toolbook

This is an interesting site that gives a very good description on how to use the various tools and charts. They have a very simple and effective structure:
1) When to Use it
2) How to understand it
3) Examples
4) How to do it
5) Practical Variations

Do visit this site to get a very good insight into how to use the various quality tools and charts.