Most organizations have a strong bias toward planning, managing and executing a multitude of supposedly value-added activities hoping that these (often isolated) activities will yield significant results. In rare cases, activities are spawned by careful strategic planning, tracked regularly by performance data, reviewed for adjustments and improvements and integrated across functions, divisions and geographies. However, this is the exception rather than the rule.
The emphasis on activities is exceptionally noticeable in software development and other technology functions due in part to the rapid and constant changes in the availability of tools, standards, procedures and best practices. In the industry’s desire to keep up with the latest technology, current best practice or even the latest management edict, companies try new things and tinker with their processes. This is often done without consideration of the effect on quality, cost, cycle time or other performance criteria. It happens at the executive level, the management level and the individual contributor level with varying degrees of impact.
Sometimes, a simple, conceptual understanding of behaviors and resultant performance can help to change and challenge the ways that a company performs its work and even push back on activities that yield little performance value.
Activity 1: Training vs. Learning
Most people have been through this. The latest corporate edict comes out. Employees must attend mandatory training for something (safety, ISO, CMM, diversity, Six Sigma, etc.). Everyone is herded into an auditorium or cafeteria, the subject matter experts are assembled and the Power Point slides start and continue for a few hours. Each attendee leaves with a pin on their company badge, a new pen and some new knowledge to make the company a better place to work. Like a scene right out of Dilbert, this probably becomes the source of jokes and innuendos in short order. So, why do companies do it? And, more importantly how can this be changed.
Start with some simple facts. According to the Research Institute of America and several other sources, knowledge retention from classroom training degrades to 58 percent, 30 minutes after course completion, to 33 percent after 48 hours, and to less than 10 percent after a few weeks. The intent of company training is usually to change on the job behaviors. Therefore, given these statistics, it is easy to see that the activity of training does not alone meet the objective and can be a complete waste of time and money. Learning, if it is defined as “the application of new knowledge to drive positive, measurable changes to your activities,” is the higher ordered objective. This requires the activity of training combined with the additional activities of coaching, practice, application, feedback and reinforcement. The process is called learning.
One of the positive aspects that Six Sigma has brought to bear on this issue is the notion of driving learning, not just conducting training. When Six Sigma training is done right, a lot of focus and attention goes into the setup for success. Candidates are carefully chosen, aligned with important projects and the training activity is staggered and combined with coaching to assure meaningful application and measurable results. The research indicates a 45 percent to 110 percent improvement in performance impact when utilizing these techniques. Like many things, a little more time up front and some good planning can yield significantly improved performance. These lessons can be applied at an individual, management or executive level and apply to any type of performance problem an organization is trying to overcome.
Lesson Learned: Focus on the process of learning not just the activity of training.
Activity 2: Data Collection vs. Information Processing
Everyone has seen the mountain of paper – Excel spreadsheets, databases and checklists – and has to wonder what will be done with them next, if anything at all. Most organizations are good at collecting data but poor at processing, managing and distributing useful information. And, even if data is collected, its accuracy and relevance is highly suspect. It is often collected out of organizational habit, to comply with an outdated procedure or out of fear of loosing some important piece of history. The Six Sigma methodology has taught many valuable lessons relating to the availability, use and accuracy of data. It also has exposed the old notion that “she or he who has the most data wins” is not necessarily true. What is really valuable is having accurate data, processing it in a way that adds value and providing it in a rich visual display to drive towards an accurate conclusion. In other words, information processing.
Many flawed problem-solving activities are derailed by inaccurate data. Most Black Belts agree. At least 50 percent of all processes they have worked on had data accuracy problems so severe the data was not useful for problem solving, even if the most advanced tools were used. Much has been written about measurement systems analysis. While discussing accuracy is not really the intent here, it is very important to note that, if anyone is going to spend a lot of time collecting data, that activity should be guided by a higher ordered process.
In software development, companies generally see a lot of data being generated in test. In many cases, it is too much to process. It leads to a misnomer that most of the problems are with test. This, after all, is the perceived bottleneck and one of the more expensive parts of the software development life cycle. It gets a lot of the focus because this is where the defects are found. Far too little effort, however, is expended on the discovery of where the defects were inserted (the really important information). It is also known that, given the nature of software development, insertion points are not so easy to pinpoint. But part of this problem is the limitation set by the historic and conventional way of collecting data. If a company set out to build its data collection activity to trap insertion point data, would it not have a higher probability of success? So in some ways companies fall into their own trap, and continue to just collect the data knowing full well it will never tell them something they really need to know.
Rather, those responsible should sit down and review the information (form, fit and function) needed and make data collection activity timely and value-added for future use in process and product improvement.
Lesson Learned: Focus on the information needed to drive improvement and tune data collection activities to yield accurate usable data.