Mediocrity or Magnificence
Measures of Choice or a Choice of Measures
There are basically only two choices. Either you are diligently
pursuing one or more objectives consistent with your vision or
you're wandering aimlessly. And aimlessness is the sure road to
mediocrity. The choice is essentially up to you!
When an organization deploys Solution-Centered Support it should
be prepared to evolve through a number of stages of development.
During this evolution the organization should never lose sight
of the the fact that Solution-Centered Support is never the objective,
but simply a strategy which will enable to organization to better
achieve whatever it is striving to accomplish.
Measures are essential for an organization to maintain an awareness
of what's happening in its environment, its response to that environment,
and the results it is producing. And at the same time it must
be noted that the measures are simply an indication of what is
being created by some activity. The numbers are a basis for determining
what aspects of the activity to focus on. If one focuses on the
numbers they will change, yet the activity which changes to alter
the numbers is not likely to be the most appropriate activity
changes to benefit the organization. The activity changes will
be those which will most benefit the numbers you were looking
for.
So it is critical to measure a variety of activities that will
influence meaningful and progressive changes. Here are activities,
that if they are measured, reported and are the subject of management
focus and conversation will generate improved performance and
productivity and progress towards the organizational vision.
- Activity - activity is that
input from the external environment which drives the organization.
It is this activity which the organization must develop a response
to on an ongoing basis.
- Call Volume - call volume is simply the rate of received
calls. This should be tracked on a daily basis as it forms the
basis for the organization's response. Most organization's have
different classifications of calls as they have made commitments
to respond to differently based on some call severity level.
- As an example consider an organization that takes 3 types
of calls, e.g., Type 1, Type 2, Type 3. It is important for the
organization to track the call volume trends over time to determine
how things are changing. Two examples are provided in the following
graphs.
-
-
- From the above graphs it appears that there is an increase
in Type 1 calls. It is appropriate to ask why this might be happening,
and what will be the impact on the organization if it continues.
- Response - response is what
the organization does based on the input activity. The organization's
response should be tracked on multiple dimensions to get an appropriate
sense of how the responses are changing. These response dimensions
are essentially leading indicators of the Results the organization
will ultimately produce.
- Hold Time & Abandon Rate - Hold time is
the time a customer holds on the phone before the call is answered
by support. It is expected that hold time is a leading indicator
of Customer Satisfaction to some extent, i.e., the longer a customer
is on hold the less likely they are to be satisfied with the
support transaction. Abandon Rate is the % of calls where the
customer considers that the hold time has been too long so they
hang up, maybe to call again later.
-
- One would expect that as the hold time increases the abandon
rate should also increase.
- First Call Resolution - first call resolution is the
% of calls which are resolved when the customer initially calls
rather than requiring one or more call backs. It is expected
that first call close rate is a leading indicator of the productivity
of the organization simply because call backs are time consuming.
- Call Close Rate - because the organization has made
commitments for certain responses for specific types of calls
it needs to track its performance against these commitments.
To accomplish this it needs to track the trends for the Average
Time to Close (ATC) for each call type.
-
-
- The two sample graphs provided indicate that all three call
types are running relatively steady in terms of the time to resolve.
Which doesn't mean things are ok. It simply means that things
don't appear to be trending in a troublesome direction. It also
means that things don't seem to be improving. And anytime things
aren't getting better they're probably getting worse - you just
don't know it yet!
- Backlog - backlog indicates the volume of work in
progress that the organization is facing. Backlog is a trouble
light. It generally provides a leading indicator for increasing
Average Time to Close (ATC), but probably not a leading indicator
for decreasing Average Time to Close (ATC). That is, as the backlog
increases one can expect that Average Time to Close (ATC) will
begin to increase because calls in the backlog are in fact aging
while sitting in the backlog. Backlog decrease is generally not
a leading indicator for reduced Average Time to Close (ATC) because
with decreasing backlog, staff will tend to stretch out the time
it takes to work the call, unless their capacity to develop meaningful
work beyond simply responding to calls has been developed by
management. This is along the lines of work expanding to fill
the time available. As the backlog declines Average Time to Close
(ATC) will tend to remain in line with what is considered to
be the acceptable Average Time to Close (ATC).
-
-
- The sample backlog graphs above indicate that the backlog
for Type 1 calls is tending to increase. Something that should
be looked into.
- Staff - staff are what the organization has to respond
to the activity with in an attempt to produce results. Staff
can be a leading indicator for both Average Time to Close (ATC)
and Backlog in many instances. If staff are decreasing both Backlog
and Average Time to Close (ATC) are likely to increase soon after
- unless there was available capacity in the organization prior
to the reduction of staff.
-
- The above chart shows Staff, Applied Rate, and Effective
Staff as a result of the applied rate. With a reduction in Staff
and a relatively constant applied rate the effective staff declines.
This should show up shortly in terms of increased Backlog and
Average Time to Close (ATC).
- Productivity - productivity is a measure of the staff
required to resolve what call volume. Productivity is a bit difficult
to calculate for a specific time because calls are closed over
a period of time. What makes more sense is to calculate an average
of the average time to close over a week or so and divide it
by the average staff available for the same time period. This
will provide a sense of the average productivity over a period
of time.
- Participation Rate - participation rate is the percentage
of calls resolved using SolutionBuilder. This should be measured
both in terms of the extent to which SolutionBuilder is being
used in the workflow to solve problems and the extent to which
calls are closed with a solution linked to them.
-
- There is probably no area of Solution-Centered Support which
has been the center of as much controversy as participation rate,
and what should one expect it to be. There are only two valid
answers of for this, i.e., it should be either 0% of 100%. Either
the organization is not pursuing Solution-Centered Support, which
would warrant and 0% participation, or it is pursuing Solution-Centered
Support and the target should be 100% participation. I know this
is apt to meet with immediate argument, yet what is the engineers
responsibility? Isn't it "To solve the customers problem
in such a way that others can capitalize on what they learned
during the interaction?" Engineers seem to have endless
excuses for not accepting their responsibility within the context
of Solution-Centered Support. Some of the most often expressed
excuses are:
- It takes too long - every time we have investigated
this excuse it has been found that it does in fact take too long
- because the engineer was either waiting till after they solved
the problem to access solution builder, or they simply couldn't
type, or they were searching for content incorrectly. The intent
is for engineers to capture the customers experience in the workflow
while they are solving the problem. This enables them to solve
the problem with the benefit of what everyone knows rather than
relying on just what they know. If an engineer doesn't create
a solution for a problem when one is warranted they waste everyone's
time as the problem has to be solved over and over. If the engineer
accepts their responsibility, even if it did take longer, it
would be an appropriate action as it would be beneficial to the
group overall.
- I can't find what I'm looking for - this is generally
due to looking for the answer after the fact rather than capturing
the customer's experience in the workflow. Searching with a very
thin context generally produces results with too may answers
to choose from.
- I already know the answer - even if the engineer already
knows the answer their intent should be to ensure that the solution
is in the database and see if there is a way they can improve
on it based on what they just learned from their interaction
with the customer.
- Its already in the database - even if the solution
is in the database, was it as easy to find as it could be, and
it there something that can be done to make it more findable
or more more complete based on what was learned from this customer
interaction.
- Solutions Created and Reused - solutions created and
reused essentially represents the two components of participation
rate. These should be tracked against the call volume with the
intent of determining how the relationship between created and
reused is changing over time. In a stable environment, i.e.,
without the release of new products, etc., creation should decline
over time as reuse increases. When new products are released,
you should be able to see the correlation in solutions created.
If product groups are working to seed the database with product
solutions prior to new product fielding, create rates will also
increase dramatically.
-
- The above graph shows the sum of create and reuse plotted
against call volume so the participation rate is readily apparent.
- Solutions In Progress - solutions in progress essentially
represents solutions which have not yet been completed. Solutions
in progress should be representative of the current call backlog
if an organization is doing its housekeeping in a timely manner.
-
- The above diagram shows that the actual solutions in progress
(SIP-Act) is greater than the expected solutions in progress
(SIP-Exp) and the organization should be paying better attention
to its housekeeping. The expected solutions in progress is computed
from the backlog times the open participation rate.
- Solution Rework - each organization generally has
a set of solution status codes which represent solution rework.
These codes might represent initially completed solutions waiting
for review, or reviewed solutions on which someone has raised
a question and need to be clarified.
-
- A growing backlog of unreviewed solutions indicates that
reviewers are not keeping up with their task. A growing backlog
of clarify would indicate that individual engineers are not adequately
managing their content.
- Solutions Complete - completed solutions are generally
in two categories, public and internal. One should be concerned
that these two categories are maintaining an appropriate ratio.
The organization should strive to ensure the greatest portion
of their solution are in public status so they may be accessed
by customers.
-
- The above graph indicates that internal solutions are growing
faster than public, a situation which should be looked into.
- Solution Quality - solution quality ratings should
be developed for the solution set by sampling the content on
a daily basis. Individual solution quality ratings should also
be assigned to the individual who created the solution so what
is being developed is a profile of the overall quality of the
solution set as well as an indication of an individuals ability
to create quality content. Solutions identified during the quality
review that warrant clarification should be placed back in a
clarify status for the owner to review.
-
- The above graph includes a moving average as well as the
individual daily quality evaluations.
- Team Profile - each team should be profiles as to
the resources available to participate in the response the team
creates. This profile should provide an indication of the competency
levels of the individuals as this presents a perspective as to
the capacity of the team itself. Categories that seem to make
sense are:
- Entry - a new person in the group who had not yet
demonstrated a level of process, technical, or content development
competency. Individuals at this level should be provided limited
privileges, such as an ability to modify their own solutions
only.
- Intermediate - a person who has demonstrated process
and content development competency, yet may have a ways to go
to demonstrate technical competency. These individuals should
have privileges to modify all solutions.
- Reviewer - a person who has demonstrated technical
competency, as well as the competencies to reach the intermediate
level. These individuals should be able to review solutions created
by new people and offer feedback. They should also be able to
delete solutions.
- Coach - a person who has reached the reviewer level
and demonstrated the interpersonal skills necessary to offer
coaching to develop the skill levels of other individuals within
the group.
-
- The above chart indicates that team profile is something
more appropriately tracked weekly rather than daily, and also
serves to indicate the total number of resources an team has
available to produce a response.
- Results - results are the end
product of the organization's response to activity. Regardless
of what the organization is endeavoring to accomplish it is the
results it produces which finally gets it there. And just in
case there's a question about the ordering of the 3 items in
this section, the ordering was intentional. Employee satisfaction
comes first. This is based on the premise that employees are
seldom likely to treat customers any better than the organization
treats its employees.
- Employee Satisfaction - employee satisfaction is most
easily determined via local group survey. The survey needs to
be done often enough for the findings to be acted upon in such
a way that the individuals providing feedback are able to see
that their input actually has an impact on the operation. By
simply asking for input creates an expectation on the part of
those providing the input. And if this input is not acted upon
in a visual manner input will decline in the future.
- Input should be solicited in 4 dimensions, with an overall
evaluation of satisfaction provided by each respondent.
- What are we doing that we are doing well that we should keep
doing.
- What are we doing that we need to improve the way we are
doing it.
- What are we not doing that we should start doing because
it would be beneficial.
- What are we doing that we should stop doing because it is
not providing appropriate return on effort.
- This input should be categorized in terms what is within
the groups purview to alter and what aspects are outside the
group and need to be addressed by higher authority.
-
- The above graph, which is probably most appropriately done
weekly, depicts the overall group level of satisfaction, along
with a moving average trend line.
- Customer Satisfaction - there is basically only 2
ways to assess customer satisfaction, one effective, and one
ineffective. Most organizations do periodic, e.g., quarterly,
semi-annually, or annually, surveys of their entire customer
base to assess customer satisfaction levels. Although this might
produce information which makes the organization feel good, it
is the ineffective way of determining customer satisfaction.
Surveying customer satisfaction, if it is to be of any real use,
must be done close enough in time to the service provided and
discrete enough so the response can be used to influence the
actual activity responsible for producing the existing level
of customer satisfaction.
-
- The customer satisfaction sampling should be used to determine
the overall customer satisfaction for the group as well as the
customer satisfaction being produced by each individual responsible
for the response to the customer that was sampled. If the trends
are not positive then action should be taken to correct the activity
which is not producing the appropriate results.
- Cost per Resolution - cost per resolution is determined
from the combined cost of operation, i.e., resource cost, benefits,
and facilities cost, divided by the number of resolutions per
time period. If the organization is serious about improving this
result, and it should be, then the appropriate time frame for
tracking is weekly.
The comment most often made in light of all that has been presented
above is, "That's a lot of work. When will I get any work
done?" Well, for a manager, this is their work. Assessing
the performance of the group and acting in a manner which continually
improves the results of the group is the job of the manager. If
they believe their responsibility is something else then they
should discharge that responsibility with some other organization,
because they're not benefiting their current organization in the
manner in concert with what their responsibility should be.
The second question is what to do with all the information.
All of the information should be used as feedback to the group
so they know what they are accomplishing and what it's costing
to produce that result. Yet one must be very careful as was mentioned
at the beginning of this paper. Numbers are a double edged sword
and without appropriate care a manager will quickly fall on the
pointed end of this sword if they attempt to wield it inappropriately.
If the numbers are not trending in a desired direction the numbers
cannot become the focus. One must focus on the behaviors within
the group that are responsible for producing the numbers. The
numbers are not the behavior, they are just indicators. If the
numbers are trending in a desired direction it's not time to become
complacent. It is time to stick to one's knitting so to speak.
Continually attention to behaviors are essential to ensure that
numbers trending in the desired direction continue to do so.
With regard to measures it might be said,
"And that is that!"
theWay of Systems
* Feedback
* Musings
Copyright © 2004 Gene Bellinger