EVM and TPM are Project Waste

I think Earned Value Management (EVM) and Technical Performance Management (TPM) are project waste.

Investors, stakeholders and end-users want to know if their investment will:

  • be delivered as specified;
  • be delivered when requested;
  • be a contained cost of acquisition;
  • always work; and
  • generate recurring return on investment.

Not enough people ask about the cost of ownership, but since many people don't seem to grasp that low cost of acquisition pursuits often compromise long-term cost of ownership, we'll save that for another conversation. Regardless, what we can all agree upon is if we're an employee or consultant spending someone else's money, we're hired to deliver while being a good financial steward. If we spend our own money, even more so.

So why is the money being wasted in software projects calculating EVM and TPM?

Determining Value

  • Market analysis will tell you if your idea has any consumer value, whether product or feature
  • If market analysis cannot tell you this with useful clarity, then you need to test the market with a minimum viable feature (MVF) or product (MVP)
  • Neither effort is a place for EVM or TPM
  • You simply need working software and an ability to empirically understand (e.g. analytics harnesses) how the consumer does or does not interact with the test letting raw data teach you where to spend money

Specifying an MV(F/P)

  • This is a much larger conversation well covered in other treatises
  • Based upon your tests, decompose the MVP into into MVFs (aka epics) and order them according to MoSCoW or some other talisman of the trade
  • After you have a "Must" list (which is by virtue assigning value), it must be prioritized (yes, they all "must" allegedly happen, but not at the same time; so there must still exist an order to the madness)
  • We need working, tested, demonstrable software not EVM and TPM calculations

Decomposing, Sizing and Prioritizing

  • After the "must" list is prioritized, decompose the epics into deliverable, automated testable statements (stories and acceptance criteria) using role-based stakeholders
  • Size each story (or acceptance criteria therein) using your estimation model of choice
  • Re-prioritize the epics and stories within using your new information validating value and asserting cost associated to points
  • Again, this is no place for EVM or TPM

Delivering

  • In Extreme Programming, predictable, repeatable two-week iterations are preferred to retrieve feedback now, not downstream
  • However, using continuous delivery behaviours, we can know the state of the software multiple times per hour taking away any possibility of value contribution by EVM and TPM math
  • Based upon story points and capacity, each person makes a commitment and delivers on the commitment
  • Iteration planning, daily stand-ups, iteration demo (interact with the end-user, get feedback now, change now), iteration retrospective, repeat
  • Planned versus actual burns occur daily and per iteration, per epic and story, visibly and transparently and tell us about our plans and actuals

Continually Deliver while Proving Completeness and Correctness

  • During development, automate the acceptance test statements within each story which may include role-based attributes of both positive and negative statements (DO, DO NOT)
  • During development, automate the unit testable elements
  • Using continuous delivery, continuously integrate, test, inspect, deploy
  • Integrate your change management, version control and continuous deploy tools for automated traceability, auditability and reportability
  • EVM and TPM serve no valuable purpose here

Deliver to production, continually. Verify continually. Report continually. In other words, the method by which we determined what to deliver and then delivered it then became the method by which production software is maintained and evolved while in production. When we are building software, we are building the framework to manage the project, product, feature, process, teams, money and timeline.

EVM and TPM were originally created to help determine qualitative value in spending. In today's software world, they cannot be used to determine market demand, test consumer response, identify, prioritize, decompose, estimate, reprioritize or then continuously deliver software. Basically, while EVM and TPM talk about determining qualitative value in arrears, the software delivery model referenced above actually delivers it in real-time on far more dimensions than EVM and TPM math can muster.

Why spend money calculating something that doesn't actually contribute to generating revenue? I look forward to hearing an answer.

Paying in Advance

Insurance companies tend to pay for things after they are already an issue.

Of course, there is no guarantee they will do so. They reserve the right to say no. So you could discover an issue after it is already a problem and be declined for coverage because it doesn't fall into the correct conditional context, even if it was originally alleged to be covered.

The only thing insurance companies guarantee is a required monthly invoice. They send it, the customer pays it.

As it relates to insurance, the customer must pay no matter what. There is no guarantee that anything ever paid will actually benefit the customer. Money in; no guarantee of ROI for the customer.

Software testing at the end of a delivery pipeline is exactly and precisely the same game.

There is no guarantee that end of pipeline testers will find anything useful. In fact, even if the end of pipeline test team communicates "this is what we test" and "this is what we do not test", there is no useful guarantee that either will happen as spoken. To test does not mean to find. And to find does not imply useful discovery.

The only thing end of lifecycle test teams guarantee is a required monthly or bi-monthly invoice. They exist, the employer pays.

As it relates to end of lifecycle test teams and efforts, the employer must pay no matter what. There is no guarantee that anything ever paid will actually benefit the employer. Money in; no guarantee of ROI for the employer.

Both ideas are risk-mitigation tactics or "just in case" choices. Wouldn't it be great if there were things we could do to prevent, mitigate or otherwise minimize the probability of downstream issues?

This isn't dialogue on proper health habits. Although, like all of life, daily healthy choices impact software engineering and leadership effectivity.

So how do we change the cost and risk model of paying for something at the end of a lifecycle that may or may not provide timely value?

For some people this is known. For others, this is brand new knowledge. Start with the basics and evolve from there.

Like proper diet and exercise behaviours, it requires context-driven balance, exploration and committed cadence. And yes, there's more to do after that. However, if you're trying to understand why you're paying all this money and getting nothing discernably useful out of it other than "risk mitigation", stuff that might possibly someday save you, consider changing the model. Change your cost model by changing your risk exposure model. Prevent it now, find it now.

After all, who likes the idea of paying all that money monthly and finding out, when you needed it, you weren't covered?

If you don't understand these fundamentals, advanced topics like evolutionary design will evade you.

Is it any wonder?

Is it any wonder that many software programmers never become masters of their craft?

A young person:

  • graduates from university wanting to be a programmer
  • is hired at a company that espouses love for programmers
  • is assigned to a project and excited to create software solutions
  • is assigned work by someone else who already did the thinking
  • is asked to estimate how long to finish the work
  • does the work and is wrong about the estimate
  • feels like he failed

After this initial failure, the young person now wonders how everyone views the new-hire, fresh out of university? Naive? Young? Not useful? A bad hire? And the young person is now diverted from thinking about, knowing and perfecting his craft to ensuring that other people perceive him as a good hire, as a useful contributor and as someone they want to keep.

Another project comes around and the young programmer is again assigned as a team member. This time being a little wiser, the young programmer:

  • asks to participate in the design of the system
  • asks to be involved some interesting parts of the development
  • is assigned work by someone else again
  • is asked how long it will take to get the job done
  • does the work and is wrong about the estimate
  • realizes that the definition of done is different than the assigned work
  • is told he needs to get better at estimating

Now the lightly seasoned programmer is wondering what he overlooked in university, what he may be missing and now concerned that his very first performance reviews will be mediocre at best. He is no longer thinking about how to be a good programmer; now he's trying to figure out what is making his estimates wrong and how to stay employed.

Rinse. Repeat.

Fast forward ten years later, our young fresh-out of university programmer is now seasoned, if not actually jaded. Our now seasoned programmer knows that no one is really interested in how much or little software is written, how clever the algorithms or complex the integrations, how high the quality of software, the reliability, extensibility or whether he used industry standards, decreased long-term cost of ownership or eliminated hand-rolled vendor lock-in solutions. The only thing anyone ever asks of the programmer is "how long will it take" and "why didn't you get it done in the time you said it would take". The seasoned programmer has learned that the key to getting promotions, staying employed and being valued is measured in how well he meets his estimates (accuracy), how agreeable he is when someone does the thinking and hands off the actual work (good team member) and whether he participates in corporate events (corporate citizenship).

As a result of his professional experience, our seasoned software programmer pursues successful navigation of his corporate culture, craft agnostic. No one is asking him about his craft; only whether he's part of the solution or part of the problem.

People who choose software programming as a profession need to recognize the game board into which they are stepping. Project Managers assigned to 'control and manage' the project according to time, scope and budget even though they may have no idea what it actually takes to deliver the scope other than the budget and time estimates preceding project start. Ladder climbing managers who are assigned to make sure developers develop, testers test, requirements people generate requirements, time is entered weekly and people show up to meetings. Architects who construct architectural directions, designs and directives that have nothing to do with solving real business problems, substitute hand-rolled standards in place of industry standards and encourage technical staff to do as they're told rather than perfecting their craft and increasing value to the business.

Without the software, there is no system, solution, product, revenue or company. Without the software programmer, there is no software. Ironically, everyone who is not a software programmer will attempt to manage the software programmers to their own version of productivity and perceived value.

Dear Software Professional,

You stand in the midst of people. People who have personal identity, lives and responsibilities before they get to work. People who understandably bring their personal stuff with them to work. People who have professional desires and struggle to achieve them daily. People who are developing their own personal and professional world views, at the same time, in real-time, while you're present and involuntarily involved. People who are very often defining their own value by measuring their contributions against yours and those of everyone else.
No one is going to ask you to perfect your craft. They're busy trying to discover and/or perfect their own. All of these other roles want you to be a craftsman. That is why you were hired. However, they many not know to ask for craftsmanship or even understand what it really means. And in the event it doesn't look like you know your craft or that you are pursuing some form of craftsmanship, they'll step in and insert their own definition and method of done.

It is up to you to answer these questions for yourself.

  1. What is your craft?
  2. Are you a master of your craft?
    • If no, what are you doing about it?
    • If yes, how do you know?
  3. Why were you hired?

After you figure this out for yourself, next is figuring out how to be a team member who pursues his craft in an environment that may not naturally enable such a feat. Let's party. Shall we?