Goals, Mission, and Purpose

Whenever you have trouble getting up in the morning, remind yourself that you’ve been made by nature for the purpose of working with others, whereas even unthinking animals share sleeping. And it’s our own natural purpose that is more fitting and more satisfying.Marcus Aurelius, Meditations, 8.12

When I was much younger I had an obsession with defining and achieving goals. I’d codified my approach into an unpublished workbook titled “The Goal Mapping Process.” There was nothing particularly unique about this process. All it really accomplished was laying out a method for breaking goals down into achievable tasks and then reassembling them in to the larger goal. It was very tactical and it worked. At least it did for me and perhaps that’s why I never published it. Working out the method within a frame of something that others might see forced a level of rigor that I might not have otherwise applied. In the end, it was just another way of getting things done.

Later in life the realization that completing goals had an element of dissatisfaction came into focus. Achieving goals, even big goals, wasn’t enough. The question of “What next?” frequently presented a blank slate. Figuring out how to achieve the goal kept me busy, but there was rarely any thought about what was after the goal. Or more importantly, what the underlying purpose of the goal was in the first place. What I learned was that a goal in and of itself, while often necessary, wasn’t as important to my overall satisfaction with life than the purpose or mission behind the goal. Goals are destinations. Mission and purpose are journeys.

This realization is perhaps twenty five or more years old. It turns out, defining goals and breaking them down into their tactical pieces is relatively easy. Defining an underlying purpose that makes identifying the associated goals is harder. After twenty five years I believe I have worked out a purpose and mission that has been fairly stable for the past five years.

My mission and purpose was influenced by the story of a woman named Janet. She died in 2005 at the age of 51. For ten years preceding her death she had been fighting breast cancer. For most of that time, her diagnosis was “terminal.” The battle statistics are staggering: 55 chemotherapy treatments, many of them high dose; 33 radiation treatments; 4 major surgeries; and uncounted doctor’s appointments. This and so much more is what it took to stretch a two year survival prognosis into ten.

I know Janet’s story because I was with her for every one of her chemotherapy treatments, the recovery after, and for each of her surgeries.

I know Janet’s story because she died in my arms.

I know Janet’s story because she was my wife.

I taught her how to search for and read research articles using Usenet and the nascent World Wide Web. While I was working two jobs Janet was searching these and many other resources for anything that might suggest viable treatment options. This effort is worthy of it’s own post, but does not factor so prominently in my purpose and mission. What does is something we experienced during this process of research.

Due to our heightened interest, news stories that claimed to have some angle on a “cure for cancer” caught our attention. Whenever we heard such news bites, we’d eagerly take note and then work to chase down the details. Invariably, they would end in disappointment. The news had hyper-inflated the claims of the researchers, often to the chagrin of the researchers themselves. We learned to tune out these news stories (eventually, the news altogether.)

I can recall many times during Janet’s cancer battle when I thought of these researchers. Indeed, of all the people working to solve the cancer conundrum. While Janet slept, I’d watch the milliliters slowly drip from the IV bag during the hours it would take for her chemotherapy treatments to run their course. I’d imagine dedicated individuals working long hours to solve chemical problem or design devices that would eventually replace the barbaric “suicide/salvage” strategy of contemporary chemotherapy. These were often moments of despair and feelings of extreme isolation. We were on the dark side of the moon, hoping for signals that would show the way across the cancer cure threshold and bring us home.

In the end, they never came and Janet lost the battle.

I’ve haven’t stopped thinking about the people who work to find a cure. Fifteen years later I find myself on the sunny side of Earth and in a position to help those working to solve the cancer conundrum. And I have to say, it isn’t how I imagined it would be.

There are certainly those who work long hours with a dedication that is both inspiring and humbling. But for the most part, there are people doing what people do – complaining, fighting for turf, lashing out over imagined offenses, scratching for more pay, finding ways to game the system, sinking to the lowest expected level of effort, defensive and afraid to correct bad behavior, perpetuating bad habits, blissfully unaware of cognitive biases that adversely affect their work, unaware yet aggressively protective their own limitations. It’s a lengthy list.

It is, as they say, a target rich environment for applying Agile principles and practices. The room for improvement pretty much matches the amount of space between here and the dark side of the moon.

One of the primary motivation devices in this environment are the success stories. And as well it should be. They are VERY moving and it’s impossible for me to see and hear the success stories of someone making it across the cancer cure threshold and not shed tears. For myself, there are also many untold stories which are similarly motivating and bring me to tears. These are the stories of those who did not make it across the cancer cure threshold but fought, like Janet, with everything they had while hoping a cure would be found before they lost the battle. The stories of the people who were fortunate to have been cured are examples of what we are trying to achieve. The stories of people who were not so fortunate are examples of why we need to find the most effective way possible for working together.

This is my purpose and my mission: Build teams that are communicating clearly and effectively, teams that understand both the value and limitations of diversity and inclusion, teams that are capable of uniting on well-reasoned goals, teams composed of compassionate individuals who are tirelessly seeking to understand themselves within the wider context and the longer view. Today, Agile principles and practices offer the greatest promise for fulfilling this purpose and achieving this mission. When something more effective emerges, I shall adapt accordingly.

Here’s to moving into 2020 with mind and eyes wide open.

How To Run an Agile Death March

Found on the Internet…

An experienced scrum master describes their work cycles as going “from being very busy during sprint end/start weeks to be [sic] very bored.” While this scrum master works very hard to fill in the gaps with 1:1’s with the team members and providing regular training opportunities, they nonetheless ask the question, “Does anyone have any suggestions of things I am maybe not doing that I should be doing?” One response included the following:

“Now, it could be that you have worked to create a hyper-performing team and there is no further room for improvement. A measure of this is that velocity (or similar metric) has increased by an order of magnitude in the last year.

However, the most likely scenario is that you and your team have become ‘comfortable’ and velocity has not increased significantly in the last few Sprints and/or there is a high variance in velocity.”

This reflects a common misunderstanding of “velocity” and its confusion with “acceleration.” (It also reflects the “more is better” and “winners vs losers” thinking derived from the scrum sports metaphor and points as a way of keeping score. I’ve written about that elsewhere.) Neither does the commenter understand what “order of magnitude” means. A velocity that increases by an order of magnitude in a year isn’t a velocity, it’s an acceleration. That’s a bad thing. This wouldn’t be a “hyper-performing” team. This would be a team headed for a crash as a continual acceleration in story points completed is untenable. More and more points each sprint isn’t the goal of scrum. A product owner cannot predict when their team might complete a feature or a project if the delivery of work is accelerating throughout the project.

Assuming a typical project, something that continues for a year or more, the team and the project will eventually crash as they’ve been pressured to work more and more hours and cut more and more corners in the interests of completing more and more points. The accumulation of bugs, small and large, will slow progress. Team fatigue will increase and moral decrease, resulting in turn-over and further delays. In common parlance, this is referred to as a “death march.”

Strictly speaking, velocity is some displacement over time. In the case of scrum, it is the number of story points completed in a sprint. We’ve “displace” some number of story points from being “not done” to “done.” By itself, a single sprint’s velocity isn’t particularly useful. Looking at the velocity of a number of successive sprints, however, is useful. There are two pieces of information from looking at successive sprint velocities that, when considered together, can reveal useful aspects of how well a team is performing or not. The first is the average over the previous 5 to 8 sprints, a rolling average. As a yard stick, this can provide a measure of predictability. Using this average, a product owner can make a rough calculation for how many sprints remain before completing components or the project based on the story point information in the product backlog.

The measure of confidence for this prediction would come from an analysis of the variance demonstrated in the sprint velocity values over time. Figures 1 and 2 show the distinction between the value provided by a rolling average and the value provided by the variance in values over time.

Figure 1

Figure 2

In both cases the respective teams have an average velocity of 21 points per sprint. However, the variability in the values over time show that the team in Figure 1 would have a much higher level of confidence in any predictions based on their past performance than the team shown in Figure 2.

What matters is the trend, each sprint’s velocity over a number of sprints. The steady completion of story points (i.e. work) sprint to sprint is the desirable goal. Another way to say this is that a steady velocity makes it possible to predict project delivery dates. In real life, there will be a variance (up and down) of sprint velocity over time and the goal is to guide the project such that this variance is within a manageable range.

If a team were to set as its goal an increase in the number of story points completed from sprint to sprint then their performance chart might initially look like Figure 3.

Figure 3

Such a pace is unsustainable and eventually the team burns out. Fatigue, decreased moral, and overall dissatisfaction with the project cause team members to quit and progress grinds to a halt. The fallout of such a collapse is likely to include the buildup of significant technical debt and code errors as the run-up to the crescendo forced team members to cut corners, take shortcuts, and otherwise compromise the quality of their effort. [1] The resulting performance chart would look something like Figure 4.

Figure 4

All that said, I grant that there is merit in coaching teams to make reasonable improvements in their overall sprint performance. An increase in the overall average velocity might be one way to measure this. However, to press the team into achieving an order of magnitude increase in performance is a fools errand and more than likely to end in disaster for the team and the project.

References

[1] Lyneis, J.M, Ford, D.N. (2007). System dynamics applied to project management: a survey, assessment, and directions for future research. System Dynamics Review, 23 (2/3), 157-189.

Relative Team Expertise and Story Sizing

In Parkinson’s Law of Triviality and Story Sizing, I touched on the issue of relative expertise among team members during collaborative efforts to size story cards. I’d like to expand on that idea by considering several types of team compositions.

Team 1 is a tight knit band of four software developers represented in Figure 1.

Figure 1 - Team 1
Figure 1 – Team 1

Their preferred domain and depth of experience is represented by the color and area of their respective circles. While they each have their own area of expertise, there is a significant overlap in common knowledge. All four of them understand the underlying architecture, common coding practices, and fundamental coding principles. Furthermore, there is a robust amount of inter-domain expertise. When needed, the HTML5/CSS developer can probably help out with JavaScript issues, for example. The probability of this team successfully working together to size the stories in the product backlog is high.

Team 1 represents a near-ideal team composition for a typical software related project. However, the real world isn’t so generous in it’s allocation of near-ideal, let alone ideal, teams. A typical team for a software related project is more likely to resemble Team 2, as represented in Figure 2.

Figure 2 - Team 2
Figure 2 – Team 2

In Team 2, the JavaScript developer is fresh out of college,  new to the company and new to the business. His real-world experience is limited so his circle of expertise is smaller relative to his teammates. The HTML5/CSS developer has been working for the company for 10 years and knows the business like the back of her hand. So she has a much wider view of how her work impacts the company and product development. As a team, there is much less overlap and options for helping each other through a sprint is diminished.  As for collaborative story sizing efforts, the HTML5/CSS and C# developers are likely to dominate the conversation while the JavaScript developer agrees with just about anything not JavaScript related.

As Agile practices become more ubiquitous in the business world, team composition beings to resemble Team 3, as shown in Figure 3.

Figure 3 - Team 3
Figure 3 – Team 3

The mix now includes non-technical people – content developers and editors, strategists, and designers. Even assuming an equal level of experience in their respective domains, the company, and the business environment, there is very little overlap. Arriving at a consensus during a story sizing exercise now becomes a significant challenge. But again, the real world isn’t even so kind as this. We are increasingly more likely to encounter teams that resemble Team 4 as shown in Figure 4.

Figure 4 - Team 4
Figure 4 – Team 4

As before, the relative circle of expertise among team members can vary quite a bit. When a team resembles the composition of Team 4, the software developers (HTML5/CSS and C#) will have trouble understanding what the Learning Strategist is asking for while the Learning Strategist may not understand why what he wants the software developers to deliver isn’t possible.

When I’ve attempted to facilitate story sizing sessions with teams that resemble Team 4 they either become quite contentious (and therefore time consuming) or team members that don’t have the expertise to understand a particular card simply accept the opinion of the stronger voices. Neither one of these situations is desirable.

To counteract these possibilities, I’ve found it much more effective to have the card assignee determine the card size (points and time estimate) and work to have the other team members ask questions about the work described on the card such that the assignee and the team better understand the context in which the card is positioned. The team members that lack domain expertise, it turns out, are in a good position to help craft good acceptance criteria.

  • Who will consume the work product that results from the card? (dependencies)
  • What cards need to be completed before a particular card can be worked on? (dependencies)
  • Is everything known about what a particular card needs before it can be completed? (dependencies, discovery, exploration)

At the end of a brief conversation where the entire team is working to evaluate the card for anything other than level of effort (time) and complexity (points), it is not uncommon for the assignee to reconsider their sizing, break the card into multiple cards, or determine the card shouldn’t be included in the sprint backlog. In short, it ends up being a much more productive conversation if teammates aren’t haggling over point distinctions or passively accepting what more experienced teammates are advocating. The benefit to the product owner is that they now have additional information that will undoubtedly influence the product backlog prioritization.