Crafting a Product Vision

In his book, “Crossing the Chasm,” Geoffrey Moore offers a template of sorts for crafting a product vision:

For (target customer)

Who (statement of the need or opportunity)

The (product name) is a (product category)

That (key benefit, problem-solving capability, or compelling reason to buy)

Unlike (primary competitive alternative, internal or external)

Our product/solution (statement of primary differentiation or key feature set)

To help wire this in, the following guided exercise can be helpful. Consider the following product vision statement for a fictitious software program, Checkwriter 1.0:

For the bill-paying member of the family who also uses a home PC

Who is tired fo filling out the same old checks month after month

Checkwriter is a home finance program for the PC

That automatically creates and tracks all your check-writing.

Unlike Managing Your Money, a financial analysis package,

Our product/solution is optimized specifically for home bill-paying.

Ask the team to raise their hand when an item on the following list of potential features does not fit the product/solution vision and to keep it up unless they hear an item that they feel does fit the product/solution vision. By doing this, the team is being asked, “At what point does the feature list begin to move outside the boundaries suggested by the product vision?” Most hands should go up around item #4 or #5. All hands should be up by #9. A facilitated discussion related to the transition between “fits vision” and “doesn’t fit vision” is often quite effective after this brief exercise.

  1. Logon to bank checking account
  2. Synchronize checking data
  3. Generate reconciliation reports
  4. Send and receive email
  5. Create and manage personal budget
  6. Manage customer contacts
  7. Display tutorial videos
  8. Edit videos
  9. Display the local weather forecast for the next 5 days

It should be clear that one or more of the later items on the list do not belong in Checkwriter 1.0. This is how product visions work. They provide a filter through which potential features can be run during the life of the project to determine if they are inside or outside the project’s scope of work. As powerful as this is, the product vision will only catch the larger features that threaten the project work scope. To catch the finer grain threats to scope creep, a product road map needs to be defined by the product owner.

Show your work

A presentation I gave last week sparked the need to reach back into personal history and ask when I first programed a computer. That would be high school. On an HP 9320 using HP Educational Basic and an optical card reader. The cards looked like this:

(Click to enlarge)

What occurred to me was that in the early days – before persistent storage like cassette tapes, floppy disks, and hard drives – a software developer could actually hold their program in their hands. Much like a woodworker or a glass blower or a baker or a candlestick maker, we could actually show something to friends and family! Woe to the student who literally dropped their program in the hallway.

Then that went away. Keyboards soaked up our coding thoughts and stored them in places impossible to see. We could only tell people about what we had created, often using lots of hand waving and so much jargon that it undoubtedly must have seemed as if we were speaking a foreign language. In fact, the effort pretty much resembled the same fish-that-got-away story told by Uncle Bert every Thanksgiving. “I had to parse a data file THIIIIIIIIIS BIG using nothing but Python as an ETL tool!”

Yawn.

This is at the heart of why it is I burned out on writing code as a profession. There was no longer anything satisfying about it. At least, not in the way one gets satisfaction from working with wood or clay or fabric or cooking ingredients. The first time I created a predictive inventory control algorithm was a lot of fun and satisfying. But there were only 4-5 people on the planet who could appreciate what I’d done and since it was proprietary, I couldn’t share it. And just how many JavaScript-based menu systems can you write before the challenge becomes a task and eventually a tedious chore.

Way bigger yawn.

I’ve found my way back into coding. A little. Python, several JavaScript libraries, and SQL are where I spend most of my time. What I code is what serves me. Tools for my use only. Tools that free up my time or help me achieve greater things in other areas of my life.

I can compare this to woodworking. (Something I very much enjoy and from which I derive a great deal of satisfaction.) If I’m making something for someone else, I put in extra effort to make it beautiful and functional. To do that, I may need to make a number of tools to support the effort – saw fences, jigs, and clamps. These hand-made tools certainly don’t look very pretty. They may not even be distinguishable from scrap wood to anybody but myself. But they do a great job of helping me achieve greater things. Things I can actually show and handle. And if the power goes down in the neighborhood, they’ll still be there when the lights come back on.

Improving the Signal to Noise Ratio – Coda

In a Scientific American column delightfully named “The Artful Amoeba” there is an article on a little critter called the “fire chaser” beetle: How a Half-Inch Beetle Finds Fires 80 Miles Away – Fire chaser beetles’ ability to sense heat borders on the spooky

Why a creature would choose to enter a situation from which all other forest creatures are enthusiastically attempting to exit is a compelling question of natural history. But it turns out the beetle has a very good reason. Freshly burnt trees are fire chaser beetle baby food. Their only baby food.

Fire chaser beetles are thus so hell bent on that objective that they have been known to bite firefighters, mistaking them, perhaps, for unusually squishy and unpleasant-smelling trees.

This part is interesting:

A flying fire chaser beetle appears to be trying to give itself up to the authorities. Its second set of legs reach for the sky at what appears to be an awkward and uncomfortable angle.

But the beetle has a good reason. It’s getting its legs out of the way of its heat eyes, pits filled with infrared sensors tucked just behind its legs.

A strategy suggested by the fire beetle life cycle is if you want to maximize a signal to noise ratio, iterate through three simple things:

  1. Work to develop a super well defined signal/goal/objective.
  2. Remove every possible barrier to receiving information about that signal – mental, emotional, even physical – that you can think of or that you discover over time.
  3. Repeat

Also, the “Way of the Amoeba” is now the “Way of the Artful Amoeba.” Update your phrase books accordingly.

Improving the Signal to Noise Ratio – Revisited

Additional thoughts about signals and noise that have been rattling around in my brain since first posting on this topic.

At the risk of becoming too ethereal about all this, before there is signal and before there is noise, there is data. Cold, harsh, cruelly indifferent data. It is after raw data encounters some sort of filter or boundary, something that triggers a calculation to evaluate what that data means or whether it is relevant to whomever is on the other side of the filter, that it begins to be characterized as “signal” or “noise.”

Since we’re talking about humans in this series of posts, that filter is an amazingly complex system built from both physiological and psychological elements. The small amount of physical data that hits our senses and actually makes it to our brains is then filtered by beliefs, values, biases, attitudes, emotions, and those pesky unicorns that can’t seem to stop talking while I’m trying to think! It’s after all this processing that data has now been sorted according to “signal” (what’s relevant) and “noise” (what’s irrelevant) for any particular individual. Our individual systems of filters impart value judgments on the data such that each of us, essentially, creates “signal” and “noise” from the raw data.

That’s a long winded way to say:

data -> [filter] -> signal, noise

Now apply this to everyone on the planet.

data -> [filter 1] -> signal 1, noise 1

data -> [filter 2] -> signal 2, noise 2

data -> [filter n] -> signal n, noise n

As an example, Google, itself a filter, is a useful one. Let’s assume for a moment that Google is some naturally occurring phenomenon and not a filter created by humans with their own set of filters driving what it means to create a let’s be evil good search engine. To retrieve 1,000,000 pieces of information, my friend, Bob, entered search criteria of interest to him, i.e. “filter 1.” Maybe he searched for “healthy keto diet recipes”. Scanning those search results, I determine (using my “filter 2”) 100% of the search results are useless because my filter is “how do i force the noisy unicorns in my head to shut the hell up”. The Venn diagram of those two search results is likely to show a vanishingly small set of relationships between the two. (Disclaimer: I have no knowledge of the carbohydrate content of unicorns nor how tasty they may be when served with capers and a lemon dill sauce.)

Google may return 1,000,000 search results. But only a small subset is viewable at a time. What of the rest of the result set that I know nothing about? Is it signal? Is it noise? Is it just data that has yet to be subjected to anyone’s system of filters? Because Google found stuff, does that make it signal? Accepting all 1,000,000 search results as signal seems to require a willingness to believe that Google knows best when it comes to determining what’s important to me. This would apply to any filter not our own.

All systems for distinguishing signal from noise are imperfect and some of us on the Intertubes are seeking ways to better tune our particular systems. The system I use lets non-relevant data fall through the sieve so that the gold nuggets are easier to find. Perhaps at some future date I’ll unwittingly re-pan the same chunk of data through an experienced-refined sieve and a newly relevant gem will emerge from the dirt. But until that time, I’ll trust my filters, let the dirt go as noise, and lurch forward.

Improving the Signal to Noise Ratio – In Defense of Noise

[This post follows from Improving the Signal to Noise Ratio.]

All signal all the time may not be a good thing. So I’d like to offer a defense for noise: It’s needed.

Signal is signal because there is noise. Without the presence of noise we risk living in the proverbial echo chamber. When we know what’s bad, we are better equipped to recognize what’s good. I deliberately tune into the noise on occasion for no other reason than to subject my ideas to a bit of rough and tumble. Its why I blog. Its why I participate in several select forums. “Here’s what I think, world. What say you?”

Of course, noise is noise because there is signal. Once we’ve had an experience of “better” we are then more skilled at recognizing what’s bad. I remember the food I grew up on as being good, but today I view some of it as poison (Wonder Bread anyone?) And there are subjects for which I no longer check out the noise. The exposure is too harmful.

There are subjects for which I seem to be swimming in noise and casting around for any sort of signal that suggests “better.” I’m recalling a joke about the two young fish who swim past an older fish. The older fish says to the younger fish, “The water sure is nice today.” A little further on, one of the young fish asks the other, “What’s water?” I’m hoping to catch that older fish in my net. He knows something I don’t.

To understand what I mean by noise being necessary it is important to understand the metaphor I’m using, where it applies and where it doesn’t.

Taking the metaphor literally, in the domain of electrical engineering, for example, the signal to noise ratio is indeed an established measure with clear unit definitions as to what is reflected by the ratio – decibels, for example. In this domain the goal is to push always for maximum signal and minimum noise.

In the world of biological systems, however, noise is most definitely needed. One of many examples I can think of is related to an underlying driver to evolution: mutations. In an evolving organism, anything that would potentially upset the genetic status quo is a threat to survival. Indeed, most mutations are at best benign or at worst lethal such that the organism or it’s progeny never survive and the mutation is selected against as evolutionary “noise.”

However, some mutations are a net benefit to survival and add to the evolutionary “signal.” We, as 21st Century homo sapiens, are who we are because of an uncountable number of noisy mutations that we’ll never know about because they didn’t survive. Even so, surviving mutations are not automatically “pure” signal. There are “noisy” mutations, such as that related to sickle cell anemia. Biological systems can’t recognize a mutation as “noise” or “signal” before the mutation occurs, only after, when they’ve been tested by the rough and tumble of life. This is why I speak in terms of “net benefit.”

For humans trying to find our way in the messy, sloppy world of human interactions and thought, pure signal can be just as undesirable as pure noise. I’ll defer to John Cook, who I think expresses more succinctly the idea I was clumsily trying to convey:

If you have a crackly recording, you want to remove the crackling and leave the music. If you do it well, you can remove most of the crackling effect and reveal the music, but the music signal will be slightly diminished. If you filter too aggressively, you’ll get rid of more noise, but create a dull version of the music. In the extreme, you get a single hum that’s the average of the entire recording.

This is a metaphor for life. If you only value your own opinion, you’re an idiot in the oldest sense of the word, someone in his or her own world. Your work may have a strong signal, but it also has a lot of noise. Getting even one outside opinion greatly cuts down on the noise. But it also cuts down on the signal to some extent. If you get too many opinions, the noise may be gone and the signal with it. Trying to please too many people leads to work that is offensively bland.

The goal in human systems is NOT to push always for maximum signal and minimum noise. For example, this is reflected in Justice Brandeis’s comment: “If there be time to expose through discussion the falsehood and fallacies, to avert the evil by the process of education, the remedy to be applied is more speech, not enforced silence.” So my amended thesis is: In the domain of human interactions and thought, noise is needed by anyone seeking to both evaluate and improve the quality of the signal they are following.

A final thought…

If we were to press for eliminating as much “noise” as possible from human systems much like the goal for electrical noise, I’m left with the question “Who decides what qualifies as noise?”

Improving the Signal to Noise Ratio

A question was posed, “Why not be an information sponge?”

I’d have to characterize myself as more of an information amoeba – (IIRC, the amoeba is, by weight, the most vicious life form on earth) – on the hunt for information and after internalizing it, going into rest mode while I decompose and reassemble it into something of use to my understanding of the world. Yum.

More generally, to be an effective and successful consumer of information these days, the Way of the Sponge (WotS, passive, information washes through them and they absorb everything) is no longer tenable and the Way of the Amoeba (WotA, active, information washes over them and they hunt down what they need) is likely to be the more successful strategy. The WotA requires considerable energy but the rewards are commensurate with the effort. WotS…well, there’s your obsessive processed food eating TV binge-watcher right there. Mr. Square Bob Sponge Pants.

What’s implied by the WotA vs the WotS is that the former has a more active role in optimizing the informational signal to noise ratio than the latter. So a few thoughts to begin with on signals and noise.

Depending on the moment and the context, one person’s signal is another person’s noise. Across the moments that make up a lifetime, one person’s noise may become the same person’s signal and vice versa. When I was in high school, I found Frank Sinatra’s voice annoying and not something to be mingled with my collection of Mozart, Bach, and Vivaldi. Today…well, to disparage the Chairman of the Board is fightin’ words in my house. Over time, at least, noise can become signal and signal become noise.

But I’m speaking here of the signal quality and not it’s quantity (i.e. volume)

Some years ago I came across Stuart Kauffman’s idea of the adjacent possible:

It may be that biospheres, as a secular trend, maximize the rate of exploration of the adjacent possible. If they did it too fast, they would destroy their own internal organization, so there may be internal gating mechanisms. This is why I call this an average secular trend, since they explore the adjacent possible as fast as they can get away with it.

This has been interpreted in a variety of ways. I carry this around in my head as a distillation from several of the more faithful versions: Expand the edge of what I know by studying the things that are close by. Over time, there is an accumulation of loosely coupled ideas and facts that begin to coalesce into a deeper meaning, a signal, if you will, relevant to my life.

With this insight, I’ve been able to be more deliberate and directed about what I want or need to know. I’ve learned to be a good custodian of the edge and what I allow to occupy space on that edge. These are my “internal gating mechanisms.” It isn’t an easy task, but there are some easy wins. For starters, learning to unplug completely. Especially from social media and what tragically passes for “news reporting” or “journalism”these days.

The task is largely one of filtering. I very rarely directly visit information sources. Rather, I leverage RSS feeds and employ filtering rules. I pull information of interest rather than have it pushed at me by “news” web sites, cable or TV channels, or newspapers. While this means I will occasionally miss some cool stuff, it’s more than compensated by the boost in signal quality achieved by excluding all the sludge from the edge. I suspect I still get the cool stuff, just in a slightly different form or revealed by a different source that makes it through the filter. In this way, it’s a matter of modulating the quantity such that the signal is easier to find.

There is a caution to consider while optimizing a signal-to-noise ratio, something reflected in Kauffman’s comments around the rate of exploration for new ideas: “If they did it too fast, they would destroy their own internal organization…”

Before the Internet, before PCs were a commodity, before television was popular it was much, much easier to find time to think. In fact, it was never something that had to be looked for or sought out. I think that’s what is different today. It takes WORK to find a quiet space and time to think. While my humble little RSS filters do a great job of keeping a high signal-to-noise ratio with all things Internet, accomplishing the same thing in the physical world is becoming more and more difficult.

The “attention economy,” or whatever it’s being called today, is reaching a truly disturbing level of invasion. It seems I’ve used more electrician’s tape to cover up camera lenses and microphones in the past year than I’ve used on actual electrical wires. The number of appliances and gadgets in the home with glowing screens crying out for bluetooth or wifi access like leaches seeking blood are their own source of noise. This is my current battleground for finding the signal within the noise.

Enough about filtering. What about boundaries. Fences make for good neighbors, said someone wise and experienced. And there’s a good chance that applies to information organization, too. Keeping the spiritual information in my head separate from my shopping list probably helps me stop short of forming some sort of cult around Costco. ( “All praise ‘Bulk,’ the God of Stuff!)

An amoeba has a much more develop boundary between self and other than a sponge and that’s probably a net gain even with the drawback of extra energy required to fuel that arrangement. Intellectually, we have our beliefs and values that mark where those edges between self and other are defined.

So I’ll stop for now with the question, “What are the strategies and mental models that promote permeability for desired or needed information while keeping, as much as possible, the garbage ‘out there?’”

 

Agile and Changing Requirements or Design

I hear this (or some version) more frequently in recent years than in past:

Agile is all about changing requirements at anytime during a project, even at the very end.

I attribute the increased frequency to the increased popularity of Agile methods and practices.

That the “Responding to change over following a plan” Agile Manifesto value is cherry picked so frequently is probably due to a couple of factors:

  • It’s human nature for a person to resist being cornered into doing something they don’t want to do. So this value gets them out of performing a task.
  • The person doesn’t understand the problem or doesn’t have a solution. So this value buys them time to figure out how to solve the problem. Once they do have a solution, well, it’s time to change the design or the requirements to fit the solution. This reason isn’t necessary bad unless it’s the de facto solution strategy.

The intent behind the “Responding to change” value, and the way successful Agile is practiced, does not allow for constant and unending change. Taken to it’s logical conclusion, nothing would ever be completed and certainly nothing would ever be released to the market.

I’m not going to rehash the importance of the preposition in the value statement. Any need to explain the relativity implied by it’s use has become a useful signal for me to spend my energies elsewhere. But for those who are not challenged by the grammar, I’d like to say a few thing about how to know when change is appropriate and when it’s important to follow a plan.

The key is recognizing and tracking decision points. With traditional project management, decisions are built-in to the project plan. Every possible bit of work is defined and laid out on a Gantt chart, like the steel rails of a train track. Deviation from this path would be actively discouraged, if it were considered at all.

Using an Agile process, decision points that consider possible changes in direction are built into the process – daily scrums, sprint planning, backlog refinement, reviews and demonstrations at the end of sprints and releases, retrospectives, acceptance criteria, definitions of done, continuous integration – these all reflect deliberate opportunities in the process to evaluate progress and determine whether any changes need to be made. These are all activities that represent decisions or agreements to lock in work definitions for short periods of time.

For example, at sprint planning, a decision is made to complete a block of work in a specified period of time – often two weeks. After that, the work is reviewed and decisions are made as to whether or not that work satisfies the sprint goal and, by extension, the product vision. At this point, the product definition is specifically opened up for feedback from the stakeholders and any proposed changes are discussed. Except under unique circumstances, changes are not introduced mid-sprint and the teams stick to the plan.

Undoing decisions or agreements only happens if there is supporting information, such as technical infeasibility or a significant market shift. Undoing decisions and agreements doesn’t happen just because “Agile is all about changing requirements.” Agile supports changing requirements when there is good reason to do so, irrespective of the original plan. With traditional project management, it’s all about following the plan and change at any point is resisted.

This is the difference. With traditional project management, decisions are built-in to the project plan. With Agile they are adapted in.

How does Agile help with long term planning?

I’m often involved in discussions about Agile that question its efficacy in one way or another. This is, in my view, a very good thing and I highly encourage this line of inquiry. It’s important to challenge the assumptions behind Agile so as to counteract any complacency or expectation that it is a panacea to project management ills. Even so, with apologies to Winston Churchill, Agile is the worst form of project management…except for all the others.

Challenges like this also serve to instill a strong understanding of what an Agile mindset is, how it’s distinct from Agile frameworks, tools, and practices, and where it can best be applied. I would be the first to admin that there are projects for which a traditional waterfall approach is best. (For example, maintenance projects for nuclear power reactors. From experience, I can say traditional waterfall project management is clearly the superior approach in this context.)

A frequent challenge the idea that with Agile it is difficult to do any long-term planning.

Consider the notion of vanity vs actionable metrics. In many respects, large or long-term plans represent a vanity leading metric. The more detail added to a plan, the more people tend to believe and behave as if such plans are an accurate reflection of what will actually happen. “Surprised” doesn’t adequately describe the reaction when reality informs managers and leaders of the hard truth. I worked a multi-million dollar project many years ago for a Fortune 500 company that ended up being canceled. Years of very hard work by hundreds of people down the drain because projected revenues based on a software product design over seven years old were never going to materialize. Customers no longer wanted or needed what the product was offering. Our “solution” no longer had a problem to solve.

Agile – particularly more recent thinking around the values and principles in the Manifesto – acknowledges the cognitive biases in play with long-term plans and attempts to put practices in place that compensate for the risks they introduce into project management. One such bias is reflected in the planning fallacy – the further out the planning window extends into the future, the less accurate the plan. An iterative approach to solving problems (some of which just happen to use software) challenges development teams on up through managers and company leaders to reassess their direction and make much smaller course corrections to accommodate what’s being learned. As you can well imagine, we may have worked out how to do this in the highly controlled and somewhat predictable domain of software development, however, the critical areas for growth and Agile applicability are at the management and leadership levels of the business.

Another important aspect the Agile mindset is reflected in the Cone of Uncertainty. It is a deliberate, intentional recognition of the role of uncertainty in project management. Yes, the goal is to squeeze out as much uncertainty (and therefore risk) as possible, but there are limits. With a traditional project management plan, it may look like everything has been accounted for, but the rest of the world isn’t obligated to follow the plan laid out by a team or a company. In essence, an Agile mindset says, “Lift your gaze up off of the plan (the map) and look around for better, newer, more accurate information (the territory.) Then, update the plan and adjust course accordingly.” In Agile-speak, this is what is behind phrases like “delivery dates emerge.”

Final thought: You’ll probably hear me say many times that nothing in the Agile Manifesto can be taken in isolation. It’s a working system and some parts if it are more relevant than others depending on the project and the timing. So consider what I’ve presented here in concert with the Agile practices of developing good product visions and sprint goals. Product vision and sprint goals keep the project moving in the desired direction without holding it on an iron-rails-track that cannot be changed without a great deal of effort, if at all.

So, to answer the question in the post title, Agile helps with long term planning by first recognizing the the risks inherent in such plans and implementing process changes that mitigate or eliminate those risks. Unpacking that sentences would consist of listing all the risks inherent with long-term planning and the mechanics behind and reasons why scrum, XP, SAFe, LeSS, etc., etc., etc. have been developed.