by Christopher R. Paparone
Download the Full Article: Design and the Prospects for Decision
The proposed view of decision presented in this essay reflects quite a departure from US "PPBE" procedures, "JOPP" doctrine and Service-equivalent procedures, such as the US Army's Field Manual 5-0, The Operations Process. US Joint Forces Command recently released a pamphlet that equally subordinates design as process methodology subservient to a rational-analytic model of decision-making. These depictions unfortunately portray design as just another staff method to reach sufficient commander's understanding in order to eventually reach a rational or programmatic decision.
This essay is attempts to challenge that view. The artful military designer is concerned about the design of deciding based on appreciatively judging the situation at hand -- sometimes having to act before deciding; realizing that deciding may be an elongated process, not a point in time. The messier the situation the more unstructured deciding must become. Although US Army doctrine and pre-doctrinal joint publications may see design as a step toward the more desirable programmatic and rational-analytic forms, participative and emergent modes do not have to lead to analytic and programmatic modes. Situations may demand a blending of all four types, with emphases on one or more. With the proposed typology presented in this essay, the art of deciding is facilitated with the prospects of blending more than one "color" on the military designer's palette.
Much of the curricula in both US Service and joint staff colleges emphasize programmatic and rational analytic models. For the most part, US military staff and senior service colleges stress rational-analytic models to teach military strategy. Can we imagine a staff or war college changing the educational emphases toward consensus-based decision-making (with the backdrop of social construction theory) and require more exploration of complexity science and chaos theory?
Download the Full Article: Design and the Prospects for Decision
Christopher R. Paparone, Colonel, U.S. Army, Retired, is an associate professor in the Army Command and General Staff College's Department of Joint, Interagency and Multinational Operations at Fort Lee, Virginia. He holds a B.A. from the University of South Florida; master's degrees from the Florida Institute of Technology, the U.S. Naval War College, and the Army War College; and a Ph.D. in public administration from Pennsylvania State University. On active duty he served in various command and staff positions in the continental United States, Panama, Saudi Arabia, Germany, and Bosnia.
About the Author(s)
Comments
On your comment -- "Instead, in the military bad news is more likely to set off a witch hunt."
I think there is a paradox between initiative and accountability, where the latter tends to dissuade the former.
Accountability is such a strong value in the military chain (perhaps the strongest toward the higher end) that it may interfere with risk-taking.
Rational decision making schemes (such as DOD programmatics, JOPP, etc.) also act as "CYA" justifications. And for good reason, because it seems the American people and Congress will not stand for failure in combat. They demand "competence" even if competence is not feasible.
Are we still viewed as professionals when we do not know what to do in the face of high VUCA?
Sir,
Thanks for the video. I very much enjoyed it. This very much tracks with the ideas of John Boyd (OODA loops/iterative decision making).
Two thoughts immediately come to mind.
1. The really sinister part about "thinking then doing" is that we predefine a path and end state, which may or may not be correct. This can and does lead to bad feedback. Either we are look for the wrong things, or we refuse to see the right things because they don't align with our preconceived plan (cognitive bias, cognitive dissonance, etc.)
In contrast, "doing then thinking" <i>depends</i> on feedback to move to the next iteration. In this way, it should produce better results because there is new feedback in each iteration. This allows the next "doing" cycle to continually match up better with changing circumstances - critical in a high VUCA world.
2. The major obstacle for the military in moving toward an iterative decision-making process is that the decision maker will make mistakes. This is by design, yet there is little tolerance for mistakes. We'd rather fool ourselves as long as possible, then blame failure on something or someone else. (It was a great plan, but....)
Thomas Edison, on being asked about his many hundreds of failed attempts to invent a working light bulb, is purported to have said, "I have not failed 700 times. I have not failed once. I have succeeded in proving that those 700 ways will not work."
Failure, bad news, and the like are beautiful, wonderful feedback loops that leaders and designers should embrace as improvements to their world view. Instead, in the military bad news is more likely to set off a witch hunt.
ML,
Spot on. The fallacy of planning is that we seldom acknowledge this.
We claim we do through the planning of "branches" yet these also assume we can predict possible outcomes.
We also think we can pre-program learning (through commander's critical information requirements and so forth). This is another fallacy in high VUCA situations.
If you look at the lives of those who seemed to be effective strategists (like TE Lawrence [WWI] and John Vann (Vietnam), you'll find that they immersed themselves in the situation (they were able to converse with the situation as they acted in it).
One of the references in this essay on decision is the link to Henry Mintzberg's short metaphor on decision making -- http://www.youtube.com/watch?v=DyvXu3lSSG0. I would like to hear thoughts on what he says here. I think his story has some profound meaning to military practitioners.
Dayuhan,
I believe the statement was not intended to be a commentary on the "VUCA-ness" of the world.
Rather, the point is that if you design policies and strategies at Point A in time, then execute those policies and strategies at a later Point B in time, the likely efficacy of the policies and strategies goes down as "VUCA-ness" goes up.
(How about VUCAness for a made up word?)
Based on your comment, I'd say you and the author agree.
I'm curious about this line:
<i>
However, the more volatile, uncertain, complex and ambiguous (high VUCA) the world is, the less likely that pre-execution policies and strategies will prove to be valid.
</i>
Is this meant to suggest that the world is more volatile, uncertain, complex and ambiguous than it once was, or that it is becoming more so? I wouldn't say that's the case at all, though of course the world is always, has always been, and probably will always be all of those things.
I'd submit that our VUCA issues are not a function of increasing VUCA (hey, I'm doing acronyms!) but of our own bad decisions on the policy level. When we choose to take on tasks that we haven't the capacity to accomplish and to pursue those tasks with tools grossly unsuited to their accomplishment (eg trying to do "nation building" with an army), we create VUCA for ourselves. This problem cannot be successfully addressed by better decision-making in the field (though this is never a bad thing); it requires better decision-making (and a bit of @#$% common sense) in Washington DC.