Vous êtes sur la page 1sur 13

The Analytic Solutions Group, LLC

Shannon Entropy and Productivity:


Why Big Organizations Can Seem Stupid
Dr. Rich Janow
February 28, 2004
Department of Physics
New Jersey Institute of Technology
Newark, NJ 07102-1982
janow@njit.edu
and
The Analytic Solutions Group, LLC
514 North Wyoming Avenue, South Orange NJ 07079
973-762-4987
janow@att.net

Abstract:
For an idealized model organization of n decision
There is a fairly common perception that large nodes the per capita limit on the sustainable decision
organizations tend to behave much less intelligently rate is proportional to 1/log2(n). Additionally, a
than their size suggests. They often lose the distinct saturation effect with a higher threshold
decisiveness seen in small groups and may seem switches on when individual's decision throughput
"stupid" to people within them who work with ideas limits are reached.
and knowledge. This paper offers the first quantitative
basis for these observations. It adapts some Quantitative tools for managing entropy and
elements of Shannon's information theory to productivity in business firms and in command and
organizations by defining "organizational entropy" control situations are feasible as practical outgrowths
and related notions of decision complexity and of this work. They involve conscious restructuring
productivity. Organizational entropy measures extra designed to limit entropy effects
decision information needed when partitioning
functions onto a structure.
There appears to be a fundamental upper limit on the Keywords:
average per capita decision rate that an organization
can sustain, depending inversely on the Management, information theory, organizational
organization's entropy. If the limit is broached, entropy, decision networks, social networks, business
impaired productivity among knowledge managers reengineering, technology management, decision
may result and large organizations may be dis- 3
complexity, command and control, C I, entropy-
advantaged when performing knowledge -intensive based warfare, knowledge management, complex
tasks that require efficient use of intellectual capital. systems, control theory applications.

Copyright 2003, 2004 by Rich Janow Page 1


The Analytic Solutions Group, LLC
Abstract:............................................................................................................................................ 1

Keywords: ......................................................................................................................................... 1
1. Introduction ................................................................................................................................. 3
2. A model for decision processes............................................................................................... 4
2.1. Symbol transmission in information theory 4
2.2. Management decisions and "dits" 4
2.3. The upper limit M(n) on the maximum management decision flow rate: 4
2.4. The maximum "dit" capacity R(n) 5
2.5. Organizational entropy and the decision complexity A(n) 5
3. Approximations ........................................................................................................................... 6
3.1. A form factor for organization structure 6
3.2. An exactly solvable model: the fully connected organization 7
3.3. Sparsifying topology, limiting choice (partitioning into sub-units) 8
4. Applications ................................................................................................................................. 8
References:....................................................................................................................................... 9
Appendix A: The choice of a definition for M(n) ....................................................................... 10
Appendix B: The smallest fully connected organization ......................................................... 10
About the author:........................................................................................................................... 11

Copyright 2003,2004 by Rich Janow Rev. 2/28/2004 Page 2


The Analytic Solutions Group, LLC

Organizational entropy is defined and introduced


1. Introduction below in analogy with the Shannon entropy; it
contributes to the average collaborative decision
Knowledge managers in large organizations often complexity (choice) just as entropy for a
come to feel a limit on their efficiency and communication channel measures the average
effectiveness that they attribute to the organization's information content (bits per symbol) of symbols
size. For example, an individual may ask his firm to transmitted over it. Organizational entropy defined
assimilate his work and make decisions based on it, this way increases as a decision network grows, if
but he/she may have to wait an inordinately long time the complexity of the tasks themselves remains
for a response - even when the issues to be decided unchanged. As organizations grow, often having to
seem straightforward. People may become frustrated take on increasingly complex tasks, the decision
by these perceived impediments and assign them to structure adds nodes and partitions functions among
bureaucracy, resistance to fresh initiatives, turf more decision makers in order to "divide and
protection, complacency, etc. conquer". That added structure increases the
network entropy and adds to the overall decision
These phenomena may actually, though, be the
complexity, unless the tasks themselves become
result of individuals' own decision processing over-
simpler or more specialized.
running the organizations' collaboration channels'
ability to keep up with them. The sluggish response Importantly, the entropy grows fast enough to more
may become more pronounced with growth. Small than offset growth in the total of individuals'
groups of people sometimes self-organize and capacities for making basic binary decisions. There
bypass the structure to gain efficiency and avoid appears to be a fundamental upper limit on the total
frustration. The knowledge managers and customers management decision flow rate. That limit grows
who find that an organization cannot keep up with slower than linearly with the number of nodes in an
them may conclude simplistically that it (the organization, and so the maximum per capita
organization) is "stupid". When a small firm merges management decision flow rate actually shrinks as
into a large one, culture shock often results. the number of decision-makers in the network grows.
Decision-intensive tasks are often integral to an The intended inference is that the widespread,
organization's product as well as providing control of anecdotal perceptions mentioned earlier may be
the organization itself. These are all lumped together symptoms that a shrinking limit on per capita decision
and called "management decisions" in the throughput is reached and perceived as impaired
discussions that follow. productivity. The limit is intrinsic to large control
networks. It is emphatically not related to congestion
Organizations are a type of control system in which
on physical data networks that may be present: the
the decision network (perhaps several of them) has
effect has in principle been operating as long as
humans as nodes [1]. The individual decision-makers
humans have formed groups and divided up roles.
use the control network they are in as a
communication channel: they insert management These suggestions clearly need experimental
decisions into it and expect to eventually receive validation and ex perience with practical applications
decision information back from it. The network has to become useful and accepted. They are potentially
some maximum decision flow rate it can support; if important for any organization dominated by complex
knowledge managers try to exceed it their work may cognitive tasks and teamwork. Businesses that
be dropped altogether and responses may be primarily manage knowledge and create intellectual
delayed or inappropriate. property may need to restructure to become more
efficient and competitive in their markets. Defense
Conventional communication channels have similar
applications include improving command and control
capacity limits and this prompts the speculation that
efficiency for friendly assets or understanding how to
an analogy may be useful. The key events for
impede it (entropy-based warfare) for opposing
communication systems are the transmission of
forces [3].
symbols (such as letters of the alphabet) rather than
decisions. Claude Shannon [2] developed
information theory for communication systems and
perhaps most notably recognized and quantified
entropy as information.
Organizations have seemingly analogous flows of
management- and application-related decisions.

Copyright 2003,2004 by Rich Janow Rev. 2/28/2004 Page 3


The Analytic Solutions Group, LLC
1
2. A model for decision processes "dits" . In general, the task of building such binary
representations would be tedious and subjective
2.1. Symbol transmission in unless it can be automated.
information theory
There is no direct relationship between the way a
decision is represented via a sequence of "dits" and
Claude Shannon [2] showed how to quantify the
representations of the decision as symbol and bit
information contained in members of some symbol
strings for transmission. Dits are "actionable", not
set (such as letters of the alphabet) that occur with
just perceptual. For example, bitmap graphics and
particular frequencies. The entropy H of an
the like may require a huge number of bits but have
information source or sink measures the average
little or no actionable content. A message authorizing
information content of each symbol transmitted and is
a military attack might be just a few bits long but have
expressed as the number of bits per symbol needed
a huge decision complexity (many "dits").
when using the optimal compression coding scheme.
Information is intimately related to the element of
choice, as is the entropy of physical systems. The complexity of management decisions (i.e., their
average length in "dits") says nothing at all about
Entropy pertaining to symbol transmission has the
their importance to the organization. It measures
same characteristic functional form as physical
ultimately the amount of thought decisions require,
entropy:
which may or may not correlate with impact.
n
2.3. The upper limit M(n) on the maximum
H = - k pjlog 2(pj)
j =1 management decision flow rate:

A communication channel's capacity C is the The quantity analogous to Shannon's upper limit on
maximum rate for transmitting raw bits/second, symbol transmission is a function M(n) that
determined by physical characteristics of the channel represents an organization's maximum total
and modulation schemes. The Fundamental Theorem management decision flow rate. An organization is
for a Noiseless Channel [2] showed that there is a regarded as a network of n decision nodes
maximum rate S for symbol transmission that is (knowledge workers) that create, consume, and
simply C divided by the entropy H: communicate actionable information related to the
organizations activities. M(n) is simply the quotient:
C
S H
M (n) R (n) (1a)

A source that tries to go faster will over-run the


A( n)
channel, resulting in errors and lost information. The function R(n) measures the maximum dit flow
2.2. Management decisions and "dits" rate (in dits/unit time) for the entire organization -
analogous to the Shannon channel capacity C.
Management decisions are composed of many The function A(n) represents the average
independent choices - much as communication collaborative decision complexity (with dimensions
symbols are composed of many bits. In principle, dits/decision). Its value depends on the inherent
management decisions can be reduced to binary complexity of collaboration tasks themselves, and
decision elements [1]. The quanta of actionable also on an entropy component due to the decision
decision information are analogous to the "bits" of network structure. The entropy component fits an
classical information theory, and so it's natural to call analogy with Shannon's entropy.
them "dits" for short to emphasize the parallel.
The per capita limit m(n) on the management
Beer [1] noted that a dichotomous classifier can in decision rate is simply M(n) divided by n, viz:
principle generate the binary representations for
decisions. A human (or a coding device) might parse m(n) M(n) (1b)
statements to find actionable content, develop a set
of (decidable) symbolic logic statements, map them
n
onto a binary tree, and then code them as a string of

1 This is analogous to compression of a


symbol set in information theory.

Copyright 2003,2004 by Rich Janow Rev. 2/28/2004 Page 4


operating at full flow rate, for example). R(n) then
2
scales proportionally to n rather than n .
If saturation is reached a dramatic collapse of
productivity follows (discussed below). It may happen
for example through down-sizing or if an organization
grows very large but does not restructure to limit the
paths and dit rates per knowledge worker. To avoid

r i,j

2.4. The maximum "dit" capacity R(n)

Each of the n people in an organization can potentially


have a decision path to any of the remaining n 1
others. So R(n) scales as n.(n-1), and it approximates
2
n for large n. For an individual node i, Ri(n) represents
the maximum capacity involving as many as n-1
destination nodes. Each
such linkage is represented by R0i,j (with
dimensions "dits"/unit time).

The factors R0i,j are maximum (not expected


value) dit rates flowing between pairs of
decision-makers and might vary for each path.
They depend on the people at either end of the
paths, who are filling the role that codecs (coder-
decoders) play in hardware communications.
The coefficients r i,j toggle the flow on or off for
each path and thus specify the topology of the
organization: they range from 0 to 1. If for
example all paths are fully open all the time,
then r i,j = 1 for all values of the indices.
The maximum "dit" capacity is a sum over all the
sources and destinations:
n
R (n ) = R i (n )
i =1 (2)
nn n
i, j
= r i, jR 0 Di
i =1 j =1, j i i =1

A detailed evaluation of this function for a large


organization may require a complete social network
map of who is connected to whom with values for the
dit rates.
The extreme right hand side of Equation (2) portrays
a saturation limit. Individual collaborators can
become saturated if the maximum rates Di (in
dits/unit time) at which they can make or assimilate
basic binary decisions are exceeded. Those rates
depend on personal and resource characteristics. A
binary dit rate Di might be spent on a few complex or
many simple management decisions.
The right hand inequality above becomes the upper
limit on the "dit" rate when all of the nodes in an
organization reach saturation (too many paths
The Analytic Solutions Group, LLC ingredient. The freedom exercised when choosing a
symbol from a symbol set led to a definition of
saturation the workload must be partitioned onto a information using a function identical to that of
larger staff (increasing n) accompanied by structural statistical entropy. Whenever a symbol sequence is
change. highly predictable the choice is small and very little
information is conveyed.
2.5. Organizational entropy and
the decision complexity A(n) Choice applies also to destinations for actionable
information in an organization. Choice grows as tasks
The connection between entropy, information, and grow in complexity and are partitioned to more
choice has been known for a long time and it follows nodes, increasing the amount of collaboration.
that decision processes should have an associated Organizational entropy measures choice
entropy. As early as 1894, physicist Ludwig ("degeneracy") in the number of collaboration states.
Boltzmann [4] observed that the entropy of a physical A subset of the possible states is being used at any
system is related to missing information" inasmuch one time on particular knowledge-intensive tasks.
as it counts the number of alternative ("degenerate") When the range of decision network states is large
microscopic states of a physical system that might be so is the entropy; the organization is then also
chosen consistent with a single macroscopic something of a general-purpose tool. Conversely,
(observable) state. The entropy grows with the size when entropy is small, the organization will probably
of the phase space (microscopic system state) do a prescribed set of specialized tasks efficiently
volume that a system can occupy. When all phase and others not at all. Like physical entropy,
space cells are equally probable the entropy is a organizational entropy is an extensive quantity: it
maximum. The system is then highly disordered and grows with the size of the decision network.
it takes a lot of information to specify which of the
microscopic states it is in. By contrast, the state of a The price of having the capability to execute a wide
highly ordered system (say, a solid at absolute zero range of complicated, multi-person tasks may be a
temperature) has low entropy and takes large organizational entropy. That broad capability
comparatively little information to specify. may impair efficiency when doing simple tasks for
which a multi-purpose structure is over-kill.
When Shannon quantified information for
communications the notion of choice was a key

Copyright 2003,2004 by Rich Janow Rev. 2/28/2004 Page 5


In the above, the notation (si , sj) represents the
event in which source i chooses recipient j.
2
A logarithmic function satisfies all
these requirements, viz:

Info (si , sj ) log2 (1/pi(j)) = -log2 (pi(j))

2 With base 2 logarithms, the logarithm of a


number is simply the number of binary integers
(bits) needed to represent it. One can switch
between base 2 and natural (base "e") logarithms
without losing generality, apart from the constant
0.6931 which is the natural logarithm of 2., viz.:
log2 (x) = loge(x)/loge(2) = loge(x)/.6931

pi(j)

The overall decision complexity A(n) is the average


number of "dits per collaborative management
decision task. It is a sum over contributions Ai(n)
from each of the n decision nodes.
Each contribution is the product of the average
complexity A0i (pertaining to tasks that node i
performs) and a factor Hi that measures the entropy
of the decision network that node i sees. The entropy
measures structural choice information (in bits). A0i
accounts for the content in dits per decision per bit of
structure information.
The entropy is found by retracing the thread of
Hammings [5] discussion of Shannon entropy for a
communication channel. The ith person chooses
one of the remaining n-1 people in the organization
every time he issues a decision to the network. Let
be the conditional probability that person i
chooses person j as his target. If the destination is
known in advance there is no surprise and no
information is needed to know whom person i
collaborates with. One of the pi(j) is then equal to 1
and all the other probabilities are zero. If the
probabilities are all equal (and small if n is large) then
the surprise when one recipient is actually picked is
at it's maximum and that choice carries significant
information. The information needed to pick a
recipient is thus related to the inverse of the
probability pi(j) for making that choice..
When two decision recipients are chosen
independently, the information associated with the
joint event should simply be the sum of the
information for each separately, viz:

Info [ (si , s j) and (si , sk ) ] =


= Info (si , s j ) + Info (si , s k )
The Analytic Solutions Group, LLC
For a particular organization, the probabilities might
Zero information is involved when there is no surprise be determined by field surveys and activity reporting -
inasmuch as log2(1) = 0. a major undertaking involving mapping out the social
network. It would be valuable to develop and use a
The expected value for the entropy hi(j) (information family of standard modeling coefficients.
in "dits") associated with the pair (si , sj) is just the
conditional probability of choosing j multiplied by the 3. Approximations
information associated with the choice, i.e.,
3.1. A form factor for organization structure
hi (j) = - pi( j)log 2 (pi(j))
Equations (1), (2), and (3) simplify when an
The entropy Hi for all the destinations that person i organization can be approximated by a set of
communicates with is just the sum of hi(j) over j. decision-makers having identical capabilities and
collaborating on tasks of the same inherent
n complexity. The complexity coefficients A0i factor out
Hi = - pi ( j)log 2(pi ( j)) and can be replaced by a single average value A0.
j =1, j i The elementary dit capacities R0i,j likewise factor and
This is identical in form to physical entropy expressions and are replaced by an average R0.
to Shannon's information expression n n
[2] . After summing on source nodes, the result for the r i, j
decision complexity is: M(n) = M0 i = 1 j =1, j i
n n
n n
Hi - pi( j)log 2(pi( j)) (4)
A(n) = Ai (n) = A 0i
i =1 i =1 i =1 j =1, j i
n n (3)
where M0 R 0
= - A 0i pi( j)log 2(pi( j)) A0
i =1 j =1, j i

Copyright 2003,2004 by Rich Janow Rev. 2/28/2004 Page 6


The Analytic Solutions Group, LLC

The ratio M0 is the maximum management decision coupling, it will show up as simplification of
flow rate per collaboration channel (below individuals' tasks: i.e., as a reduction of the decision
saturation). To the right of M0 is a dimensionless form complexity coefficients A0i.
factor that depends purely on the organization
Parallel processor arrays obey a superficially similar
structure and entropy.
rule called "Amdahl's Law", which sets an upper limit
With the structure effects separated, equation (4) can on the speedup ratio [7] achievable by adding
be used for comparisons that reflect organization processing nodes. When an algorithm is completely
structure alone, and absolute measurements of the parallelizable the speedup can grow linearly as
coefficients are not essential. processors are added, since the nodes never need to
wait while another is making decisions. Otherwise
3.2. An exactly solvable model: the
the speedup ratio is less than 1. Amdahl's Law,
fully connected organization though, does not apply entropy to a processor
network, so the analogy is limited.
A further approximation is the "fully connected
model", which corresponds to setting r i,j = 1 for all i The per capita maximum decision rate m(n) = M(n)/n
and j in Equation (4). It assumes that paths between is an organization's productivity limit for knowledge-
all pairs of knowledge workers are open and have intensive tasks. It declines as 1/log2(n) for a system
equal weighting, and that all decision-makers below the saturation threshold.
collaborate with equal probability. The maximum dit
rate R(n) reduces to R0.n(n-1), which approaches Table (1) shows some relative values of M and m for
2 a range of organization sizes. For example, a one
R0.n for large n. All of the conditional probabilities
pi(j) are equal to 1/(n-1). The organizational entropy million-person organization can utilize only about
50,000 times as much management decision
becomes H = n.log2(n-1) while the decision complexity
capacity as a single individual, neglecting saturation
becomes A(n) = A0 H. and assuming it is fully connected (the worst case).
This model can be solved exactly as all terms in the
summations are the same, so the sums become Table 1: Form factors for M and m ignoring? saturation
trivial. But it tends to overstate the entropy effects. In N Decision Rate Per Capita
large organizations this model can most realistically N-1/log2(N-1) Factor
be applied to individual functional units or processes
1/log2(N -1)
that are then sparsely linked to each other.
3 2.00 1.00
4 1.89 0.63
The n.log2(n) expression for the entropy can be
5 2.00 0.50
arrived at another way: that expression is also the
combinatorial complexity of the most efficient general 7 2.32 0.39
method for sorting n objects [6]. The decision 10 2.84 0.32
network in this model can alternatively be viewed as 100 14.90 0.15
a sorting machine for management decisions each of 500 56 0.11
which has complexity A0. The entropy is thus 1,000 100 0.10
proportional to the sorting time. 10,000 753.00 0.0753
The maximum sustainable decision rate M(n) for an 100,000 6,020 0.0602
organization below the size nS that triggers saturation 200,000 11,356 0.0568
is: 500,000 24,409 0.0528
1,000,000 50,171 0.0502
M (n ) = M 0 n-1
log 2 (x) = loge(x)/loge(2) = loge(x)/.6931
log 2(n - 1) 5a)
When an organization is "saturated" the total dit rate
M0 n
for large n n S
is limited by nD0, where D0 is an average node's own
log 2(n) maximum internal dit capacity. For fixed R0 the
number of nodes ns that initiates saturation, along
This result grows sub-linearly; that is, more slowly with organizational thrashing and productivity
than n. It makes intuitive sense that people work
implosion, satisfies D0 = R0.(ns - 1). Above saturation
fastest (although not necessarily most effectively) on
a task when they can proceed autonomously on their the total decision rate M(n) actually falls as 1/log2(n)
with further growth, due to the increased structural
own pieces of it. When there is a synergy or scale information that increases entropy, viz:
economy to be gained by increasing collaborative

Copyright 2003,2004 by Rich Janow Rev. 2/28/2004 Page 7


The Analytic Solutions Group, LLC

M (n) = M 0 nS -1 4. Applications
log 2(n - 1) (5b) One way of applying this work to re-engineering is to
nS use the simple modeling results as heuristics in
M0 for large n > ns
conjunction with a set of rules. Some strategies for
log 2(n) improving knowledge managers' productivity that are
The per capita maximum decision rate m (n) declines consistent with these results have been used
intuitively, but in an ad hoc way without benefit of a
as 1/ n.log2(n) in the saturated regime - a factor of n quantitative rationale:
faster than before.
Reduce choice by creating specialized,
Figure (1) plots m (n) for organizations with dedicated organizations whenever they can be
populations from 3 nodes up to 1 million nodes. justified. Define and automate workflows so
Small organizations growing from a few individuals to that they follow customer-centric processes
about 1,000 experience a ten-fold fall-off in their per and cross "silo" boundaries.
capita productivity that should be highly noticeable.
Further growth (without saturation) from 1000 to 1 Match organization size and structure to the
million knowledge managers would reduce complexity of the task.
productivity by only about another factor of 2 - Use information hiding aggressively.
smaller but still with significant economic impact. A Knowledge managers should work at a level of
person from a small startup firm that is acquired by a abstraction where many low level decisions are
large one would feel a sudden culture shock not seen.
Productivity collapses markedly if saturation is
Use knowledge management systems to avoid
reached, as plotted in figure (1) for a range of
"reinventing the wheel", especially in large
choices for ns. In a real organization saturation would firms.
be reached less suddenly than shown.
Alter the rewards system so that managers
3.3. Sparsifying topology, limiting have an incentive to lower entropy. Decouple
choice (partitioning into sub-units) pay from the number of people supervised.
As experience provides knowledge of the model's
The fully connected architecture maximizes entropy.
coefficients, quantitative tools may emerge to make
"Sparsifying" the decision paths reduces it; i.e. it tuning organizations for high performance a much less
subdivides an organization into subunits and lets a chancy and ad-hoc process. For example, the metrics
small fraction of the people do most of the introduced here may be used as components of cost
communication between them. The entropy 4
functions in linear programming methods.
denominator decreases faster than the "dit capacity"
numerator in Equation (1a). In commercial markets efficiency in managing
knowledge increasingly determines competitive
Managers of large organizations often seem to
advantage. Bigness can be a dis-economy of scale if
intuitively understand this principle. They try to
large, functionally diverse, general-purpose
control decision complexity by subdividing into
organizations compete with focused, low entropy
weakly coupled business units or non-hierarchical
firms in markets where intellectual labor costs drive
process teams, sometimes declaring the intent to
product economics. Where traditional scale
emulate "small firm environments". Only a small
economies (like manufacturing or distribution) are
fraction of the knowledge managers interact across
dominant, knowledge managers' productivity has little
unit boundaries.
leverage on bottom line competitiveness and
Sparsification may not gain as much efficiency as is complex, high entropy organizations may
hoped for if the people who handle external decision nonetheless enjoy critical strategic advantages.
interfaces exceed their "saturation" limits Di. This
Defense applications include improving the speed
may happen easily. One estimate [8] of a typical
and productivity of command and control, intelligence
human's conventional communication bandwidth
capacity is 50 bits/sec - about 250 English words per and sensor fusion, and real-time battle management.
3 These functions all depend on rapid execution of
minute . A human's maximum "dit" rate D0 is likely to many complex decisions by networks of skilled
be much smaller. human decision-makers, who are a limited resource.

3Shannon and others estimate about 2.6


bits/letter and about 4.5 letters/word for English. 4 U. S. patent pending

Copyright 2003,2004 by Rich Janow Rev. 2/28/2004 Page 8


The Analytic Solutions Group, LLC
In "entropy-based warfare" [3] the goal is the 5. chapter VI.
opposite one of deteriorating all these functions for Hamming, Richard W: "Coding and Information
the enemy; extensions of this work may help to Theory", Prentice-Hall, 1980, page 101ff.
countermeasure the effectiveness of combat or terror 6. Aho, Alfred V., John E. Hopcroft, & Jeffrey D.
organizations.
Network entropy probably applies to any control Ullman: "The design and Analysis of Computer
Algorithms", Addison Wesley, 1974, page.77.
system, not only those consisting of human decision The sorting literature most often uses base 2
elements. Examples may include whole societies logarithms, which differ from natural logarithms
and neural networks.
by a factor of loge(2) ~ .6921.
References: 7. Amdahl, G.M., "Validity of single-processor
approach to achieving large-scale computing
1. Beer, Stafford, "Brain of the Firm", Herder & capability", Proceedings of AFIPS Conference,
Reston, VA. 1967, page 483-485.
Herder, N. Y., 1972 and later editions, pp. 58-59. 8. Lucky, Robert W. "Silicon Dreams: Information,
2. Shannon, Claude E. and Warren Weaver: "The
Man and Machine", St. Martin's Press, NY, 1989,
Mathematical Theory of Communication",
University of Illinois Press, 1998, Page 58. page 33. Here is the quote: "After Claude
Reprinted from an article with the same title in Shannon and others conceived the principles of
information theory in the late 1940s, a number of
the Bell System Technical Journal of July and
October 1948. studies were conducted to determine the channel
capacity of a human being...It seems that a
3. Herman, Mark: "Entropy-Based Warfare -
human being -- you and I lest there be any doubt
Modeling the Revolution in Military Affairs", Joint -- is only good for about 50 bits per second of
Forces Quarterly, Autumn/Winter 1998-9, page input or output. That is all the information that we
85-90 are capable of taking in or putting out"
4. Tolman, Richard C.: "The Principles of Statistical
Mechanics", Oxford University Press, 1938,
Rate (Relative)

1.E+00
1.E-01

1.E-02
Decision

1.E-03
No Saturation
Saturation at N = 10K
Saturation at N = 1000
Saturation at N = 100
Per Capita

1.E-04
Saturation at N = 100K
1.E-05

1.E-06

1.E+00 1.E+01 1.E+02 1.E+03 1.E+04 1.E+05 1.E+06


N = Number of decision-makers
Figure 1: The per-capita decision rate m (N) for organizations in the fully connected model, as a
function of size and of saturation threshold NS

Copyright 2003,2004 by Rich Janow Rev. 2/28/2004 Page 9


The Analytic Solutions Group, LLC

The largest (n-fold) gain is realized when pieces of a


Appendix A: The choice of a definition problem can be solved completely independently. For
for M(n) organizations working on tasks that have been
partitioned onto decision-makers, the same limit and
An organization is viewed abstractly as a decision logic applies.
channel whose throughput is limited by the maximum
management decision flow rate M(n), defined simply
as the total maximum dit capacity R(n) divided by the Appendix B: The smallest fully
total decision complexity A(n). R(n) and A(n) are connected organization
both always positive and they grow as n grows. M(n)
has dimensions of decisions/unit time. Suppose there are just three people in an
The defining equation for M(n) is written below. On organization with all of the connections open and
the far right hand side the summation over source equally weighted. As a result, r i, j =1 for each of the
nodes highlights a point: each term Ri(n) /A(n) in the n(n-1) = 6 terms in the double summation of
sum is the maximum (average) decision rate seen by Equation (2). The maximum dit rate is: R(3) = 6R0
the i'th node with the total decision complexity A(n) dits/unit time.
as its denominator:
The decision complexity is calculated using Equation
R (n) n (3). Any one of the 3 people can select two others as
M (n) = [R i(n) / A (n )] (1' ) recipients. If all have equal probability of being
A( n) i =1 chosen, pi(j) = and log2(pi(j)) = - 1 and thus Hi
The entropy of the entire organization limits the = 1; there is just one dit (or bit) of information in this
decision rate at node i, not just the contribution due choice since there are 2 destinations. The total
to decisions at node i alone which is smaller by organizational entropy given by Equation (3) is H(3)
roughly a factor of n. This makes intuitive sense = 3, since there are 3 information sources.
inasmuch as the individual nodes are working on The decision complexity A(3) = 3A0 dits/decision. The
pieces of tasks that were distributed to the nodes.
maximum decision rate is M(3) = 2M 0 decisions/unit
An alternative definition that was rejected in favor of time, using the results above for R(3) and A(3).
the above was:
Three persons is the smallest organization that can
n R i(n) be treated using Equation (1). If n = 2 there is no
(1'x) X (n) choice of topology for mapping problems onto nodes,
pi(j) = 1, the entropy function becomes zero, and
i =1 Ai(n) Equation (4) fails to be mathematically well behaved.
In this summation, each term is a decision rate for
the i'th node that includes only the node's own Three person organizations might be used as an
decision complexity Ai(n) in the denominator. If this experimental tool to evaluate the coefficient M0: it is
were the definition of M(n), the upper limit on per simply one half of the maximum decision rate.
capita decision rates would grow with organizational
size and complexity rather than decrease; i.e., we
would observe synergy rather than antisynergy as
the result of growth.
Such results would be a priori absurd and contrary to
experience: decision-making would become faster
and simpler as the tasks and the organization grow
more complex. The results would also be
inconsistent with a related common -sense boundary
condition called Amdahl's Law [7], which simply says
that an array of n parallel processors working on
pieces of a problem cannot speed things up by more
than a factor of n. Normally, the speed -up factor is
much less than n on account of inherently serial
points in a problems algorithms - often points where
decisions must be made.

Copyright 2003,2004 by Rich Janow Rev. 2/28/2004 Page 10


The Analytic Solutions Group, LLC

About the author:

Rich Janow is the principal of The Analytic Solutions Group, LLC, consulting with government, industry, and
academic organizations. He likes working on cross-disciplinary problems at the intersection of R&D,
information systems, business and technology strategy, physical science, and disruptive technology. He is a
faculty member in the Applied Physics Department at New Jersey Institute of Technology.
Rich's association with advanced technology markets and strategy, roadmaps, assessment,
technological forecasting, and futures includes 18 years at Bell Laboratories, 4 years as an
executive in a high technology company, and applied research in computer science, condensed
matter and surface physics. He holds a Ph. D. from CUNY and an A. B. degree from Columbia
College, both in Physics, and resides in South Orange, New Jersey.
He may be reached at janow@att.net, janow@njit.edu or by phone at (973) 762-4987.

Copyright 2003,2004 by Rich Janow Rev. 2/28/2004 Page 11

Vous aimerez peut-être aussi