International Journal of Intelligence and CounterIntelligence, 16: 609–637, 2003
Copyright # Taylor & Francis Inc.
ISSN: 0885-0607 print/1521-0561 online
DOI: 10.1080/08850600390198779
STEPHEN MARRIN
CIA’s Kent School: Improving Training
for New Analysts
The Central Intelligence Agency’s (CIA’s) Career Analyst Program (CAP)
for new analysts seeks to increase their on-the-job effectiveness and ability
to produce more accurate analysis. The program is located within the
Sherman Kent School for Intelligence Analysis, which CIA’s senior
managers created in 2000 to increase the expertise of officers within its
Directorate of Intelligence (DI). The school provides analyst and
managerial training for the DI, as well as housing the Kent Center which
acquires and disseminates information regarding analytic ‘best practices.’1
According to an Agency press release, the CAP is CIA’s ‘‘first
comprehensive training program for professional intelligence analysts,’’
and is a noticeable improvement on prior training efforts. 2 The CAP
provides new analysts with the knowledge and skills that enable them to
be more effective in the production of finished intelligence, as well as in
their overall job performance. It also provides them with the means to
produce more accurate analysis by teaching them the causes of — and
means to avoid— intelligence failure, and cognitive tools to assist them in
structuring their analysis more along the lines of the scientific method.
Whether the CAP will lead to an improvement in the quality of the CIA’s
analytic output will depend on whether analysts have the opportunity to
apply their newly acquired expertise when back on the job after training.
Ultimately, the institutional assessments of analytic production processes
will determine whether analytic training programs allow the CIA to
Stephen Marrin, a former Central Intelligence Agency analyst and contractor,
is in the doctoral program at the University of Virginia, specializing in
intelligence studies. An earlier version of this article was presented at the
annual meetings of the Intelligence Studies Section of the International
Studies Association, New Orleans, Louisiana, in March 2002.
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
609
610
STEPHEN MARRIN
improve its production of accurate, timely, and tailored intelligence that fits
the needs of the Agency’s national security policymaking customers.
IMPROVING THE CIA’S ANALYSES
Just as every reform is intended to be a solution to a particular problem, the
CIA’s leaders created the Kent School as a way to address the Agency’s
analytic weaknesses. In May 1998, Director of Central Intelligence (DCI)
George J. Tenet ‘‘gave a classified internal briefing on the CIA’s problems
and what he intends to do about them,’’ according to an article in US
News and World Report.3 In articulating his vision for the future of the
CIA — a vision that became known as his ‘Strategic Direction’ — Tenet
emphasized the importance of changing past practices to improve the
Agency’s contribution to national security policymaking. A year later, in
18 October 1999, he told an audience at Georgetown University in
Washington, D.C.:
[When] I launched a Strategic Direction plan for the CIA . . . I told our
people that we had to take charge of our destiny. That we would do all
within our power to maintain our edge and our vibrancy. That we had
to streamline and realign ourselves and adopt business practices like
the best in the private sector. That we would think big and think
different. That we would work smarter and in new ways so that we
would have the agility and the resiliency to do what the President —
this President or a future President — wants and the American people
expect.4
One week after this speech, the CIA failed to warn American policymakers
of India’s intention to test nuclear weapons. This failure highlighted the
Agency’s limitations and likely accelerated the implementation of reforms
stemming from Tenet’s Strategic Direction plan.5 Post hoc analyses cited
many factors which may have contributed to the failure, but notable was
the DI’s ‘‘lack of critical thinking and analytic rigor,’’ causing it to fail to
add together all indications of a possible nuclear test.6,7 According to The
Wall Street Journal’s Carla Anne Robbins, Admiral David Jeremiah—who
headed the official investigation into the failure — ‘‘recommended hiring
more analysts, improving their training and increasing contact with outside
experts to challenge conventional wisdom.’’8 She also reported that DCI
Tenet said he would ‘‘make it my highest priority to implement [Admiral
Jeremiah’s recommendations] as quickly as possible.’’9
Tenet subsequently commissioned a number of task forces to implement
his Strategic Direction. The ‘‘Analytic Depth and Expertise Task Force,’’
made up of eight DI officers, was given free rein to investigate ways to
‘‘develop true world class experts’’ consistent with the Strategic Direction’s
goals, according to Kent School program director Denis Stadther.10 The
INTERNATIONAL JOURNAL OF INTELLIGENCE
CIA’S KENT SCHOOL: IMPROVING TRAINING FOR NEW ANALYSTS
611
officers met several times a week through the latter half of 1998,
brainstorming ideas to increase the expertise of junior, mid-level, and
senior analysts. The task force ruled out a training course for mid-level
officers because they believed that the broadening experience provided by
rotational assignments, such as those overseas or to policymaking
institutions, would provide greatest value to the Agency. The task force
also ruled out training for senior analysts. It noted that the DI’s career
paths provided greater advancement possibilities for senior analysts willing
to switch to management, and as a result the more ambitious analysts
became managers rather than continuing to use their knowledge in an
analytic capacity. Therefore, the task force proposed the creation of a
more desirable career track in which senior analysts could continue to use
their expertise rather than shift into management. Finally, the task force
concluded that junior officers would be best served with a training
program intended to ‘‘build a common foundation upon which analytic
expertise could be built.’’
The task force submitted two recommendations to then-DDI John E.
McLaughlin, who approved both, and subsequently attributed their
implementation to DCI Tenet’s support and tenure that ‘‘lasted longer
than the ‘task force phase,’’’ according to Vernon Loeb of the Washington
Post.11 In 2000, the career track allowing ‘‘analysts to rise to very senior
rank without branching out into management’’ took form as the Senior
Analytic Service. 12 Also that year, the improved new analyst program
began operating through the newly created Sherman Kent school. The
school was named for Kent, chairman of the CIA’s Board of National
Estimates from 1952 to 1967, and considered ‘‘a leader in developing the
profession of intelligence analysis.’’13 At the dedication of the Kent School
in May 2000, Tenet praised Kent as a ‘‘brilliant teacher to generations of
analysts,’’ and expressed his wish that ‘‘this school . . . [will] always produce
analysts of whom he would be proud.’’14
MECHANISM OF IMPROVEMENT
The Kent School was created to improve CIA’s analytic quality, in part by
increasing the expertise of its analysts. At the opening ceremonies, Tenet
stated that the Kent School ‘‘will prepare generations of men and women
for the . . . profession of intelligence analysis in the 21st Century . . . [by
teaching] the best of what we as an Agency have learned about the craft of
analysis.’’15 According to Martin Petersen, director of the CIA’s human
resources department, the Kent School’s creation ‘‘sends a very, very
powerful message about what you value, and the value here . . . is analytic
expertise.’’16 The Kent School improves the CIA’s analytic production by
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
612
STEPHEN MARRIN
providing analysts with knowledge that they otherwise would acquire
haphazardly or not at all in on-the-job training.
Its mission of increasing analyst expertise corresponds with the efforts of
many other professions, and even other intelligence organizations, that
train employees as part and parcel of organizational improvement
programs. Training programs, in general, provide information specific to
the needs of the institution, and different institutions within the
governmental foreign policy process use training programs to bolster their
unique informational needs. For example:
The State Department’s Foreign Service Institute provides area and language
studies helpful to outgoing State Department Foreign Service Officers and
other government officials;
The Department of Defense’s (DOD) command colleges — including the Army
War College, Naval War College, and National Defense University among
others — provide rising military officers with a more advanced conceptual and
informational context within which to make command decisions;
The Joint Military Intelligence Training Center (JMITC) specializes in providing
short-course training to the DOD’s military intelligence analysts, according to
Ken Olson, the chief of JMITC’s General Intelligence Training Branch. Its
companion institution, the Joint Military Intelligence College (JMIC), provides
broader education consisting of degree and certificate programs in intelligence
at the graduate and undergraduate level.17
Training is intended to increase expertise, which is ‘‘the skill of an expert,’’
and this skill is crucial to the production of accurate intelligence analysis.18
‘‘In the lexicon of US intelligence professionals,’’ observes intelligence
scholar Loch K. Johnson, ‘‘analysis’ refers to the interpretation by experts
of unevaluated (‘raw’) information collected by the intelligence
community.’’19 The more expert an analyst is, the more likely that he or
she will produce a higher quality product. DCI Tenet described the
importance of an analyst’s expertise at the Kent School’s dedication:
In our [DI] it is not enough just to make the right call. That takes luck.
You have to make the right call for the right reasons. That takes
expertise. It is expertise — built up through study and experience — that
combines with relevance and rigor to produce something that is very
important: insight. . . . [O]ur analysts blend a scholar’s mastery of detail
with a reporter’s sense of urgency and clarity. At its best, the result is
insight. And it is insight that wins the confidence of our customers and
makes them want to read our publications and listen to our briefings.20
The DI’s exploitation of analytic expertise in its production of finished
intelligence entails the integration of a complex web of analytic specialties
to produce multi-disciplinary analysis. Not hired by the CIA or even the
DI, most analysts are hired by the individual DI offices, assigned to
INTERNATIONAL JOURNAL OF INTELLIGENCE
CIA’S KENT SCHOOL: IMPROVING TRAINING FOR NEW ANALYSTS
613
‘‘groups’’ which cover specific geographic areas, and then assigned a
functional specialty — ‘‘discipline’’ or ‘‘occupation’’ in DI terminology —
such as political, military, economic, leadership, or scientific, technical, and
weapons intelligence. 21 For the most part, CIA analysts possess a very
small area of direct responsibility, defined by a combination of their
regional area and discipline, work in country ‘‘teams’’ with analysts of
other disciplines, and interact with other regional or disciplinary specialists
as the need arises. Therefore, an analyst’s expertise can vary, depending on
the relative possession of regional knowledge, disciplinary theory, and
intelligence methods in general:
Regional expertise is essentially area studies: a combination of the geography,
history, sociology, and political structures of a specific geographic region. The
DI’s regional offices, such as the Office of Near East, South Asia and Africa,
are responsible for an analyst’s regional expertise and develop it by providing
access to language training, regional familiarization through university courses,
or in-house seminars.
Procedural expertise is knowledge of ‘‘tradecraft’’; the sometimes unique methods
and processes required to produce intelligence analysis. The Kent School is the
primary repository for tradecraft information and its training programs are the
primary means of distribution to both analysts and managers.
Disciplinary expertise is the theory and practice that underlies the individual
analytic occupations. For example, political, military, economic, and
leadership analysis are all built on beds of theory derived from the academic
disciplines of political science, military science, economics, and political
psychology, respectively. Disciplinary expertise can be acquired through
previous academic coursework, on-the-job training, and through occupational
‘‘representatives’’ distributed throughout the offices as focal points for
disciplinary resources.
The CIA’s small analytic niches create specialists, but their specialties must
be reintegrated in order to provide high-level policymakers with a bigger
picture that is more accurate and balanced than the limited perspective or
knowledge of the niche analyst. This process of reintegration includes a
procedure known as ‘‘coordination,’’ in DI parlance, which allows analysts
of all kinds to weigh in with their niche expertise on pieces of finished
intelligence before they are disseminated. According to CIA analyst Frank
Watanabe: ‘‘We coordinate to ensure a corporate product and to bring the
substantive expertise of others to bear.’’22 Accordingly, the bureaucratic
norm is that an analyst will make every effort to coordinate a draft with
other analysts in related regional or disciplinary accounts prior to
submitting the draft to management for editing and dissemination. As a
result, while the expertise of the primary drafter of the piece of intelligence
analysis usually has primary influence on the accuracy of the final piece,
the coordination process exerts a strong influence as well.
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
614
STEPHEN MARRIN
Inaccuracies in the analytic product can occur as a result of insufficient
substantive and disciplinary expertise. Columbia University Professor
Robert Jervis points out that ‘‘a grav(e) danger lies in not having sufficient
expertise about an area of a problem to detect and interpret important
trends and developments. To make up for such deficiency, analysts tend to
impose on the information the concepts, models, and beliefs that they have
derived elsewhere.’’23 In addition, in 1991 former DCI Stansfield Turner
noted that: ‘‘Another reason for many of the analytic shortcomings is that
our analytical agencies do not have an adequate grasp of the cultures of
many countries with which we must deal.’’ He goes on to suggest that
analysts could use a ‘‘a better opportunity to attend academic institutions,
participate in professional conferences, travel and live abroad, acquire
language skills and thus become true experts in their areas.’’ 24 Robert
Jervis adds ‘‘adequate training’’ and ‘‘first-hand exposure to the country’’
as additional methods to increase expertise.25
Expertise is not necessarily sufficient to produce accuracy or prevent
failure, though. Jervis notes that: ‘‘experts will [not] necessarily get the
right answers. Indeed, the parochialism of those who know all the facts
about a particular country that they consider to be unique, but lack the
conceptual tools for making sense of what they see, is well known.’’ 26
In addition, ‘‘even if the organizational problems . . . and perceptual
impediments to accurate perception were remedied or removed, we could
not expect an enormous increase in our ability to predict events’’ because
‘‘the impediments to understanding our world are so great’’ that
‘‘intelligence will often reach incorrect conclusions.’’27 Human cognitive
limitations require analysts to simplify reality through the analytic process,
but reality simplified is no longer reality. As a result, ‘‘even experts can be
wrong because their expertise is based on rules which are at best blunt
approximations of reality. In the end any analytic judgment will be an
approximation of the real world and therefore subject to some amount of
error,’’ and analytic inaccuracies—and sometimes intelligence failure—will
be inevitable.28,29
In sum, analyst expertise is an important component of analytic quality,
but even if the Kent School’s analytic training were ideal it could never
increase expertise to the point that perfection would be attainable. In
addition, the CIA’s interdependent analytic structure implies that training
can increase aggregate capabilities only through the cumulative effects of
multiple kinds of targeted training. Even so, that these factors limit the
Kent School’s potential to improve analyst quality does not preclude
improvement on the margins. If training can improve analytic quality it
does so by increasing the knowledge and skills at the analyst’s disposal.
The CIA’s intelligence analysts require an understanding of their role
within the foreign policy process, tools that help them structure and
INTERNATIONAL JOURNAL OF INTELLIGENCE
CIA’S KENT SCHOOL: IMPROVING TRAINING FOR NEW ANALYSTS
615
analyze complicated issues, a theoretical framework to approach an issue
from a specific disciplinary perspective, such as political or economic
analysis, and the presentational skills necessary for busy policymakers to
incorporate into their decisionmaking processes. Yet, prior to the Kent
School’s creation, new analysts were not receiving the information they
needed to do their jobs effectively.
OVERCOMING PREVIOUS TRAINING DEFICIENCIES
The CIA’s central role in providing high-level policymakers with analysis of
national security threats and opportunities might be expected to require
stringent training for new analysts prior to conducting analysis and writing
reports for senior policymakers. In fact, CIA analysts have been hired and
assigned a desk at Agency headquarters without any analytic training
whatsoever, with two or four weeks of analytic training coming six months
or so later.30 For example, in 1996 the introduction to the CIA for new
analysts consisted of just one week of pro forma briefings regarding
security and other administrative issues, with nothing specific to analysis.
Within weeks, and in some cases days, these newly hired analysts were
producing finished intelligence—albeit limited and under the careful watch
of more senior analysts — without any official training whatsoever in
substance or methods.
This minimal formal training appears to have historical roots in the CIA.
According to former DDI and current Deputy Director of Central
Intelligence (DDCI) John McLaughlin: ‘‘Dick Lehman, the creator of what
we now call the President’s Daily Brief, once remarked that the basic
analytic training he got back in 1949 came down to a single piece of advice
from his boss: ‘Whatever you do, just remember one thing — the Soviet
Union is up to no good!’ Simple, but that said it.’’31 In addition, Mark
Lowenthal, a former staff director of the House Permanent Select
Committee on Intelligence, has said that CIA does not ‘‘do a lot of
training. . . . They say, ‘Congratulations, you’re the Mali analyst, have a
nice day.’’’32 The training process usually relied upon the analyst’s prior
formal education, combined with an initial period of sink-or-swim
adaptation to the DI. The sink-or-swim analogy is used frequently inside
the Agency to describe its junior analyst acculturation. In May 2001, the
Kent School’s former dean, Frans Bax, likened previous DI training to
being thrown into the deep end of a pool, and added that if the training or
mentoring ‘‘missed,’’ the analyst ‘‘floundered.’’33
In 1996, the CIA improved its training for junior analysts with the creation
of a month-long survey course for new analysts entitled ‘‘Fundamentals of
DI Tradecraft’’ (FDIT). Some of FDIT’s content was based on a
tradecraft course developed under the auspices of former DDI Douglas
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
616
STEPHEN MARRIN
J. MacEachin, and delivered to all analysts and managers.34 Nonetheless,
although FDIT was intended only as a limited survey course, it received a
fair amount of criticism internally in its first few runnings due to the slow
pace and inconsistent quality of instruction.35 It also demonstrated that a
mere month of training was too short to familiarize analysts with the
range of information that they needed to know to do their jobs correctly.
But these weaknesses became strengths, as the criticisms may have
provided the Analytic Depth and Expertise task force with indicators that
there was greater potential inherent in the new analyst training than was
being actualized through FDIT. In the end, the task force was able to use
the lessons derived from FDIT to inform the structure of the Kent School
and the content of its new analyst training program.
One member of the Analytic Depth and Expertise task force had been
intimately involved in FDIT’s development as a DI representative to the
CIA’s Office of Training and Education (OTE), and he contributed the
lessons he learned from this to the task force. 3 6 After deciding to
recommend training for junior analysts, the task force began to prepare
the outlines of a proposal. Prior DI training programs run through OTE,
located in CIA’s Directorate of Administration, had difficulty obtaining
qualified instructors from the DI because of the prevalent belief that nonanalytic assignments in OTE were not career-enhancing. In addition,
OTE’s separate structure led to difficulties providing training tailored to
the DI’s needs. To overcome these problems, the task force recommended
that training be run through the DI so that it could be better integrated
with the DI’s requirements. Also suggested was a school structure that
would ensure continuity and a knowledge base for future training efforts.
A final suggestion was that the school be headed by a Senior Intelligence
Service officer with a presence on the DI’s corporate board in order to
provide greater ‘‘bureaucratic weight and budgetary authority’’ for the
school to meet its goals. 37 Then-DDI McLaughlin implemented all the
suggestions. As a result, the Kent School’s location within the DI and
bureaucratic structure provide it with a solid base from which to deliver its
informational product.
Although the CAP used lessons derived from FDIT, it was, according to
Denis Stadther, ‘‘a clean-sheet exercise in developing new training that was
founded in the belief that comprehensive training for new analysts must
go beyond skills development to include history, mission, and values of
the DI as well as orientation to specialized knowledge and skills like
operations, intelligence collection, and counterintelligence issues.’’ 38
Comprehensive training could not be shoehorned into a one-month course
akin to FDIT, however, and the CAP’s length became a matter of some
dispute. According to the school’s former dean, Frans Bax, DCI Tenet had
‘‘chided the DI for not taking training as seriously as the [CIA’s
INTERNATIONAL JOURNAL OF INTELLIGENCE
CIA’S KENT SCHOOL: IMPROVING TRAINING FOR NEW ANALYSTS
617
Directorate of Operations or DO]’’ — which has a year-long program for
trainees — and ‘‘instructed the DI to create a program as long as the
DO’s.’’39 As a result, the Analytic Depth and Expertise task force initially
proposed a ten-month long training program. But this was opposed by the
DI offices that were facing tight personnel resource constraints and
reluctant to lose their new analysts for that long. To accommodate their
concerns, the program was reduced from ten months to twenty-five weeks,
and yet retained ‘‘all [its] training components’’ by providing analysts with
fewer interim assignments to other components.40 The CAP’s curriculum
undergoes constant revision derived from analyst and manager feedback in
a search to make the program more effective. According to Denis Stadther,
between 2000 and 2002, the CAP’s length was reduced to twenty-two
weeks, as the curriculum was refined due to student feedback and a
comprehensive review of the curriculum by the CAP faculty.
CAP’S TRAINING CURRICULUM41
By February 2002, the CAP had become a twenty-two-week program which
‘‘teaches newly hired analysts about the CIA’s and the DI’s history and
values and develops the basic skills essential to an intelligence analyst’s
successful career in the directorate.’’ 4 2 The initial week entails an
introduction to intelligence topics, including the history, mission, and
values of the CIA, and a unit on the history and literature of intelligence
that is taught by the history staff of the Agency’s Center for the Study of
Intelligence.43 The following few weeks introduce analysts to a variety of
DI skills, including analytic thinking, writing and self-editing, briefing,
data analysis techniques, and a teamwork exercise akin to an ‘‘outward
bound’’ for analysts. Training in these kinds of basic skills is necessary,
even for CIA analysts with advanced degrees, because intelligence analysis
focuses on the informational needs of policymakers who ask different
questions than those encountered in academia. Also, briefing skills are not
emphasized in academia but are vital for the effective distribution of
information within government. In addition, analysts are not accustomed
to writing in the specialized writing style of the DI, which emphasizes
bottom-line judgments at the beginning of the piece, while strategically
using evidence to bolster the main conclusions. This training is reinforced
throughout the CAP, as analysts write an intelligence paper in which they
self-consciously apply lessons learned to a real-world analytic problem.44
The first five weeks of training culminate in a task force exercise that
provides analysts with the opportunity to take the skills they have learned
and practiced in a classroom setting and apply them in a more realistic one.
After five weeks in the classroom the CAP students go on a four-week
interim assignment that helps them understand how the DI relates to other
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
618
STEPHEN MARRIN
components of the CIA, the intelligence community, and the policymaking
agencies they serve. The analysts — in consultation with their managers —
identify positions within the bureaucracy that are related to their
assignment, and provide working knowledge of units that each analyst will
either work with or rely upon throughout his or her career.45 In particular,
seeing how intelligence analysis is viewed and used in other components of
the policymaking bureaucracy can help new analysts understand their role
and perform it better, according to Denis Stadther. These interim
assignments can entail a stint at a military command—such as the Pacific
Command headquartered at Pearl Harbor — which provides the analyst
with insight into working with a military customer. These assignments can
have other benefits as well. In early 2001, DDCI McLaughlin spoke
enthusiastically of the insight that a new analyst can acquire from visiting
Pearl Harbor and seeing first-hand the reason the CIA was created, and
how he or she fits into a fifty-year tradition of working to prevent similar
destruction.46
After this four-week interim assignment, the analysts return to the
classroom for four more weeks of training in more advanced topics, such
as writing and editing longer papers and topical modules that address
issues such as denial and deception, and indicators and warning. 47
According to Stadther, these are special kinds of analysis that require
application of more advanced and sophisticated analytic tradecraft
techniques. The analysts also broaden their knowledge of other positions
within the intelligence community by visiting other intelligence agencies
and participating in an ‘‘ops familiarization course’’ which provides a brief
exposure to the intense training that a DO case officer receives. 48 The
analysts are also provided with greater information on tradecraft topics —
consisting of concepts, techniques, and methodologies—that form the core
of the intelligence analysts’ specialized skill set, which they can use to
rigorously and self-consciously analyze raw intelligence data. This skill set
includes adapting the study of intelligence issues to the needs of the user,
presenting complicated issues in simple terms stripped of long-winded
assumptions, qualifications, and background, and addressing issues related
to controversial foreign policies and policy goals without taking a policy
position. 49 Specifically, the CAP uses Sherman Kent’s ‘‘Principles for
Intelligence Analysis’’ as guidelines for its students to follow as they learn
the craft of analysis. 5 0 These guidelines — listed here in full in the
Appendix — emphasize the importance of intellectual rigor, the conscious
effort to avoid analytical biases, a willingness to consider other judgments,
and the systemic use of outside experts as a check on in-house blinders, as
well as the importance of candidly admitting analytic shortcomings and
learning from mistakes.
INTERNATIONAL JOURNAL OF INTELLIGENCE
CIA’S KENT SCHOOL: IMPROVING TRAINING FOR NEW ANALYSTS
619
The CAP also teaches other tradecraft skills, such as alternative analysis
techniques including, but not limited to, scenario building and competitive
‘‘A=B Team’’ approaches that entail periodic checks on analysis by
external experts. 51,52 In addition, the CAP assigns Richards Heuer’s
writings to provide analysts with tips and techniques for overcoming flaws
and limitations in the human thinking process. 53 The CAP also uses
analytic failure as a teaching tool for what not to do. Since much of the
intelligence analysis tradecraft has been derived from intelligence failure,
‘‘the lessons of . . . high-profile CIA screw-ups form the core of the Kent
School curriculum,’’ according to reporter Bob Drogin of the Los Angeles
Times. Drogin quotes the school’s dean as saying, ‘‘We spend a lot of time
in this course studying mistakes.’’54
The analysts then go for a second four-week interim assignment, and come
back for a final four-week classroom session that deals with more advanced
topics. For example, a session on politicization and professional ethics
teaches analysts ‘‘how to walk the fine line that separates effective
intelligence support for policymakers from prescribing policy,’’ according
to Stadther. Another session, entitled ‘‘Writing for the President,’’ is
taught by senior DI editors, provides insight and lessons learned in writing
for the CIA’s most senior and demanding policymakers. The CAP also
provides new analysts with a ‘‘roundtable on customer relations’’ in which
experienced analysts provide their hard-earned lessons in dealing with
Congress, the public, and the press, and a ‘‘secrets of success panel’’
providing tips of what does and does not work from analysts with a few
years of experience.
Once again, the classroom session ends with a task force exercise. This time
it entails a two-day terrorist crisis simulation that takes place outside the
classroom, and requires total immersion to handle the fifteen-hour days.
The CAP instructors, using this as an opportunity to present analysts with
situations they might see later in their careers, spike the simulation with
dilemmas to force the analysts to improvise and apply the skills they
learned throughout the course. As reporter Bob Drogin notes, it is
‘‘deliberately designed to be intensive and stressful.’’55 Finally, at the end
of the program, the CAP provides written evaluations of the students to
their home offices, documenting their skills, abilities, and performance
during trainings.56 In sum, the CAP’s twenty-two weeks provide analysts
with information unique to the realm of intelligence analysis, as well as
‘‘greater opportunity for individual growth in understanding of the
profession, and hopefully a greater self-conscious rigor in the use of
analytical tools.’’57
In addition to more rigorous content, the CAP utilizes interactive teaching
methods to maximize analyst learning and acquisition of expertise.
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
620
STEPHEN MARRIN
The training is intended to improve analyst performance. According to Kent
School Dean Dan Wagner, rather than risk student disengagement from
abstract lecture courses, the programs use varied and active teaching
methods which are ‘‘very important’’ to provide multiple pathways for
students to apply their unique learning styles.58 In addition, the CAP also
employs the case study method to make the lessons more fungible to realworld situations.59 Former CIA instructor Thomas Shreeve has noted that
‘‘cases fit best in those parts of the CIA curriculum in which the teaching
objectives included helping students learn how to analyze complex,
ambiguous dilemmas and make decisions about how to resolve them.’’60
In sum, the CAP provides analysts with knowledge that they could not
acquire elsewhere, and improves on prior efforts in this area. If these
lessons are absorbed and subsequently applied, analytic quality may
improve. But training effectiveness and the growth of analyst expertise will
depend on several variables, including student motivation, peer group
influence, and teaching methods. The CAP’s use of interactive teaching
methods is aimed at maximizing learning, and provides an assessment to
the analyst’s home office similar to and with the incentives of a report
card. Some of that knowledge will fall by the wayside, however, as
analysts discard the lessons that are not relevant to their specific accounts.
Other lessons, such as the promulgation of Sherman Kent’s principles to
new analysts, are more goals than standards, and may be modified by
inevitable workplace pressures. For example, professional practices and
promotional concerns may mitigate against the universal application of
‘‘candid admission of shortcomings.’’ Nonetheless, the CAP’s emphasis on
practices such as coordination, cooperation, and admission of one’s own
analytic limitations provides the impetus to change the sometimes
hidebound internal DI culture. As its former dean, Frans Bax, noted in
2001, the Kent School is ‘‘trying to embed the concept of career-long
learning within the culture of the DI.’’61 By disseminating tradecraft and
techniques widely throughout the DI, individual analyst expertise will
increase. But concluding that this increased expertise may or will cause
improvement to the DI’s analytic quality has yet to be justified.
INCREASING ANALYST EFFECTIVENESS
The CAP appears to be an upgrading of the CIA’s past efforts to provide
analysts with job-relevant expertise, although measuring this improvement
from outside the institution is impossible. Hypothetically, evaluating
the CAP’s impact is possible through a simple pretest-posttest control
group research design. 62 For example, to establish the efficacy of the
school in increasing new analyst expertise a test could be administered
both before and after training to measure any increases in analyst
INTERNATIONAL JOURNAL OF INTELLIGENCE
CIA’S KENT SCHOOL: IMPROVING TRAINING FOR NEW ANALYSTS
621
knowledge. A control group composed of new analysts who remain on the
job instead of receiving training could be tested as well, and any changes
in analyst knowledge between the two groups would be attributable to the
training alone.
In October 2001, the Kent School’s director implemented a version of
this alternative design by hiring an outside contractor to interview CAP
alumni and their managers to assess the effect of training on their job
performance. 63 CAP program director Denis Stadther provided an
overview of its conclusions. According to Stadther, all interviewees
considered the CAP to have been beneficial. In particular, line managers
noted that new analysts who went through the program appeared to
possess greater confidence and maturity in applying their skills on the
job. In addition, CAP analysts created lasting ties with their classmates,
and the use of these contacts increased their ability to work the system
in such job-related tasks as obtaining information from a counterpart
agency, coordinating with experts elsewhere within the government, and
finding a fellow analyst within the CIA who covers a particular issue.
Managers appreciated the analysts’ greater institutional expertise because
it meant that they required less intense mentoring to perform well on the
job. Stadther also noted that, contrary to the belief of early CAP
students that their promotion prospects might be adversely affected
by being pulled off the job for five months of training, anecdotal
evidence suggests that their promotion rates have been as good as or better
than those who did not attend the CAP. Stadther states that a good
performance by analysts in the CAP appears to translate well into good
job performance.
But he also emphasized that not all CAP student feedback was positive,
and that this led to adjustments in the curriculum. For example, when
analysts had difficulty relating to a theoretical unit intended to improve
interpersonal skills, the CAP’s program manager refocused it on the
practical application of those skills. Nonetheless, even though feedback
indicates that some aspects of the CAP’s curriculum require modification,
in the aggregate the CAP improves a new analyst’s individual effectiveness,
and this may improve the DI’s effectiveness on the margins.
Training also provides other benefits to analyst effectiveness that Stadther
did not mention was part of the study. For example, CAP promotes a better
fit between analyst and assignment by encouraging analysts to pursue
transfers to units where their skills or interests may be more effectively
applied.64 The CIA’s placement policies are cumbersome due to the sixmonth or greater lag between the time a vacancy opens up and a qualified
person can be hired because of the background checks and other matters
necessary to obtain security clearances. This lengthy process can lead to
initial assignments that are a misfit for an analyst’s knowledge and
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
622
STEPHEN MARRIN
abilities. In 2001, Frans Bax noted that the CAP ‘‘provides the opportunity
for officers to find a better fit for themselves,’’ and that ‘‘there is an infinite
range of career possibilities and if an analyst sees something better they
should go for it.’’65 He added that, in the one year the CAP had been in
operation, a number of analysts had switched offices and even directorates,
and that this was an intentional attempt to break out of the apprenticeship
model that had formerly formed the core human resource approach for the
DI. 66 Therefore, the CAP provides CIA with the opportunity to shift
personnel in a way that allows for interest and skill mixes to be applied
more effectively towards institutional goals.
In sum, the CAP provides analysts with a greater knowledge of the
institution and its practices that can make their job performance more
effective, and even side benefits like providing the opportunity to refine
placement policies with transfers can improve institutional performance on
the margins. But neither addresses the core reason the Kent School was
created: to improve the CIA’s analytic quality.
TOWARD IMPROVING ACCURACY
More fundamental to the mission of intelligence analysis than bureaucratic
savvy or presentational ability, however, is the accuracy of the finished
intelligence product. According to the CIA’s Website, its mission includes
‘‘providing accurate, evidence-based, comprehensive, and timely foreign
intelligence related to national security.’’67 Directly measuring the CAP’s
impact on analytic quality is impossible from outside the institution, due
to the fact that intelligence analysis is inherently classified. In addition, so
far as I know, neither the CIA in general nor the school in particular
measures the accuracy of the analytic product because accuracy in
intelligence analysis is a very difficult concept to operationalize.
Since intelligence analysis can influence what American national security
policymakers decide to do, and what they do has the potential to prompt
or preclude actions of other international actors, a hit-or-miss yardstick
would not effectively capture the quality of the analysis.68 For example, if
the CIA predicts that a terrorist bombing is imminent and policymakers
implement security procedures based on CIA warnings, and the terrorists
are deterred due to the increased security measures, then the intelligence
prediction may well be considered inaccurate even though it helped prevent
the bombing. This causal dynamic exists for all intelligence issues —
including political, economic, and scientific — due to the nature of the
intelligence mission. Therefore, post-hoc assessment of intelligence
accuracy may not provide a true sense of the quality of the analysis.
Instead, both the Kent School’s and the CIA’s evaluations focus on the
INTERNATIONAL JOURNAL OF INTELLIGENCE
CIA’S KENT SCHOOL: IMPROVING TRAINING FOR NEW ANALYSTS
623
soundness of the analytic process because it is implicitly a modified version of
the scientific method.
Analytic Processes Familiar
The CIA analyst uses a version of the scientific method to create intelligence
analysis. The traditional term ‘‘intelligence cycle’’ describes how an analyst
integrates information collected by numerous entities and disseminates this
information to policymakers. As former DCI William Colby noted, ‘‘At
the center of the intelligence machine lies the analyst, and he is the fellow
to whom all the information goes so that he can review it and think about
it and determine what it means.’’69 While the ‘‘intelligence cycle’’ presents
this process in sequential terms, more accurately the analyst is engaged in
never-ending conversations with collectors and policymakers over the
status of international events and their implications for United States
policy. In converting raw intelligence data into finished analysis, analysts
interpret the international environment through an information processing
methodology approximating the scientific method.70 As intelligence author
Washington Platt noted in 1957: ‘‘The so-called ‘scientific method’ means
different things to different people, but the basic features are much the
same. These features are: collection of data, formation of hypotheses,
testing the hypotheses, and so arriving at conclusions based on the
foregoing which can be used as reliable sources of prediction.’’71
An analyst builds an operating hypothesis upon constructs found
in a particular country’s history, culture, regional dynamics, governmental
workings, or whatever aspect may be relevant to determine how that
country’s leaders might respond to foreign challenges. The Kent School
bolsters this knowledge with both procedural and disciplinary expertise to
provide a methodological and theoretical base for the production of
intelligence analysis. For example, academia develops theories to explain
outcomes in the realms of economics, politics, psychology, and
international relations, and these theories form the basis for any good
intelligence analysis. The analyst’s reigning conceptual framework is
grounded in disciplinary theory modified by knowledge of regional or
country-specific substance. Analysts simultaneously — and for the most
part instinctively — use inductive reasoning to find patterns amidst the
flood of data, and use deductive reasoning to provide meaning and context
for the patterns they find. Incoming bits of intelligence data are filtered
through this framework, and those that fit are imbued with meaning and
context, while those that do not fit are set aside as potential modifiers of
the concepts. As a result, intelligence analysis, when done accurately, is a
model of social scientific inductive and deductive reasoning in action.
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
624
STEPHEN MARRIN
Yet, despite the similarities to social science research, intelligence analysis
is more complex. Two methodologists note that intelligence analysis has
difficulties — including externally imposed time constraints which
necessitate analysis prior to acquisition of sufficient information, an
inability to control the variables under study, an unknown data quality
due to imperfections in the collection process and the possibility of
deception, an emphasis on prediction, and a focus on utility — that,
combined, make intelligence analysis more difficult than social scientific
research in an academic setting.72
Also, like many specialties in the study of international relations,
intelligence analysts cannot test their hypotheses or replicate outcomes in
the real world, and instead must allow the progression of events to falsify
them. In addition, international relations writ large has a minimal
empirical base relative to other fields such as medicine or physics, meaning
that the etiology of certain kinds of activities, such as war or revolution,
cannot be aggregated into anything other than a very general and
uncertain theory. Unlike academics, though, the intelligence analyst does
not create theory, but must interpret facts that are themselves subject to
distortion and deception, at times under tight deadlines.73 This means that
high levels of uncertainty are implicit in most analytic judgments.
As a result of the high levels of uncertainty, intelligence analysis rarely has
certainty or proof and, for the most part, conclusions are tentative. In what
has become a cliché, ‘‘evidence collected by intelligence agencies is often
ambiguous and can lead to differing conclusions.’’74 In addition, Israeli
scholars Yitzhak Katz and Ygal Vardi cite an experiment demonstrating
that the same data can lead to differing conclusions, depending on the
conceptual structures applied. 75 The result of such ambiguity is an
environment of conflict and debate over the interpretation and implications
of intelligence data. Loch K. Johnson says that within the institutional
context ‘‘analysis is usually a messy affair, with incomplete information
and often heated debates over what the available facts may really mean
about the intentions of secretive adversaries in distant capitals. . . .
Uncertainty, ambiguity, debate, and partial answers to tangled questions
will remain an existential condition of the analytical process.’’76
Given this complex information environment, not knowing or negligence
in applying social science techniques can lead to analytic inaccuracies. As
Robert Jervis observed in 1986: ‘‘Good intelligence demands three
interrelated conditions. Analysts should present alternatives and competing
explanations for a given even[t], develop the evidence for each of the
alternatives, and present their arguments as fully as is necessary to do
justice to the subject matter. This is not to imply, of course, that if these
conditions are met the resulting analyses will always be excellent but only
that their omission will substantially reduce the probability that the
INTERNATIONAL JOURNAL OF INTELLIGENCE
CIA’S KENT SCHOOL: IMPROVING TRAINING FOR NEW ANALYSTS
625
product will be of high quality.’’77 At its most extreme outcome, analytic
inaccuracies due to failure to apply tradecraft can lead to intelligence
failures. As intelligence scholar Robert Folker notes, ‘‘the root cause of
many critical intelligence failures has been analytical failure.’’78 Katz and
Vardi also argue that, although Israeli intelligence possessed ‘‘data
pointing to the probability of war,’’ it was surprised by the 1973 Yom
Kippur War because it ‘‘was incorrectly interpreted.’’ 79 In addition,
Georgetown Professor Walter Laqueur notes that sources of analytic
failure include ‘‘bias . . . [which is] an unwillingness . . . to accept evidence
contrary to their preconceived notions, . . . ignorance, lack of political
sophistication and judgment, [l]ack of training and experience, lack of
imagination, and the assumption that other people behave more or less as
we do,’’ which is known in intelligence parlance as ‘mirror-imaging.’80 In
sum, failure to apply appropriate social science standards of rigor and
procedure can lead to analytic inaccuracy. If so, then accuracy can be
improved on the margins by providing analysts with greater knowledge of
fact, theory, or procedure, and the application of methodologies to
simplify and structure their analysis.
Kent School: Improving CIA’s Analytic Procedure
The Kent School’s CAP provides analysts with improved methodological
knowledge and analytical tools that should provide a more rigorous basis
for their analysis. This greater rigor, approximating social science methods
should increase the chance that the analytic product will be more accurate
than it otherwise would have been without training. Just as increases in the
quality of research methodology allow a researcher to more effectively
operationalize and measure the constructs of interests, better training in
methodology would hypothetically provide an intelligence analyst with a
better foundation from which to produce more accurate analysis.
Therefore, if the theory holds, then the Kent School will provide analysts
with the capability to produce more accurate analysis, so long as its
programs provide information that enables a more effective application of
the scientific method.
Many possible methodological tools can be used to organize and simplify
the process of intelligence analysis. As Washington Platt noted in 1957: ‘‘In
intelligence production as yet we find little study of methods as such. Yet a
systematic study of methods in any given field by an open-minded expert
in that field nearly always leads to worthwhile improvement.’’ 81 For
example, one JMIC publication — a distillation of a student’s Masters
thesis — demonstrates that ‘‘analysts who apply a structured method
(specifically) hypothesis testing . . . to an intelligence problem, outperform
those who rely on . . . the intuitive approach.’’82 Just as social science is
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
626
STEPHEN MARRIN
structured upon a foundation of hypothesis falsification and revision which
requires that hypotheses be compared for accuracy and relevance,
intelligence analysts self-consciously compare hypotheses in a process
termed ‘‘competitive analysis.’’ This technique can be applied in particular
circumstances — e.g., when an issue of particular importance is imbued
with partisan implications — to get at underlying assumptions that might
be biasing the analysis in one direction or the other. 83 Some have even
advocated the structural incorporation of this method through an
institutionalized ‘‘Devil’s Advocate’’ position or ‘‘Team B’’ exercises.
According to Richards Heuer, other tools that help analysts structure the
most appropriate approach to a particular intelligence problem include
‘‘outlines, tables, diagrams, trees, and matrices, with many sub-species of
each. For example, trees include decision trees and fault trees. Diagrams
includes causal diagrams, influence diagrams, flow charts, and cognitive
maps. . . . [In addition, a matrix can be used] to array evidence for and
against competing hypotheses to explain what is happening now or
estimate what may happen in the future.’’ 84 Finally, R.V. Jones — the
‘‘father of modern scientific and technical intelligence’’ — suggests that
analysts apply Occam’s Razor — or ‘‘the simplest hypothesis that is
consistent with the information’’ — as a way to simplify the analysis of
contradictory or insufficient information.85
The analytic tradecraft standards and techniques taught in the CAP were
developed in the 1990s. According to former CIA officer Jack Davis, former
DDI Douglas MacEachin articulated ‘‘corporate tradecraft standards for
analysts . . . aimed . . . at ensuring that sufficient attention would be paid to
cognitive challenges in assessing complex issues.’’86 Davis added:
MacEachin advocated an approach to structured argumentation called
‘‘linchpin analysis,’’ to which he contributed muscular terms designed
to overcome many CIA professionals’ distaste for academic
nomenclature. The standard academic term ‘‘key variables’’ became
drivers. ‘‘Hypotheses’’ concerning drivers became linchpins —
assumptions underlying the argument — and these had to be explicitly
spelled out. . . . MacEachin thus worked to put in place systematic and
transparent standards for determining whether analysts had met their
responsibilities for critical thinking.87
MacEachin then formalized the promulgation of methodologies to all DI
analysts. In 1996 and 1997, ‘‘nearly all DI managers and analysts attended’’ a
new course based on the standards entitled ‘‘Tradecraft 2000.’’88 In addition,
in 1997 Jack Davis authored ‘‘a series of notes on analytical tradecraft’’ that
encapsulated the lessons from the new course.89 According to the CIA
Website — where portions of these notes can be found — the notes
‘‘elaborate on some of the skills and methods used by DI intelligence
INTERNATIONAL JOURNAL OF INTELLIGENCE
CIA’S KENT SCHOOL: IMPROVING TRAINING FOR NEW ANALYSTS
627
analysts’’ and have ‘‘become a standard reference within CIA for
practitioners and teachers of intelligence analysis.’’90 These methodologies
have become, in essence, analytic doctrine, and through them CIA analysts
have become familiarized with social science methodologies, if with a
modified vocabulary. In 2001, the CIA Website noted that: ‘‘Invariably,
analysts work on the basis of incomplete and conflicting information. DI
analysts are taught to clearly articulate what is known (the facts), how it is
known (the sources), what drives the judgements (linchpin assumptions),
the impact if these drivers change (alternative outcomes), and what
remains unknown.’’ 9 1 The analytic standards were subsequently
incorporated into the existing training courses for analysts, and maintain
their place in the Kent School’s curriculum today.
In sum, the Kent School’s CAP provides analysts with the expertise
necessary to produce more accurate analysis. This conclusion, however, is
insufficient to address whether the school’s CAP actually improves the
quality of the CIA’s analytic output.
BUREAUCRATIC PROCESSES CAN STIFLE POTENTIAL
Hypothetically, the CAP could improve an analyst’s analytic accuracy, but
its potential could be stifled if the DI’s business practices prevent
application of the hard-earned expertise. Intelligence analysis takes place
within a specific organizational context, and an organization’s processes
can promote or impede the application of lessons learned in training, and
affect the ultimate accuracy of the intelligence product.
For example, the length of finished intelligence product emphasized by
the DI can impact expertise acquisition. The DI produces many types of
finished intelligence—some reportorial, some analytical, some estimative—
largely as a result of its attempt to meet policymakers’ varying needs for
information.92 The reportorial products— known as current intelligence —
tend to be shorter but more targeted to policymakers’ needs, while the
more estimative and analytic products can be longer and potentially less
relevant for policymakers due to the time of production and length of
product. Over time, the DI tends to emphasize the benefits of one product
length over another. In the 1980s, the DI emphasized the production of
longer pieces for the knowledge that they create, but after arguments
surfaced that these longer papers were of decreased utility for
policymakers, the DI promptly reversed course in the mid-1990s and
limited ‘‘most DI analytical papers . . . to 3 to 7 pages, including graphics
and illustrations.’’93 Presumably, this was done to increase the relevance of
DI products for policymakers.
But, while the production of a single piece of current intelligence has
limited effect on expertise because it draws on the knowledge and tools
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
628
STEPHEN MARRIN
that an analyst has developed through training and prior analysis, expertise
acquisition and application problems become endemic if the institution
emphasizes current intelligence over longer products. If the analyst does
not have time to think, learn, or integrate new information with old to
create new understandings, knowledge of facts and events may increase,
but the ability to accurately interpret these events decreases, as the analyst
does not have the opportunity to reinforce knowledge through the more
in-depth research and drafting processes. The reliance on brevity also
deprives analysts of the opportunity to apply ‘‘best practices,’’ including
the structured tradecraft methodologies and benefits arising from
disciplinary theory. By 1997, DI management was making efforts to
address the decreased expertise levels resulting from excessive current
intelligence production.94 In the end, the Kent School’s effectiveness in
improving analytic accuracy may depend in part on whether the CIA
provides its analysts with assignments that allow them to utilize the
expertise acquired in training, while still providing relevant current analysis.
The length of time that the DI encourages an analyst to spend on an
account can also impact an analyst’s expertise, and by extension, accuracy.
In 2001, former CIA Inspector General Frederick P. Hitz argued that
CIA’s analysts have ‘‘tended to be 18-month wonders who hopscotch from
subject to subject without developing any broad, long-term expertise in any
one area,’’ and as a result ‘‘are ill-equipped to grapple with the
information that arrives on their desks.’’ 95 An analyst cannot apply
sophisticated methodological tools to intelligence data until he or she
becomes knowledgeable about the specific regional area or discipline of the
account. Frequent changes will prevent an analyst from achieving the
potential inherent in the analytic tools that the Kent School provides new
analysts. Therefore, for the Kent School to improve analytic accuracy
through the application of increased expertise, ‘‘analysts should be
expected to make a career of their functional or geographic specialty so
that the daily flows of information are added to an already solid base of
knowledge,’’ as Hitz suggests.
In sum, the DI must adapt its practices to provide analysts with the
greatest opportunity to apply the expertise they have acquired both onthe-job and in training if the potential inherent in the Kent School’s
programs to improve the DI’s analytic accuracy is to be fully realized. Yet,
despite the importance of institutional context in determining the CIA’s
analytic quality, few have studied the interplay between analyst and
institution. As Robert Jervis has concluded, ‘‘perhaps the intelligence
community has paid too [little attention] . . . to how the community’s
internal structure and norms might be altered to enable intelligence to be
worth listening to.’’96 Further studies of analytic structures and processes
must be done to determine which reforms would have the greatest
INTERNATIONAL JOURNAL OF INTELLIGENCE
CIA’S KENT SCHOOL: IMPROVING TRAINING FOR NEW ANALYSTS
629
improvement on analytic quality, and perhaps their implementation will take
CIA a step closer to creating ‘‘intelligence worth listening to.’’
IMPROVING TO MEET FUTURE CHALLENGES
In 1988, Loch Johnson said ‘‘probably the most important problem faced by
any intelligence agency [is] how to improve the quality of its product.’’97 The
Kent School is part of the CIA’s effort to do this. The school has apparently
achieved its goal of increasing analytic expertise by improving the Agency’s
training for new analysts. This training improves the analysts’ — and
consequently the DI’s—effectiveness, and should provide each analyst with
a greater ability to produce more accurate intelligence analysis. But, even if
increases in analyst expertise due to training can be documented, the
analyst may not have the time, inclination, or opportunity to apply
recently acquired expertise in the production of finished intelligence
analysis. Analytic training may be a necessary component for improvement
in analytic accuracy writ large, but is not sufficient in and of itself.
Therefore, complementary modifications may have to be implemented to
the DI’s business practices if the desired outcome — increased analytic
accuracy — is to occur. As such, the Kent School is a step in the right
direction, but further assessments are required to determine which changes
in business practices or organizational structure might magnify possible
improvements.
More broadly, other issues have implications for American intelligence.
The tragic events of September 2001 have highlighted the limitations of
institutionalized intelligence, including the problems inherent in intelligence
analysis, and present policymakers with the opportunity to redefine the
intelligence community of the next decades. This time, according to an
intelligence expert advising President George W. Bush on ways to improve
the workings of the intelligence community, ‘‘We are trying to treat the
intelligence community more like a corporation that should have goals and
a way to measure success or failure against those goals.’’98
Increased analytic quality, achieved through the Kent School’s Career
Analyst Program helps CIA reach that goal. Yet, in the broader context of
national security policymaking, improving the accuracy of the CIA’s
analytic output on the margins may not necessarily lead to improved
policymaking. Intelligence analysis is only one of many information
streams that the policymaker listens to, and may not be the most
important. In addition, a policymaker’s decisions will be affected by policy
agendas, tradeoffs, and other cognitive limitations that create ‘‘bounded’’
rationality. Therefore, if the researcher’s primary concern is more effective
national security policymaking, then it may be more useful in this limitedresource environment to assess how intelligence production can be
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
630
STEPHEN MARRIN
modified to better fit policymaker needs. In the end, the utility of intelligence
analysis may be just as important a criterion of analytic quality as is
accuracy.99
Nevertheless, understanding the options available to improve
policymaking is crucial, and improving the CIA’s analytic accuracy is one
of those options. As Robert Jervis has said, ‘‘We will never be able to do
as well as we would like, but this does not mean that we cannot do better
than we are doing now.’’100
The Kent School’s Career Analyst Program is a step in the right direction,
but more steps may be necessary in order to produce a measurable
improvement in the CIA’s analytic quality.
APPENDIX
Sherman Kent’s Principles for Intelligence Analysis. The Career
Analyst Program (CAP) Teaches These Principles:
1. Intellectual Rigor
Judgments are supported by facts or credible reporting.
All sources are reviewed and evaluated for consistency, credibility.
Uncertainties or gaps in information are made explicit.
2. Conscious Effort to Avoid Analytical Biases
State working assumptions and conclusions drawn from them explicitly.
Subject assumptions and conclusions to structured challenge: what developments
would indicate they would be wrong.
If uncertainties or the stakes of being wrong are high, identify alternative
outcomes and what it would take for each to occur.
3. Willingness to Consider Other Judgments
Recognize the limits to your own expertise and avoid treating your account as
yours alone.
Seek out expertise that will complement your own as a product is being prepared.
Strong differences of view should be made explicit.
4. Collective Responsibility for Judgment
Seek out and allow time for formal coordination of your product.
Represent and defend all Agency and DI views.
Make it clear when you express individual views; do so only when asked.
5. Precision of Language
Provide your most unique or new insight or fact quickly.
Use active voice and short sentences; avoid excessive detail; minimize the use of
technical terms. Follow DI writing guidelines.
Shorter is always better.
6. Systematic Use of Outside Experts as a Check on In-House Blinders
INTERNATIONAL JOURNAL OF INTELLIGENCE
CIA’S KENT SCHOOL: IMPROVING TRAINING FOR NEW ANALYSTS
631
Seek out new external studies and experts relevant to your account and discipline
on a continuing basis.
Keep up with news media treatment of your account and consider whether their
perspective offers unique insight.
On key issues, indicate where outsiders agree or disagree with your judgments.
7. Candid Admission of Shortcomings and Learning from Mistakes
Recognize that intelligence analysis will sometimes be wrong because it must
focus on the tough questions or uncertainties.
Review periodically past judgments or interpretations; what made them right or
wrong; how could they have been better.
Alert the policymaker if you determine that a previous line of analysis was
wrong. Explain why and what it means.
8. Attentiveness To and Focus On Policymaker Concerns
Deliver intelligence that is focused on and timed to the policymaker’s current
agenda.
Make clear the implications of your analysis for U.S. policy.
Provide ‘‘actionable’’ intelligence that can help the policymaker handle a threat,
make a decision, or achieve an objective.
9. Never Pursue a Policy Agenda
Personal policy preferences must not shape the information presented or the
conclusions of intelligence analysis.
Politely but clearly deflect policymaker requests for recommendations on policy.
Intelligence helps the policymaker by reducing the range of uncertainty and
risk, and by identifying opportunities for action. It does not make the
choice for him.
REFERENCES
1
2
3
4
5
CIA Website, ‘‘Tenet Dedicates New School for Intelligence Analysis,’’ CIA
Press Release, 4 May 2000. http:==www.cia.gov=cia=public_affairs=
press_release=archives=2000=pr050400.html
CIA Website, ‘‘Tenet Lauds Appointment of McLaughlin as Acting DDCI,’’
Press Release, 29 June 2000. http:==www.cia.gov=cia=public_affairs=
press_release=archives=2000=pr06292000.html
Bruce B. Auster, ‘‘What’s Really Gone Wrong with the CIA,’’ U.S. News and
World Report, 1 June 1998, Vol. 124, No. 21, p. 27.
George J. Tenet, ‘‘The CIA and the Security Challenges of the New Century,’’
International Journal of Intelligence and CounterIintelligence, Vol. 13, No. 2,
Summer 2000, pp. 140–141.
Various newspaper articles including: Bob Drogin, ‘‘School for New Brand of
Spooks,’’ The Los Angeles Times, 21 July 2000, p. A-1.; Tim Weiner, ‘‘U.S.
Intelligence Under Fire in Wake of India’s Nuclear Test,’’ The New York
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
632
STEPHEN MARRIN
Times, 13 May 1998; Walter Pincus, ‘‘CIA Chief Cited Loss of Agency’s
Capabilities; Remarks Preceded Indian Bomb Tests,’’ The Washington Post,
25 May 1998, p. A4.
6
For a good presentation of the specific indicators, see Tim Weiner, ‘‘U.S.
Intelligence Under Fire in Wake of India’s Nuclear Test,’’ op. cit.
7
James R. Asker, ‘‘Same Ol’, Same Ol’ ’’ Aviation Week and Space Technology, 8
June 1998.
8
Carla Anne Robbins, ‘‘Failure to Predict India’s Tests Is Tied to Systemwide
Intelligence Breakdown,’’ The Wall Street Journal, 3 June 1998, p. A8.
9
Ibid.
10
Denis Stadther was a member of this task force and provided all the information
on the Kent School’s creation during an interview on 14 May 2001, except where
otherwise indicated.
11
Vernon Loeb, ‘‘CIA Goes Deep Into Analysis; Agency Opens School, Elevates
Analysts,’’ The Washington Post, 4 May 2000, p. A23.
12
CIA Website, ‘‘Tenet Lauds Appointment of McLaughlin as Acting DDCI.’’
Also see: CIA Website, ‘‘John E. McLaughlin: Deputy Director of Central
Intelligence.’’ http:==www.cia.gov=cia=information=mclaughlin.html
13
CIA Website, ‘‘Tenet Dedicates New School for Intelligence Analysis,’’ CIA
Press Release, 4 May 2000.
14
CIA Website, ‘‘Remarks of the Director of Central Intelligence George J. Tenet
at the Dedication of the Sherman Kent School,’’ 4 May 2000. http:==www.
cia.gov=cia=public_affairs=speeches=archives=2000=dci_speech_05052000.html
15
CIA Website, ‘‘Remarks of the Director of Central Intelligence George J. Tenet
at the Dedication of the Sherman Kent School, 4 May 2000.’’
16
Vernon Loeb, ‘‘CIA Goes Deep Into Analysis; Agency Opens School, Elevates
Analysts.’’ For more on Martin Petersen, see Vernon Loeb, ‘‘Tenet, Krongard
Alter CIA Power Structure,’’ The Washington Post, 1 May 2001, p. A21.
17
Ken Olson, Personal Interview, 16 May 2001; Defense Intelligence Agency
(DIA) Website, ‘‘About the JMIC.’’ http:==www.dia.mil=Jmic=about.html
18
Merriam-Webster Website: http:==www.m-w.com
19
Loch K. Johnson, ‘‘Analysis For a New Age,’’ Intelligence and National Security,
Vol. 11, No. 4, October 1996, p. 657.
20
CIA Website, ‘‘Remarks of the Director of Central Intelligence George J. Tenet
at the Dedication of the Sherman Kent School, 4 May 2000.’’
21
CIA Website. Directorate of Intelligence: Organizational Components
http:==www.odci.gov=cia=di=mission=components.html (see: Council of
Intelligence Occupations). See also CIA Website. Intelligence Disciplines,
http:==www.odci.gov=cia=di=work=disciplines.html
22
Frank Watanabe, ‘‘Fifteen Axioms for Intelligence Analysts,’’ Studies in
Intelligence, unclassified edition, 1997. http:==www.odci.gov=csi=studies=
97unclass=axioms.html
INTERNATIONAL JOURNAL OF INTELLIGENCE
CIA’S KENT SCHOOL: IMPROVING TRAINING FOR NEW ANALYSTS
23
633
Robert Jervis, ‘‘What’s Wrong with the Intelligence Process?’’ International
Journal of Intelligence and CounterIntelligence, Vol. 1, No. 1, Spring 1986, pp.
31–32.
24
Stansfield Turner, ‘‘Intelligence for A New World Order,’’ Foreign Affairs, Fall
1991, p. 164.
25
Robert Jervis, ‘‘What’s Wrong with the Intelligence Process?’’, p. 33.
26
Ibid, pp. 31–32.
27
Ibid, pp. 28, 30.
28
Stephen Marrin, ‘‘Complexity Is in the Eye of the Beholder,’’ DI Discussion
Database, 6 May 1997.
29
The inevitability of intelligence failure appears to be a consensus position in the
intelligence failure literature and stems from Richard Betts’s 1978 article.
Richard K. Betts, ‘‘Analysis, War and Decision: Why Intelligence Failures Are
Inevitable,’’ World Politics, Vol. XXX1, No. 1, October 1978, pp. 61–89.
30
Author’s personal experience, circa 1996.
31
CIA Website, ‘‘Remarks of the Deputy Director of Central Intelligence John E.
McLaughlin at the Conference on CIA’s Analysis of the Soviet Union, 1947–
1991, Princeton University. 9 March 2001.’’ http:==www.cia.gov=cia=
public_affairs=speeches=ddci_speech_03092001.html
32
Tim Weiner, ‘‘Naivete at the CIA: Every Nation’s Just Another U.S.’’ The New
York Times, 7 June 1998.
33
Frans Bax, Presentation at an Association of Former Intelligence Officers
(AFIO) Luncheon, Fort Myer, VA, 22 May 2001.
34
Jack Davis, ‘‘Improving Intelligence Analysis at CIA: Dick Heuer’s
Contribution to Intelligence Analysis,’’ Psychology of Intelligence Analysis,
Center for the Study of Intelligence, CIA, 1999, p. xix.
35
Author’s experience in one of FDIT’s earlier runnings in the fall of 1996.
36
Denis Stadther, Personal interview, 14 May 2001.
37
That the Kent School dean resides on the DI’s corporate board is from CIA’s
internal ‘‘newspaper,’’ What’s News at CIA, Issue number 703. Unclassified
article: ‘‘DDI Establishes Sherman Kent School for Intelligence Analysis.
Frans Bax Named Dean.’’ Unknown date.
38
Denis Stadther, personal correspondence, 6 September 2001.
39
Frans Bax, presentation at AFIO Luncheon.
40
Denis Stadther, personal correspondence, 6 September 2001.
41
Most of the information on CAP was provided by its program director Denis
Stadther during an interview on 14 May 2001 and a second interview on 26
February 2002. All direct information on CAP stems from these two
interviews, except where otherwise indicated.
42
Kent School publication, ‘‘Career Analyst Program: Preparing for a Career in
the Directorate of Intelligence,’’ June 2001.
43
Frans Bax, presentation at AFIO Luncheon.
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
634
STEPHEN MARRIN
44
Bob Drogin, ‘‘School for New Brand of Spooks’’; Denis Stadther, personal
interview, 14 May 2001.
45
Denis Stadther, personal interview, 14 May 2001. Re-emphasized on 26
February 2002.
46
John McLaughlin, personal conversation, 9 March 2001.
47
Frans Bax, presentation at AFIO Luncheon.
48
Ibid.
49
For greater information on intelligence tradecraft, see: Douglas J. MacEachin,
et al., ‘‘The Tradecraft of Analysis: Challenge and Change in the CIA,’’
Consortium for the Study of Intelligence, Washington DC, 1994. Also see the
DI Analytic Toolkit which contains ‘‘excerpt(s) from Notes on Analytic
Tradecraft, published between 1995 and 1997, which elaborate on some of the
skills and methods used by DI intelligence analysts. These notes become a
standard reference within CIA for practitioners and teachers of intelligence
analysis.’’ It can be found at: http:==www.odci.gov=cia=di=toolkit=index.html
50
Sherman Kent’s ‘‘Principles for Intelligence Analysis’’ acquired from
informational materials provided by Kent School officers, 14 May 2001. See
Appendix for the full text of the document.
51
Frans Bax, presentation at AFIO Luncheon.
52
The ‘A=B Team’ approach stems from 1976 when former DCI George H. W.
Bush commissioned a team of outside experts to review the same information
that had led to a controversial National Intelligence Estimate on Soviet
military spending. This outside ‘‘B Team’’ based its analysis on more
pessimistic assumptions of Soviet intentions, interpreted intelligence of Soviet
military spending accordingly, and came to a judgment more skeptical than
the IC’s that was consistent with reigning conservative views. See Robert C.
Reich, ‘‘Re-examining the Team A–Team B Exercise,’’ International Journal of
Intelligence and CounterIntelligence, Vol. 3, No. 3, Fall 1989, pp. 387–388. See
also Kevin P. Stack, ‘‘A Negative View of Competitive Analysis,’’
International Journal of Intelligence and CounterIntelligence, Vol. 10, No. 4,
Winter 1997–1998, pp. 456–464.
53
Richards J. Heuer, Jr., Psychology of Intelligence Analysis (Washington, DC:
Center for the Study of Intelligence, CIA, 1999).
54
Bob Drogin, ‘‘School for New Brand of Spooks.’’
55
Ibid.
56
Frans Bax, presentation at AFIO Luncheon.
57
Stephen Marrin, ‘‘The CIA’s Kent School: A Step in the Right Direction,’’ The
Intelligencer: Journal of U.S. Intelligence Studies. Vol. 11, No. 2, Winter 2000,
pp. 55–57.
58
Dan Wagner, personal interview, 23 August 2001.
59
Frans Bax, presentation at AFIO Luncheon. Denis Stadther, personal interview,
14 May 2001.
INTERNATIONAL JOURNAL OF INTELLIGENCE
CIA’S KENT SCHOOL: IMPROVING TRAINING FOR NEW ANALYSTS
60
635
Thomas W. Shreeve and James J. Dowd, Jr. ‘‘Building a Learning Organization:
Teaching with Cases at CIA,’’ International Journal of Intelligence and CounterIntelligence, Vol. 10, No. 1, Spring 1997, p. 104.
61
Frans Bax, presentation at AFIO Luncheon.
62
Donald T. Campbell and Julian C. Stanley, Experimental and QuasiExperimental Designs for Research (Boston: Houghton Mifflin, 1963), pp.
13–22.
63
Denis Stadther, personal interview, 26 February 2002.
64
Frans Bax, personal interview, 14 May 2001.
65
Frans Bax, presentation at AFIO Luncheon.
66
Ibid.
67
CIA Website. ‘‘CIA Vision, Mission, and Values.’’ http:==www.cia.gov=cia=
information=mission.html
68
For more on the limitations of using accuracy to measure intelligence, see the
strategic surprise and intelligence estimation literature. Of particular note, see
Steve Chan, ‘‘The Intelligence of Stupidity: Understanding Failures in
Strategic Warning,’’ The American Political Science Review, Vol. 73, No. 1,
March 1979, pp. 171–180.
69
William Colby, ‘‘Retooling the Intelligence Industry,’’ Foreign Service Journal,
January 1992, p. 21.
70
For more information on the processes of intelligence analysis, see Robert M.
Clark, Intelligence Analysis: Estimation and Prediction (Baltimore, MD:
American Literary Press Inc., 1996); Research: Design and Methods,
(Washington, DC: Joint Military Intelligence College, September 2000); David
Schum, Evidence and Inference for the Intelligence Analyst (Lanham, MD:
University Press of America, 1987).
71
Washington Platt, Strategic Intelligence Production: Basic Principles (New York:
Frederick A. Praeger, 1957), p. 75.
72
Jerome K. Clauser, and Sandra M. Weir, Intelligence Research Methodology
(State College, PA: HRB Singer, Inc., 1976), pp. 37–46.
73
Robert Jervis, ‘‘What’s Wrong with the Intelligence Process?,’’ p. 29.
74
Michael Dobbs, ‘‘U.S., Russia At Odds on Iranian Deal,’’ The Washington Post,
15 June 2001, p. A01.
75
Yitzhak Katz and Ygal Vardi, ‘‘Strategies for Data Gathering and Evaluation in
the Intelligence Community,’’ International Journal of Intelligence and CounterIntelligence, Vol. 5, No. 3, Fall 1991, appendix, pp. 325–327.
76
Loch K. Johnson, ‘‘Analysis For a New Age,’’ p. 661.
77
Robert Jervis, ‘‘What’s Wrong with the Intelligence Process?,’’ p. 33.
78
Robert D. Folker, ‘‘Intelligence in Theater Joint Intelligence Centers: An
Experiment in Applying Structured Methods,’’ Joint Military Intelligence
College Occasional Paper Number Seven, Washington, DC, January 2000, p. 3.
79
Yitzhak Katz and Ygal Vardi, ‘‘Strategies for Data Gathering and Evaluation in
the Intelligence Community,’’ p. 313.
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
636
STEPHEN MARRIN
80
Walter Laqueur, ‘‘The Future of Intelligence,’’ Society, Vol. 35, No. 2, January–
February 1998, p. 301.
81
Washington Platt, Strategic Intelligence Production: Basic Principles (New York:
Frederick A. Praeger, 1957), p. 151.
82
Robert Folker, ‘‘Intelligence in Theater Joint Intelligence Centers,’’ p. 2.
83
Richards Heuer devotes an entire chapter to this approach entitled ‘‘Analysis of
Competing Hypotheses,’’ Psychology of Intelligence Analysis, pp. 95–109.
84
Ibid, p. 89.
85
R.V. Jones, Reflections on Intelligence (London: Heineman, 1989), pp. 87–88. The
characterization of R.V. Jones is attributed to former DCI James Woolsey at:
http:==www.odci.gov=cia=public_affairs=press_release=archives=1997=pr122997.html
86
Jack Davis, ‘‘Improving Intelligence Analysis at CIA,’’ p. xvii.
87
Douglas J. MacEachin, ‘‘The Tradecraft of Analysis: Challenge and Change
in the CIA,’’ p. 1; Jack Davis. ‘‘Improving Intelligence Analysis at CIA,’’ pp.
xvii–xix.
88
Jack Davis, ‘‘Improving Intelligence Analysis at CIA,’’ p. xix.
89
The reference to Davis authoring the tradecraft notes is not on the CIA’s Website.
Instead, the citation was in the foreword of the hardcopy version, which can be found
at: http:==intellit.muskingum.edu=intellsite=analysis_folder=di_catn_Folder=
foreword.html; Jack Davis, ‘‘Improving Intelligence Analysis at CIA,’’ p. xix.
90
CIA Website, ‘‘DI Analytic Toolkit.’’ http:==www.odci.gov=cia=di=toolkit=
index.html
91
CIA Website, ‘‘Intelligence Analysis in the DI: Frequently Asked Questions.’’
http:==www.odci.gov=cia=di=work=analyst.html
92
CIA Website, ‘‘Analytical Products of the DI.’’ http:==www.odci.gov=cia=di=
work=major.html
93
For the emphasis in the 1980s on longer papers, see Loch K. Johnson, ‘‘Making
the Intelligence ‘Cycle’ Work,’’ International Journal of Intelligence and CounterIntelligence, Vol. 1, No. 4, Winter 1986, pp. 6–7. For the benefits of longer
papers, see Arthur S. Hulnick, ‘‘Managing Intelligence Analysis: Strategies for
Playing the End Game,’’ International Journal of Intelligence and CounterIntelligence, Vol. 2, No. 3, Fall 1988, pp. 322–323; and Loch K. Johnson,
‘‘Analysis For a New Age,’’ p. 665. For the backlash against longer papers,
see Jay T. Young, ‘‘US Intelligence Assessment in a Changing World: The
Need for Reform,’’ Intelligence and National Security, Vol. 8, No. 2, April
1993, pp. 129, 134. For the DI’s change in the 1990s, see ‘‘ ‘Say It Ain’t So,
Jim’: Impending Reorganization of CIA Looks Like Suppression, Politicizing
of Intelligence,’’ Publications of the Center for Security Policy, No. 94-D 74,
15 July 1994. http:==www.security-policy.org=papers=1994=94-D74.html. For
the backlash against shorter papers, see John Gentry, ‘‘A Framework for
Reform of the U.S. Intelligence Community.’’
94
John E. McLaughlin, New Challenges and Priorities for Analysis, Defense
Intelligence Journal, Vol. 6, No. 2, 1997, pp. 16–17. Also on the CIA Website
at http:==www.cia.gov=cia=di=speeches=428149298.html
INTERNATIONAL JOURNAL OF INTELLIGENCE
CIA’S KENT SCHOOL: IMPROVING TRAINING FOR NEW ANALYSTS
95
637
Frederick P. Hitz, ‘‘Not Just a Lack of Intelligence, a Lack of Skills,’’ The
Washington Post, 21 October 2001, p. B3.
96
Robert Jervis, ‘‘What’s Wrong with the Intelligence Process?,’’ p. 41.
97
Loch K. Johnson, A Season of Inquiry (Chicago: Dorsey Press, 1988), p. 197.
98
Walter Pincus, ‘‘Intelligence Shakeup Would Boost CIA,’’ The Washington Post,
8 November 2001, p. A1.
99
In fact, in 2001 the Kent School’s former dean noted his preference for a utility
measure over an accuracy one. Frans Bax, personal interview, 14 May 2001.
100
Robert Jervis, ‘‘What’s Wrong with the Intelligence Process?,’’ p. 30.
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
International Journal of Intelligence and CounterIntelligence, 16: 609–637, 2003
Copyright # Taylor & Francis Inc.
ISSN: 0885-0607 print/1521-0561 online
DOI: 10.1080/08850600390198779
STEPHEN MARRIN
CIA’s Kent School: Improving Training
for New Analysts
The Central Intelligence Agency’s (CIA’s) Career Analyst Program (CAP)
for new analysts seeks to increase their on-the-job effectiveness and ability
to produce more accurate analysis. The program is located within the
Sherman Kent School for Intelligence Analysis, which CIA’s senior
managers created in 2000 to increase the expertise of officers within its
Directorate of Intelligence (DI). The school provides analyst and
managerial training for the DI, as well as housing the Kent Center which
acquires and disseminates information regarding analytic ‘best practices.’1
According to an Agency press release, the CAP is CIA’s ‘‘first
comprehensive training program for professional intelligence analysts,’’
and is a noticeable improvement on prior training efforts. 2 The CAP
provides new analysts with the knowledge and skills that enable them to
be more effective in the production of finished intelligence, as well as in
their overall job performance. It also provides them with the means to
produce more accurate analysis by teaching them the causes of — and
means to avoid— intelligence failure, and cognitive tools to assist them in
structuring their analysis more along the lines of the scientific method.
Whether the CAP will lead to an improvement in the quality of the CIA’s
analytic output will depend on whether analysts have the opportunity to
apply their newly acquired expertise when back on the job after training.
Ultimately, the institutional assessments of analytic production processes
will determine whether analytic training programs allow the CIA to
Stephen Marrin, a former Central Intelligence Agency analyst and contractor,
is in the doctoral program at the University of Virginia, specializing in
intelligence studies. An earlier version of this article was presented at the
annual meetings of the Intelligence Studies Section of the International
Studies Association, New Orleans, Louisiana, in March 2002.
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
609
610
STEPHEN MARRIN
improve its production of accurate, timely, and tailored intelligence that fits
the needs of the Agency’s national security policymaking customers.
IMPROVING THE CIA’S ANALYSES
Just as every reform is intended to be a solution to a particular problem, the
CIA’s leaders created the Kent School as a way to address the Agency’s
analytic weaknesses. In May 1998, Director of Central Intelligence (DCI)
George J. Tenet ‘‘gave a classified internal briefing on the CIA’s problems
and what he intends to do about them,’’ according to an article in US
News and World Report.3 In articulating his vision for the future of the
CIA — a vision that became known as his ‘Strategic Direction’ — Tenet
emphasized the importance of changing past practices to improve the
Agency’s contribution to national security policymaking. A year later, in
18 October 1999, he told an audience at Georgetown University in
Washington, D.C.:
[When] I launched a Strategic Direction plan for the CIA . . . I told our
people that we had to take charge of our destiny. That we would do all
within our power to maintain our edge and our vibrancy. That we had
to streamline and realign ourselves and adopt business practices like
the best in the private sector. That we would think big and think
different. That we would work smarter and in new ways so that we
would have the agility and the resiliency to do what the President —
this President or a future President — wants and the American people
expect.4
One week after this speech, the CIA failed to warn American policymakers
of India’s intention to test nuclear weapons. This failure highlighted the
Agency’s limitations and likely accelerated the implementation of reforms
stemming from Tenet’s Strategic Direction plan.5 Post hoc analyses cited
many factors which may have contributed to the failure, but notable was
the DI’s ‘‘lack of critical thinking and analytic rigor,’’ causing it to fail to
add together all indications of a possible nuclear test.6,7 According to The
Wall Street Journal’s Carla Anne Robbins, Admiral David Jeremiah—who
headed the official investigation into the failure — ‘‘recommended hiring
more analysts, improving their training and increasing contact with outside
experts to challenge conventional wisdom.’’8 She also reported that DCI
Tenet said he would ‘‘make it my highest priority to implement [Admiral
Jeremiah’s recommendations] as quickly as possible.’’9
Tenet subsequently commissioned a number of task forces to implement
his Strategic Direction. The ‘‘Analytic Depth and Expertise Task Force,’’
made up of eight DI officers, was given free rein to investigate ways to
‘‘develop true world class experts’’ consistent with the Strategic Direction’s
goals, according to Kent School program director Denis Stadther.10 The
INTERNATIONAL JOURNAL OF INTELLIGENCE
CIA’S KENT SCHOOL: IMPROVING TRAINING FOR NEW ANALYSTS
611
officers met several times a week through the latter half of 1998,
brainstorming ideas to increase the expertise of junior, mid-level, and
senior analysts. The task force ruled out a training course for mid-level
officers because they believed that the broadening experience provided by
rotational assignments, such as those overseas or to policymaking
institutions, would provide greatest value to the Agency. The task force
also ruled out training for senior analysts. It noted that the DI’s career
paths provided greater advancement possibilities for senior analysts willing
to switch to management, and as a result the more ambitious analysts
became managers rather than continuing to use their knowledge in an
analytic capacity. Therefore, the task force proposed the creation of a
more desirable career track in which senior analysts could continue to use
their expertise rather than shift into management. Finally, the task force
concluded that junior officers would be best served with a training
program intended to ‘‘build a common foundation upon which analytic
expertise could be built.’’
The task force submitted two recommendations to then-DDI John E.
McLaughlin, who approved both, and subsequently attributed their
implementation to DCI Tenet’s support and tenure that ‘‘lasted longer
than the ‘task force phase,’’’ according to Vernon Loeb of the Washington
Post.11 In 2000, the career track allowing ‘‘analysts to rise to very senior
rank without branching out into management’’ took form as the Senior
Analytic Service. 12 Also that year, the improved new analyst program
began operating through the newly created Sherman Kent school. The
school was named for Kent, chairman of the CIA’s Board of National
Estimates from 1952 to 1967, and considered ‘‘a leader in developing the
profession of intelligence analysis.’’13 At the dedication of the Kent School
in May 2000, Tenet praised Kent as a ‘‘brilliant teacher to generations of
analysts,’’ and expressed his wish that ‘‘this school . . . [will] always produce
analysts of whom he would be proud.’’14
MECHANISM OF IMPROVEMENT
The Kent School was created to improve CIA’s analytic quality, in part by
increasing the expertise of its analysts. At the opening ceremonies, Tenet
stated that the Kent School ‘‘will prepare generations of men and women
for the . . . profession of intelligence analysis in the 21st Century . . . [by
teaching] the best of what we as an Agency have learned about the craft of
analysis.’’15 According to Martin Petersen, director of the CIA’s human
resources department, the Kent School’s creation ‘‘sends a very, very
powerful message about what you value, and the value here . . . is analytic
expertise.’’16 The Kent School improves the CIA’s analytic production by
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
612
STEPHEN MARRIN
providing analysts with knowledge that they otherwise would acquire
haphazardly or not at all in on-the-job training.
Its mission of increasing analyst expertise corresponds with the efforts of
many other professions, and even other intelligence organizations, that
train employees as part and parcel of organizational improvement
programs. Training programs, in general, provide information specific to
the needs of the institution, and different institutions within the
governmental foreign policy process use training programs to bolster their
unique informational needs. For example:
The State Department’s Foreign Service Institute provides area and language
studies helpful to outgoing State Department Foreign Service Officers and
other government officials;
The Department of Defense’s (DOD) command colleges — including the Army
War College, Naval War College, and National Defense University among
others — provide rising military officers with a more advanced conceptual and
informational context within which to make command decisions;
The Joint Military Intelligence Training Center (JMITC) specializes in providing
short-course training to the DOD’s military intelligence analysts, according to
Ken Olson, the chief of JMITC’s General Intelligence Training Branch. Its
companion institution, the Joint Military Intelligence College (JMIC), provides
broader education consisting of degree and certificate programs in intelligence
at the graduate and undergraduate level.17
Training is intended to increase expertise, which is ‘‘the skill of an expert,’’
and this skill is crucial to the production of accurate intelligence analysis.18
‘‘In the lexicon of US intelligence professionals,’’ observes intelligence
scholar Loch K. Johnson, ‘‘analysis’ refers to the interpretation by experts
of unevaluated (‘raw’) information collected by the intelligence
community.’’19 The more expert an analyst is, the more likely that he or
she will produce a higher quality product. DCI Tenet described the
importance of an analyst’s expertise at the Kent School’s dedication:
In our [DI] it is not enough just to make the right call. That takes luck.
You have to make the right call for the right reasons. That takes
expertise. It is expertise — built up through study and experience — that
combines with relevance and rigor to produce something that is very
important: insight. . . . [O]ur analysts blend a scholar’s mastery of detail
with a reporter’s sense of urgency and clarity. At its best, the result is
insight. And it is insight that wins the confidence of our customers and
makes them want to read our publications and listen to our briefings.20
The DI’s exploitation of analytic expertise in its production of finished
intelligence entails the integration of a complex web of analytic specialties
to produce multi-disciplinary analysis. Not hired by the CIA or even the
DI, most analysts are hired by the individual DI offices, assigned to
INTERNATIONAL JOURNAL OF INTELLIGENCE
CIA’S KENT SCHOOL: IMPROVING TRAINING FOR NEW ANALYSTS
613
‘‘groups’’ which cover specific geographic areas, and then assigned a
functional specialty — ‘‘discipline’’ or ‘‘occupation’’ in DI terminology —
such as political, military, economic, leadership, or scientific, technical, and
weapons intelligence. 21 For the most part, CIA analysts possess a very
small area of direct responsibility, defined by a combination of their
regional area and discipline, work in country ‘‘teams’’ with analysts of
other disciplines, and interact with other regional or disciplinary specialists
as the need arises. Therefore, an analyst’s expertise can vary, depending on
the relative possession of regional knowledge, disciplinary theory, and
intelligence methods in general:
Regional expertise is essentially area studies: a combination of the geography,
history, sociology, and political structures of a specific geographic region. The
DI’s regional offices, such as the Office of Near East, South Asia and Africa,
are responsible for an analyst’s regional expertise and develop it by providing
access to language training, regional familiarization through university courses,
or in-house seminars.
Procedural expertise is knowledge of ‘‘tradecraft’’; the sometimes unique methods
and processes required to produce intelligence analysis. The Kent School is the
primary repository for tradecraft information and its training programs are the
primary means of distribution to both analysts and managers.
Disciplinary expertise is the theory and practice that underlies the individual
analytic occupations. For example, political, military, economic, and
leadership analysis are all built on beds of theory derived from the academic
disciplines of political science, military science, economics, and political
psychology, respectively. Disciplinary expertise can be acquired through
previous academic coursework, on-the-job training, and through occupational
‘‘representatives’’ distributed throughout the offices as focal points for
disciplinary resources.
The CIA’s small analytic niches create specialists, but their specialties must
be reintegrated in order to provide high-level policymakers with a bigger
picture that is more accurate and balanced than the limited perspective or
knowledge of the niche analyst. This process of reintegration includes a
procedure known as ‘‘coordination,’’ in DI parlance, which allows analysts
of all kinds to weigh in with their niche expertise on pieces of finished
intelligence before they are disseminated. According to CIA analyst Frank
Watanabe: ‘‘We coordinate to ensure a corporate product and to bring the
substantive expertise of others to bear.’’22 Accordingly, the bureaucratic
norm is that an analyst will make every effort to coordinate a draft with
other analysts in related regional or disciplinary accounts prior to
submitting the draft to management for editing and dissemination. As a
result, while the expertise of the primary drafter of the piece of intelligence
analysis usually has primary influence on the accuracy of the final piece,
the coordination process exerts a strong influence as well.
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
614
STEPHEN MARRIN
Inaccuracies in the analytic product can occur as a result of insufficient
substantive and disciplinary expertise. Columbia University Professor
Robert Jervis points out that ‘‘a grav(e) danger lies in not having sufficient
expertise about an area of a problem to detect and interpret important
trends and developments. To make up for such deficiency, analysts tend to
impose on the information the concepts, models, and beliefs that they have
derived elsewhere.’’23 In addition, in 1991 former DCI Stansfield Turner
noted that: ‘‘Another reason for many of the analytic shortcomings is that
our analytical agencies do not have an adequate grasp of the cultures of
many countries with which we must deal.’’ He goes on to suggest that
analysts could use a ‘‘a better opportunity to attend academic institutions,
participate in professional conferences, travel and live abroad, acquire
language skills and thus become true experts in their areas.’’ 24 Robert
Jervis adds ‘‘adequate training’’ and ‘‘first-hand exposure to the country’’
as additional methods to increase expertise.25
Expertise is not necessarily sufficient to produce accuracy or prevent
failure, though. Jervis notes that: ‘‘experts will [not] necessarily get the
right answers. Indeed, the parochialism of those who know all the facts
about a particular country that they consider to be unique, but lack the
conceptual tools for making sense of what they see, is well known.’’ 26
In addition, ‘‘even if the organizational problems . . . and perceptual
impediments to accurate perception were remedied or removed, we could
not expect an enormous increase in our ability to predict events’’ because
‘‘the impediments to understanding our world are so great’’ that
‘‘intelligence will often reach incorrect conclusions.’’27 Human cognitive
limitations require analysts to simplify reality through the analytic process,
but reality simplified is no longer reality. As a result, ‘‘even experts can be
wrong because their expertise is based on rules which are at best blunt
approximations of reality. In the end any analytic judgment will be an
approximation of the real world and therefore subject to some amount of
error,’’ and analytic inaccuracies—and sometimes intelligence failure—will
be inevitable.28,29
In sum, analyst expertise is an important component of analytic quality,
but even if the Kent School’s analytic training were ideal it could never
increase expertise to the point that perfection would be attainable. In
addition, the CIA’s interdependent analytic structure implies that training
can increase aggregate capabilities only through the cumulative effects of
multiple kinds of targeted training. Even so, that these factors limit the
Kent School’s potential to improve analyst quality does not preclude
improvement on the margins. If training can improve analytic quality it
does so by increasing the knowledge and skills at the analyst’s disposal.
The CIA’s intelligence analysts require an understanding of their role
within the foreign policy process, tools that help them structure and
INTERNATIONAL JOURNAL OF INTELLIGENCE
CIA’S KENT SCHOOL: IMPROVING TRAINING FOR NEW ANALYSTS
615
analyze complicated issues, a theoretical framework to approach an issue
from a specific disciplinary perspective, such as political or economic
analysis, and the presentational skills necessary for busy policymakers to
incorporate into their decisionmaking processes. Yet, prior to the Kent
School’s creation, new analysts were not receiving the information they
needed to do their jobs effectively.
OVERCOMING PREVIOUS TRAINING DEFICIENCIES
The CIA’s central role in providing high-level policymakers with analysis of
national security threats and opportunities might be expected to require
stringent training for new analysts prior to conducting analysis and writing
reports for senior policymakers. In fact, CIA analysts have been hired and
assigned a desk at Agency headquarters without any analytic training
whatsoever, with two or four weeks of analytic training coming six months
or so later.30 For example, in 1996 the introduction to the CIA for new
analysts consisted of just one week of pro forma briefings regarding
security and other administrative issues, with nothing specific to analysis.
Within weeks, and in some cases days, these newly hired analysts were
producing finished intelligence—albeit limited and under the careful watch
of more senior analysts — without any official training whatsoever in
substance or methods.
This minimal formal training appears to have historical roots in the CIA.
According to former DDI and current Deputy Director of Central
Intelligence (DDCI) John McLaughlin: ‘‘Dick Lehman, the creator of what
we now call the President’s Daily Brief, once remarked that the basic
analytic training he got back in 1949 came down to a single piece of advice
from his boss: ‘Whatever you do, just remember one thing — the Soviet
Union is up to no good!’ Simple, but that said it.’’31 In addition, Mark
Lowenthal, a former staff director of the House Permanent Select
Committee on Intelligence, has said that CIA does not ‘‘do a lot of
training. . . . They say, ‘Congratulations, you’re the Mali analyst, have a
nice day.’’’32 The training process usually relied upon the analyst’s prior
formal education, combined with an initial period of sink-or-swim
adaptation to the DI. The sink-or-swim analogy is used frequently inside
the Agency to describe its junior analyst acculturation. In May 2001, the
Kent School’s former dean, Frans Bax, likened previous DI training to
being thrown into the deep end of a pool, and added that if the training or
mentoring ‘‘missed,’’ the analyst ‘‘floundered.’’33
In 1996, the CIA improved its training for junior analysts with the creation
of a month-long survey course for new analysts entitled ‘‘Fundamentals of
DI Tradecraft’’ (FDIT). Some of FDIT’s content was based on a
tradecraft course developed under the auspices of former DDI Douglas
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
616
STEPHEN MARRIN
J. MacEachin, and delivered to all analysts and managers.34 Nonetheless,
although FDIT was intended only as a limited survey course, it received a
fair amount of criticism internally in its first few runnings due to the slow
pace and inconsistent quality of instruction.35 It also demonstrated that a
mere month of training was too short to familiarize analysts with the
range of information that they needed to know to do their jobs correctly.
But these weaknesses became strengths, as the criticisms may have
provided the Analytic Depth and Expertise task force with indicators that
there was greater potential inherent in the new analyst training than was
being actualized through FDIT. In the end, the task force was able to use
the lessons derived from FDIT to inform the structure of the Kent School
and the content of its new analyst training program.
One member of the Analytic Depth and Expertise task force had been
intimately involved in FDIT’s development as a DI representative to the
CIA’s Office of Training and Education (OTE), and he contributed the
lessons he learned from this to the task force. 3 6 After deciding to
recommend training for junior analysts, the task force began to prepare
the outlines of a proposal. Prior DI training programs run through OTE,
located in CIA’s Directorate of Administration, had difficulty obtaining
qualified instructors from the DI because of the prevalent belief that nonanalytic assignments in OTE were not career-enhancing. In addition,
OTE’s separate structure led to difficulties providing training tailored to
the DI’s needs. To overcome these problems, the task force recommended
that training be run through the DI so that it could be better integrated
with the DI’s requirements. Also suggested was a school structure that
would ensure continuity and a knowledge base for future training efforts.
A final suggestion was that the school be headed by a Senior Intelligence
Service officer with a presence on the DI’s corporate board in order to
provide greater ‘‘bureaucratic weight and budgetary authority’’ for the
school to meet its goals. 37 Then-DDI McLaughlin implemented all the
suggestions. As a result, the Kent School’s location within the DI and
bureaucratic structure provide it with a solid base from which to deliver its
informational product.
Although the CAP used lessons derived from FDIT, it was, according to
Denis Stadther, ‘‘a clean-sheet exercise in developing new training that was
founded in the belief that comprehensive training for new analysts must
go beyond skills development to include history, mission, and values of
the DI as well as orientation to specialized knowledge and skills like
operations, intelligence collection, and counterintelligence issues.’’ 38
Comprehensive training could not be shoehorned into a one-month course
akin to FDIT, however, and the CAP’s length became a matter of some
dispute. According to the school’s former dean, Frans Bax, DCI Tenet had
‘‘chided the DI for not taking training as seriously as the [CIA’s
INTERNATIONAL JOURNAL OF INTELLIGENCE
CIA’S KENT SCHOOL: IMPROVING TRAINING FOR NEW ANALYSTS
617
Directorate of Operations or DO]’’ — which has a year-long program for
trainees — and ‘‘instructed the DI to create a program as long as the
DO’s.’’39 As a result, the Analytic Depth and Expertise task force initially
proposed a ten-month long training program. But this was opposed by the
DI offices that were facing tight personnel resource constraints and
reluctant to lose their new analysts for that long. To accommodate their
concerns, the program was reduced from ten months to twenty-five weeks,
and yet retained ‘‘all [its] training components’’ by providing analysts with
fewer interim assignments to other components.40 The CAP’s curriculum
undergoes constant revision derived from analyst and manager feedback in
a search to make the program more effective. According to Denis Stadther,
between 2000 and 2002, the CAP’s length was reduced to twenty-two
weeks, as the curriculum was refined due to student feedback and a
comprehensive review of the curriculum by the CAP faculty.
CAP’S TRAINING CURRICULUM41
By February 2002, the CAP had become a twenty-two-week program which
‘‘teaches newly hired analysts about the CIA’s and the DI’s history and
values and develops the basic skills essential to an intelligence analyst’s
successful career in the directorate.’’ 4 2 The initial week entails an
introduction to intelligence topics, including the history, mission, and
values of the CIA, and a unit on the history and literature of intelligence
that is taught by the history staff of the Agency’s Center for the Study of
Intelligence.43 The following few weeks introduce analysts to a variety of
DI skills, including analytic thinking, writing and self-editing, briefing,
data analysis techniques, and a teamwork exercise akin to an ‘‘outward
bound’’ for analysts. Training in these kinds of basic skills is necessary,
even for CIA analysts with advanced degrees, because intelligence analysis
focuses on the informational needs of policymakers who ask different
questions than those encountered in academia. Also, briefing skills are not
emphasized in academia but are vital for the effective distribution of
information within government. In addition, analysts are not accustomed
to writing in the specialized writing style of the DI, which emphasizes
bottom-line judgments at the beginning of the piece, while strategically
using evidence to bolster the main conclusions. This training is reinforced
throughout the CAP, as analysts write an intelligence paper in which they
self-consciously apply lessons learned to a real-world analytic problem.44
The first five weeks of training culminate in a task force exercise that
provides analysts with the opportunity to take the skills they have learned
and practiced in a classroom setting and apply them in a more realistic one.
After five weeks in the classroom the CAP students go on a four-week
interim assignment that helps them understand how the DI relates to other
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
618
STEPHEN MARRIN
components of the CIA, the intelligence community, and the policymaking
agencies they serve. The analysts — in consultation with their managers —
identify positions within the bureaucracy that are related to their
assignment, and provide working knowledge of units that each analyst will
either work with or rely upon throughout his or her career.45 In particular,
seeing how intelligence analysis is viewed and used in other components of
the policymaking bureaucracy can help new analysts understand their role
and perform it better, according to Denis Stadther. These interim
assignments can entail a stint at a military command—such as the Pacific
Command headquartered at Pearl Harbor — which provides the analyst
with insight into working with a military customer. These assignments can
have other benefits as well. In early 2001, DDCI McLaughlin spoke
enthusiastically of the insight that a new analyst can acquire from visiting
Pearl Harbor and seeing first-hand the reason the CIA was created, and
how he or she fits into a fifty-year tradition of working to prevent similar
destruction.46
After this four-week interim assignment, the analysts return to the
classroom for four more weeks of training in more advanced topics, such
as writing and editing longer papers and topical modules that address
issues such as denial and deception, and indicators and warning. 47
According to Stadther, these are special kinds of analysis that require
application of more advanced and sophisticated analytic tradecraft
techniques. The analysts also broaden their knowledge of other positions
within the intelligence community by visiting other intelligence agencies
and participating in an ‘‘ops familiarization course’’ which provides a brief
exposure to the intense training that a DO case officer receives. 48 The
analysts are also provided with greater information on tradecraft topics —
consisting of concepts, techniques, and methodologies—that form the core
of the intelligence analysts’ specialized skill set, which they can use to
rigorously and self-consciously analyze raw intelligence data. This skill set
includes adapting the study of intelligence issues to the needs of the user,
presenting complicated issues in simple terms stripped of long-winded
assumptions, qualifications, and background, and addressing issues related
to controversial foreign policies and policy goals without taking a policy
position. 49 Specifically, the CAP uses Sherman Kent’s ‘‘Principles for
Intelligence Analysis’’ as guidelines for its students to follow as they learn
the craft of analysis. 5 0 These guidelines — listed here in full in the
Appendix — emphasize the importance of intellectual rigor, the conscious
effort to avoid analytical biases, a willingness to consider other judgments,
and the systemic use of outside experts as a check on in-house blinders, as
well as the importance of candidly admitting analytic shortcomings and
learning from mistakes.
INTERNATIONAL JOURNAL OF INTELLIGENCE
CIA’S KENT SCHOOL: IMPROVING TRAINING FOR NEW ANALYSTS
619
The CAP also teaches other tradecraft skills, such as alternative analysis
techniques including, but not limited to, scenario building and competitive
‘‘A=B Team’’ approaches that entail periodic checks on analysis by
external experts. 51,52 In addition, the CAP assigns Richards Heuer’s
writings to provide analysts with tips and techniques for overcoming flaws
and limitations in the human thinking process. 53 The CAP also uses
analytic failure as a teaching tool for what not to do. Since much of the
intelligence analysis tradecraft has been derived from intelligence failure,
‘‘the lessons of . . . high-profile CIA screw-ups form the core of the Kent
School curriculum,’’ according to reporter Bob Drogin of the Los Angeles
Times. Drogin quotes the school’s dean as saying, ‘‘We spend a lot of time
in this course studying mistakes.’’54
The analysts then go for a second four-week interim assignment, and come
back for a final four-week classroom session that deals with more advanced
topics. For example, a session on politicization and professional ethics
teaches analysts ‘‘how to walk the fine line that separates effective
intelligence support for policymakers from prescribing policy,’’ according
to Stadther. Another session, entitled ‘‘Writing for the President,’’ is
taught by senior DI editors, provides insight and lessons learned in writing
for the CIA’s most senior and demanding policymakers. The CAP also
provides new analysts with a ‘‘roundtable on customer relations’’ in which
experienced analysts provide their hard-earned lessons in dealing with
Congress, the public, and the press, and a ‘‘secrets of success panel’’
providing tips of what does and does not work from analysts with a few
years of experience.
Once again, the classroom session ends with a task force exercise. This time
it entails a two-day terrorist crisis simulation that takes place outside the
classroom, and requires total immersion to handle the fifteen-hour days.
The CAP instructors, using this as an opportunity to present analysts with
situations they might see later in their careers, spike the simulation with
dilemmas to force the analysts to improvise and apply the skills they
learned throughout the course. As reporter Bob Drogin notes, it is
‘‘deliberately designed to be intensive and stressful.’’55 Finally, at the end
of the program, the CAP provides written evaluations of the students to
their home offices, documenting their skills, abilities, and performance
during trainings.56 In sum, the CAP’s twenty-two weeks provide analysts
with information unique to the realm of intelligence analysis, as well as
‘‘greater opportunity for individual growth in understanding of the
profession, and hopefully a greater self-conscious rigor in the use of
analytical tools.’’57
In addition to more rigorous content, the CAP utilizes interactive teaching
methods to maximize analyst learning and acquisition of expertise.
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
620
STEPHEN MARRIN
The training is intended to improve analyst performance. According to Kent
School Dean Dan Wagner, rather than risk student disengagement from
abstract lecture courses, the programs use varied and active teaching
methods which are ‘‘very important’’ to provide multiple pathways for
students to apply their unique learning styles.58 In addition, the CAP also
employs the case study method to make the lessons more fungible to realworld situations.59 Former CIA instructor Thomas Shreeve has noted that
‘‘cases fit best in those parts of the CIA curriculum in which the teaching
objectives included helping students learn how to analyze complex,
ambiguous dilemmas and make decisions about how to resolve them.’’60
In sum, the CAP provides analysts with knowledge that they could not
acquire elsewhere, and improves on prior efforts in this area. If these
lessons are absorbed and subsequently applied, analytic quality may
improve. But training effectiveness and the growth of analyst expertise will
depend on several variables, including student motivation, peer group
influence, and teaching methods. The CAP’s use of interactive teaching
methods is aimed at maximizing learning, and provides an assessment to
the analyst’s home office similar to and with the incentives of a report
card. Some of that knowledge will fall by the wayside, however, as
analysts discard the lessons that are not relevant to their specific accounts.
Other lessons, such as the promulgation of Sherman Kent’s principles to
new analysts, are more goals than standards, and may be modified by
inevitable workplace pressures. For example, professional practices and
promotional concerns may mitigate against the universal application of
‘‘candid admission of shortcomings.’’ Nonetheless, the CAP’s emphasis on
practices such as coordination, cooperation, and admission of one’s own
analytic limitations provides the impetus to change the sometimes
hidebound internal DI culture. As its former dean, Frans Bax, noted in
2001, the Kent School is ‘‘trying to embed the concept of career-long
learning within the culture of the DI.’’61 By disseminating tradecraft and
techniques widely throughout the DI, individual analyst expertise will
increase. But concluding that this increased expertise may or will cause
improvement to the DI’s analytic quality has yet to be justified.
INCREASING ANALYST EFFECTIVENESS
The CAP appears to be an upgrading of the CIA’s past efforts to provide
analysts with job-relevant expertise, although measuring this improvement
from outside the institution is impossible. Hypothetically, evaluating
the CAP’s impact is possible through a simple pretest-posttest control
group research design. 62 For example, to establish the efficacy of the
school in increasing new analyst expertise a test could be administered
both before and after training to measure any increases in analyst
INTERNATIONAL JOURNAL OF INTELLIGENCE
CIA’S KENT SCHOOL: IMPROVING TRAINING FOR NEW ANALYSTS
621
knowledge. A control group composed of new analysts who remain on the
job instead of receiving training could be tested as well, and any changes
in analyst knowledge between the two groups would be attributable to the
training alone.
In October 2001, the Kent School’s director implemented a version of
this alternative design by hiring an outside contractor to interview CAP
alumni and their managers to assess the effect of training on their job
performance. 63 CAP program director Denis Stadther provided an
overview of its conclusions. According to Stadther, all interviewees
considered the CAP to have been beneficial. In particular, line managers
noted that new analysts who went through the program appeared to
possess greater confidence and maturity in applying their skills on the
job. In addition, CAP analysts created lasting ties with their classmates,
and the use of these contacts increased their ability to work the system
in such job-related tasks as obtaining information from a counterpart
agency, coordinating with experts elsewhere within the government, and
finding a fellow analyst within the CIA who covers a particular issue.
Managers appreciated the analysts’ greater institutional expertise because
it meant that they required less intense mentoring to perform well on the
job. Stadther also noted that, contrary to the belief of early CAP
students that their promotion prospects might be adversely affected
by being pulled off the job for five months of training, anecdotal
evidence suggests that their promotion rates have been as good as or better
than those who did not attend the CAP. Stadther states that a good
performance by analysts in the CAP appears to translate well into good
job performance.
But he also emphasized that not all CAP student feedback was positive,
and that this led to adjustments in the curriculum. For example, when
analysts had difficulty relating to a theoretical unit intended to improve
interpersonal skills, the CAP’s program manager refocused it on the
practical application of those skills. Nonetheless, even though feedback
indicates that some aspects of the CAP’s curriculum require modification,
in the aggregate the CAP improves a new analyst’s individual effectiveness,
and this may improve the DI’s effectiveness on the margins.
Training also provides other benefits to analyst effectiveness that Stadther
did not mention was part of the study. For example, CAP promotes a better
fit between analyst and assignment by encouraging analysts to pursue
transfers to units where their skills or interests may be more effectively
applied.64 The CIA’s placement policies are cumbersome due to the sixmonth or greater lag between the time a vacancy opens up and a qualified
person can be hired because of the background checks and other matters
necessary to obtain security clearances. This lengthy process can lead to
initial assignments that are a misfit for an analyst’s knowledge and
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
622
STEPHEN MARRIN
abilities. In 2001, Frans Bax noted that the CAP ‘‘provides the opportunity
for officers to find a better fit for themselves,’’ and that ‘‘there is an infinite
range of career possibilities and if an analyst sees something better they
should go for it.’’65 He added that, in the one year the CAP had been in
operation, a number of analysts had switched offices and even directorates,
and that this was an intentional attempt to break out of the apprenticeship
model that had formerly formed the core human resource approach for the
DI. 66 Therefore, the CAP provides CIA with the opportunity to shift
personnel in a way that allows for interest and skill mixes to be applied
more effectively towards institutional goals.
In sum, the CAP provides analysts with a greater knowledge of the
institution and its practices that can make their job performance more
effective, and even side benefits like providing the opportunity to refine
placement policies with transfers can improve institutional performance on
the margins. But neither addresses the core reason the Kent School was
created: to improve the CIA’s analytic quality.
TOWARD IMPROVING ACCURACY
More fundamental to the mission of intelligence analysis than bureaucratic
savvy or presentational ability, however, is the accuracy of the finished
intelligence product. According to the CIA’s Website, its mission includes
‘‘providing accurate, evidence-based, comprehensive, and timely foreign
intelligence related to national security.’’67 Directly measuring the CAP’s
impact on analytic quality is impossible from outside the institution, due
to the fact that intelligence analysis is inherently classified. In addition, so
far as I know, neither the CIA in general nor the school in particular
measures the accuracy of the analytic product because accuracy in
intelligence analysis is a very difficult concept to operationalize.
Since intelligence analysis can influence what American national security
policymakers decide to do, and what they do has the potential to prompt
or preclude actions of other international actors, a hit-or-miss yardstick
would not effectively capture the quality of the analysis.68 For example, if
the CIA predicts that a terrorist bombing is imminent and policymakers
implement security procedures based on CIA warnings, and the terrorists
are deterred due to the increased security measures, then the intelligence
prediction may well be considered inaccurate even though it helped prevent
the bombing. This causal dynamic exists for all intelligence issues —
including political, economic, and scientific — due to the nature of the
intelligence mission. Therefore, post-hoc assessment of intelligence
accuracy may not provide a true sense of the quality of the analysis.
Instead, both the Kent School’s and the CIA’s evaluations focus on the
INTERNATIONAL JOURNAL OF INTELLIGENCE
CIA’S KENT SCHOOL: IMPROVING TRAINING FOR NEW ANALYSTS
623
soundness of the analytic process because it is implicitly a modified version of
the scientific method.
Analytic Processes Familiar
The CIA analyst uses a version of the scientific method to create intelligence
analysis. The traditional term ‘‘intelligence cycle’’ describes how an analyst
integrates information collected by numerous entities and disseminates this
information to policymakers. As former DCI William Colby noted, ‘‘At
the center of the intelligence machine lies the analyst, and he is the fellow
to whom all the information goes so that he can review it and think about
it and determine what it means.’’69 While the ‘‘intelligence cycle’’ presents
this process in sequential terms, more accurately the analyst is engaged in
never-ending conversations with collectors and policymakers over the
status of international events and their implications for United States
policy. In converting raw intelligence data into finished analysis, analysts
interpret the international environment through an information processing
methodology approximating the scientific method.70 As intelligence author
Washington Platt noted in 1957: ‘‘The so-called ‘scientific method’ means
different things to different people, but the basic features are much the
same. These features are: collection of data, formation of hypotheses,
testing the hypotheses, and so arriving at conclusions based on the
foregoing which can be used as reliable sources of prediction.’’71
An analyst builds an operating hypothesis upon constructs found
in a particular country’s history, culture, regional dynamics, governmental
workings, or whatever aspect may be relevant to determine how that
country’s leaders might respond to foreign challenges. The Kent School
bolsters this knowledge with both procedural and disciplinary expertise to
provide a methodological and theoretical base for the production of
intelligence analysis. For example, academia develops theories to explain
outcomes in the realms of economics, politics, psychology, and
international relations, and these theories form the basis for any good
intelligence analysis. The analyst’s reigning conceptual framework is
grounded in disciplinary theory modified by knowledge of regional or
country-specific substance. Analysts simultaneously — and for the most
part instinctively — use inductive reasoning to find patterns amidst the
flood of data, and use deductive reasoning to provide meaning and context
for the patterns they find. Incoming bits of intelligence data are filtered
through this framework, and those that fit are imbued with meaning and
context, while those that do not fit are set aside as potential modifiers of
the concepts. As a result, intelligence analysis, when done accurately, is a
model of social scientific inductive and deductive reasoning in action.
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
624
STEPHEN MARRIN
Yet, despite the similarities to social science research, intelligence analysis
is more complex. Two methodologists note that intelligence analysis has
difficulties — including externally imposed time constraints which
necessitate analysis prior to acquisition of sufficient information, an
inability to control the variables under study, an unknown data quality
due to imperfections in the collection process and the possibility of
deception, an emphasis on prediction, and a focus on utility — that,
combined, make intelligence analysis more difficult than social scientific
research in an academic setting.72
Also, like many specialties in the study of international relations,
intelligence analysts cannot test their hypotheses or replicate outcomes in
the real world, and instead must allow the progression of events to falsify
them. In addition, international relations writ large has a minimal
empirical base relative to other fields such as medicine or physics, meaning
that the etiology of certain kinds of activities, such as war or revolution,
cannot be aggregated into anything other than a very general and
uncertain theory. Unlike academics, though, the intelligence analyst does
not create theory, but must interpret facts that are themselves subject to
distortion and deception, at times under tight deadlines.73 This means that
high levels of uncertainty are implicit in most analytic judgments.
As a result of the high levels of uncertainty, intelligence analysis rarely has
certainty or proof and, for the most part, conclusions are tentative. In what
has become a cliché, ‘‘evidence collected by intelligence agencies is often
ambiguous and can lead to differing conclusions.’’74 In addition, Israeli
scholars Yitzhak Katz and Ygal Vardi cite an experiment demonstrating
that the same data can lead to differing conclusions, depending on the
conceptual structures applied. 75 The result of such ambiguity is an
environment of conflict and debate over the interpretation and implications
of intelligence data. Loch K. Johnson says that within the institutional
context ‘‘analysis is usually a messy affair, with incomplete information
and often heated debates over what the available facts may really mean
about the intentions of secretive adversaries in distant capitals. . . .
Uncertainty, ambiguity, debate, and partial answers to tangled questions
will remain an existential condition of the analytical process.’’76
Given this complex information environment, not knowing or negligence
in applying social science techniques can lead to analytic inaccuracies. As
Robert Jervis observed in 1986: ‘‘Good intelligence demands three
interrelated conditions. Analysts should present alternatives and competing
explanations for a given even[t], develop the evidence for each of the
alternatives, and present their arguments as fully as is necessary to do
justice to the subject matter. This is not to imply, of course, that if these
conditions are met the resulting analyses will always be excellent but only
that their omission will substantially reduce the probability that the
INTERNATIONAL JOURNAL OF INTELLIGENCE
CIA’S KENT SCHOOL: IMPROVING TRAINING FOR NEW ANALYSTS
625
product will be of high quality.’’77 At its most extreme outcome, analytic
inaccuracies due to failure to apply tradecraft can lead to intelligence
failures. As intelligence scholar Robert Folker notes, ‘‘the root cause of
many critical intelligence failures has been analytical failure.’’78 Katz and
Vardi also argue that, although Israeli intelligence possessed ‘‘data
pointing to the probability of war,’’ it was surprised by the 1973 Yom
Kippur War because it ‘‘was incorrectly interpreted.’’ 79 In addition,
Georgetown Professor Walter Laqueur notes that sources of analytic
failure include ‘‘bias . . . [which is] an unwillingness . . . to accept evidence
contrary to their preconceived notions, . . . ignorance, lack of political
sophistication and judgment, [l]ack of training and experience, lack of
imagination, and the assumption that other people behave more or less as
we do,’’ which is known in intelligence parlance as ‘mirror-imaging.’80 In
sum, failure to apply appropriate social science standards of rigor and
procedure can lead to analytic inaccuracy. If so, then accuracy can be
improved on the margins by providing analysts with greater knowledge of
fact, theory, or procedure, and the application of methodologies to
simplify and structure their analysis.
Kent School: Improving CIA’s Analytic Procedure
The Kent School’s CAP provides analysts with improved methodological
knowledge and analytical tools that should provide a more rigorous basis
for their analysis. This greater rigor, approximating social science methods
should increase the chance that the analytic product will be more accurate
than it otherwise would have been without training. Just as increases in the
quality of research methodology allow a researcher to more effectively
operationalize and measure the constructs of interests, better training in
methodology would hypothetically provide an intelligence analyst with a
better foundation from which to produce more accurate analysis.
Therefore, if the theory holds, then the Kent School will provide analysts
with the capability to produce more accurate analysis, so long as its
programs provide information that enables a more effective application of
the scientific method.
Many possible methodological tools can be used to organize and simplify
the process of intelligence analysis. As Washington Platt noted in 1957: ‘‘In
intelligence production as yet we find little study of methods as such. Yet a
systematic study of methods in any given field by an open-minded expert
in that field nearly always leads to worthwhile improvement.’’ 81 For
example, one JMIC publication — a distillation of a student’s Masters
thesis — demonstrates that ‘‘analysts who apply a structured method
(specifically) hypothesis testing . . . to an intelligence problem, outperform
those who rely on . . . the intuitive approach.’’82 Just as social science is
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
626
STEPHEN MARRIN
structured upon a foundation of hypothesis falsification and revision which
requires that hypotheses be compared for accuracy and relevance,
intelligence analysts self-consciously compare hypotheses in a process
termed ‘‘competitive analysis.’’ This technique can be applied in particular
circumstances — e.g., when an issue of particular importance is imbued
with partisan implications — to get at underlying assumptions that might
be biasing the analysis in one direction or the other. 83 Some have even
advocated the structural incorporation of this method through an
institutionalized ‘‘Devil’s Advocate’’ position or ‘‘Team B’’ exercises.
According to Richards Heuer, other tools that help analysts structure the
most appropriate approach to a particular intelligence problem include
‘‘outlines, tables, diagrams, trees, and matrices, with many sub-species of
each. For example, trees include decision trees and fault trees. Diagrams
includes causal diagrams, influence diagrams, flow charts, and cognitive
maps. . . . [In addition, a matrix can be used] to array evidence for and
against competing hypotheses to explain what is happening now or
estimate what may happen in the future.’’ 84 Finally, R.V. Jones — the
‘‘father of modern scientific and technical intelligence’’ — suggests that
analysts apply Occam’s Razor — or ‘‘the simplest hypothesis that is
consistent with the information’’ — as a way to simplify the analysis of
contradictory or insufficient information.85
The analytic tradecraft standards and techniques taught in the CAP were
developed in the 1990s. According to former CIA officer Jack Davis, former
DDI Douglas MacEachin articulated ‘‘corporate tradecraft standards for
analysts . . . aimed . . . at ensuring that sufficient attention would be paid to
cognitive challenges in assessing complex issues.’’86 Davis added:
MacEachin advocated an approach to structured argumentation called
‘‘linchpin analysis,’’ to which he contributed muscular terms designed
to overcome many CIA professionals’ distaste for academic
nomenclature. The standard academic term ‘‘key variables’’ became
drivers. ‘‘Hypotheses’’ concerning drivers became linchpins —
assumptions underlying the argument — and these had to be explicitly
spelled out. . . . MacEachin thus worked to put in place systematic and
transparent standards for determining whether analysts had met their
responsibilities for critical thinking.87
MacEachin then formalized the promulgation of methodologies to all DI
analysts. In 1996 and 1997, ‘‘nearly all DI managers and analysts attended’’ a
new course based on the standards entitled ‘‘Tradecraft 2000.’’88 In addition,
in 1997 Jack Davis authored ‘‘a series of notes on analytical tradecraft’’ that
encapsulated the lessons from the new course.89 According to the CIA
Website — where portions of these notes can be found — the notes
‘‘elaborate on some of the skills and methods used by DI intelligence
INTERNATIONAL JOURNAL OF INTELLIGENCE
CIA’S KENT SCHOOL: IMPROVING TRAINING FOR NEW ANALYSTS
627
analysts’’ and have ‘‘become a standard reference within CIA for
practitioners and teachers of intelligence analysis.’’90 These methodologies
have become, in essence, analytic doctrine, and through them CIA analysts
have become familiarized with social science methodologies, if with a
modified vocabulary. In 2001, the CIA Website noted that: ‘‘Invariably,
analysts work on the basis of incomplete and conflicting information. DI
analysts are taught to clearly articulate what is known (the facts), how it is
known (the sources), what drives the judgements (linchpin assumptions),
the impact if these drivers change (alternative outcomes), and what
remains unknown.’’ 9 1 The analytic standards were subsequently
incorporated into the existing training courses for analysts, and maintain
their place in the Kent School’s curriculum today.
In sum, the Kent School’s CAP provides analysts with the expertise
necessary to produce more accurate analysis. This conclusion, however, is
insufficient to address whether the school’s CAP actually improves the
quality of the CIA’s analytic output.
BUREAUCRATIC PROCESSES CAN STIFLE POTENTIAL
Hypothetically, the CAP could improve an analyst’s analytic accuracy, but
its potential could be stifled if the DI’s business practices prevent
application of the hard-earned expertise. Intelligence analysis takes place
within a specific organizational context, and an organization’s processes
can promote or impede the application of lessons learned in training, and
affect the ultimate accuracy of the intelligence product.
For example, the length of finished intelligence product emphasized by
the DI can impact expertise acquisition. The DI produces many types of
finished intelligence—some reportorial, some analytical, some estimative—
largely as a result of its attempt to meet policymakers’ varying needs for
information.92 The reportorial products— known as current intelligence —
tend to be shorter but more targeted to policymakers’ needs, while the
more estimative and analytic products can be longer and potentially less
relevant for policymakers due to the time of production and length of
product. Over time, the DI tends to emphasize the benefits of one product
length over another. In the 1980s, the DI emphasized the production of
longer pieces for the knowledge that they create, but after arguments
surfaced that these longer papers were of decreased utility for
policymakers, the DI promptly reversed course in the mid-1990s and
limited ‘‘most DI analytical papers . . . to 3 to 7 pages, including graphics
and illustrations.’’93 Presumably, this was done to increase the relevance of
DI products for policymakers.
But, while the production of a single piece of current intelligence has
limited effect on expertise because it draws on the knowledge and tools
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
628
STEPHEN MARRIN
that an analyst has developed through training and prior analysis, expertise
acquisition and application problems become endemic if the institution
emphasizes current intelligence over longer products. If the analyst does
not have time to think, learn, or integrate new information with old to
create new understandings, knowledge of facts and events may increase,
but the ability to accurately interpret these events decreases, as the analyst
does not have the opportunity to reinforce knowledge through the more
in-depth research and drafting processes. The reliance on brevity also
deprives analysts of the opportunity to apply ‘‘best practices,’’ including
the structured tradecraft methodologies and benefits arising from
disciplinary theory. By 1997, DI management was making efforts to
address the decreased expertise levels resulting from excessive current
intelligence production.94 In the end, the Kent School’s effectiveness in
improving analytic accuracy may depend in part on whether the CIA
provides its analysts with assignments that allow them to utilize the
expertise acquired in training, while still providing relevant current analysis.
The length of time that the DI encourages an analyst to spend on an
account can also impact an analyst’s expertise, and by extension, accuracy.
In 2001, former CIA Inspector General Frederick P. Hitz argued that
CIA’s analysts have ‘‘tended to be 18-month wonders who hopscotch from
subject to subject without developing any broad, long-term expertise in any
one area,’’ and as a result ‘‘are ill-equipped to grapple with the
information that arrives on their desks.’’ 95 An analyst cannot apply
sophisticated methodological tools to intelligence data until he or she
becomes knowledgeable about the specific regional area or discipline of the
account. Frequent changes will prevent an analyst from achieving the
potential inherent in the analytic tools that the Kent School provides new
analysts. Therefore, for the Kent School to improve analytic accuracy
through the application of increased expertise, ‘‘analysts should be
expected to make a career of their functional or geographic specialty so
that the daily flows of information are added to an already solid base of
knowledge,’’ as Hitz suggests.
In sum, the DI must adapt its practices to provide analysts with the
greatest opportunity to apply the expertise they have acquired both onthe-job and in training if the potential inherent in the Kent School’s
programs to improve the DI’s analytic accuracy is to be fully realized. Yet,
despite the importance of institutional context in determining the CIA’s
analytic quality, few have studied the interplay between analyst and
institution. As Robert Jervis has concluded, ‘‘perhaps the intelligence
community has paid too [little attention] . . . to how the community’s
internal structure and norms might be altered to enable intelligence to be
worth listening to.’’96 Further studies of analytic structures and processes
must be done to determine which reforms would have the greatest
INTERNATIONAL JOURNAL OF INTELLIGENCE
CIA’S KENT SCHOOL: IMPROVING TRAINING FOR NEW ANALYSTS
629
improvement on analytic quality, and perhaps their implementation will take
CIA a step closer to creating ‘‘intelligence worth listening to.’’
IMPROVING TO MEET FUTURE CHALLENGES
In 1988, Loch Johnson said ‘‘probably the most important problem faced by
any intelligence agency [is] how to improve the quality of its product.’’97 The
Kent School is part of the CIA’s effort to do this. The school has apparently
achieved its goal of increasing analytic expertise by improving the Agency’s
training for new analysts. This training improves the analysts’ — and
consequently the DI’s—effectiveness, and should provide each analyst with
a greater ability to produce more accurate intelligence analysis. But, even if
increases in analyst expertise due to training can be documented, the
analyst may not have the time, inclination, or opportunity to apply
recently acquired expertise in the production of finished intelligence
analysis. Analytic training may be a necessary component for improvement
in analytic accuracy writ large, but is not sufficient in and of itself.
Therefore, complementary modifications may have to be implemented to
the DI’s business practices if the desired outcome — increased analytic
accuracy — is to occur. As such, the Kent School is a step in the right
direction, but further assessments are required to determine which changes
in business practices or organizational structure might magnify possible
improvements.
More broadly, other issues have implications for American intelligence.
The tragic events of September 2001 have highlighted the limitations of
institutionalized intelligence, including the problems inherent in intelligence
analysis, and present policymakers with the opportunity to redefine the
intelligence community of the next decades. This time, according to an
intelligence expert advising President George W. Bush on ways to improve
the workings of the intelligence community, ‘‘We are trying to treat the
intelligence community more like a corporation that should have goals and
a way to measure success or failure against those goals.’’98
Increased analytic quality, achieved through the Kent School’s Career
Analyst Program helps CIA reach that goal. Yet, in the broader context of
national security policymaking, improving the accuracy of the CIA’s
analytic output on the margins may not necessarily lead to improved
policymaking. Intelligence analysis is only one of many information
streams that the policymaker listens to, and may not be the most
important. In addition, a policymaker’s decisions will be affected by policy
agendas, tradeoffs, and other cognitive limitations that create ‘‘bounded’’
rationality. Therefore, if the researcher’s primary concern is more effective
national security policymaking, then it may be more useful in this limitedresource environment to assess how intelligence production can be
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
630
STEPHEN MARRIN
modified to better fit policymaker needs. In the end, the utility of intelligence
analysis may be just as important a criterion of analytic quality as is
accuracy.99
Nevertheless, understanding the options available to improve
policymaking is crucial, and improving the CIA’s analytic accuracy is one
of those options. As Robert Jervis has said, ‘‘We will never be able to do
as well as we would like, but this does not mean that we cannot do better
than we are doing now.’’100
The Kent School’s Career Analyst Program is a step in the right direction,
but more steps may be necessary in order to produce a measurable
improvement in the CIA’s analytic quality.
APPENDIX
Sherman Kent’s Principles for Intelligence Analysis. The Career
Analyst Program (CAP) Teaches These Principles:
1. Intellectual Rigor
Judgments are supported by facts or credible reporting.
All sources are reviewed and evaluated for consistency, credibility.
Uncertainties or gaps in information are made explicit.
2. Conscious Effort to Avoid Analytical Biases
State working assumptions and conclusions drawn from them explicitly.
Subject assumptions and conclusions to structured challenge: what developments
would indicate they would be wrong.
If uncertainties or the stakes of being wrong are high, identify alternative
outcomes and what it would take for each to occur.
3. Willingness to Consider Other Judgments
Recognize the limits to your own expertise and avoid treating your account as
yours alone.
Seek out expertise that will complement your own as a product is being prepared.
Strong differences of view should be made explicit.
4. Collective Responsibility for Judgment
Seek out and allow time for formal coordination of your product.
Represent and defend all Agency and DI views.
Make it clear when you express individual views; do so only when asked.
5. Precision of Language
Provide your most unique or new insight or fact quickly.
Use active voice and short sentences; avoid excessive detail; minimize the use of
technical terms. Follow DI writing guidelines.
Shorter is always better.
6. Systematic Use of Outside Experts as a Check on In-House Blinders
INTERNATIONAL JOURNAL OF INTELLIGENCE
CIA’S KENT SCHOOL: IMPROVING TRAINING FOR NEW ANALYSTS
631
Seek out new external studies and experts relevant to your account and discipline
on a continuing basis.
Keep up with news media treatment of your account and consider whether their
perspective offers unique insight.
On key issues, indicate where outsiders agree or disagree with your judgments.
7. Candid Admission of Shortcomings and Learning from Mistakes
Recognize that intelligence analysis will sometimes be wrong because it must
focus on the tough questions or uncertainties.
Review periodically past judgments or interpretations; what made them right or
wrong; how could they have been better.
Alert the policymaker if you determine that a previous line of analysis was
wrong. Explain why and what it means.
8. Attentiveness To and Focus On Policymaker Concerns
Deliver intelligence that is focused on and timed to the policymaker’s current
agenda.
Make clear the implications of your analysis for U.S. policy.
Provide ‘‘actionable’’ intelligence that can help the policymaker handle a threat,
make a decision, or achieve an objective.
9. Never Pursue a Policy Agenda
Personal policy preferences must not shape the information presented or the
conclusions of intelligence analysis.
Politely but clearly deflect policymaker requests for recommendations on policy.
Intelligence helps the policymaker by reducing the range of uncertainty and
risk, and by identifying opportunities for action. It does not make the
choice for him.
REFERENCES
1
2
3
4
5
CIA Website, ‘‘Tenet Dedicates New School for Intelligence Analysis,’’ CIA
Press Release, 4 May 2000. http:==www.cia.gov=cia=public_affairs=
press_release=archives=2000=pr050400.html
CIA Website, ‘‘Tenet Lauds Appointment of McLaughlin as Acting DDCI,’’
Press Release, 29 June 2000. http:==www.cia.gov=cia=public_affairs=
press_release=archives=2000=pr06292000.html
Bruce B. Auster, ‘‘What’s Really Gone Wrong with the CIA,’’ U.S. News and
World Report, 1 June 1998, Vol. 124, No. 21, p. 27.
George J. Tenet, ‘‘The CIA and the Security Challenges of the New Century,’’
International Journal of Intelligence and CounterIintelligence, Vol. 13, No. 2,
Summer 2000, pp. 140–141.
Various newspaper articles including: Bob Drogin, ‘‘School for New Brand of
Spooks,’’ The Los Angeles Times, 21 July 2000, p. A-1.; Tim Weiner, ‘‘U.S.
Intelligence Under Fire in Wake of India’s Nuclear Test,’’ The New York
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
632
STEPHEN MARRIN
Times, 13 May 1998; Walter Pincus, ‘‘CIA Chief Cited Loss of Agency’s
Capabilities; Remarks Preceded Indian Bomb Tests,’’ The Washington Post,
25 May 1998, p. A4.
6
For a good presentation of the specific indicators, see Tim Weiner, ‘‘U.S.
Intelligence Under Fire in Wake of India’s Nuclear Test,’’ op. cit.
7
James R. Asker, ‘‘Same Ol’, Same Ol’ ’’ Aviation Week and Space Technology, 8
June 1998.
8
Carla Anne Robbins, ‘‘Failure to Predict India’s Tests Is Tied to Systemwide
Intelligence Breakdown,’’ The Wall Street Journal, 3 June 1998, p. A8.
9
Ibid.
10
Denis Stadther was a member of this task force and provided all the information
on the Kent School’s creation during an interview on 14 May 2001, except where
otherwise indicated.
11
Vernon Loeb, ‘‘CIA Goes Deep Into Analysis; Agency Opens School, Elevates
Analysts,’’ The Washington Post, 4 May 2000, p. A23.
12
CIA Website, ‘‘Tenet Lauds Appointment of McLaughlin as Acting DDCI.’’
Also see: CIA Website, ‘‘John E. McLaughlin: Deputy Director of Central
Intelligence.’’ http:==www.cia.gov=cia=information=mclaughlin.html
13
CIA Website, ‘‘Tenet Dedicates New School for Intelligence Analysis,’’ CIA
Press Release, 4 May 2000.
14
CIA Website, ‘‘Remarks of the Director of Central Intelligence George J. Tenet
at the Dedication of the Sherman Kent School,’’ 4 May 2000. http:==www.
cia.gov=cia=public_affairs=speeches=archives=2000=dci_speech_05052000.html
15
CIA Website, ‘‘Remarks of the Director of Central Intelligence George J. Tenet
at the Dedication of the Sherman Kent School, 4 May 2000.’’
16
Vernon Loeb, ‘‘CIA Goes Deep Into Analysis; Agency Opens School, Elevates
Analysts.’’ For more on Martin Petersen, see Vernon Loeb, ‘‘Tenet, Krongard
Alter CIA Power Structure,’’ The Washington Post, 1 May 2001, p. A21.
17
Ken Olson, Personal Interview, 16 May 2001; Defense Intelligence Agency
(DIA) Website, ‘‘About the JMIC.’’ http:==www.dia.mil=Jmic=about.html
18
Merriam-Webster Website: http:==www.m-w.com
19
Loch K. Johnson, ‘‘Analysis For a New Age,’’ Intelligence and National Security,
Vol. 11, No. 4, October 1996, p. 657.
20
CIA Website, ‘‘Remarks of the Director of Central Intelligence George J. Tenet
at the Dedication of the Sherman Kent School, 4 May 2000.’’
21
CIA Website. Directorate of Intelligence: Organizational Components
http:==www.odci.gov=cia=di=mission=components.html (see: Council of
Intelligence Occupations). See also CIA Website. Intelligence Disciplines,
http:==www.odci.gov=cia=di=work=disciplines.html
22
Frank Watanabe, ‘‘Fifteen Axioms for Intelligence Analysts,’’ Studies in
Intelligence, unclassified edition, 1997. http:==www.odci.gov=csi=studies=
97unclass=axioms.html
INTERNATIONAL JOURNAL OF INTELLIGENCE
CIA’S KENT SCHOOL: IMPROVING TRAINING FOR NEW ANALYSTS
23
633
Robert Jervis, ‘‘What’s Wrong with the Intelligence Process?’’ International
Journal of Intelligence and CounterIntelligence, Vol. 1, No. 1, Spring 1986, pp.
31–32.
24
Stansfield Turner, ‘‘Intelligence for A New World Order,’’ Foreign Affairs, Fall
1991, p. 164.
25
Robert Jervis, ‘‘What’s Wrong with the Intelligence Process?’’, p. 33.
26
Ibid, pp. 31–32.
27
Ibid, pp. 28, 30.
28
Stephen Marrin, ‘‘Complexity Is in the Eye of the Beholder,’’ DI Discussion
Database, 6 May 1997.
29
The inevitability of intelligence failure appears to be a consensus position in the
intelligence failure literature and stems from Richard Betts’s 1978 article.
Richard K. Betts, ‘‘Analysis, War and Decision: Why Intelligence Failures Are
Inevitable,’’ World Politics, Vol. XXX1, No. 1, October 1978, pp. 61–89.
30
Author’s personal experience, circa 1996.
31
CIA Website, ‘‘Remarks of the Deputy Director of Central Intelligence John E.
McLaughlin at the Conference on CIA’s Analysis of the Soviet Union, 1947–
1991, Princeton University. 9 March 2001.’’ http:==www.cia.gov=cia=
public_affairs=speeches=ddci_speech_03092001.html
32
Tim Weiner, ‘‘Naivete at the CIA: Every Nation’s Just Another U.S.’’ The New
York Times, 7 June 1998.
33
Frans Bax, Presentation at an Association of Former Intelligence Officers
(AFIO) Luncheon, Fort Myer, VA, 22 May 2001.
34
Jack Davis, ‘‘Improving Intelligence Analysis at CIA: Dick Heuer’s
Contribution to Intelligence Analysis,’’ Psychology of Intelligence Analysis,
Center for the Study of Intelligence, CIA, 1999, p. xix.
35
Author’s experience in one of FDIT’s earlier runnings in the fall of 1996.
36
Denis Stadther, Personal interview, 14 May 2001.
37
That the Kent School dean resides on the DI’s corporate board is from CIA’s
internal ‘‘newspaper,’’ What’s News at CIA, Issue number 703. Unclassified
article: ‘‘DDI Establishes Sherman Kent School for Intelligence Analysis.
Frans Bax Named Dean.’’ Unknown date.
38
Denis Stadther, personal correspondence, 6 September 2001.
39
Frans Bax, presentation at AFIO Luncheon.
40
Denis Stadther, personal correspondence, 6 September 2001.
41
Most of the information on CAP was provided by its program director Denis
Stadther during an interview on 14 May 2001 and a second interview on 26
February 2002. All direct information on CAP stems from these two
interviews, except where otherwise indicated.
42
Kent School publication, ‘‘Career Analyst Program: Preparing for a Career in
the Directorate of Intelligence,’’ June 2001.
43
Frans Bax, presentation at AFIO Luncheon.
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
634
STEPHEN MARRIN
44
Bob Drogin, ‘‘School for New Brand of Spooks’’; Denis Stadther, personal
interview, 14 May 2001.
45
Denis Stadther, personal interview, 14 May 2001. Re-emphasized on 26
February 2002.
46
John McLaughlin, personal conversation, 9 March 2001.
47
Frans Bax, presentation at AFIO Luncheon.
48
Ibid.
49
For greater information on intelligence tradecraft, see: Douglas J. MacEachin,
et al., ‘‘The Tradecraft of Analysis: Challenge and Change in the CIA,’’
Consortium for the Study of Intelligence, Washington DC, 1994. Also see the
DI Analytic Toolkit which contains ‘‘excerpt(s) from Notes on Analytic
Tradecraft, published between 1995 and 1997, which elaborate on some of the
skills and methods used by DI intelligence analysts. These notes become a
standard reference within CIA for practitioners and teachers of intelligence
analysis.’’ It can be found at: http:==www.odci.gov=cia=di=toolkit=index.html
50
Sherman Kent’s ‘‘Principles for Intelligence Analysis’’ acquired from
informational materials provided by Kent School officers, 14 May 2001. See
Appendix for the full text of the document.
51
Frans Bax, presentation at AFIO Luncheon.
52
The ‘A=B Team’ approach stems from 1976 when former DCI George H. W.
Bush commissioned a team of outside experts to review the same information
that had led to a controversial National Intelligence Estimate on Soviet
military spending. This outside ‘‘B Team’’ based its analysis on more
pessimistic assumptions of Soviet intentions, interpreted intelligence of Soviet
military spending accordingly, and came to a judgment more skeptical than
the IC’s that was consistent with reigning conservative views. See Robert C.
Reich, ‘‘Re-examining the Team A–Team B Exercise,’’ International Journal of
Intelligence and CounterIntelligence, Vol. 3, No. 3, Fall 1989, pp. 387–388. See
also Kevin P. Stack, ‘‘A Negative View of Competitive Analysis,’’
International Journal of Intelligence and CounterIntelligence, Vol. 10, No. 4,
Winter 1997–1998, pp. 456–464.
53
Richards J. Heuer, Jr., Psychology of Intelligence Analysis (Washington, DC:
Center for the Study of Intelligence, CIA, 1999).
54
Bob Drogin, ‘‘School for New Brand of Spooks.’’
55
Ibid.
56
Frans Bax, presentation at AFIO Luncheon.
57
Stephen Marrin, ‘‘The CIA’s Kent School: A Step in the Right Direction,’’ The
Intelligencer: Journal of U.S. Intelligence Studies. Vol. 11, No. 2, Winter 2000,
pp. 55–57.
58
Dan Wagner, personal interview, 23 August 2001.
59
Frans Bax, presentation at AFIO Luncheon. Denis Stadther, personal interview,
14 May 2001.
INTERNATIONAL JOURNAL OF INTELLIGENCE
CIA’S KENT SCHOOL: IMPROVING TRAINING FOR NEW ANALYSTS
60
635
Thomas W. Shreeve and James J. Dowd, Jr. ‘‘Building a Learning Organization:
Teaching with Cases at CIA,’’ International Journal of Intelligence and CounterIntelligence, Vol. 10, No. 1, Spring 1997, p. 104.
61
Frans Bax, presentation at AFIO Luncheon.
62
Donald T. Campbell and Julian C. Stanley, Experimental and QuasiExperimental Designs for Research (Boston: Houghton Mifflin, 1963), pp.
13–22.
63
Denis Stadther, personal interview, 26 February 2002.
64
Frans Bax, personal interview, 14 May 2001.
65
Frans Bax, presentation at AFIO Luncheon.
66
Ibid.
67
CIA Website. ‘‘CIA Vision, Mission, and Values.’’ http:==www.cia.gov=cia=
information=mission.html
68
For more on the limitations of using accuracy to measure intelligence, see the
strategic surprise and intelligence estimation literature. Of particular note, see
Steve Chan, ‘‘The Intelligence of Stupidity: Understanding Failures in
Strategic Warning,’’ The American Political Science Review, Vol. 73, No. 1,
March 1979, pp. 171–180.
69
William Colby, ‘‘Retooling the Intelligence Industry,’’ Foreign Service Journal,
January 1992, p. 21.
70
For more information on the processes of intelligence analysis, see Robert M.
Clark, Intelligence Analysis: Estimation and Prediction (Baltimore, MD:
American Literary Press Inc., 1996); Research: Design and Methods,
(Washington, DC: Joint Military Intelligence College, September 2000); David
Schum, Evidence and Inference for the Intelligence Analyst (Lanham, MD:
University Press of America, 1987).
71
Washington Platt, Strategic Intelligence Production: Basic Principles (New York:
Frederick A. Praeger, 1957), p. 75.
72
Jerome K. Clauser, and Sandra M. Weir, Intelligence Research Methodology
(State College, PA: HRB Singer, Inc., 1976), pp. 37–46.
73
Robert Jervis, ‘‘What’s Wrong with the Intelligence Process?,’’ p. 29.
74
Michael Dobbs, ‘‘U.S., Russia At Odds on Iranian Deal,’’ The Washington Post,
15 June 2001, p. A01.
75
Yitzhak Katz and Ygal Vardi, ‘‘Strategies for Data Gathering and Evaluation in
the Intelligence Community,’’ International Journal of Intelligence and CounterIntelligence, Vol. 5, No. 3, Fall 1991, appendix, pp. 325–327.
76
Loch K. Johnson, ‘‘Analysis For a New Age,’’ p. 661.
77
Robert Jervis, ‘‘What’s Wrong with the Intelligence Process?,’’ p. 33.
78
Robert D. Folker, ‘‘Intelligence in Theater Joint Intelligence Centers: An
Experiment in Applying Structured Methods,’’ Joint Military Intelligence
College Occasional Paper Number Seven, Washington, DC, January 2000, p. 3.
79
Yitzhak Katz and Ygal Vardi, ‘‘Strategies for Data Gathering and Evaluation in
the Intelligence Community,’’ p. 313.
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4
636
STEPHEN MARRIN
80
Walter Laqueur, ‘‘The Future of Intelligence,’’ Society, Vol. 35, No. 2, January–
February 1998, p. 301.
81
Washington Platt, Strategic Intelligence Production: Basic Principles (New York:
Frederick A. Praeger, 1957), p. 151.
82
Robert Folker, ‘‘Intelligence in Theater Joint Intelligence Centers,’’ p. 2.
83
Richards Heuer devotes an entire chapter to this approach entitled ‘‘Analysis of
Competing Hypotheses,’’ Psychology of Intelligence Analysis, pp. 95–109.
84
Ibid, p. 89.
85
R.V. Jones, Reflections on Intelligence (London: Heineman, 1989), pp. 87–88. The
characterization of R.V. Jones is attributed to former DCI James Woolsey at:
http:==www.odci.gov=cia=public_affairs=press_release=archives=1997=pr122997.html
86
Jack Davis, ‘‘Improving Intelligence Analysis at CIA,’’ p. xvii.
87
Douglas J. MacEachin, ‘‘The Tradecraft of Analysis: Challenge and Change
in the CIA,’’ p. 1; Jack Davis. ‘‘Improving Intelligence Analysis at CIA,’’ pp.
xvii–xix.
88
Jack Davis, ‘‘Improving Intelligence Analysis at CIA,’’ p. xix.
89
The reference to Davis authoring the tradecraft notes is not on the CIA’s Website.
Instead, the citation was in the foreword of the hardcopy version, which can be found
at: http:==intellit.muskingum.edu=intellsite=analysis_folder=di_catn_Folder=
foreword.html; Jack Davis, ‘‘Improving Intelligence Analysis at CIA,’’ p. xix.
90
CIA Website, ‘‘DI Analytic Toolkit.’’ http:==www.odci.gov=cia=di=toolkit=
index.html
91
CIA Website, ‘‘Intelligence Analysis in the DI: Frequently Asked Questions.’’
http:==www.odci.gov=cia=di=work=analyst.html
92
CIA Website, ‘‘Analytical Products of the DI.’’ http:==www.odci.gov=cia=di=
work=major.html
93
For the emphasis in the 1980s on longer papers, see Loch K. Johnson, ‘‘Making
the Intelligence ‘Cycle’ Work,’’ International Journal of Intelligence and CounterIntelligence, Vol. 1, No. 4, Winter 1986, pp. 6–7. For the benefits of longer
papers, see Arthur S. Hulnick, ‘‘Managing Intelligence Analysis: Strategies for
Playing the End Game,’’ International Journal of Intelligence and CounterIntelligence, Vol. 2, No. 3, Fall 1988, pp. 322–323; and Loch K. Johnson,
‘‘Analysis For a New Age,’’ p. 665. For the backlash against longer papers,
see Jay T. Young, ‘‘US Intelligence Assessment in a Changing World: The
Need for Reform,’’ Intelligence and National Security, Vol. 8, No. 2, April
1993, pp. 129, 134. For the DI’s change in the 1990s, see ‘‘ ‘Say It Ain’t So,
Jim’: Impending Reorganization of CIA Looks Like Suppression, Politicizing
of Intelligence,’’ Publications of the Center for Security Policy, No. 94-D 74,
15 July 1994. http:==www.security-policy.org=papers=1994=94-D74.html. For
the backlash against shorter papers, see John Gentry, ‘‘A Framework for
Reform of the U.S. Intelligence Community.’’
94
John E. McLaughlin, New Challenges and Priorities for Analysis, Defense
Intelligence Journal, Vol. 6, No. 2, 1997, pp. 16–17. Also on the CIA Website
at http:==www.cia.gov=cia=di=speeches=428149298.html
INTERNATIONAL JOURNAL OF INTELLIGENCE
CIA’S KENT SCHOOL: IMPROVING TRAINING FOR NEW ANALYSTS
95
637
Frederick P. Hitz, ‘‘Not Just a Lack of Intelligence, a Lack of Skills,’’ The
Washington Post, 21 October 2001, p. B3.
96
Robert Jervis, ‘‘What’s Wrong with the Intelligence Process?,’’ p. 41.
97
Loch K. Johnson, A Season of Inquiry (Chicago: Dorsey Press, 1988), p. 197.
98
Walter Pincus, ‘‘Intelligence Shakeup Would Boost CIA,’’ The Washington Post,
8 November 2001, p. A1.
99
In fact, in 2001 the Kent School’s former dean noted his preference for a utility
measure over an accuracy one. Frans Bax, personal interview, 14 May 2001.
100
Robert Jervis, ‘‘What’s Wrong with the Intelligence Process?,’’ p. 30.
AND COUNTERINTELLIGENCE
VOLUME 16, NUMBER 4