These class notes were composed by Dr. Tom O'Connor for his class on Homeland Security at NORTH CAROLINA WESLEYAN COLLEGE, original documents can be found here

THE HISTORY AND LESSONS OF INTELLIGENCE FAILURE
"The result of shielding men from the effects of folly is to fill the world with fools" (Herbert Spencer)

    An intelligence failure can be defined as any misunderstanding of a situation that leads a government or its military forces to take actions that are inappropriate and counterproductive to its own interests (Schulsky & Schmitt 2002: 63).  It is a mistake to think that any human endeavor, including intelligence, will be error-free.  Enemies may be underestimated or overestimated, and events that should be predictable go unforeseen.  Because intelligence work is the product of a team effort, there are certain peculiarities common to the bureaucratic environment that help explain failure.  Arguably, the worst kind of intelligence failure is surprise attack.  Surprise attack can be defined as a bureaucratic neglect of responsibility, or responsibility so poorly defined or delegated that action gets lost (Schelling 1962).  Surprise attack subsumes intelligence failure, and some examples of the worst surprise attacks in history are as follows: 

Examples of Surprise Attack

1. OPERATION BARBAROSSA -- a 1941 deep penetration mission that took Russian intelligence by surprise.  In 1941, Hitler staged an operation invading Russia with 3 million German troops who poured in from the Arctic Circle to the Black Sea.  The Russians had plenty of information about troop movements eastward by the Germans, and couldn't help but notice the increased number of aerial surveillance flights Hitler was sending over Russia.  Besides, U.S. intelligence had already told Russian intelligence of Hitler's plans to invade Russia back in 1940.  Russia was convinced that similar intelligence leaked to them by the British was really counterintelligence.  Hitler played two deception schemes.  He first explained the buildup of his troops on the Russian border as being there for training purposes, to prepare for the invasion of England (Operation Sea Lion).  He then explained them as being contingency forces against possible hostile Soviet action. Stalin bought all this because his own intelligence led him to believe that Hitler would not dare try to fight a war on two fronts.
2. PEARL HARBOR -- in 1941, a task force of 33 Japanese ships stationed themselves 200 miles north of Oahu and launched two successive waves of air attack (350 planes).  By the time the attack was over, the U.S. had lost 18 warships, 200 airplanes, and over 2,000 personnel.  The case of Pearl Harbor is regarded as the worst case of intelligence failure in history.  No intelligence agency had prepared a report for the possibility of an attack there, although everyone talked about it.  Naval intelligence (ONI) did not even have a minimal amount of strategic or tactical intelligence.  They thought Japan would attack Thailand about that time of year.  The problem was that America lacked Human Intelligence (HUMINT) on Japan.  The U.S. had a few geisha girls on the payroll, but no agents in the Japanese elite.  The U.S. had broken the Japanese code, but what they were intercepting was just diplomatic and espionage information (movement of spies), nothing of the nature of military plans, and anyway, they changed their codes a day before the attack.  Japanese radio transmissions deceived the Americans into thinking the task force was assembling for training maneuvers.
3. SEPTEMBER 11 "TWIN TOWERS" -- From 1998-2001, Osama bin Laden's terrorist network, al-Qaida, slipped new operatives into the U.S. from Hamburg and Bangkok while the CIA was watching and disrupting other sleeper cells, and although intelligence officials had long speculated on the use of airliners as weapons and knew OBL was a potent adversary from 1998 attacks on African embassies and a 2000 attack on the USS Cole, the infiltrators managed to avoid attention getting six different kinds of fake ID and some even attending pilot training, a fact the FBI picked up on as early as 1998 and something British intelligence tipped the US about in 1999, which led to a 1999 U.S. intelligence brief which political officials claimed they never heard of or was vague. In late 2000, the U.S. conducted a "Dark Winter" drill in which fictional terrorists flew fictional planes into buildings. In early 2001, a flight school alerted the FAA about a suspicious student, and the summer of 2001, several suspected terrorists were seen drinking and partying in Las Vegas. Also in late summer 2001, a Phoenix FBI agent sent a warning memo to his supervisors, and Russian, Jordanian, British, and Israeli intelligence all tried to warn the U.S.  In August of 2001, a senior FBI counterterrorist official quit, the Minnesota FBI began working with the CIA on detained suspect Moussaoui, and the CIA issued another intelligence brief to the President. On September 10, 2001, a group of top Pentagon officials suddenly canceled commercial flight plans, financial centers reported a surge in money transfers from banks in the World Trade Center, and suspected terrorists were again seen in bars, getting drunk and hurling insults at "infidels." 


    Day of the 9/11 Attack: 19 terrorists get past airport security with box cutters, and 4 planes are hijacked -- two out of Boston, and two others out of Washington DC and Newark.  The two planes out of Boston crash into the World Trade Center towers within 20 minutes of one another.  The one out of DC crashes into the Pentagon about a half hour later, and a half hour from that, a fourth plane crashes into a field in Pennsylvania, due to passengers storming the cockpit.  The FAA notified NORAD about the first flight, but nothing could be done in the six minute timeframe. With the second flight, two F-15s give chase, but even at 500 mph couldn't catch the airliner in time.  The third flight, headed for the Pentagon, was off radar because it was flying so low.  The fourth flight had a thirty minute window of opportunity, but by then, passengers had received cell phone calls about the first tower collapsing and heroically rushed the cockpit. 

 

THEORIES ABOUT THE 9/11 ATTACK   

   

    There are many theories about who's to blame for the 9/11 attack, and unless more evidence is forthcoming, each of the three (3) main "conspiracy" theories remain little more than speculation.  Those conspiracy theories include: (1) the CIA did it; (2) Israeli intelligence did it; and (3) the Arabs did it, and the CIA let it happen.  This is not the place to go into those conspiracy theories; suffice it to say that belief depends heavily on viewpoint [see Harris Poll on Who's To Blame for 9/11].  There are those (e.g. Hart 2003) who spread the blame widely on a kind a post-Cold War fatigue, "peace dividend" mindset, or general "lassitude" that all Americans were caught up in at the time.  What is evident and clear is that catatrophic weaknesses were exposed in the world's passport & visa system (background checks were not done), the FBI operated in "dumb" pencil-and-paper mode, the CIA didn't have enough linguists and translators, airport security didn't know how to do a CAPPS screening, there was too much distrust of sharing secrets, and civilian agencies (like the FAA) didn't know how to handle military threats while military agencies (like NORAD) didn't know how to handle law enforcement threats.  The 2004 Executive Summary of the 9-11 Commission Final Report stated that all the following were specific intelligence failures which occurred:

    Anonymous (2004), aka ex-CIA Osama expert, Michael Scheuer, who has also authored Imperial Hubris and Through Our Enemies' Eyes has disagreed with the official 9/11 Commission Report, saying it was erroneously based on a portrayal of the problem as the result of budgetary, structural, and organizational issues.  In a letter to the House and Senate Intelligence Committees, he laid out the "Ten Steps How Not to Catch a Terrorist," which are condensed as follows:

  1. in late 1996, a report about al-Qaeda and their seeking to acquire nuclear weapons was suppressed (within the CIA), reduced in length, and only after 3 members of the Bin Laden unit protested was the full report made

  2. from 1996-1998, the CIA and another IC agency refused to share half the information that was exploited via a communications conduit used by Bin Laden and al-Qaeda

  3. from 1996-1999, the CIA received only two special operations officers from the military when they requested many more, and then had to wait 18 months to get the two they got

  4. from 1996-1998, verbatim transcripts (which are much more operationally useful than summaries) were never provided to the CIA by NSA

  5. in 1997, when the CIA had one officer (on loan from another IC component) who knew the issue cold about an upcoming al-Qaeda attack in a foreign city, that officer (an extraordinarily able analyst) was ordered back to her headquarters

  6. in 1998, the CIA's Bin Laden unit was ordered disbanded, then the DCI found out about it and preserved the unit

  7. from 1998-1999, CIA forces had at least ten chances to capture or kill Bin Laden, and in all instances, the assertion was that the "intelligence was not good enough" and joint military plans were also scrapped because senior officials from CIA, the Executive Branch, and other IC components decided to accept assurances from an Islamic country that it could acquire Bin Laden from the Taliban

  8. in 1998, following the embassy bombings in Africa, the CIA's Bin Laden unit briefly started receiving verbatim transcripts, but after about receiving a dozen of them, the flow stopped

  9. in 1999, Scheuer wrote an unresponded-to memo to senior CIA officials describing the problems of insufficient support, personnel, and at best mediocre performance of Western European intelligence allies

  10. for a long time, and even today (2004), there has been no systematic effort to groom al-Qaeda expertise

GENERAL REASONS FOR INTELLIGENCE FAILURE

    Although details are important in ascribing causation to historical events (Carr 1961), perhaps more important is to take the time to reflect over the problem of intelligence failure in general.  Numerous sources exist which have analyzed the general reasons (Laqueur 1985; Lowenthal 2003; BBC 2004), and have attributed the main causes to certain tendencies which are inherent in most bureaucracies.  The following is a list of those causes:

Bureaucratic Reasons for Intelligence Failure

1. Overestimation -- this is perhaps the most common reason for failure, and one which, if uncorrected, can lead to the continuation of error for a long time.  Examples include the long Cold War period in which the U.S. consistently overestimated the "missile gap" between the U.S. and Soviet Union.  Critics of the Iraq War say this was the main kind of error that happened in estimating Saddam Hussein's capabilities.
2. Underestimation -- this occurs when intelligence or political leadership seems unwilling to be receptive to warnings, or completely misread the enemy's intentions.  A classic example is Stalin in 1941, who didn't want to hear about the possibility of Hitler invading Russia, even thought the British and Americans tried to tip him off.  It is a primary cause of not trusting what foreign intelligence services are saying, and may also be a reason why lower-ranking employees are not listened to.
3. Subordination of Intelligence to Policy -- this happens when judgments are made to produce results that superiors want to hear instead of what the evidence indicates.  It is the most widely discussed and analyzed type of intelligence failure, although some discussions talk about a related error, bias.  With 9/11, there is the possibility that a "hands-off" policy toward Saudi Arabia interfered with intelligence over the hijackers, many of whom were from Saudi Arabia.
4. Lack of communication -- the lack of a centralized "fusion" office often creates this problem, but it more typically results from when you have different officials from different agencies with different rules, different security clearances, and different procedures on who and how they communicate. It also occurs when there are too few analysts who only work on-the-fly for different agencies and don't have full-time intelligence responsibilities.
5. Unavailability of Information -- regulations and bureaucratic jealousies are sometimes the cause of this, but the most common problem involves restrictions on the circulation of sensitive information.  When there is virtually no intelligence at all, this is called something else, ignorance. 
6. Received Opinion -- this is also called "conventional wisdom" and consists of assertions and opinions that are generally regarded in a favorable light, but have never been sufficiently investigated. Sometimes the people in a bureaucracy are forced to make "best guesses" on the basis of limited information.
7. Mirror-Imaging -- this is technically defined as "the judging of unfamiliar situations on the basis of familiar ones," but most often involves assessing a threat by analogy to what you (your government or a similar government) would do in a similar position. It is also the problem of having too many area specialists, like Kremlinologists or Sovietologists.
8. Over-confidence -- this occurs when one side is so confident of its ability that it projects its reasoning onto the other side and believes that since it would not do something itself, neither will the other side.  The classic case is the Yom Kippur war of October 1973, although the whole Cold War was characterized by this.
9. Complacency -- this happens when you know the enemy might do something, though you are not sure what or when, and yet you do nothing anyway.  The classic example is the British who did nothing in the weeks leading up to the Falkland War of 1982. A modern example is the way the international community sat on the sidelines during the Rwanda massacre. There's a tendency in some circles to just let things run their course.  
10. Failure to connect the dots -- this occurs when the connections between bits of intelligence are not put together to make a coherent whole.  It is most easily observed in hindsight, and is perhaps the main cause behind how the 9/11 attacks caught American officials by surprise. 

HOMELAND SECURITY AS CONNECTING THE DOTS

    In many ways, homeland security is all about connecting the dots, or in other words, about coordination and putting bits of intelligence together.  As both Seidman (1998) and Kettl (2004) have pointed out, "contingent coordination" is that all-elusive "philosopher's stone" in government work.  Every government administrator perpetually wonders if and when the day will come when that "magic formula" will be found for reconciling the irreconcilable, harmonizing competing and divergent interests, plugging every hole in the system, and overcoming the irrationality of public policy.  Part of the appeal behind homeland security is the perception that everyone agrees on the hard policy choices that need to be made in the name of homeland security.  It is the diagnosis of solution -- connect the dots, and the problems will all be solved.

    It just isn't that easy.  First of all, there are clever terrorists waiting to exploit the smallest loophole.  Secondly, to plug every loophole would probably overwhelm system resources, impinge heavily on civil liberties, and place enormous overtime costs and demands on first responders.  Thirdly, bureaucracies simply don't function well at producing "new ways of doing things differently," especially when they adopt paramilitarism or a military model.  Since any good homeland security apparatus would do well to learn the lessons of bad models to follow, let's examine some standard textbook critiques of bureaucracy and paramilitarism:

    The German sociologist Max Weber first introduced the idea of bureaucracy as a way to eliminate managerial abuses inherent in charismatic models of leadership. Subsequent writers, like Bennis (1966) have enumerated the characteristics of bureaucracy, and Auten (1985) was one of the first to draw attention to paramilitary limitations.

Characteristics of bureaucracy Characteristics of paramilitarism
1 - division of labor by functional specialization
2 - well-defined hierarchy of authority
3 - system of rules for rights and duties of staff
4 - system of procedures for work situations
5 - impersonal relations between people
6 - promotion and selection by competence
1 - centralized command structure
2 - rigid differences among ranks
3 - military terminology
4 - frequent use of commands and orders
5 - rules and discipline strictly enforced
6 - creativity and change not encouraged

    Functional specialization sounds good, and establishing function via efficient-looking organization charts is easy, but specialization is more difficult.  All management textbooks say specialization is accomplished via size.  Just ask anyone who works in the field of cybersecurity, and they'll tell you they just can't do the job with 4 or 5 personnel handling IT security for a firm of 15,000 people.   In addition, all bureaucracies are hierarchical, and hierarchical authority determines communication channels. Vertical, or top-down, communication always becomes more important than horizontal and upward communication in a bureaucracy.  This means endless "stovepipes" will occur where those in the field (and in the know) will never be able to get their ideas across to those in positions of power to take action.  All sorts of other communications problems are inherent, particularly with respect to the rights and duties of staff; just ask any human resources officer.  Rules and procedures give bureaucracies their aspect of formality, but sometimes informality is what you need, especially if you're sharing secrets or doing anything covertly.  Impersonality is what can't be avoided under bureaucratic circumstances, because when you work in a bureaucracy, you begin to notice that people come and go (some are more deplorable than others), but you look at the building at night when everyone else has gone, and you begin to realize that you work in a thing that's bigger than the sum total of the people who work there. This sensation describes the feeling of being an interchangeable part, and that's intended with bureaucracy, and has consequences.  Appointment and promotion on the basis of merit or competence means that rewards are not supposed to be given out in a manner based on familiarity, favoritism, or nepotism, but ask anyone who has been passed up for promotion because some coworker's relative got the job how that taste feels in their mouth and how this changes their perception.  In addition to all these critiques, you'll note that bureaucracies tend to insulate command staff from line staff and line staff from the clientele of the organization. A typical way to baffle a bureaucracy is to give it an unforeseen problem, something that cannot be solved easily or has never come up before. You'll note that the problem is routed up the chain of command for guidance and instruction, but the upper levels of the organization will insist that the problem be solved at the lower levels. This results in the problem being delayed at the middle management levels. 

    As harsh as this critique of bureaucracy sounds, let's take a look at some textbook criticisms of the military modelCentralized command and rigid rank differences tend to facilitate close supervision, sometimes too-close supervision, which stifles innovation and reaping the rewards of training.  Military terminology is used (or more frequently abused) to create a warriorlike mentality, which along with frequent commands and orders, produces a sense of enemy to hate, fear, and destroy.  This is particularly dangerous because of civil liberty implications.  Strict discipline has many effects, but some of the negative ones are a code of silence, a discouragement of whistleblowing, and a distinct tendency to look out for one's coworkers over and above the clientele you are supposed to be protecting.  Discouragement of creativity is selective in militaristic organizations (but Joint Special Operations may be the exception to this since it always encourages creativity).  Nothing is discouraged if it helps the cause of fighting the enemy.  However, the reality is that some things are discouraged if the effort consists of anything intellectual or academic which doesn't have an "applied" focus.  Because of this, one would NOT expect homeland security, for example, to be the subject of a liberal arts graduate school curriculum, and that is unfortunate.  Paramilitarism suffers from the same kind of communication blockages and inhibitions as bureaucracies suffer from, except that decisions don't get stymied in middle management, they get implemented in about a dozen different ways, so that one hand doesn't know what the other is doing. Military-style organizations are also full of tradition; in fact, they worship it. This causes a commitment to outmoded modes of operation, and finally, military-style organizations have a notorious tendency to mismatch talent with job positions and/or job titles.  Bureaucratic and military models do nothing to overcome the kinds of problems that Anonymous (2004), for example, outlined with the 9/11 failure.  Serious thought probably ought to be directed at developing new models of interorganizational structure and function.

PRODUCING AND DISSEMINATING WARNING INTELLIGENCE

     A relatively simple and common solution to the problem of intelligence failure is the establishment of fusion-based, all-source, independent intelligence entities which specialize in the production and dissemination of warning intelligence for homeland security.  This approach is not altogether new, as some federal agencies have had similar entities, and such entities are usually called a CounterTerrorism Center (CTC) or Joint Terrorism Task Force (JTTF).

    The CIA's CTC has existed since 1986, but the FBI has had a technology-challenged counterterrorism JTTF for much longer.  The first JTTF started in New York City in 1980 with 11 members from the NYPD and 11 FBI investigators operating under a memorandum of understanding (MOU) that they will be both responsive and proactive.  Today (2005), there are 16 Joint Terrorism Task Forces nationwide which utilize law enforcement at all levels and conduct intelligence sharing (under Patriot Act authorization) that is designed specifically to see that the right hand knows what the left hand is doing.  The JTTF concept is widely recognized in the law enforcement community as a good idea, but civil liberty advocates are extremely critical of it (for example, asking why a JTTF would be operating in a small town or on a college campus).  President Bush, since Executive Order of August 27, 2004 & Additional Presidential Remarks, has officially made the CTC concept (not the JTTF concept) the main partner of homeland security, but for various political reasons, the CTC is linked to the power of a NID (National Intelligence Director, or intelligence czar).  The CIA probably will keep its top-secret Bin Laden unit, which is a kind of CTC.  The Department of Homeland Security also has an Information Analysis and Infrastructure Protection directorate, or IAIP, which is obligated by law to analyze and integrate homeland threats to produce "actionable" intelligence.  "Fused" intelligence will most likely take place in a more analytical-oriented interagency organization which was newly created in 2003 -- the Terrorist Threat Integration Center, or TTIC (a build-out from the CIA, which means operating out of a CIA building in the Tyson's Corner area of Virginia).  The TTIC entity ensures that there is information sharing across agency lines.  There are other differences, as follows:

Comparison of the IAIP, CTC and TTIC Approach

    The IAIP approach is intended to provide actionable intelligence, which is a military intelligence term for precise and timely assessment of the posture, or indications, of an enemy reflecting their preparations for hostilities or battle.  A time honored military precept holds that intelligence should not estimate the intentions of an adversary, but only their capabilities (Grabo 2002).  Actionable intelligence is intelligence that is suitable for use, and a useful analogy is criminal justice.  With this analogy, think of "actionable" as the evidentiary standards necessary to support legal action.  However, actionable intelligence is unlike criminal justice in the sense that one does not have to wait until the last piece of the puzzle is in place. It is extremely dangerous to equate “actionable intelligence” with “complete intelligence" or to equate intelligence with evidence.  Actionable intelligence relies upon situational awareness.

    The CTC approach is intended to provide estimative intelligence, which is an understanding of what terrorists do, their motivation, their organizational assets, and their vulnerabilities (Kauppi 2002).  This is strategic work involving the application of inference and logic to patterns and trends.  However, warning intelligence is a subset of estimative intelligence, and every CTC entity contains a large warning component (Marrin 2003).  Warning intelligence usually focuses on sudden developments that could have a deleterious effect on U.S. security.  The CTC fulfills its warning function by immediately disseminating information to those who can counter the threat, and the warning may be tactical (within hours or days), operational (within weeks or months), or strategic (within months or years). The CTC approach also facilitates information sharing through the creation of personal relationships among workers from different agencies who have been rotated into work at the center.  In this way, the problem of "stovepiping" is avoided.  Information in a "stovepipe" is information that only goes up vertically within one agency.  Sometimes, the process of producing finished warning intelligence is complicated by the need to provide support for ongoing operations, but this is a common problem within the CIA between the DI (Directorate for Intelligence) and DO (Directorate for Operations). 

    The TTIC approach is intended to provide all-source fusion intelligence, which is information collected by other agencies, but overlooked or discounted in some way, and presumably includes open source information which is freely available.  It has access to some 24 information systems and databases spanning the intelligence, law enforcement, homeland security, diplomatic, and military communities.  As of mid-2004, the TTIC is tracking about 100,000 known or suspected terrorists worldwide. As a government-wide initiative, the TTIC approach is a team approach which attempts to avoid the problem of who is a consumer and who is a producer of intelligence.  Hulnick (1986; 1997) has examined this problem and found it is just as serious of a barrier between CIA-FBI cooperation as cultural factors.  TTIC's primary means of communication with its customers is through TTIC Online, perhaps the world's most top-secret website, but one that has the potential for being as unclassified as LEO.  TTIC's goal of becoming a one-stop shopping center for terrorist information is also being accomplished by reducing the number of documents marked "ORCON" which stands for Originator Control, and is a kind of copyright or intellectual property procedure that the intelligence community uses on a sizeable number of documents.  ORCON and other classification/distribution controls cause serious problems with information sharing since if a document was produced by the CIA (the originator) but consumed by the FBI, and the FBI wants to share it with state and local law enforcement, the FBI must get permission from the originator before dissemination occurs.  One solution is to use so-called "tear lines” where classified documents are broken into sections.  Some sections contain summary information and others contain detailed information such as sources and methods.  The sections containing summary information can be disseminated to those with proper clearances.  The sections containing sources and methods cannot be disseminated.     

PROBLEMS WITH USING THE "BEST" INTELLIGENCE

    Good collection and good judgment are the key ingredients to producing good warning intelligence.  The most accurate of warnings are going to come from a minority of individuals, so it is probably good that the U.S. (questions of redundancy aside) is "beefing up" the number of analysts in various agencies.  A large workforce of intelligence analysts should at least statistically increase the chances against surprise attack.  However, there are deeper issues at work the safeguarding against intelligence failure.  Let's conclude with a look at some of the reasons why the "best" intelligence cannot be used for warning purposes.  A partial list is as follows:

INTERNET RESOURCES
9/11 Intelligence Failure: The 2004 Final Report of the 9-11 Commission
9/11 Intelligence Failure: The 2003 Joint Congressional Inquiry Report
BBC Article on Intelligence Failure

Clausewitzian Friction & Future War (pdf)
Criminal Justice and Actionable Intelligence
FBI Counterterrorism Partnerships
FBI Efforts to Improve the Sharing of Intelligence
FBI Press Release (photographs) of 9/11 Hijackers
Intelligence Failure during the Korean War
Law Enforcement Intelligence Guide
ORCON Creep and Government Information Sharing (pdf)
Predicting Surprise Attack
Security Clearance Process for Law Enforcement
Security Controls on the Dissemination of Intelligence Information
The 9/11 Coverup 10-Page Summary
The Big Difference between Intelligence and Evidence
The Clausewitz Homepage
The Monynihan Commission Report: A Review
The Sherman Kent Center
Warrior Cops: The Ominous Growth of Paramilitarism in Police Departments

PRINTED RESOURCES
Alexseev, M. (1997). Without Warning: Threat Assessment and Intelligence. St. Martin’s Press.
Anonymous (
Mike Scheuer). (2002). Through Our Enemies' Eyes. Dulles, VA: Brassey's.
Anonymous. (2004). Imperial Hubris. Dulles, VA: Brassey's.
Anonymous. (2004). "How Not to Catch a Terrorist." Pp. 50-52 Atlantic Monthly (December).
Auten, J. (1985). "The Paramilitary Model of Police" In The Ambivalent Force by A. Blumberg and E. Niederhoffer (eds). NY: Holt, Rinehart & Winston.
Beesly, P. (2000). Very Special Intelligence: The Story of the Admiralty's Operational Intelligence Center, 1939-1945. London: Greenhill Books.
Bennis, W. (1966). Beyond Bureaucracy. NY: McGraw-Hill.
Ben-Zvi, A. (1979). "The Study of Surprise Attacks" Brit. J. of International Studies, Vol. 5.
Betts, Richard. (1978). "Why Intelligence Failures are Inevitable" World Politics, Vol 31, No. 1.
Betts, R. (1982). Surprise Attack: Lessons for Defense Planning. Brookings Institute.
Bullock, J., Haddow, G., Coppola, D., Ergin, E., Westerman, L. & Yeletaysi, S. (2005). Introduction to Homeland Security. Boston: Elsevier.
Carr, E. (1961). What is History? NY: Vintage Books.
Chan, S. (1979). "The Intelligence of Stupidity: Understanding Failures" Am. Pol. Sci. Rev. 73:633-50.
Clarke, R. (2004). Against All Enemies. NY: Free Press. [sample excerpt]
Ford, Harold. (1993). Estimative Intelligence. (2nd ed.) Univ. Press of America.
Gaddis, J. (2005). Surprise, Security & The American Experience. Cambridge, MA: Harvard Univ. Press.
Grabo, Cynthia. (2002). Anticipating Surprise: Analysis for Strategic Warning. DIA-Joint Military Intelligence College.
Hart, G. (2003). "Post-Cold War Lassitude Contributed to the Attack on America." Pp. 42-46 in M. Williams (ed.) The Terrorist Attack on America: Current Controversies. San Diego: Greenhaven.
Hart, P. (1990). Groupthink in Government. Baltimore: John Hopkins Univ. Press.
Hughes-Wilson, J. (2004). Military Intelligence Blunders and Cover-Ups. NY: Carroll & Graf.
Hulnick, A. (1986). "The Intelligence Producer-Policy Consumer Linkage: A Theoretical Approach." Intelligence and National Security 1(2): 212-233.
Hulnick, A. (1997). "Intelligence and Law Enforcement: The 'Spies Are Not Cops' Problem." International Journal of Intelligence and Counterintelligence 10(3): 269-286.
Kam, E. (2004). Surprise Attack: The Victim's Perspective. Cambridge, MA: Harvard Univ. Press.
Kauppi, M. (2002). "Counterterrorism Analysis 101." Defense Intelligence Journal 11(1): 39-40.
Kettl, D. (2004). System Under Stress: Homeland Security and American Politics. Washington DC: CQ Press.
Laqueur, Walter. (1985). A World of Secrets: Uses & Limits of Intelligence. NY: Basic.
Lowenthal, M. (2003). Intelligence: From Secrets to Policy, 2e. Washington D.C.: CQ Press.
Marrin, S. (2003). "Homeland Security and the Analysis of Foreign Intelligence." The Intelligencer 13(2): 25-36. [online version]
Pateman, R. (2003). Residual Uncertainty: Trying to Avoid Intelligence and Policy Mistakes in the Modern World. Lanham, MD: Univ. Press. of America.
Prange, G., Goldstein, D. & Dillon, K. (1982). At Dawn We Slept: The Untold Story of Pearl Harbor. NY: Penguin.
Seidman, H. (1998). Politics, Position, and Power: The Dynamics of Federal Organizations, 5e. NY: Oxford Univ. Press.
Schelling, T. (1962). "Forward" in B. Wohlstetter, Pearl Harbor: Warning & Decision. Palo Alto, CA: Stanford Univ. Press.
Schulsky, A. & Schmitt, G. (2002). Silent Warfare: Understanding the World of Intelligence. Washington DC: Brassey's.
Wohlstetter, B. (1962). Pearl Harbor: Warning and Decision. Palo Alto, CA: Stanford Univ. Press.

Last updated: 06/15/05
Syllabus for JUS 415 (Homeland Security)
MegaLinks in Criminal Justice

 

 

ml>