A red team is an independent group that seeks to challenge an organization in order to improve its effectiveness. Sandia National Labs uses red teams that attempt malicious entry in both the physical and cyber world. The United States intelligence community (military and civilian) has red teams that speculate about alternative futures and write articles as if they were despotic world leaders. There is sparingly little in formal doctrine or publications about Red Teaming in the military.
Private business such as IBM and SAIC, and government agencies like the CIA and Sandia National Labs, have long used Red Teams. Red Teams in the United States military were used much more frequently after a 2003 Defense Science Review Board recommended increasing the use of Red Teams to help prevent the shortcomings that led up to 9/11. In response to the 2003 report, the Army stood up its service-level Red Team, the Army Directed Studies Office, in 2004. This was the first service level Red Team and until 2011 was the largest Red Team in the DoD.
One type of Red Teaming can take the form of penetration testers that assess the security of an organization, which is often unaware of the existence of the team or the exact assignment. This type of Red Team provides a more realistic picture of the security readiness than exercises, role playing, or announced assessments. Red team may trigger active controls and countermeasures in effect within a given operational environment.
In wargaming, the opposing force (or OPFOR) in a simulated military conflict may be referred to as a red cell (this is a very narrow form of Red Teaming) and may also engage in red team activity, which is used to reveal weaknesses in military readiness. The key theme is that the aggressor is composed of various threat actors, equipment, and techniques that are at least partially unknown by the defenders. The red cell challenges the operations planning by playing the role of a thinking enemy.
Some of the benefits of red team activities are that it challenges preconceived notions by demonstration; they also serve to elucidate the true problem state that planners are attempting to mitigate. Additionally, a more accurate understanding can be gained about how sensitive information is externalized, as well as highlight exploitable patterns and instances of undue bias with regard to controls and planning.
Read more about Red Team: United States Army, U.S. Joint Forces Commands' Joint Enabling Capabilities Command (Now US Transportation Command's JEC, United States Government, U.S. Marine Corps, US FAA, Other Examples, Intelligence Work, Hacking, See Also
Famous quotes containing the words red and/or team:
“The sable presbyters approach
The avenue of penitence;
The young are red and pustular
Clutching piaculative pence.”
—T.S. (Thomas Stearns)
“Relying on any one disciplinary approachtime-out, negotiation, tough love, the star systemputs the parenting team at risk. Why? Because children adapt to any method very quickly; todays effective technique becomes tomorrows worn dance.”
—Ron Taffel (20th century)