ActiveResponse.org

Re-Imagining Cyber Security

Category: Analysis (Page 1 of 2)

The Darker Side of Threat Intelligence: Cyber Stockholm Syndrome

Stockholm Syndrome is a psychological phenomenon described in 1973 in which hostages express empathy and sympathy and have positive feelings toward their captors, sometimes to the point of defending and identifying with the captors. - Wikipedia

Maturing as a threat intelligence analyst involves “living with your threat.”  In my interview process I ask potential analysts about threats they’ve tracked in their career.  Tracking a threat for months or years creates a unique learning environment and I look for that in analysts.  Unsurprisingly, in that environment an analyst becomes intimate with the adversary’s routines, their interests, and even begins to distinguish characteristics of individuals from within a larger group.  An analyst gets truly connected when they can successfully predict a threat’s activity.

However, while this sounds like an analytic panacea and also something threat intelligence production cells strive to build, it comes at a cost.  The risk is that analysts go beyond being closely connected and become “married” to a threat.  In living with that threat every day, spending all of your professional time studying them, spending hundreds of hours discussing them with others, it is impossible not to closely connect with the adversary on the other side of your screen.  Analysts become personally attached to the “bad guys” – a “Cyber Stockholm Syndrome.”  I personally know analysts who have fallen into depression when their threat goes away.

Not only is this unhealthy for the analyst, this relationship also affects their communication and infects their analytic capabilities reducing objectivity.

Symptoms of “Cyber Stockholm Syndrome”

  • An analyst gets particularly protective and defensive regarding perceived encroachment on their territory
  • An analyst unnecessarily hides intelligence and data to prevent others from knowing details helping to maintain their superiority
  • Overwhelming and obvious confirmation bias – an analyst “seeing their threat in everything”
  • An unwillingness to work on other threats even given clear direction and obvious priorities
  • An analyst continues to work on a threat even after the threat is “gone” against overwhelming evidence and analytic consensus

What may cause this?

One hypothesis: an analyst may associate their self-worth with an adversary.  As the analyst grows in mastery of knowledge of an adversary, they produce spectacular intelligence and amazing insight providing great value to others; this results in praise from leaders and admiration from peers creating a feedback loop.  The cycle strengthens the bond the analyst builds with a threat as the threat continues to provide value to the analyst.

What should happen?

When this happens managers may respond by immediately separating the analyst from the threat.  I don’t believe that is the right answer.  Separation causes resentment and potential psychological problems such as depression.  Instead, managers of analysts should look to slowly incorporate other analysts into the equation and ultimately strive to return the analyst to a proper relationship so they don’t lose that valuable expertise.

Most importantly, analysts must recognize this problem in themselves.  For their own professional and personal well-being.

Additional Discussion

Chris Sanders (@chrissanders88) made an excellent point that Stockholm Syndrome requires empathy with an aggressor which is lacking in my description.  I agree that the syndrome’s description includes that requirement but its exclusion from the DSM means there is no consistent definition.  Further, active academic discussion on the topic includes whether Stockholm Syndrome actually exists or is really one facet of a larger aggressor-bonding trait. While empathy is not the right aspect of the bond I describe here there is an attachment bond created either through the return on investment (ROI) the analyst receives through the adversary or otherwise.  This is evidenced by both the confirmation bias present and the sense of depression described by analysts.  I agree that the application of the Stockholm Syndrome may be imprecise.

Cyber Threat Language Dilution

A “trojanized document” hides malware inside itself, but rarely do we call a webpage doing the same a “trojanized webpage”.  The word Trojan, derived from Homer’s epic poem, intended to describe a seemingly innocuous object containing damaging material, now describes almost all cyber threat delivery vectors.  The term “Trojan” in cybersecurity has become diluted to the point of nonsense.

Trojan is just one example in a diluted language space now including other terms like virus, rootkit, targeted, etc.  As the community grows in both terms of depth and breadth, it will carry with it historical baggage and loose terminology.  Poor phraseology will infect those writing on the topic not familiar with nuances further contributing to the problem.  Lastly, as cyber threats grow and change the language must evolve as well causing further issues.  For example, increased modularization of capabilities challenge attempts to clearly categorize with existing language.

This is a problem for effective threat intelligence communication.  Good threat intelligence accurately communicates the context of the threat relativizing it to a risk environment.  A reliance on diluted language increases ambiguity therefore decreasing accuracy and effectiveness.

My message to those responsible for communicating cyber threats: consider language dilution, both your own actions contributing to dilution but also leveraging diluted language and its effect on your customers.  Language dilution is a fact-of-life for any discipline, but how it’s addressed makes the difference.

 

CART: The 4 Qualities of Good Threat Intelligence

I write often of poor quality threat intelligence which pervades the security community.  Poor quality threat intelligence not only has a heavy cost on its consumers, it also threatens the confidence threat intelligence consumers place in their providers.  Confidence is the cornerstone of threat intelligence.  Nobody will take intelligence from an untrustworthy source and act – at least they shouldn’t.  It is important that the producer and consumer trust each other.  That trust needs to be based on transparency and verification.

However, how does one appropriately assess threat intelligence?  The first step must be to identify the qualities which define “good” threat intelligence.  However, these are not binary qualities – there is a clear gradient based on use case.  Timeliness is a good example of this gradient as some intelligence (likely more strategic) has a more fluid timeliness requirement while tactical threat intelligence has stricter requirements.

Further, one single threat intelligence source will not likely be able to satisfy all qualities simultaneously.  For instance, it is unlikely any one provider will have complete visibility across Diamond elements or Kill Chain phases and consumers will have to rely on more than one to achieve satisfactory completeness.

The four qualities are (CART): Completeness, Accuracy, Relevance, and Timeliness.

Completeness

Threat intelligence must be sufficiently complete to provide effective detection and (hopefully) prevention.  For instance, providing a domain indicator used in the exploitation of only one victim is not sufficient for other victims and therefore the intelligence is effectively incomplete and unhelpful.

Accuracy

Threat intelligence must save organizations more in success than it costs them in errors and mistakes.

Relevance

Threat intelligence must address a threat to the organization in a method that allows for effective action.  Intelligence addressing threats not faced by the organization is of no value.  Further, intelligence delivered in a type or method not usable by the organization is also unhelpful.

Timeliness

Threat intelligence must be received and operationalized fast enough to make an impact more valuable than the cost of the threat intelligence itself.

The Cost of Bad Threat Intelligence

There is no doubt that threat intelligence is now “a thing.” At RSA 2015 I couldn’t help but notice how many vendor booths were hawking their relevance to threat intelligence.  I hear about a threat intelligence start-up almost weekly.  That is not surprising given venture capital is flowing and C-suite customers are now investing in “threat intelligence.”  Everyone wants a piece of the pie.

While market growth for threat intelligence produces innovations it also produces negative by-products (welcome to capitalism).  The most concerning by-product is the reduction in threat intelligence quality.

A growing number of published threat intelligence reports contain inaccuracies and poor analysis.  A growing number of indicators across a variety of producers are either stale, irrelevant, or generate so many false positives to be useless.

What so many fail to realize is the cost of poor quality intelligence.  Here are some of the costs:

  • If a single threat intelligence-sourced alert generates $1000 worth of time to investigate a false positive, it is easy to see how that relatively small amount can multiple within an organization and across enterprises worldwide.
  • If an intelligence producer reports incorrectly categorizes a threat as APT (say instead of cyber crime) an organization’s security response to the threat will be (and should be) different likely involving a deeper investigation.  Again, this additional, and likely unnecessarily deep, investigation is costly in both time and resources.
  • Every poor quality report costs time to read and digest.  Time that could be spent understanding a high-quality report.
  • Every poor association or correlation derails an analytic effort at an organization.

Because organizational security resources are finite and already stretched thin these mistakes, errors, and poor practices consume critical resources which could be spent on other problems and reduces the security of an organization.

Two market elements have caused this quality reduction:

  • A need to garner attention in the growing cacophony of the threat intelligence market feeding a “first to publish” mentality which usually results in a “rush to publish.”
  • A lack of customer education resulting in a poor evaluation of providers thereby incentivizing the wrong aspects of threat intelligence – such as volume of indicators over their quality or relevance

Obviously, only threat intelligence providers can solve the problem, but what pressures can help drive effective change?  Here are some:

  • Threat intelligence customers armed with evaluation criteria (particularly quality metrics) which helps them leverage threat intelligence effectively without generating unnecessary costs – this will help create market drivers for higher quality
  • Industry must self-police bad intelligence by being honest with ourselves and each other.
  • Threat intelligence aggregation platforms should have quality assessment capabilities informing the intelligence consumer of potential problems (likewise they are also be in a position to highlight timely, relevant, and unique intelligence of great value)
  • Threat intelligence analysts trained in analytic tradecraft stressing quality and accepting an ethical duty

Security professionals practicing threat intelligence must understand the implications of mistakes and poor analysis.  Bad intelligence can and does decrease the security effectiveness of an organization. Therefore it is an ethical duty of the threat intelligence practitioner to reduce errors. Threat intelligence is difficult – intelligence by definition attempts to illuminate the unknown and works by making judgments with imperfect data – errors are natural to the domain.  But, with proper practices and procedures bad intelligence can, and must, be minimized.

Discover All Websites Hosted on an IP Address

There have been many times I’ve worked to discover all websites being hosted on a single host address (e.g., IP address).  This required some effort and none of my techniques generated anything I considered comprehensive or authoritative.  Usually only a list good enough to get my analysis to the next step.

I found this very useful post today on how to accomplish that exact task, easily.  There is also a bash script posted to do this from the command line.

http://robert.penz.name/722/howto-find-all-websites-running-on-a-given-ip-address/

When I saw the post I immediately recognized how obvious this is.  Of course search engines know this information!  They crawl the web constantly visiting every website and they must have a website-IP mapping.  But, until know I didn’t know they exposed this mapping.

The post shows to how do it in Bing – simply use the following syntax:

ip:XXX.XXX.XXX.XXX

Click here to see an example using current IP address of this website, activeresponse.org: http://www.bing.com/search?q=ip%3A69.195.124.131&go=Submit+Query&qs=bs&form=QBRE

 

A search to reveal websites hosted on an IP address

A search to reveal websites hosted on an IP address

One challenge I see of this technique is that Bing does not expose the timestamp of this information.  Of course Bing would do some caching of information for performance purposes and as such I cannot guarantee that all of these sites are still hosted on that same IP address.  Given the nature of dynamic hosting and cloud services websites can move around pretty quickly depending on their hosting service.

Therefore, and as I’ve cautioned previously on this blog, ensure you know your data source and their biases and limitations.  In this case the data may be cached and out-of-date.

 

 

On the other hand, having a cache showing what was hosted where in the past is also helpful.

But, it’s a pretty cool and helpful capability to have.

Let me know if you have any other easy ways to accomplish this task!  I tried the task with other search engines but was unsuccessful.

DISCLAIMER: I am employed by Microsoft

Snakes and Ladders: How Intrusion Analysis and Incident Response is Like a Board Game and the Critical Role of Pivoting

Pivoting is, in my humble opinion, the most important skill of intrusion analysis and incident response.  I have been teaching/training/mentoring intrusion analysts for over 7 years.  In my experience, this is the most difficult skill to train as it requires creativity, attention to detail, and a full knowledge of their data sources and how to exploit those.

Pivoting is the ability to identify a critical piece of information and being able to maximally exploit that information across all of your sources to substantially increase your knowledge of the adversary and further identify the next critical piece of information – which is then pivoted upon moving you deeper into the operation of the adversary – hopefully earlier into the kill-chain.

An example: trolling through log files you discover a very odd HTTP user-agent accessing your website.  You then query on this user-agent across all the log entries and identify a significant number of users providing this string value.  (Pivot on user-agent) You then extract all of those particular log entries and identify a regular time pattern of access indicating automated functionality.  You also discover that all the requests have a very odd resource/page request – bob.php.  (Pivot on bob.php) You then take that page name (bob.php) and examine all HTTP traffic in your network over the last 2 days and discover that several hosts in your network have been POSTing odd data to bob.php….at this point you may retrieve and conduct a forensic analysis on the hosts, etc.  When you finally discover that the adversary has compromised several internal hosts and has had them HTTP POSTing data to a webpage on your external-facing website of which the adversary then uses to extract the information/data.  At this point, you now have several pieces of mitigative value: the source IP of the adversary’s infrastructure on the outside, the page deposited on your website, any malicious tools discovered on the hosts, the HTTP traffic, etc.  All of which are collectively more valuable to defense than any one of those pieces of information independently.

 

A Step Function

In this way, analysis and incident response is a step-function.  Most of the time analysis is, in a sense, rote.  It involves looking through log files, examining and validating alerts, looking at various binaries.  Step by step peeling back the onion of the adversary’s operations.  At times we even move backwards as an analyst makes an incorrect assumption or a poor hypothesis which costs time/money/resources to recover and correct the analytic path.  However, when a piece of critical information is discovered it should be exploited and a deeper knowledge should be achieved moving the analysis to a “new level” of the function substantially increasing the knowledge as a whole – which, in theory, should lead to additional mitigative opportunities.

 

Chutes and Ladders

My favorite analogy is that of the game of “Chutes and Ladders” (or “Snakes and Ladders” for those outside the US).  A player slowly moves across the board block-by-block but then happens on a ladder which moves them up substantially in the board.  Other times, they land on a snake/chute which then brings them back down.  This is the process of analysis.

Why does this matter?  It matters because this understanding can help us better understand the process and model of analysis thereby providing an opportunity for researchers to target parts of analysis to increase the chances/likelihood of a step-function increase in knowledge and decrease the chance of a decrease.

One way is to increase the capability of analytic tools to maximize pivoting.  Allowing for an easy and quick way to query other data sources with a new discovery and integrating that into the analytic picture.  The tools should also allow an analyst to ‘back-up’ their analysis removing a possible poor path once an error is discovered.

This is just a couple of ideas.  I’d love to hear yours.

15 Knowledge Areas and Skills for Cyber Analysts and Operators

Rodin’s The Thinker

 

Here are some knowledge areas which I consider necessary to conduct effective intrusion analysis and operations. In future articles I will go into further details on how to improve your skills in each of these areas (and link them from here). The knowledge areas are not listed in any particular order.

Every organization’s mission, focus, and needs are different and therefore I don’t pretend to define the ‘perfect’ analyst for any mission.

Critical Thinking and Logic

I will be forthright and say that I consider this skill the most important above all others.  It is a gateway skill which allows an analyst to become proficient in many others.  It is also the skill upon which I rely for analysts to temper their judgments and make the best decision as to how to approach a problem.  Logic is complementary to critical thinking and the two cannot be separated.  Without a proper foundation in logic critical thinking is ineffective.

US-CERT Incident Reponse Report

Critical Reading and Writing

Critical reading is being able to dissect the text of a document to extract the most important information and apply critical thinking skills to the information.Effective/Critical writing and documentation refers to writing correctly, logically, concisely, and effectively for your audience (which likely includes yourself).  Most importantly, write in an organized manner to help others use their critical thinking skills.

History

As I have said previously: “Study History.  It provides perspective.”  Works like The Cuckoos Egg are a great start; but branch into other areas: military history, biographies of famous leaders, studies of famous events.  Learn how others have been able to assess strategic situations, derive tactics, and evolve their strategy to a quickly changing situation.  All of these skills are useful in intrusion analysis and incident response.  Be able to step back from a situation and apply the lessons learned from others to your own.

Research Methods

In the cyber security domain we face more unknown than knowns.  My favorite saying is “no analyst is an island” meaning that there is nobody who knows it all and we need to rely on others and the greater community to help to solve problems.  Therefore, a significant skill is the ability to conduct effective research on hard problems to find existing solutions – preventing, as the saying goes, “recreating the wheel.”   This skill, more than any other, will increase your effectiveness and efficiency.

This skill can and should be mixed with other skills described – critical reading to get through research material quicker, critical thinking to see through the B.S. and FUD, and effective writing to document your findings so you use it again in the future.

Analytic Approaches and Methods

When facing any problem, being able to identify and evaluate the various approaches to solving the problem is invaluable – some would say critical.  Being knowledgeable in as many analytic approaches as possible is invaluable, and being able to create new approaches on-the-fly is even more invaluable.

Learn analytic methods from others.  Look for their mixture of logic, research, tool use, and lines of critical thinking and apply them yourself.

Network Protocol Map

Network Protocol Map

Network Protocol Analysis

Know your network protocols.  More importantly, be able to research, analyze, and identify new or previously unknown protocols.  Don’t be afraid of packets.  Use your research methods and critical reading skills to dissect protocol definitions and RFCs.

 

Programming

A basic knowledge and ability to write computer programs is very useful in that it practices logic skills, helps one better dissect cyber security activities, and allows one to create and/or modify tools quickly as necessary.

Psychology

An understanding of the fundamental theorems of psychology is useful when attempting to determine the intent, context, and motivations of an adversary.  For example, knowing and being able to apply the fundamentals of Maslow’s Hierarchy of Needs or Operant Conditioning will go towards influencing your adversary through operations to achieve a positive outcome and better protect your network.

See Also: A Hacker’s Hierarchy of Operational Needs based on Maslow’s Theory of Human Motivation

Hacker Tools and Methodology

Obviously, a working knowledge of hacker tools and methodologies is a must.

Binary Reverse Engineering

IDA Pro Binary Reverse Engineering

IDA Pro Binary Reverse Engineering

All hackers use capabilities and tools to achieve their desired effects.  Most of these are binaries either live on command-and-control nodes or are delivered to the target for operations.  Having a working knowledge and ability to reverse engineer a binary is necessary to conducting effective analysis   Even if your organization has dedicated reverse engineers having this knowledge to effectively communicate and ask intelligent questions of these engineers is just as important.

Host-Based Log File and Forensic Analysis

Understanding the internal workings of a host and operating system help not only in investigations where host data is available but also as a learning tool to understand the adversary’s target environment.  This will further inform the analyst by providing greater context to the choices of an adversary given the host environment.

This knowledge should be coupled with that of hacker tools and methodologies and network and host configuration and administration for full effect.

Network and Host Configuration and Administration

As I’ve said in another post, 5 Intrusion Analysis Ideas in 10 Minutes, I believe that cyber security professional should be just as proficient in understanding how networks and hosts are administrated and configured as in how those systems are attacked.

Signature Writing and Detection Tools

Snort Rule Header

Example Snort Rule

Finding malicious activity on your network is important, being able to track that activity and detect when it returns is an imperative.  Therefore, analysts and operators should be proficient in their organization’s particular signature and detection tools and learn how to author the best signatures.

It is just as important to understand how a detection tool works but also it’s biases and limitations – so you know when there are potential false positives and false negatives.  This is one of my 20 Questions for an Intrusion Analyst.

Incident Response Methodology

Incident response methodology is obviously a requirement for anybody who is part of the incident response team in their organization.  However, incident response should be well-known by every intrusion analyst.  This is simply because they will likely be generating documentation and analysis for the incident response team.  The better they understand the methodology, the better they can tailor their documentation and feedback to the needs of response and mitigation.

Tools

Wireshark

Wireshark

I am fond of saying, “there is no one tool to rule them all,” meaning no single tool will do everything you need. While I think that too much time is spent by cyber security professionals in becoming proficient in a specific tool-set, I cannot under estimate the criticality of these tools to our profession.  However, I believe that over reliance on our tools breeds ignorance of the data the tool is processing and analysts become unwilling to challenge and blindly trusting the output.

Therefore, it is important to know how to operate and understand the tools that are best for your mission be it OllyDbg or Wireshark.

Lastly, with a strong or competent programming background, as described previously, you are empowered to write your own tools or improve existing tools for the benefit of the community.

A Hacker's Hierarchy of Operational Needs

A Hacker’s Hierarchy of Needs

A Hacker's Hierarchy of Operational Needs

A Hacker’s Hierarchy of Operational Needs

Maslow's Hierarchy of Needs

Maslow’s Hierarchy of Needs

All humans have a basic set of needs which they work to satisfy – as described by Maslow in his seminal, “A Theory of Human Motivation.”  Maslow did not create a true hierarchy.  He describes how there are sometime competing and/or complementary needs.  Instead of a strict hierarchy, these needs form dominating preferences or priorities.

It made me question whether there was a cyber operational equivalent: a set of hierarchical needs or requirements necessary for the adversary/hacker to meet their goal.  Like Maslow, I do not believe that this hierarchy is necessarily serial in nature but rather inform priorities and dominate preferences.  Nor do I believe that they must necessarily be satisfied in order, serially.

For instance, a hacker may create a capability and then sell that capability, or their skills to use the capability, to an organization thereby gaining funding for the rest of the operation.  However, while the capability was the first achieved in the chain, it was a vehicle to achieve a more base need: funding.

 


Basic Necessities: Obviously those things which allow a person to live and work effectively

Funding: Even the most basic funding is required for equipment (computer(s)) and/or purchasing other things like connectivity to the Internet and the like.

Connectivity: A hacker must be connected to a network to which s/he can reach potential targets

Target Vulnerabilities: A hacker must have a set of vulnerabilities and exposure upon which they can exploit to achieve their goals

Capabilities/Infrastructure: I believe these are both equally important but both are a requirement for operations – the capability to achieve their effect, and the infrastructure to deliver the capabilities to the target victims

Targets: A hacker must have a one or more targets of which they can use to achieve their intent

Access: A hacker must have access to the target to achieve any effects and ultimately achieve a positive outcome

Outcome: The successful exploitation, attack, etc. of which was the entire intent of the hacker

Reward: The reward for their successful operation (fame, fortune, notoriety, etc.)


So, what do you think?  Do they map to your understanding of the hierarchy for the operational needs of a hacker? How would you use this model?

 

5 Intrusion Analysis Ideas in 10 Minutes

Here are 5 cyber security ideas to improve your analysis and understanding which will take no more than 10 minutes of your time.

1. Inspect all events with a sliding scale – Good, Suspicious, Bad

An analytic mind-set should move as evidence is uncovered

One of the easiest, and worst, mistakes an analyst can make is to be too firm in their judgement.  I train analysts, and myself, to use a freely sliding scale when inspecting events, packets, and binaries.  This scale moves between known good, suspicious, and known bad as uncovered evidence supports a “goodness” or “badness” final judgement.

It is natural to come to premature conclusions when analyzing data.  Many preach against this.  I have never known a perfectly objective human.  This discounts our naturally occurring and helpful ability to make quick judgments and drive our desire for more data and evidence.  Instead, we should preach against the analyst who is hasty in a final judgement and unwilling to accept and synthesize new evidence in either direction.

2. Be willing to accept suspicious

There will be many times when after hours or days of work and collaboration the best judgement is that the event, packet, log entry, or binary, etc. is still not known as either “good” or “bad.”  An analyst must be willing to accept this unknown middle ground of “suspicious” where final judgement is elusive.  There will be times when there is not enough evidence either way nor is it likely more evidence will be uncovered (e.g. that purged critical log file, the company will not provide a missing piece of information, etc.).  Be willing to accept suspicious as an answer and reject the pressure to render a final judgement of good or bad.

However, it is important that an analyst is willing to render an informed judgement to decision makers as to where, on the scale, the event lies and what evidence supports that judgement – and more importantly, what evidence supports a contrary judgement.

3. Goodness Outweighs Badness

Some of the best cyber security analysts I have known have been network engineers and system administrators – those that best understand how systems and users actually work rather than relying on the hypothetical or documentation.  This is because the majority of network activity is good/valid versus bad.

The most valuable skill an intrusion analyst can have is to quickly and accurately identify valid activity and separate the non-valid/malicious/bad activity from the pile.  My number one recommendation to upcoming intrusion analysts is not just focus on courses and materials which teach intrusion techniques (e.g. SANS) but to spend an equal amount of time on the RFC‘s and other training opportunities which teach the valid operation and administration of modern networks and hosts.

4. Counter-Intelligence is our closest domain partner

Of all the domains I have studied to further my exploration of intrusion analysis and cyber security it is counter-intelligence which I have found to offer the most insight and parallels to this domain.  Others may argue with this but counter-intelligence works in a domain where there is an assumed compromised environment and the focus is primarily on detection and damage limitation when compromise occurs.

Of course, counter-intelligence necessarily breeds paranoia – but that is also a good quality in an intrusion analyst, when balanced with the right amount sanity 🙂

5. Document everything and don’t get lost in the “rabbit hole”

In the pursuit of an activity with the gathering of evidence and shifting judgments and hypotheses, things can move quickly.  When conducting intrusion activity, document everything – even if it seems irrelevant – you never know when a case will hinge on a small, originally, over-looked detail.   In this documentation also record all open questions and hypotheses so that when “going doing the rabbit hole” of analysis towards one hypothesis other lines of analysis are not forgotten or discounted without effective evidence gathering.

8 Tips for Maintaining Cyber Situational Awareness

 

Situational awareness is the perception of your environment and comprehending the  elements within that environment with particular focus on those critical to decision making.

Cyber defenders, operators, and analysts must maintain “situational awareness.”  This is more than sitting in a room with several large televisions streaming Twitter and [insert management’s favorite cable news channel here].

Maintaining situational awareness is the act of continuously defining your environment and identifying (and comprehending) elements critical to decision-making.  The purpose of this act is so that one can continuously orient towards the best decision.

Those familiar with the OODA Loop will recognize this as the observe phase in the loop.

It is important to know and comprehend your environment, which means both your internal situation AND the external situation.

Knowing your internal situation usually comes with dashboards, alerts, network activity graphs, parsing log files, vulnerability scanners, updates from vendors, etc.  From this view an analyst finds particularly interesting events or anomalies and understand their organization’s exposure surface.

Most importantly, the situational awareness from these data points should provide a decision-making construct to identify necessary actions (e.g. “should we patch for that?”, “should we close that firewall hole?”, “should I explore that spike in traffic?”).

However, maintaining knowledge of the internal situation is not enough.  Just as a pilot must keep their eyes on their instruments AND the horizon an analyst must keep their eyes on their internal sensors AND the external threat environment.

Keeping track of just ONE of these environments is hard enough, how can an analyst hope to track both environments effectively,  make effective decisions on that information, and act on those decisions on time?

Both management and analysts dream of some tool that will quickly and easily integrate these disparate and complicated environments simply to make the best decisions quickly.  However until that dream tool is created:

1. Know your organization’s mission statement, business strategy, and business rules

You’ll never know what elements or events are important if you don’t know what is important to your organization.  Be able to articulate your organization’s mission statement.  How is your organization attempting to meet its goals and how do you support that?  How do the various business units work together to create cohesive whole?  With this information you can make an informed decision as to the criticality of an event based on the assets being affected.

2. Be cognizant of external events affecting your organization’s mission

What is happening in your market space or global sociopolitical space which is changing your security profile?  Will that new acquisition by a foreign competitor cause you to become a target of corporate espionage?  Will hackers target your organization in retaliation to country X expelling ambassadors from country Y?

3. Be aware of internal events

What is happening inside the organization?  Is there a new desktop load being deployed?  Who is being fired today?  What are the upcoming mergers/acquisitions?  All of these affect the exposure surface of an organization and it’s target profile to attackers.

4. Find and follow the best

The internet is the greatest collection of human knowledge ever assembled.  Use it.  There are great security researchers and analysts constantly updating information sources with critical knowledge.  Find these sources and follow them.  Use Twitter, Google Reader, Listorious, and other sources to help aggregate this information.  Who/What are the critical sources following?

5. Be aware and able to communicate what is missing

Know what is missing from your viewpoint.  Are there any data feeds which would add to the picture?  What are the biases and limitations of your data sets?  How do these affect your decision-making?  Knowing this in advance and taking it into account will help reduce poor decision-making and unexpected consequences.

6. Know the rule sets, analytics, and data sources

The better an analyst knows their own rule-sets, analytics, and data sources, the more efficiently and accurately they can distinguish critical from non-critical events.

7. Eliminate Useless Information

One must carefully balance the need for information with the danger of information overload which will cause poor or delayed decision-making.  Therefore, eliminate any useless information sources.  This includes high false positive hitting signatures, network activity graphs which nobody pays any attention to.  It is better to have less information of higher quality than high quantity which muddles decision-making.  Replace bad data feeds with something useful, or better yet don’t replace them at all.

8. Not Everyone Requires the Same Information

It is important for organizations to understand that everyone does not need the same information to maintain situational awareness.  People think differently.  Use that to your advantage.  Don’t try to make robots.  People perceive their environment differently from one-another.  Allow each to develop their own information feeds and visualizations to maximize effectiveness.

Page 1 of 2

Powered by WordPress & Theme by Anders Norén