This page is from APP, the official source of professional practice for policing.
Effective analysis
To assist decision makers, analysts must deliver effective analysis that can be understood and acted upon.
One model that illustrates the role of analysis in policing is the ‘3i model’ (Ratcliffe 2004).
This model positions the role of the analyst as interpreting the criminal environment, to then influence the decision maker, who will then impact the criminal environment. The value of analysis within this model is therefore in:
- interpreting the environment, the stage completed within analysing the problem
- influencing decision makers by producing work that can be understood and acted on
- impacting the environment, completed by the decision maker but informed by translating the work the analyst produces into action
Writing for impact
The look and feel of an analyst’s product is important. The quality and standard of production will make an impression on those who receive it. It is important that all products aim to achieve the best standard in order to command respect, foster trust and have the analytical work acted on.
ABC
- Accuracy – analysis should be accurate – factual errors are not acceptable and will undermine analytical products.
- Brevity – analysis should be clear, short and to the point.
- Clarity – the reader should be able to understand the facts that are presented.
Bottom Line Up Front
Bottom Line Up Front (BLUF) is a good technique to use when producing written and verbal assessments. This is a technique whereby the analyst leads with their analytical conclusion to ensure that, if the customer does not have long to read the product, they will still get an understanding of the key message.
Analysts should follow their conclusion with the supporting evidence, from the strongest to the weakest, ensuring that it contains the ‘what’, ‘so what’ and ‘what’s next’ where necessary. Writing in this format ensures that the customer does not need to read the whole document in order to understand the main message.
Analysts must believe their bottom line. If not, they should write a new argument.
The 4-3-3 principle
When writing, analysts should consider using the 4-3-3 (commonly used by the FBI). This should be used as guidance rather than as a fixed rule, as it is not always possible to follow. This principle states that:
- no sentence should be longer than four lines
- no paragraph should be longer than three sentences
- no section should have more than three paragraphs
This style of writing ensures that paragraphs are structured in a way that results in analysts using concise language that is easy to read and understand.
Writing clearly
When writing reports, analysts should:
- use clear and unambiguous writing
- not make assumptions about the reader’s previous knowledge
- focus reports on the agreed scope of the terms of reference (TOR)
- establish confidence in the inferences made by showing the quality of the information sources used
- maintain unambiguous and objective reporting (avoiding adjectives such as huge, lacklustre or significant)
- invite colleagues to critically read the report to check it is fit for purpose
- ensure that enough time remains to complete the report and that it does justice to the amount of analysis/conclusions completed
Key findings and summaries
Key findings should be within the scope of the original TOR for the analysis. There should only be six or seven key findings. Each statement should be clear, relevant and unambiguous. The analysis that supports the key findings should be visible in the text of the final report.
Products such as maps or charts that are more visual in design should be easily understood through effectively presenting the key elements, allowing for more detail to be obtained than through reading any supplementary report or other expanded output.
Visualisation
Graphs, tables, pictures, maps, infographics and other visual methods of presentation can all enhance a product. Indeed, in a visual world, they can form the core of a product, illustrating the relationship between key actors, for example.
It is important to include these only where they add value and to consider the value and impact of these in each case. There will be occasions where an image or other visual aid will be more effective in communicating a message to the reader than a volume of text.
Tables, spreadsheets and matrices are a useful tool for arranging large amounts of information. They can be used in the collation process and to develop complex charts at a later stage in the analysis. This might include using a table to illustrate sequences of events over a long period, where visual illustration requires too much space or creates too much complexity for the message to be understood.
Maps can help illustrate patterns of movement or demonstrate the relationship between different events. Various data can be overlaid onto maps to support this such as:
- cell-site data
- Automatic Number Plate Recognition (ANPR) data
- points of interest such as home addresses
- locations of other key sites
Hot spot analysis relies on using maps in particular to illustrate patterns of crime. In this context, it is important to choose the right type of visualisation in analysis and subsequently to illustrate the message, (examples could include point data, kernel density estimation, choropleth mapping or a mixture of these).
‘Infographics’ are increasingly employed and can be an attractive and engaging medium for communicating analysis, with key points reinforced by impactive visuals. It can be highly effective in communicating a message and can provide a high level of creative licence to the analyst.
There are many different types of infographics produced and it is a good idea to research examples to identify ideas for a style suitable to the subject matter. By setting out the key message or argument of the analysis (possibly as a ‘strapline’) and building the rest of the infographic to support this, an analyst can set out their work in a clear and persuasive manner.
Depending on customer requirements, an infographic could accompany a formal written report, or replace it altogether. In the latter case, however, it becomes even more important to design a way of capturing research and analysis that doesn’t appear on the infographic in case it is needed to answer questions about the work or in the future.
Communicating probability
Accurately communicating probability is an important element of good-quality intelligence assessment. The agreed standard for conveying probability in intelligence analysis in the UK is the ‘PHIA probability yardstick’. This is a scale of probabilistic language developed by Defence Intelligence and latterly adopted by the PHIA for use across the government intelligence community. The scale comprises accepted intelligence terminology at a national level.
This scale demonstrates broad ranges of certainty or uncertainty that can be translated into consistent language. This language is then used in intelligence products in the context of any assessment, accompanied by the scale as an appendix to support interpretation.
PHIA probability yardstick table
Probability range | Judgement terms | Fraction range |
---|---|---|
≤ ≈ 5% | Remote chance | ≤ ≈ 1/20 |
≈ 10% to ≈ 20% | Highly unlikely | ≈ 1/10 to ≈ 1/5 |
≈ 25% to ≈ 35% | Unlikely | ≈ 1/4 to ≈ 1/3 |
≈ 40% to <50% | Realistic possibility | ≈ 4/10 to < 1/2 |
≈ 55% to ≈ 75% | Likely or probably | ≈ 4/7 to ≈ 3/4 |
≈ 80% to ≈ 90% | Highly likely | ≈ 4/5 to ≈ 9/10 |
≥ ≈ 95% | Almost certain | ≥ ≈ 19/20 |
Key
- ≈ approximately equal to
- ≥ is greater than or equal to
- ≤ is less than or equal to
- < is less than
PHIA probability yardstick diagram
This consistent terminology also follows the national intelligence model (NIM) approach to common terminology and operating principles, whereby end users can expect to pick up an intelligence product from any individual in any force or agency and reliably know what the assessment is seeking to communicate around the certainty or otherwise of any judgements.
For further information, see PHIA Analysis Guidance (available to authorised users logged on to the restricted online College Learn).
The scale was developed in response to two key challenges to effective intelligence assessment, namely the potential for:
- misinterpretation – a lack of consistency in the use of language means that the same terms can be used to describe very different things, meaning that the findings of any assessment could be wrongly interpreted and therefore improperly acted on
- misrepresentation – vagueness through a lack of consistency can lead to an intelligence assessment being accidentally or deliberately misrepresented by the end user, particularly if the work is quoted or collated into another product, potentially leading to inappropriate action.
Writing for action
Developing recommendations
A clear set of recommendations should be included in analytical reports. These should be based on the analysis and focus on key findings or information gaps relevant to the issue being analysed.
Recommendations should be written as directional statements and be limited to perhaps six or seven of the key issues. Recommendations should follow a structure, such as SMART, where any recommendation is:
- specific – recommendations should be clear, detailed and unambiguous
- measurable – recommendations should make it clear exactly what needs to be achieved. They should help to set operational objectives and objectives for results analysis
- achievable – all recommendations should be achievable and focused on what can be done with available resources
- realistic or relevant – the recommendations should be within the scope of the original TOR for the analysis and be realistically achieved if adopted
- timely – recommendations should be presented as short, medium or long-term options. Alternatively, they could be prioritised and given a schedule
Analysts may make recommendations across a range of issues. This stage of the analysis may require analysts to engage with other analysts, researchers or other intelligence professionals, with operational specialists or with subject matter experts and to jointly deliver any recommendations, as required.
Checklist for report writing
The key points to consider when developing a report are:
- only develop key findings, information gaps and recommendations once the analysis is completed
- write up the analysis before developing the key findings, information gaps and recommendations to ensure that they follow a standard and logical order of presentation
- once the analysis is written up, highlight the points that are to be communicated as key findings, and where the analysis suggests further action
- discuss the analysis with an experienced colleague who may assist in identifying the most important findings from the analysis
- use the collection plan to identify the information that was not received or found, and assess the impact of that gap on the findings
- consider what should be added to the TOR, or done if there was more time or information and whether this would improve the task that was set, or operational activity
- if so, consider including the information as recommendations
- write recommendations in the analysis as a first step to ensure that they are supported by the analysis and evidenced in the text – they can then be moved to the appropriate section of the report
- use the review process (see below) to check that the key findings, information gaps and recommendations are unambiguous, clear and directly relevant to the TOR
Problem solving
Analysts within certain functions may take a direct role in operational activity related to their work, particularly in areas relating to volume crime or antisocial behaviour. This might include taking part in problem-solving activity.
Problem solving is supported by analysis and is best achieved by adopting a collaborative approach. Collaboration should include subject experts (in crime prevention, neighbourhood teams, forensic staff, analysts, investigators, intelligence staff and partners) and, where relevant, a trained facilitator or someone outside the group to run problem-solving meetings or workshops.
Prior to meeting, all those involved must have read the analysis report or intelligence product. Researching best practice resources such as the College of Policing website for possible responses prior to the meeting can also be beneficial in understanding what has previously been used to counter similar problems. Also consider using organisational memory databases in force and results analysis reports. Any collaborative sessions may also benefit from using appropriate analytical techniques, including hypotheses generation and testing, SWOT, structured brainstorming or key assumptions checks.
Going beyond the descriptive
The value of analysis will most often be realised when the product provides an assessment that delivers insight, clarity and context. The purpose of the product will have been established through the TOR stage. Where a product is required to go beyond the descriptive, to explain why something has happened, to evaluate what something means or to estimate what might happen next, the analyst must add value (Pherson and Sullivan 2013).
Value may be added by using appropriate analytical techniques to make inferences, to generate and test hypotheses or to develop scenarios as required. It is important to communicate these and subsequent findings and conclusions effectively. Going beyond the descriptive should increase the potential for action to be taken by identifying actionable lines of enquiry, intelligence gaps or intelligence indicators.
Quality review
Once the product is complete, review is a key step to establish that the product is clear and concise and responds to the original TOR. Reviews take a number of formats including:
- peer review by a colleague
- formal or informal quality review by an analyst manager
- stakeholder reviews by those commissioning the work
The review might represent a planned milestone in producing the work or may be done prior to accepting the completed work. Stakeholders might include partners as diverse as the Crown Prosecution Service (CPS), for work presented in court, partner agencies for joint strategic or tactical products or government agencies which have tasked work out to forces.
Analysts should confirm that:
- the TOR have been achieved
- inferences and key findings are the correct ones
- findings are supported by the report text
When planning the timescale for producing analysis, adequate time should always be built in for review. It is important to provide the reviewer with the TOR so that they are aware of what the analysis is trying to achieve and can check that any specific questions are fully answered. When seeking peer review, the analyst should choose a colleague able to provide an objective view.
Examples of the questions that the reviewer should answer include the following.
- Are the key findings clear?
- Does the work add value, providing insight, clarity, context or direction, for example?
- Does the document make sense?
- Does it read well?
- Does the reviewer agree with the findings based on the content of the document?
- Do the arguments make sense?
- Do the key findings support the inferences and the core argument(s)?
- Did the analyst make use of appropriate analytical tools and techniques?
- Do the recommendations flow from information gaps and key findings?
- Are relevant minimum standards complied with?
- So what? What does this work achieve? What does it say, enable or inform?
The reviewer should give honest, constructive feedback, and the originating analyst should receive the feedback in such a light. If the answer to any of the questions listed is ‘no’, then the originating analyst may like to consider amending the way the information is presented. The most effective way of exchanging feedback is to ask the reviewer to brief the originating analyst directly, enabling clarification of any feedback and discussion of changes suggested. Any changes can then be made immediately without delaying dissemination of the product and without altering the authorship and analytic line of the product.
Disseminating analytical output
Dissemination should be agreed and set out in the TOR. Consideration should be given to who needs the information and the most appropriate format for dissemination. The amount of information disseminated will vary, depending on the audience. For one audience it may be appropriate to provide detail of the analysis. For others, it may be more appropriate to provide only the intelligence gaps and give direction on how they might be filled.
The analyst is responsible for producing and disseminating appropriate material. They should ensure that products are version controlled and comply with the corporate style and any minimum requirements. This ensures a professional approach and supports continual improvement of future work.
Specifically, the analyst must ensure that the:
- correct Government Security Classification (GSC) grading is considered and properly shown
- document is being disseminated in accordance with the GSC
- most appropriate media for dissemination is chosen
- recipient is able to access the report and has the software necessary to read attached charts or maps
- recipient is aware of any restrictions on the storage of the report
- original report is stored correctly and is easily accessible in the future
Briefings and presentations are often used by analysts to supplement written reports.
They may also be used in the place of written reports to disseminate results to some audiences. Good presentation and briefing skills are an important part of an analyst’s skills and abilities. Some key elements to remember for preparing and giving briefings and presentations are:
- know the material and be prepared for questions
- keep it brief and keep to allotted time
- focus on three or four key messages
- keep supporting slides to a minimum (don’t read them)
- use charts and maps to support the content
- know the audience and pitch accordingly
- avoid fidgeting and standing with hands in pockets, or arms folded in front of the body and use open body language
- maintain eye contact, breathe and smile
Checklist – disseminating analysis
The key points to consider when disseminating analysis are:
- identify barriers to dissemination early on in the analysis process and assess the risk of not removing them
- follow the dissemination requirements set out in the TOR
- check that the GCS is correct and is appropriately displayed on the report and supporting media
- keep a list of the recipients of the report and consider asking them for feedback if this is not done automatically
- ensure that deadlines are adhered to by leaving time to check that the customer has received the report and that any consultation processes have been followed