Categories

Interview: Bellingcat Open Source Investigations

Interview: Bellingcat Open Source Investigations

The CSRI team were recently joined by Eliot Higgins and Donara Barojan from Bellingcat and the Atlantic Council’s Digital Forensic Research Lab (DFR Lab), who led a fascinating workshop on Open Source investigations. The session explored the boundaries of Open Source research in analysing the nature and impact of disinformation campaigns and hybrid threats in the digital domain.

1. How has open source data changed the ways in which we can create public awareness of human rights issues?

Eliot: We now have access to a vast amount of imagery that can be cross referenced and verified against other material, such as witness statements and satellite imagery. Even in what were considered remote areas of the world the use of satellite imagery to examine and verify reports of incidents is something that’s very powerful, and its usefulness has already been demonstrated in countries like Mynamar.

Donara: My area of work is not necessarily centred around human rights, but Eliot’s work is, and I am a big fan of it. I think it has added a human touch to the evidence or proof of human rights abusers that we often hear of because we very frequently hear about numbers but never see that human face of the human suffering, and the open source research that uses proof and evidence generated by real users on social media I think allows us to see that.

2. What do you think are the future challenges and needs for development?

Eliot: The biggest challenge seems to be training, there’s a rapidly increasing interest in the sort of work Bellingcat does, but very few people who are trained to do the work in the way Bellingcat does it. This is one of the issues I’m trying to address at the moment, but we’re still in the early stages of that.

Donara: In the area of disinformation and information security I think the future challenges are related to technology, and further polarisation of our societies. We are now seeing more and more in the fake videos emerging, and although now they are more focused on the indecent activities and celebrity pornography I think very soon they will be able to imitate a politician and in that 24 hr news cycle that’s becoming more and more rapid it might create some serious security implications, and potentially cause conflict. I think the increasing polarisation in our society is enabled by social media sites is another important issue, because there is not a lot of constructive dialogue happening across the political spectrum and I think that is hurting the unity of our societies, and making us more vulnerable to an external influence, attack or propaganda.

3. If you could change one thing to increase the impact of your work what would it be?

Elliot: More collaborations with organisations where our investigative techniques can be used to support their own investigations. We’ve done it in the past, and it’s been very successful, so being able to do it on a regular basis would be very effective.

Donara: Well actually I would make more videos, and turn our content into more digestible forms, so that our content is as interesting as the hoaxes that we are capturing.

4. What technical challenges do you face in your job, and how do you verify whether images are genuine or fake?

Eliot: Broadly speaking, we explore the network of information the image exist within, so for example information we can gather from metadata, where it was shared, details in the image itself, and so on. There’s a whole range of tools and techniques used to do that, so the technical challenges are usually about which tools and resources we have to verify an image. Sometimes you know the exact tool you need for a job, sometimes you have to go and find it, or even make it yourself.

Donara: Well because we are only using open source research we don’t really get a chance to attribute the campaigns that we identify on social networks to a particular government or entity, because we don’t have the access to the back end of twitter of Facebook it means we cannot conclude with any kind of certainty who was behind an influence campaign or an influence operation, and that’s a real shame because the best deterrent and the best protection from disinformation and dissemination of fake news is being able to point out who did it, and who might be targeting you, and when we can’t do that then that presents a dilemma.

5. How large a problem do you think the spread of fake news is?

Eliot: From the perspective of discourse around a topic it’s an increasing issue, especially now you can find anything to support your pre-convinced notions, rather than being challenged by evidence and facts. From the perspective of using it as evidence it’s not a big issue, as any quality investigation will fact check and verify all the evidence, so fake news will be easily identified.

Donara: I think often we tend to overestimate it, although it is a problematic issue / area. I think what it does more often than not is that it enables people to believe what they already believe. It doesn’t change people’s hearts and minds, but it does give them the support that they might be seeking, and it helps them confirm the vices that they already have. To that end I think the problem of fake news does more to polarise societies than to change people’s opinions or effect the way they are voting. So although it is an important problem I don’t think it is as important as disinformation, which is the nudge that does try to change people’s behaviours.

6. How much of an issue do think media credibility is?

Eliot: I think you can broadly define people as either being faith based or fact based in their news consumption. There’s plenty of people who are far more interested in having their beliefs supported by their media consumption habits than actual facts and evidence, so in one sense media credibility is totally irrelevant to them.

7. Obviously in this type of work you must come into contact with a lot of trolls. How do you deal with this?

Eliot: Arguing with trolls is probably my only hobby, and can sometimes be useful to see what sort of logical hoops their jumping through at the moment to attack your work. With the Russian government and media occasionally picking up on their nonsense in a way it’s like having all possible counter claims being crowd-sourced before an official body can use them, so it can be pretty useful to pre-empt whatever they’ll come up with.

8. How does it feel to know that your research is being taken up by human rights charities such as amnesty international?

Eliot: It’s good to know the field and our work is being taken more seriously. Now I’m working with the like of the ICC and IIIM on Syria on how open source evidence can be used, which is a complex and important topic moving forwards, and it’s good to be part of that process.

Evaluating the Use of Automated Facial Recognition Technology in Major Policing Operations

Evaluating the Use of Automated Facial Recognition Technology in Major Policing Operations