7 February 2024: Signals at work

Thank you for reading this newsletter. If someone forwarded this newsletter to you, that is very kind of them and don't forget to thank them for it. You can subscribe here to receive future newsletters directly into your inbox.

For Spanish readers, I'm happy to let you know that I uploaded Diego Morabito's Spanish translation of the second newsletter. I will add the next two newsletters very soon as well, for Diego translated those as well.

Feedback and suggestions are always welcome!

Martijn
martijn@lapsedordinary.net

Signals at work

I’m a big fan of the Signal messenger and I love how it has become the de facto messenger in the digital security community.

That includes many organizations that use Signal for work, something Signal itself happily endorses. I too use Signal for various work projects (and should you have something more sensitive to discuss, feel free to use email to ask for my Signal number!).

It is not uncommon for people to take more sensitive topics to Signal, as it is more suited for such conversations than, say, Slack or Microsoft Teams. Still, if your organization wants to use Signal in this way, there are several things to keep in mind.

(On an aside, what I write about Signal below, also applies to WhatsApp, the security of which is pretty close to that of Signal. It largely applies to other messengers too.)

The first is to understand what work-related data can be shared on Signal. As this is a question with legal implications, you’d probably have to ask your lawyer (I’m not one), who may point out that personal data (such as resumes sent for job applications) can’t just be taken out of the system to be shared in some messaging app. There may also be legal requirements to retain certain data for a specific amount of time.

(On that latter topic, a previous UK government got in trouble after WhatsApp messages that discussed controversial topics were found to have been deleted. For very good reasons, governments are required to keep record of the processes involved to make certain decisions.)

A second possible concern is that on Signal, you don’t have any control over what devices it is used on. Compare this to tools like Google Workspace or Microsoft 365 where administrators can set requirements on the devices from which the services are accessed, for example to ensure they are fully patched and not jailbroken or rooted. If a colleague is using Signal, there is no way to prevent them from installing it on the family laptop that runs a long out of date version of Windows.

Note that I don’t mean that someone doing that is an irresponsible member of staff, even if it’s not a good security decision. It might be that because they are such a dedicated member of staff, they installed Signal on that laptop to get their work done on time.

A third thing to keep in mind is that, once the use of Signal (or another external messaging app) has become normal, there’s nothing to prevent people from creating their own groups on the app. That’s generally fine, but what if a work environment has become toxic and people are using this to exclude a less popular colleague from work discussions? At least on apps like Slack or Teams, an administrator can see what groups exist.

Finally, Signal accounts are currently linked to phone numbers. And unless everyone in your team has been provided with a work phone, this means a personal phone number. And by making the use of Signal a requirement for work, even if only implicitly, you make people share their personal phone number with their colleagues. Not everyone will feel comfortable doing so.

Thankfully, this latter point will soon be irrelevant once Signal introduces usernames.

None of this is meant to suggest that it is wrong to use Signal for work. It isn’t by any means. But it is a reminder that the extra security of Signal comes with a ‘cost’. And you’ll have to decide within your organization whether this cost is worth it.

What else?

Access Now published a report on 35 cases of the Pegasus spyware that have been discovered targeting civil society in Jordan. Their report is a collaborative effort with several partners including Citizen Lab, OCCRP, and Human Rights Watch; the latter two of these have staff members among those targeted.
This isn’t the first time Pegasus was found on phones in Jordan and when reading such reports, it’s always good to keep in mind the real people affected by it, whose life and work are hugely impacted by such spyware, as well as the people on the ground supporting them. Still, reports like this are really good news, as they help those behind the spyware being held accountable, while the 35 targets had the spyware detected and thus removed.
For those who have a reason to be worried about advanced spyware like Pegasus, another piece of good news is that Apple’s Lockdown Mode successfully stopped several attacks.

The Tor Project received a code audit from Radically Open Security. Code audits aren’t particularly exciting as they don’t build anything new. Yet they are crucial for any kind of software project, including open source tools like Tor: just because it’s open source doesn’t mean that people spend their (free) time looking for bugs. At the same time, those with malicious intent will gladly spend their (paid) time looking for exploitable vulnerabilities. Code audits rectify this imbalance.
If you yourself run an open source project for people in a repressive context and would like its code audited, one option is to reach out to the OTF’s Red Team Lab. Oh, and if you’re not very familiar with Tor and its importance for human rights, Amnesty International just posted a brief introduction into Tor.

I started my security career in 2007 at a small company called Virus Bulletin and from 2014 until 2019, I was responsible for the day-to-day running of the company. That included the programme of the annual Virus Bulletin Conference, first held in 1991, which I like to think has evolved into one of the most international events in threat intelligence. I’ve always loved the sharing of threat information among industry peers and I am particularly proud that that conference had many talks on threats against at-risk people and groups, such as a 2017 talk on stalkerware, a 2018 talk on threats against civil society and a 2022 talk on NSO’s exploits.
Now the call for papers for the 34th conference (which will take place in Dublin, October 2-4) has opened. It closes on April 5 and I for one would be super excited to see talks on threats against civil society on the programme.

Security training and education form an essential part of a good digital security culture. But if you think that if only people were more informed security incidents won’t happen, this blog post by author, activist and journalist Cory Doctorow, who knows a lot about security, on how he got phished is an important reminder of that mindset being rather native. Trainings and education can never 100% stop phishing. If done well though, they can raise the bar by a lot.

Non-security things

'Hectic' feels like a euphemism to describe the past ten days or so. Not in a bad way, but I am too exhausted to write about music or books — even if I didn't stop reading or listening.

One thing that I have been enjoying a lot in recent months is the Daily Art app, which, as the name suggests, shows a different piece of art (a painting, usually) every day. Those pieces of art include some well known works, but I've also discovered many great pieces by lesser known artists, from parts of the world not widely covered in Western art museums.