The coronavirus curfew has companies all over the globe scrambling to adapt to telecommuting. A massive beneficiary of this has been the teleconferencing company Zoom. But this company, in the best tradition of many a Silicon Valley startup, has a horrendous track record when it comes to security and privacy.

Howdy! Here we are with another out-of-band release of The Private Citizen. Since I wanted to dedicate the usual Wednesday episode to a non-coronavirus topic but, unsurprisingly, things have come up that people want me to talk about, I decided to give you another bonus episode. Hope y’all like my reasoning here.

Today, the show is coming to you from the mobile studio in the driver’s seat of Merle, my Volkswagen T4 camper van.

My studio in the van I’m podcasting from my camper van today.

Me during the show Gotta wear a hat while you’re on the air, of course.

I’ve created a new $2 Patreon tier because I misunderstood some feedback from Butterbeans who wanted a $2 per episode tier. I’m not a fan of having people support me based on the number of episodes I release, because I want to be able to put out more than one episode a week (like this week, for example) when the situation calls for it – and I want to do that without having to worry about people getting charged more. I want the Patreon support to be predictable for both the producers of the show and for me. That said, people seem to like the new option as a number of producers have upgraded their pledges. Butterbeans, of course, put his money were his mouth is and upgraded to the top tier and is now an Executive Producer. Thanks, mate! I do appreciate it a lot!

Civil Liberties Are Being Severely Curtailed in Germany

On to the latest coronavirus madness.

In Germany, the right to assemble for peaceful protests is non-existent. On Sunday, demonstrations in many German cities to help refugees stuck in Greece and drowning in the Mediterranean were dispersed even though everyone was abiding by social distancing rules and there really weren’t any health reasons for doing so.

Meanwhile, German lawyer Beate Bahner [has decided to sue against the coronavirus-related measures]( Pressemitteilung.pdf) . Like me, she is of the opinion that the restrictions are unconstitutional and says she is prepared to take the lawsuit all the way to the Bundesverfassungsgericht (BVerfG, the German constitutional court) in Karlsruhe.

On another front, COVID-19 patient data is now being shared with a number of state organs as a matter of course (something else I had predicted in earlier episodes). Police in the federal states of Baden-Württemberg, Bremen, Lower Saxony and in Mecklenburg-Vorpommern has requested data about all confirmed patients in their jurisdiction “to protect themselves”. This data also includes personal details about home quarantine situations and the names and addresses of people who came in contact with the patient. The state data protection officer of Baden-Württemberg is of the opinion that “data like this must not be handed to the executive” and that it “should be deleted immediately”.

Despite the fact that this behaviour seems to be clearly against the law, several federal state interior ministries have told health officials on city and county levels to hand names and addresses of patients over to state police commissioners.

Unprecedented Surveillance

And Germany isn’t the only country where this kind of data sharing is going on. It’s happening all over the world.

Western governments aiming to relax restrictions on movement are turning to unprecedented surveillance to track people infected with the new coronavirus and identify those with whom they have been in contact. Governments in China, Singapore, Israel and South Korea that are already using such data credit the practice with helping slow the spread of the virus. The U.S. and European nations, which have often been more protective of citizens’ data than those countries, are now looking at a similar approach, using apps and cellphone data.

“I think that everything is gravitating towards proximity tracking,” said Chris Boos, a member of Pan-European Privacy-Preserving Proximity Tracing, a project that is working to create a shared system that could take uploads from apps in different countries. “If somebody gets sick, we know who could be infected, and instead of quarantining millions, we’re quarantining 10.”

The U.S. federal government, working with the Centers for Disease Control and Prevention, is creating a portal that will compile phone geolocation data to help authorities predict where outbreaks could next occur and determine where resources are needed, though the effort faces privacy concerns. The anonymized data from the mobile-advertising industry shows which retail establishments, parks and other public spaces are still drawing crowds that could risk accelerating the transmission of the virus. Alphabet Inc.’s Google said Thursday it would share a portion of its huge trove of data on people’s movements.

Massachusetts Institute of Technology researchers have developed an app to track Covid-19 patients and the people they interact with, and are in talks with the federal government about its use, The Wall Street Journal has reported.

Some European countries are going further, creating programs to help track individuals – with their permission – who have been exposed and must be quarantined. The Czech Republic and Iceland have introduced such programs and larger countries including the U.K., Germany and Spain are studying similar efforts. Hundreds of new location-tracking apps are being developed and pitched to those governments, Mr. Boos said.

U.S. authorities are able to glean data on broad population movements from the mobile-marketing industry, which has geographic data points on hundreds of millions of U.S. mobile devices, mainly taken from apps that users have installed on their phones. Europe’s leap to collecting personal data marks a shift for the continent, where companies face more legal restrictions on what data they may collect. Authorities say they have found workarounds that don’t violate the European Union’s General Data Protection Regulation, or GDPR, which restricts how personal information can be shared.

European health agencies are gathering anonymized geolocation and cell-tower data directly from telecom companies, using agreements or laws that were swiftly passed to address the coronavirus crisis. Governments in Europe are also encouraging citizens to voluntarily download tracking apps and establishing call centers to ask people for permission to track their recent whereabouts. “We realize that this is an infringement of fundamental rights and freedoms, let’s not pretend it is not,” said Slovak Justice Minister Maria Koliková after her government passed a law last week allowing its public-health office to collect phone data. “In a democratic state, an interference with fundamental rights and freedoms is possible if the measure is proportionate to the purpose.”

I’d be more reassured if it looked like these people knew what they were doing. Here’s some gems from the technical documentation of the Pan-European Privacy-Preserving Proximity Tracing (PEPP-PT) program which has the backing of major European universities, healthcare providers, government institutions, NGOs and even a big mobile carrier:

“This anonymous proximity history remains encrypted on the phone of the PEPP-PT user and can never be viewed by anyone, not even the PEPP-PT user.” One-way encryption!

“When a proximity history is uploaded to the trusted server, the server can match that with proximity histories uploaded in the past.” We’re now back to matching never-to-be-decrypted data against other data

“All code […] is being monitored […], line by line of the code. […] This ensures that no unintended code or loop-holes are within the CF, and privacy is guaranteed.” I mean … sure, code review is important (IMO), but it doesn’t give you these guarantees.

“PEPP-PT will be under Mozilla Open Source license or similar.” Gnaaah, at the very least get your names right.

Zoom Security Issues

Butterbeans prompted this episode with an idea in the Patreon-only Discord server:

Fab might as well do an entire episode on videoconferencing privacy issues during these troubling times. Would like to know how much of my privacy I’m giving away to Zoom since I’m on it four times a day now…

Niall Donegan has also been keeping us up to date on the topic, and with so much interest in this, I decided to dig into Zoom for today’s episode. I must say I learned about the company relatively late, as it didn’t have very many users in Germany before February of this year.

As far as I can tell, Zoom has a typical Silicon Valley startup mentality when it comes to security and privacy. Let’s start with security. They seem to never have developed a decent concept of application security at all – which is remarkable if you think about their founder originally being a lead engineer on Cisco Webex.

The End-to-End Encryption Lie

The company made waves recently when their technology was used by the UK’s Prime Minister, Boris Johnson, for cabinet meetings. This caused the mainstream media to – pardon me – zoom in on several security issues that had been made public earlier.

For example, The Intercept revealed that, while Zoom touts their system as being end-to-end encrypted, it actually isn’t.

In Zoom’s white paper, there is a list of “pre-meeting security capabilities” that are available to the meeting host that starts with “Enable an end-to-end (E2E) encrypted meeting.” Later in the white paper, it lists “Secure a meeting with E2E encryption” as an “in-meeting security capability” that’s available to meeting hosts. When a host starts a meeting with the “Require Encryption for 3rd Party Endpoints” setting enabled, participants see a green padlock that says, “Zoom is using an end to end encrypted connection” when they mouse over it.

But when reached for comment about whether video meetings are actually end-to-end encrypted, a Zoom spokesperson wrote, “Currently, it is not possible to enable E2E encryption for Zoom video meetings. Zoom video meetings use a combination of TCP and UDP. TCP connections are made using TLS and UDP connections are encrypted with AES using a key negotiated over a TLS connection.”

“When we use the phrase ‘End to End’ in our literature, it is in reference to the connection being encrypted from Zoom end point to Zoom end point,” the Zoom spokesperson wrote, apparently referring to Zoom servers as “end points” even though they sit between Zoom clients. “The content is not decrypted as it transfers across the Zoom cloud” through the networking between these machines.

This is already one of the biggest red flags you can raise to the infosec community, potential hackers of your system and everyone else who actually knows a bit about how encryption works. Saying you provide end-to-end encryption and then actually re-defining what that means is one of the biggest no-nos in IT security.

The Rogue Local Web Server

In the summer of 2019, Zoom got into hot water for a shady practice used in its macOS installer.

On Mac, if you have ever installed Zoom, there is a web server running on your local machine. The local client Zoom web server is running as a background process, so to exploit this, a user doesn’t even need to be “running” (in the traditional sense) the Zoom app to be vulnerable. All a website would need to do is embed the above script in their website and any Zoom user will be instantly connected with their video running.

If you have ever installed Zoom on your computer, this web server is installed. It continues to run if you uninstall Zoom from your computer. This server also supports updating and installing a new version of Zoom in addition to launching a call. Had the registration of the installer domain been allowed to lapse, the takeover of this domain would have allowed an attacker to host an infected version of the Zoom installer from this site and infected users who had uninstalled Zoom from their computers.

Zoom defended this poor design decision by saying that it felt running a local server in the background was a “legitimate solution to a poor user experience, enabling our users to have seamless, one-click-to-join meetings, which is our key product differentiator”. I wrote about this for heise online back in the day. This was so bad, that Apple force-uninstalled their web server with the macOS anti-malware system.

They do more shady stuff on the Mac, though. Like faking an admin dialogue to gain root privileges without the user realising it.

If the App is already installed but the current user is not admin, they use a helper tool called “zoomAutenticationTool” and the AuthorizationExecuteWithPrivileges API to spawn a password prompt identifying as “System” (!!) to gain root (including a typo).

This is exactly how malware usually does it.

Zoombombing and Passwords Not-So-Much By Default

And that’s only one of a number of security issues with their software that have come to light.

The security of their software is so well known and they’ve gained so many users, that “zoombombing” has become a thing – i.e. people will figure out meeting IDs and crash the teleconference to shout mean things or show hardcore porn videos. Their baseline security for meetings is so bad, that there’s now an automated tool to get access to them.

Each Zoom conference call is assigned a Meeting ID that consists of 9 to 11 digits. Naturally, hackers have figured out they can simply guess or automate the guessing of random IDs within that space of digits. Security experts at Check Point Research did exactly that last summer, and found they were able to predict approximately four percent of randomly generated Meeting IDs. The Check Point researchers said enabling passwords on each meeting was the only thing that prevented them from randomly finding a meeting.

Zoom responded by saying it was enabling passwords by default in all future scheduled meetings. Zoom also said it would block repeated attempts to scan for meeting IDs, and that it would no longer automatically indicate if a meeting ID was valid or invalid. Nevertheless, the incidence of Zoombombing has skyrocketed over the past few weeks, even prompting an alert by the FBI on how to secure meetings against eavesdroppers and mischief-makers. This suggests that many Zoom users have disabled passwords by default and/or that Zoom’s new security feature simply isn’t working as intended for all users.

New data and acknowledgments by Zoom itself suggest the latter may be more likely. Trent Lo and fellow researchers recently created zWarDial, which borrows part of its name from the old phone-based war dialing programs that called random or sequential numbers in a given telephone number prefix to search for computer modems. Lo said zWarDial evades Zoom’s attempts to block automated meeting scans by routing the searches through multiple proxies in Tor. “Having a password enabled on the meeting is the only thing that defeats it,” he said.

This tool finds so many meetings, even from security vendors and cloud companies, that it seems unlikely that all of them had switched password protection off on purpose.

Zoom said it was investigating the possibility that its password-by-default approach may fail under certain circumstances. “Passwords for new meetings have been enabled by default since late last year, unless account owners or admins opted out. We are looking into unique edge cases to determine whether, under certain circumstances, users unaffiliated with an account owner or administrator may not have had passwords switched on by default at the time that change was made.”

→ See also: “Zoom Lets Attackers Steal Windows Credentials, Run Programs via UNC Links” from Bleeping Computer

Zoom has promised to take time off to focus exclusively on security and privacy, but I feel like it’s way too late for that. I’d have a hard time trusting this company with such personal information as video footage. They’ve clearly shown that they have little concept of how important security is when it comes to software development.

And it seems like they scarcely have more regard for people’s privacy.

Questioning Zoom’s Approach to Privacy

Zoom seems to be a classic example of the Silicon Valley modus operandi of disguising an advertising company as a service provider. The Register reported on this at the end of March:

Zoom not only has the right to extract data from its users and their meetings, it can work with Google and other ad networks to turn this personal information into targeted ads that follow them across the web.

This personal info includes, and is not limited to, names, addresses and any other identifying data, job titles and employers, Facebook profiles, and device specifications. Crucially, it also includes “the content contained in cloud recordings, and instant messages, files, whiteboards … shared while using the service.”

Zoom quietly rewrote its privacy policy after this story was published to now clarify, among other things, that it does not use the contents of meetings and messages to target people with adverts.

Earlier, Vice had discovered that Zoom sent personal information of their users to Facebook, even if they never connected both applications:

What the company and its privacy policy don’t make clear is that the iOS version of the Zoom app is sending some analytics data to Facebook, even if Zoom users don’t have a Facebook account. The Zoom app notifies Facebook when the user opens the app, details on the user’s device such as the model, the time zone and city they are connecting from, which phone carrier they are using, and a unique advertiser identifier created by the user’s device which companies can use to target a user with advertisements.

Zoom is not forthcoming with the data collection or the transfer of it to Facebook. Zoom’s policy says the company may collect user’s “Facebook profile information (when you use Facebook to log-in to our Products or to create an account for our Products),” but doesn’t explicitly mention anything about sending data to Facebook on Zoom users who don’t have a Facebook account at all.

Vice also reported that Zoom makes dumb assumptions about a user’s login email domain which can leak their names, email addresses and photos to other users on that domain.

The issue lies in Zoom’s “Company Directory” setting, which automatically adds other people to a user’s lists of contacts if they signed up with an email address that shares the same domain. This can make it easier to find a specific colleague to call when the domain belongs to an individual company. But multiple Zoom users say they signed up with personal email addresses, and Zoom pooled them together with thousands of other people as if they all worked for the same company, exposing their personal information to one another.

“If you subscribe to Zoom with a non-standard provider (I mean, not Gmail or Hotmail or Yahoo etc), then you get insight to ALL subscribed users of that provider: their full names, their mail addresses, their profile picture (if they have any) and their status. And you can video call them.”

Zoom now claims to be GDPR compliant, but a lot of the things we just talked about certainly would have gotten them in hot water with EU regulators. In any case, as with their security track record, I question their general commitment to protecting their user’s privacy.

This seems to be software that was built to be as easy to use as possible for the sole reason of gathering as many users as possible. The goal seems to get some of those users into paid subscriptions and harvest the data of free users and then make additional money off that. Which is typical for these kinds of startups.

The UK’s Ministry of Defence, NASA and SpaceX – among others – have already banned Zoom on confidentiality grounds. For my personal privacy, I would do the same. At least Skype, Microsoft Teams and FaceTime have big companies behind them who have very different internal cultures when it comes to security and privacy. They’ve also been around for longer and historically have more to lose. And there’s also open source alternatives like Jitsi and Signal.

Having talked about all of the issues with Zoom specifically, there’s still an episode (or two) to be had discussing the general shift towards telecommuting and home office and the privacy problems this brings independent of certain software solutions. I feel like this is a topic that will be with us for the foreseeable future.


Jonathan M.H. prompted me to change the UI of the audio file element on the show notes page to make it easier to skip within the episode. I obliged him.

Fadi Mansour comments on episode 10:

A very interesting conversation. It was nice to hear a friendly conversation despite the difference of opinion. But I had to laugh when your colleague mentioned at some point, and I paraphrase: “…nobody will take advantage of the situation.”

Everybody is jumping on the COVID-19 bandwagon! Maybe I’m being to pessimistic, but I really hope that these measure will truly be temporary. What I really believe is exactly what you mentioned: This is setting a precedent, whenever there’s a scare, some funny legislation would be implemented. And on that note, a bit of local news: Czech Republic to start “smart quarantine” in South of Moravia.

He also had some feedback on episode 11:

I applaud your suggestion to compile a guide for going off-the-grid. But I feel better, not assuming that there is a guarantee for privacy. I understand that this could be very limiting. But there should be a cost-benefit calculation. And I have to say: In Europe, I think it’s still not a life-or-death situation, as it could be in other countries! But of course the boundaries of civil liberties are being tested, and there should be, definitively, a push back against that. So, keep up the good work, stay safe, and above all stay free!

If you also have thoughts on the things discussed here, please feel free to contact me.

Toss a Coin to Your Podcaster

I am a freelance journalist and writer, volunteering my free time because I love digging into stories and because I love podcasting. If you want to help keep The Private Citizen on the air, consider becoming one of my Patreon supporters.

You can also support the show by sending money to via PayPal, if you prefer.

This is entirely optional. This show operates under the value-for-value model, meaning I want you to give back only what you feel this show is worth to you. If that comes down to nothing, that’s OK with me, pard. But if you help out, it’s more likely that I’ll be able to keep doing this indefinitely.

Thanks and Credits

I like to credit everyone who’s helped with any aspect of this production and thus became a part of the show.

Aside from the people who have provided feedback and research and are credited as such above, I’m thankful to Raúl Cabezalí, who composed and recorded the show’s theme, a song called Acoustic Routes. I am also thankful to Bytemark, who are providing the hosting for this episode’s audio file.

But above all, I’d like to thank the following people, who have supported this episode through Patreon or PayPal and thus keep this show on the air: Niall Donegan, Michael Mullan-Jensen, Jonathan M. Hethey, Georges Walther, Dave, Kai Siers, Rasheed Alhimianee, Butterbeans, Mark Holland, Steve Hoos, Shelby Cruver, Fadi Mansour, Matt Jelliman, Joe Poser, Vlad, ikn, Dave Umrysh, 1i11g, Vytautas Sadauskas, RikyM, drivezero and Barry Williams.