Convincing people to use apps such as Signal is hard work and most can’t be convinced. But with those you manage to convince, do you feel happy to talk to them on Signal?
The problem is these people use Signal on Android/IOS which can’t be trusted and IOS has recently been in the news for having a backdoor. And it has also been revealed that american feds are able to read everyone’s push notifications and they do this as mass surveillance.
So not only do you have to convince people to use Signal which is an incredibly difficult challenge. You also have to convince them to go into settings to disable message and sender being included in the push notifications. And then there’s the big question is the Android and IOS operating systems are doing mass surveillance anyway. And many people find it taking a lot of effort to type on the phone so they install Signal on the computer which is a mac or Windows OS.
So I don’t think I feel comfortable sending messages in Signal but it’s better than Whatsapp.
These were some thoughts to get the discussion started and set the context.
You are just spreading misinformation! Cite your sources!
There is a strategy used, which allows the government to find out who an account belongs to. They ask the push providers (Apple/Google) for data on the push token from e.g. a messaging app. This way they associate the account from an app with an identity.
Nothing there about message content. It is still safely E2EE.
I don’t know how it works in your country, but in mine, phone numbers are already associated with identities, so nothing gained as the gov can just ask signal for the phone number of an account, instead of having to ask signal and the push provider to get the identity.(Edit: apparently it’s hashed, so there seems to be a use for this.) Signal isn’t about Anonymity but Privacy. There is a difference.If you have another vulnerability cite it!
They ask the push providers (Apple/Google) for data on the push token from e.g. a messaging app. This way they associate the account from an app with an identity.
Very overlooked point. You can find privacy guides online but very few even suggest that FCM etc. might have privacy issues, let alone explain exactly why. It seems this has already been used by law enforcement in the past: https://www.wired.com/story/apple-google-push-notification-surveillance/
The Molly-FOSS fork of Signal (which aims to be even more secure/private) actually supports self-hosted push notifications using UnifiedPush.
I also found this comment:
As far as I know, FCM on Android can be configured to use a notification payload (which is piped through Google’s servers). But for a release app this is discouraged, especially if you are privacy conscious. An app would normally use FCM to receive a trigger and look up the received message from the app’s own backend. See here for more information.
good points altough the number is note saved. the hash of the phonenumber is hashed so Signal could not hand out your number, just the hash.
So how can Signal send verifying sms?
that is only done once at the creation of an account and does not proof that the number ist not saved hashed.
Thanks for pointing that out to me, I wasn‘t aware of that.
“spreading misinformation” is a phrase mostly used by feds when they see something they consider to be “wrong think” or not “politically correct”. They use this anti-misinformation campaign to support their censorship and mass surveillance system.
When discussing advanced IT topics it would be more appropriate to just correct someone and say they are wrong because it’s easy to be get a detail wrong in advanced it topics.
And I am mostly right, I just seem to have been wrong on the detail about Signal push notifications. I admit that I made a mistake on that but otherwise it is official that Apple and Google at least used to share push notification data with governments. This comes from the DOJ senator Wyden saying these corporations can secretly share this data with governments and can include the unencrypted text which is displayed in the notification.
I think this discussion has been very constructive because when we can correct each other and learn that is great.
Misinformation is the inadvertent spread of false information without intent to harm, while disinformation is false information designed to mislead others and is deliberately spread with the intent to confuse fact and fiction. Source
This is more than a simple mistake and I am right to call it misinformation. I appreciate that you seem open to discussion about you being wrong. Nevertheless your post is still not edited to correct the proven wrong statements. You can use strikethrough so no context is lost, like I did in the comment you are replying to, where I was wrong.
You made a post with huge claims, basically saying that signal is unsecure and messages can be read by the goverment. This is such a big claim that it should have been researched by you beforehand and you should have provided sources. You don’t get to hide behind “discussions” because in a discussion you actually provide sources if you make claims. Especially if you are trying to start one, to give the readers a chance to read up on the topic.
You “getting a detail wrong“ has a huge impact. Some people will stumble upon this post, read that signal is supposedly insecure and might believe it and even spread that. It hurts the adoption of a secure encrypted messenger. It is not a small detail, but the foundation of your whole post.
And I am mostly right, I just seem to have been wrong on the detail about Signal push notifications. […] This comes from the DOJ senator Wyden saying these corporations can secretly share this data with governments and can include the unencrypted text which is displayed in the notification.
No, you are mostly wrong about the claims you make! Again your post made the connection to signal. Push notifications for Signal NEVER contain sensitive unencrypted data & do not reveal the contents of any Signal messages or calls–not to Apple, not to Google, not to anyone but you & the people you’re talking to. Source
“spreading misinformation” is a phrase mostly used by feds when they see something they consider to be “wrong think” or not “politically correct”. They use this anti-misinformation campaign to support their censorship and mass surveillance system.
I don‘t appreciate you, trying to frame my correction of your blatant misinformation as trying to censor you. Don‘t try to play the victim.
deleted by creator
You think i’m intentionally spreading misinformation and I think you are a fed. I won’t argue more against you but anyone fair and objective can see that the mistake I made was a simple mistake to make. feds have as a fact been spying on our push notifications in secret and i thought that included signal’s push notifications. Simple mistake which I already admitted to being wrong about. You are making this into a bigger deal than it has to be because you are a fed.
You also are intentionally lying (because you are a fed) about that is the only thing the topic is about. For example, if someone is using Signal on Windows OS then I think there’s a high chance the conversation isn’t private. But I think you already know all this but you pretend not to.
This is hilarious and sad at the same time.
You continue to misunderstand the word “misinformation”. It is incorrect information spread without intent. A mistake made that leads to incorrect information spreading, falls into that category. Especially as it is in the starting point of the discussion, where sources should have been provided.
The need to feel victimized and a little bit of paranoia is strong in you, you should talk to someone about that. I am guessing that is caused by the lies and disinformation spread by your political party of choice. (I am only mentioning politics, because you brought it up with the feds conspiracy theory)
If you went and looked at my account history, you would see that there are a few comments in german and my account is registered on a german server and coincidentally I am German. So much for your fed theory.
My criticism has been nothing but constructive. I implore you for the future to do research using credible sources and to cite them, before making claims that could have a big impact. That goes for discussions on lemmy and as well in real life, when you are discussing or forming an opinion on an important topic.
I hope you get the help you need!
“spreading misinformation” is a phrase mostly used by feds when they see something they consider to be “wrong think” or not “politically correct”. They use this anti-misinformation campaign to support their censorship and mass surveillance system.
Immediately jumping to discredit and dismiss instead of engage by way of over generalization and accusation is not a good look my man.
The way I see it, any step is better than no step at all.
Yep, none of us got to where we are all at once. We learned about things over time, and made changes over time.
It’s a process just like any other type of personal development/ habit building
There are no shades of grey in encrypted communications.
Your messages are either plain text or not to 3rd party.
Sometimes it appears to be encrypted, but there loopholes that make it possible to significantly reduce decryption costs. It is plain text to those who put the loopholes, like specially crafted constants in the algorithm.
There are indeed shades of grey. Not only the presence of encryption itself matters, but the metadata, as well as details of the implementation. For example, Signal has all the messages encrypted - but it has the capability to know the identities of everyone and to build their social graph due to centralization.
deleted by creator
This maybe be what they are referring to: https://9to5mac.com/2023/12/27/most-sophisticated-iphone-attack-chain-ever-seen/
I was referring to the OP’s comment on “iOS having a backdoor”. I am not saying I agree with OP, just was trying to see if there was something like a backdoor.
Oh sorry about that! Somehow missed that. Still weird by OP to claim systems are insecure just because vulnerabilities are discovered. That‘s the case for every system out there. Vulnerabilities are discovered and fixed. And mobile operating systems typically have stricter security features and permission management than Desktop OS.
This is true. I still agree that closed source OSes are not private or as secure as if they were open source. Something like deblobbed AOSP (DivestOS) is better because it has strong sandboxing, full system MAC policies, and vastly reduced attack surface to google Android (or Apple). Desktop does not have a strong enough threat model, wish it was better.
Signal is not my tool of choice, so I’ll answer from a more general perspective:
Having multiple friends and social groups on an e2ee chat system for the past few years feels great. Knowing that our words aren’t being recorded and exploited by half a dozen companies, we no longer feel the need to self-censor. The depth and value of our online conversations have grown noticeably.
Yes, there is more work to do, both at the endpoints and in the protocols. No, not all of us have flipped all the switches to maximize our privacy yet. That’s okay. Migrating is a gradual process. We do it together, helping each other along the way, rather than trying to force it all at once. Every step an improvement.
This is exactly my take. It basically holds for Signal too.
The question of self-censorship is too often overlooked IMO. The knowledge that nobody is reading your messages except their intended recipients is empowering and liberating. No one is filling a database with information about you and your friends, because they can’t. You can say exactly what you would say at the dinner table and not think twice about it.
In a police state with mass surveillance (we all know the big examples) you don’t have this privilege. Whether or not you think about it consciously, you are constantly monitoring and policing what you say - and therefore ultimately, to some extent, what you think.
I’ve been in a couple of those places recently. I can tell you that just the banal act of using Signal there (sometimes over VPN) felt almost exhilarating, like jumping the prison walls.
In historical terms, free speech is a vanishing rare thing. It absolutely is not the norm and it bothers me that so many people in the West don’t seem to know this. We should not take it for granted.
Yeah, Signal is good enough. If people use shitty operating systems like iOS or Google’s version of Android that’s another problem and not really one that it’s my job to care about that much. What matters is the network effect and every user who moves moves from Whatsapp to Signal is one more person who gains the freedom to easily improve their digital lives further if they someday choose to do so without it costing them the ability to chat with all their friends.
The problem I have with Signal is that it itself pushes people onto the “shitty operating systems”. It does not allow registering from desktop, at least officially. There are workarounds, but they’re cumbersome (especially for a non-technical person, whom Signal is supposed to appeal to), and the official client outright tells you go to use a phone first. And even then, apparently the desktop client is not even full-featured, and not the priority.
I know there are degoogled OSes (running Graphene myself), but you’d need to get lucky or choose a phone with this in mind, while a random given laptop is likely to be able to run Linux.
I would certainly advise everyone to choose a phone with that in mind.
The desktop client is not great, but it works. There certainly are things Signal could do better. Its phone-centric nature is ridiculous and I have no idea why they cling to it. But it’s easier than trying to get everyone to use Matrix or whatever — mainly because more people have heard of it.
For my current phone, I did - I chose a Pixel. But I got aware about OS privacy while in the middle of using an unsupported phone, so for a while, I treated it as a “public place”. So making a phone private may not be viable for everyone.
Plus, the supported phones may be more expensive. Even my current one was $300, which is a lot for me, in addition to not being officially sold here.
Signal refuses to even try to accommodate for UnifiedPush or MQTT for those not using play services requiring an extra battery-draining socket to their servers. You are also still required to use one of the mobile duopoly OSs as a primary device to register (SIM still required). Good luck if you use a Linux phone, KaiOS, or just don’t want an ever-present tracking beacon on you. We all know the Electron-based desktop client is shit. I would flip this on its head & say it is the service’s probably if they choose to prioritize & mainly support the shitty mobile OS duopoly it’s their problem for providing a bad service & getting the criticism they deserve.
Signal is fine for almost everyone unless you’re truly doing dangerous work in a truly oppressive state.
I’m so tired of everyone telling others not to use signal because it uses phone numbers. Everyone in here acting like they’re mr. Robot or something.
Anonymity is not the same as privacy. Privacy is good enough for me
Your country doesn’t block Signal sign up SMS to its numbers.
“Feel,” “happy,” “comfortable”… Privacy doesn’t care about your feelings.
And it has also been revealed that american feds are able to read everyone’s push notifications and they do this as mass surveillance.
Speaking of the feds, it was they who funded the creation of Signal, which is one of the reasons it ought not be trusted.
They funded encryption too. Why don’t you stop using that?
Wait until they find out who started the internet. Or who runs GPS satellites
deleted by creator
People think that govt developed = bad. It’s a consideration for sure but if anything govt developed is so hopelessly and inherently compromised then many of the measures discussed here are useless for privacy already because they almost all run through internet, a govt created system. Even TOR. But yet here we are anyway because they are still useful systems.
Governments pour tons of time money and effort into secure communication, and not for profit, and we can still take advantage of that advancement with some caution.
History shows that you shouldn’t automatically trust encryption technologies from the US government.
- Scientific American: NSA Efforts to Evade Encryption Technology Damaged U.S. Cryptography Standard
- Washington Post: How the CIA used Crypto AG encryption devices to spy on countries for decades
Just throw your whole computer out the window.
There is plenty of space between absolute trust and its contrapositive.
Why don’t you fork Signal? Then, you’ll know the glowies aren’t funding you.
Which lines its libre software source code are malicious? Know what libre software is?
Okay, be a dumbass. Why don’t you fork yourself?
Will people read these comments and leave WhatsApp or will they stop caring about privacy?
Totally pointless since the chokepoint is Signal’s US-domiciled back-end server, and Signal doesn’t allow you to self-host it.
Reason: Trolling
Good luck with that 😂
Wow, the whole argument of the article is basically: funded in part by US government = bad, and making a lot of assumptions, nothing more.
The fund is designated to: “support open technologies and communities that increase free expression, circumvent censorship, and obstruct repressive surveillance as a way to promote human rights and open societies."
One should question the commitment of a fund that dedicates itself to “obstructing surveillance”, while being created by a government who runs the most expansive surveillance system in world history. And how the US might define the terms “human rights”, and “open society” differently from those who know the US’s history in those areas.
How laughable, that is not an argument, it’s nothing more than a guessing game, ignoring that there are different parts of government and different objectives can be true.
Signal’s use luckily never caught on by the general public of China, whose government prefers autonomy, rather than letting US tech control its communication platforms, as most of the rest of the world naively allows. (For example, India’s most popular social media apps, are Facebook and Youtube, meaning that US surveillance giants own and control the everyday communications of a country much larger than their own). Signal instead became used by US and western activists, and due to the contradictions of surveillance capitalism, also now its general populace.
You have to be kidding right? Championing china, which created a fucking surveillance state and is heavily monitoring the citizens, as an example?
Source for China doing what the US does?
“Feel,” “happy,” “comfortable”… Privacy doesn’t care about your feelings.
The motivation to do the work, spend time learning the risks and available mitigations, disrupt existing social relationships in order to adopt better tools, inconvenience friends and family, partially isolate one’s self by avoiding the popular systems… all of these things are part of improving privacy in the real world, and at least for many people, fueled by a person’s feelings. Don’t discount the human factors just because you can’t quantify them.
Yeah, i did use words that express feelings in this topic I created and it was intentional because when people have to deal with something that involves uncertainty or something so advanced they don’t understand it entirely then they can become uncomfortable and scared even though maybe there isn’t something to be scared about or maybe the fear is justified.
My post was intended to be a discussion starter so we can dig into this, get to the truth and help everyone including myself to understand everything better.
Well, that explains how the NSA keep getting in every so often.
deleted by creator
Got to start somewhere.
I figure it’s best to assume that there is no privacy on the internet.
I’ve been in IT to close to 40 years and I don’t say anything online that I wouldn’t say in public.
Be paranoid in your estimation of how much privacy you have, but diligent in your efforts to get more of it for everyone.
Will people read this and stop using the internet or stop caring about privacy?
I’m not saying don’t use the Internet. I’m saying be aware, be careful. Don’t let companies sell your information. Use two factor authentication. Encrypt everything you can. Scan your system for malware. Don’t open suspicious emails. Be proactive, but realize at some point someone could compromise your security.
That is not “no privacy” though. Absolute privacy is probably unachievable indeed, but you can be pretty high on its spectrum.
I think a big part of it comes down to what threats are there in theory and what threats are there actually. The problem is that the theoretical threats are possible, they’re not unrealistic and that’s why it doesn’t feel good to not be protected against the theoretical threats but we maybe need to try and accept they are too unlikely to be active threats. Trying to protect from theoretical threats is kind of like trying to protect your house from having an airplane fall down from the sky into your house. Or maybe this is just my trying to cope.
And how do we know what threats are theoretical vs active threats? Just have to keep learning and learning, it takes a long time. Talking in privacy and security communities can help speed up the learning.
Yeah, I fully agree! Point was different though… How does it relate to your statement of “there is no privacy on the internet”? Such awareness might help gain said privacy in each area, from different threats.
We’ve had meetings spelling out to users what they should look for in a suspicious email. Then, once a week we would send out an email that was either legitimate or suspicious. We would ask them to look closely at the email and mark down on the questionnaire whether the email was suspicious or legitimate. A not insignificant number of people failed the test every week. Your average user just isn’t equipped with the mindset they need to be safe on the internet.
That is a completely different issue from “not having privacy at all” though.
Cynicism is a self-fulfilling prophesy. If everything’s bad then there’s no reason to care, and if nobody cares then everything will be bad.
For things to get better, or not get worse, cynics depend on others to care about those things. To me that feels terribly like freeloading.
Just because someone chooses not to be a privacy advocate, I don’t think that means it is universally accepted that they are “freeloading”.
Usually the people who I see make these kinds of arguments are the ones that don’t participate in normal society and live in a bubble, and pretend capitalism isn’t necessary for most people to live their lives.
I use Molly, a fork of Signal in order to use nfty push notifications
Molly
Nice, wasn’t aware of this fork! Good share.
Is there any reason to believe the message and sender can be read from the data sent to the push service? From my understanding, that should still be encrypted.
indeed they are ☞ President of @signalapp : https://mastodon.world/@Mer__edith/111563865413484025
PSA: We’ve received questions about push notifications. First: push notifications for Signal NEVER contain sensitive unencrypted data & do not reveal the contents of any Signal messages or calls–not to Apple, not to Google, not to anyone but you & the people you’re talking to.
In Signal, push notifications simply act as a ping that tells the app to wake up. They don’t reveal who sent the message or who is calling (not to Apple, Google, or anyone). Notifications are processed entirely on your device. This is different from many other apps.
What’s the background here? Currently, in order to enable push notifications on the dominant mobile operating systems (iOS and Android) those building and maintaining apps like Signal need to use services offered by Apple and Google.
Apple simply doesn’t let you do it another way. And Google, well you could (and we’ve tried), but the cost to battery life is devastating for performance, rendering this a false option if you want to build a usable, practical, dependable app for people all over the world.*
So, while we do not love Big Tech choke points and the control that a handful of companies wield over the tech ecosystem, we do everything we can to ensure that in spite of this dynamic, if you use Signal your privacy is preserved.
*(Note, if you are among the small number of people that run alt Android-based operating systems that don’t include Google libraries, we implement the battery-destroying push option, and hope you have ways to navigate.)
Thanks, wish I’d found this earlier.
deleted by creator
Did some searching. You’re referencing a podcast in which known propagandist and liar Tucker Carlson claims that an anonymous source of his implies the NSA broke into his Signal messages. Wish you’d qualified that in the post because that’s important context.
Don’t you think it’s way more likely that the guy blew his cover some other way? Googling hotels near the Kremlin or something? You know, because he’s a dumbass?
Cite your sources if you want to make claims like these! Until then you are just spreading FUD. There is nothing supporting your statement that i could find.
Was it revealed, or did Tucker Carlson assert this with no evidence? Because what comes out of his mouth is usually bullshit.
I don’t know how the Play Store version does push notifications, but Molly, and I think the apk from their site, work just fine on degoogled phones without Google services.
I don’t remember what name it has, but missing it breaks push notifications on most “normal” apps. Many FLOSS ones are coded to have their own methods that don’t transmit data to Google, and it appears at least some versions of Signal do too.
My threat model doesn’t include state level actors taking an active interest in me, so for my purposes Signal would be secure enough, if only I got people to adopt even it.
deleted by creator
I have Signal and microG with push notifications. Signal still uses websocket on my device. So, I guess it would be fine without microG push.
This is the ideal scenario as I see it, in order of importance:
- industry-standard E2E encryption using open-source software on the client (privacy)
- distributed server network controlled by many entities (resilience)
- open-source, open-standards, interoperable software on both client and server (user autonomy)
As I understand it, the goldilocks solution is therefore the Matrix stack. BUT! It’s hard to set up and nobody uses it!
The best real-world option, with feasible UX and an existing critical mass of users, is therefore Signal. It only fully meets the first criterion, yes. But personally I give it a bit of credit for the second too, in that it belongs to a non-profit foundation with multiple stakeholders, somewhat like Wikimedia. Signal will do while we’re waiting for a proper email-like open standard for secure messaging.
There are several open protocols that meet your criteria that aren’t Matrix (with most of them using double-ratchet encryption similar to if not exactly like Signal). Due to server costs (Matrix eats a lot of RAM & storage), medium-sized entities usually bow out so the Matrix network largely consist of a few 1–10 user servers & massive centralization around Matrix.org & the hosted servers they provide. Since almost all the messages get synced to the Matrix.org server if just one Matrix.org user is in your room or whatever, all metadata will be synced to the mothership in Matrix.org that was originally funded by Israeli intelligence.
the Matrix stack. BUT! It’s hard to set up and nobody uses it!
Is it really that hard? For me it was just downloading an app and creating an account–easier than setting up Facebook Messenger. I think it doesn’t yet have the network that Messenger/Signal/Whatsapp have, which makes it harder to use with others, but setting up has been easy in my experience.
They mean setting up your own server.
Yes it looks a bit like the Twitter-Mastodon paradigm. Nobody uses it because nobody uses it. And also because changing is hard. And also because the installation and UX is bad. Which is partly because not enough people are using it.
- distributed server network controlled by many entities (resilience)
It only fully meets the first criterion, yes. But personally I give it a bit of credit for the second too, in that it belongs to a non-profit foundation with multiple stakeholders, somewhat like Wikimedia.
These two things are not at all equivalent, or even comparable.
deleted by creator
Signal runs a service. Even if its source code is open source there’s no guarantee that that’s the code running on the server.
I don’t know the protocol, but I am concerned of man in the middle and how safe it is from man in the middle. In this case signal servers must be considered to be man in the middle.
The only system to trust is peer to peer with proven track record of sending encrypted data over public channels.
That’s PGP and Delta Chat utilizing PGP.
Finally, someone who knows the difference between software and a service.
If the client software is open source with reproducible build, then you don’t need to care about what’s running on the server. You will never have any means to confirm what’s running on the server, because you don’t control the server. That is why EE2E was invented.
but if this is your argument, you could also say that Telegram is good because their client can also be built from their open source. of course you have to activate e2ee on a 1-1 chat first…
If the E2EE is enabled and the client software source is available and reproducible, then, indeed, it could be called Telegram or anything else, it doesn’t matter.
The particular issue with Telegram is, as you say, the default setting. And also that its encryption algo is not universally trusted.
ok but if the source of the server is not know, how can the client be save?
I know how e2ee works but couldn’t a bad closed-source server still be a problem?
btw. not trying to call you out, I just really want to know, cuz I cant get my head around it 🙈🙊
The message is encrypted using a key. The key exchange was done over a direct secure channel to the other client, in much the same way as you connect to your bank’s website using HTTPS. The server therefore does not have the key and can only see encrypted text.
Assuming the client software has not been compromised at either end, then the server will never see anything other than garbled ciphertext.
BTW, this is also the case with Whatsapp, for example. But the problem with Whatsapp is that the client software is closed source. So you have to trust them not to, for example, surreptitiously phone home with a separate copy of your message. Very unlikely but you have no way to check when the client software is a black box.
But what’s running on the server is not the issue in either case.
ah ok. that makes sence. so only if the secure channel of the key exchange is somrhow attacked, the encryption can be broken, correct? i dont wanna ever use telegram (not even on 1-1 e2ee chat) but basically they are still bad since they use encryption wich is not a standard and could be compromised?
(i hope thats it with all the question i have 🙈)
Yes, compromising the key exchange would be one attack. But that’s not technically breaking the encryption, that’s just stealing the key. To do that, you need control of the client - which is a thousand times easier when it’s impossible to check the source code of the software it’s running. Otherwise, your only option is to break the encryption (i.e. discover the key) and that is gonna be very hard indeed because, unlike logins that humans use, the “password” is always completely random and very strong.
Telegram has open source client software, but it uses its own in-house encryption algorithm, which is not an industry standard. Some people think it might therefore be easier to compromise. But in any case, as you say, Telegram doesn’t even have encryption enabled by default.
The better reason not to use Telegram is because it’s a shady company with no obvious business model and therefore has an incentive to do bad things.