Submitted by ActivePersona t3_11b6wx9 in technology
Comments
Useless_Advice_Guy t1_j9wdcoe wrote
Signal won't do it 10 times
locri t1_j9we5d3 wrote
> We would absolutely exit any country
Ahah I can't explain fully but certain people (with legitimate, official power) can and will stop the UK parliament if it comes to that.
eluminx t1_j9wojwb wrote
Good to know, F any govt that advocates for this. Signal may not be perfect but it's a great alternative to the rest of the options.
citizenjones t1_j9wpp4n wrote
The letter of the law vs. The spirit of the law... Know the Difference. React accordingly.
retief1 t1_j9wq3c7 wrote
Wooo, the US isn't the only country with incompetent portions of its government!
[deleted] t1_j9wrv8w wrote
[removed]
ttustudent t1_j9wtivg wrote
Been using signal as an alternate for a group of friends that have iphones and android. It's been great!
Plus the desktop app is nice
1wiseguy t1_j9wua1g wrote
People in the government who say messages must be encrypted in a manner such that the government can read them doesn't seem to understand how encryption works.
Encryption keeps anybody from reading your messages. That's just how it works.
Edsgnat t1_j9wv2bw wrote
Or they don’t care. Governments world wide have had unfettered access to private communication for centuries. Why would they want to give that up?
velifer t1_j9wvta2 wrote
Is this that little message app that became completely irrelevant when they dropped SMS integration?
This has all the newsworthy significance of me standing up and saying the same thing.
Who fuckin cares, Signal.
BatterMyHeart t1_j9wwrbo wrote
Encryption has been legal in the west for that whole period too.
jimmyhoffa_141 t1_j9wx9wx wrote
I love Signal. I opted out of Facebook long ago, and as soon as Facebook bought WhatsApp, I switched. I gladly give them money because I truly believe in what they're doing.
hodor137 t1_j9x0ilo wrote
Not true at all. Encryption that's not intended and actually implemented to be fully sender-to-receiver can easily be subverted and readable by 3rd parties. In the messaging/signal/Whatsapp context people refer to it as "end to end encryption" but that term doesn't really say anything.
I'm not sure how exactly Signal and these other messaging apps implement their encryption, but they could easily claim end to end encryption while offering governments a "back door" to decrypt and read everyone's messages. Signal is saying they won't do that.
I've never bothered to use Signal but you either have to trust their word, or they have to do a really good job proving to you that only the end users have control of their own private encryption keys. From everything I've heard, including this, they're great and trustworthy - but you still have to trust them.
duh374 t1_j9x1a2w wrote
The difference is that signal is open source, whatsapp is not. Signal you can verify their encryption, whatsapp you just have to blindly trust.
pucklermuskau t1_j9x1dd2 wrote
SMS lol. how quaint.
Raul_77 t1_j9x1sk4 wrote
I really like Signal but the main issue is, my work laptop can only download apps that are the in the app store, so cant install Signal ! I really wish it had a web app similar to WhatsApp.
hodor137 t1_j9x20xo wrote
Oh yea, it's absolutely more trustworthy than Whatsapp
alsu2launda t1_j9x2dyf wrote
No need to trust signals words, the source code is open source. You can know how every bit of data is processed for the version you are using and in what format it leaves the device.
If you don't trust the Play Store app distribution (which ideally you should not trust), compile the app from source nd you have complete control of the app as if you yourself have made the app for yourself.
Even signal can't themselves do anything fishy. The can almost give government most basic information like which time my app connect to them nd my ip because I connected to their servers.
TLDR It's not based on propriety model where you need to trust the app for what it is doing.
With signal complete privacy is in users hands and the message are encrypted when leaving the app. It's not possible for signal servers to know message content by design.
happyscrappy t1_j9x2it3 wrote
You can't offer a back door without risk that others will use it.
If you produce messages that can be read by two keys, the recipient or key XYZ which is held by the UK government then anyone who gets that XYZ key can decrypt every message.
On top of that, the politics of back doors are just too problematic. If you give the UK a back door then Russia can come to you and demand one too. Any government, by establishing a precedent that they get a back door opens up to services giving access to their enemies too.
hodor137 t1_j9x2p2w wrote
But you DO need to trust what they're doing, unless you take the steps you mentioned.
Krazy_Kitchen t1_j9x2yr9 wrote
signal is my goto messenger, and will continue to be.
it really irritates me that for work purposes fucking whatsapp has assumed the default position for messaging - although more and more people are on signal now.
governments have no business poking into our lives by limiting or weakening encryption.
beautifulgirl789 t1_j9x32en wrote
People still use SMS?
alsu2launda t1_j9x47q2 wrote
You are not forced to trust them. You can if you want to do things conveniently.
If you truly want it use its value then you must do it the correct way it should be done. Ideally you must compile your apps if you don't want to trust the distributor. There is no other option.
Propriety apps don't have their source code public so they could be collecting God knows what data and sending to the servers back at the company.
It shows the flaw in Google Play Store way of distribution of apps rather than signals.
RaktPipasu t1_j9x5vzy wrote
Is it possible to use homebrew?
throwaway83756 t1_j9x6c6s wrote
Well considering you can exfiltrate data through the app I suspect your security team would remove it anyway.
glacialthinker t1_j9x7gpr wrote
Spokesperson for the Home Office, defending the bill: "It is not a choice between privacy or child safety—we can and we must have both."
If you must have both, then the work to be done is for child safety, not compromising encryption or privacy.
Prestigious_Push_947 t1_j9x81jn wrote
In this case, they're talking about scanning on the endpoint, which does get around the issue of breaking the encryption. The messages are only encrypted in transit, not on the endpoints, so this isn't an issue of misunderstanding crypto. I support Signal's stance of non-participation, but you should probably read the article before commenting.
bpastore t1_j9x83vx wrote
I don't know Signal. I was sort of hoping that you 10,000% won't participate.
Not the level of devotion to privacy that I was hoping for but, I guess it'll do.
drawkbox t1_j9x90wo wrote
They have the ability to attach ghost users, the reason they say is moderation/spam, but no backdoor needed with that. The ghost user is able to decrypt like a regular user and syphon out the info.
This was proven with WhatsApp not too long ago and Signal also has the ability to attach users.
Any "secure" encrypted messenger that allows more than 1 to 1 connections will always have the potential for the "ghost user" problem.
System level some use additional connections/recipients for spam/moderation and the moment you allow any invisible/visible group users in, there is a massive potential for an exploit.
Additionally you have the potential for forking off messaging to other users at the system level for either oversight or spam/moderation/other. Some of the compromised systems out there use this very well.
A sneaky way some of these "secure" messaging apps are also doing this is ghost participants in the chat that can essentially syphon off the messages even without a compromised client. The ghost participant is always under the guise of moderation or anti-spam or telemetry or some other proprietary shim.
Lots of "secure" messaging apps do this for intel and surveillance and not just the white hats.
Other areas that "secure" messaging apps have holes in is the anti-spam/moderation systems that need to view messages and in the clients themselves who have access to the unencrypted content. This is also taking place in other client apps as well: VPN, password managers, extensions, wallets, even build systems and more. Many like VPNs have logs sent elsewhere but deleted locally -- access to entire machine and all network access. People are way too trusting of "secure" systems/apps that are very common today based on trust.
All of these apps/systems would pass code checks, reviews, security inspections and essentially be encrypted/"secure" though a copy is sent off to another area for review. At runtime the leak is in the direction of the data.
Then you also have governmental oversight that opens up holes that can be exploited.
On Ghost Users and Messaging Backdoors
> to add a “ghost user” (or in some cases, a “ghost device”) to an existing group chat or calling session. In systems where group membership can be modified by the provider infrastructure, this could mostly be done via changes to the server-side components of the provider’s system.
> I say that it could mostly be done server-side, because there’s a wrinkle. Even if you modify the provider infrastructure to add unauthorized users to a conversation, most existing E2E systems do notify users when a new participant (or device) joins a conversation. Generally speaking, having a stranger wander into your conversation is a great way to notify criminals that the game’s afoot or what have you, so you’ll absolutely want to block this warning.
> While the GCHQ proposal doesn’t go into great detail, it seems to follow that any workable proposal will require providers to suppress those warning messages at the target’s device. This means the proposal will also require changes to the client application as well as the server-side infrastructure.
> (Certain apps like Signal are already somewhat hardened against these changes, because group chat setup is handled in an end-to-end encrypted/authenticated fashion by clients. This prevents the server from inserting new users without the collaboration of at least one group participant. At the moment, however, both WhatsApp and iMessage seem vulnerable to GCHQ’s proposed approach.)
Other messengers also have issues.
Signal + Telegram
-
Default settings in Telegram aren’t encrypted, same with Signal
-
Both sides of a Signal or Telegram conversation have to both have the encryption on
-
Anti-spam filter has to check actual content (proprietary and third party in some cases)
-
Shrouded spectator connections to your chat that may not be visible to you -- part of moderation/spam proprietary hooks. You could have a perfectly clean secure software platform that can still be exposed via normal usage to get data on client or with someone that has access to your comms unencrypted.
-
Connected through your phone number and also your location which narrows it down to exactly you, this is more damning than using ADID, UDID or MAC as this WILL follow you across everything.
-
Users have to be identity validated before they use the app beyond ID bridging.
-
They might be bought someday by someone more unscrupulous with data, all that history going to a private equity firm.
-
Clients have full access to unencrypted data, as well as the server with private keys
-
Even if you trust them now they may not be trustable in the future, see LastPass for an example or Auth0 or ad blockers/extensions or VPNs or even password managers that you trust. All of those need a client on your machine that will have access to elevated permissions and your unencrypted data as they are clients.
-
Source code is delayed after builds. Open doesn't mean much to the end binary if they are putting in proprietary areas and the hash/checksum will be different all the time. Who knows what is in it.
-
Signal gets location, number, identity and more and where you are at. Extreme example: if they know when you shit, they can stage a robbery from third party actors and craigslist style contractors while you’re taking a dump, technically. They know when you’re out for the evening.
-
Also if you have location tracking off they still have IP and device identifier as well as geofenced notifications that don't need the location permission always on. Geofenced location can wake up the app at any time.
-
Signal is recommended by Edward Snowden, Glenn Greenwald, Jack Dorsey and Elon Musk as well as many other potentially sketchy people. Originally these guys were played nice but the people behind them are sketch (Elon being authoritarian funded for instance). Edward Snowden is in Russia and Glenn Greenwald can't say a bad word about Putin. Sketchy that they are the featured testimonials as well as people connected to them.
-
Telegram is funded by Pavel Durov who is essentially Russia's Zuckerberg who is also authoritarian funded. Durov made VK (Russia's Facebook from same MailRU/DST Global funding) and then made their "secure" messenger. Brian Acton ran WhatsApp, bought by Zuckerberg, then made Signal a "secure" messenger. Similar story, same sketchiness even if Signal is less sketchy than Facebook/WhatsApp/Telegram. If someone from Facebook/Meta broke off now and created a "secure" messenger would you believe it and use it now? nah. You think the guys that build social media surveillance aren't just better at it with messengers, a big risk. Alarm bells should be going off if you have good opsec.
There are NO secure messaging apps, none, unless you wrote your own encryption and shared it with the third party and encrypted before sending outside of that system entirely. If you send an email, that had like PGP that would have worked for a while until the backdoor (Phil Zimmerman was in decades long cases relate to this). But if you make your own encryption and are sending messages in the clear you will get visits so really only military/intel are allowed that. Spy/intel agencies do that all the time but they shroud the messages in content like in the Illegals Program
There is a reason why these "secure" messengers all exploded in the 2010s...
If you think that there are any secure messengers, you are naive. There is always a way to get access to the input, side channel or through a temporary/targeted hole like how Russia/Saudis/MBS/Trump did with Bezos and WhatsApp. That is another area where these "secure" messengers are compromised, in targeted attacks or temporary holes which just happened recently where 1900 people were compromised and they were targeting 3 numbers in it. There is also the social hole where any member of that chat would also have copies.
> Among the 1,900 phone numbers, the attacker explicitly searched for three numbers, and we’ve received a report from one of those three users that their account was re-registered.
OcculusSniffed t1_j9x95v6 wrote
What are you using a work laptop for non-work messaging for?
drawkbox t1_j9x9cnn wrote
Being open source doesn't make it secure. You can just view the code. There are tons of other attack vectors past that, CI/build, dependencies, ghost users, suveillance masquerading as moderation/spam checking and so on.
Open source libraries have been owned right in front of everyone. OpenSSL had the Heartbleed hole for years, everyone owned. Log4j/Log4Shell owned every device with Java on it including all Android phones for over a decade...
drawkbox t1_j9x9o8o wrote
Being open source does not mean it is secure. If anything it means people will overly trust it.
Open source libraries have been owned right in front of everyone. OpenSSL had the Heartbleed hole for years, everyone owned. Log4j/Log4Shell owned every device with Java on it including all Android phones for over a decade...
Opening up private messages to a third party isn't a good idea. If you are on Apple, use iMessenger. Apple can already get your info. Same on Google. Using an additional third party client, as well as a desktop client, that opens you up to all sorts of attack vectors even it you trust the company, they can be hacked. Trust leads to intrusions.
OcculusSniffed t1_j9x9qv4 wrote
Using signal for sms is like trying to take a train to the corner store. Its not really for that.
1wiseguy t1_j9x9z7v wrote
There's no such thing as a secure back door.
A back door is code for "other people can read your message that you thought was secure".
ComicOzzy t1_j9xc5t1 wrote
Can we please just go back to 100% being enough... and also not using "100%" as a complete sentence to indicate agreement? I'm getting too old for this shit.
hodor137 t1_j9xcga4 wrote
I didn't say it was secure, or good. My point was that just because "encryption" is used doesn't mean there can't be a back door that prevents a 3rd party from reading your messages.
FriendlyDespot t1_j9xcpel wrote
> I'm not sure how exactly Signal and these other messaging apps implement their encryption, but they could easily claim end to end encryption while offering governments a "back door" to decrypt and read everyone's messages.
You should have stopped at "I'm not sure how exactly Signal and these other messaging apps implement their encryption," because you go on to say something that's completely wrong. Signal can't decrypt anyone's messages. The devices that are talking to each other across Signal's infrastructure use local public and private keys that Signal as a company doesn't possess.
The most that Signal could do is make the Signal software take the cleartext messages after decryption and send them somewhere, but the Signal applications are open and auditable, and something like that would be discovered, and would mean the death of the company.
hodor137 t1_j9xdcqg wrote
Or they could simply have the app upload your keys to their server.
But as others have pointed out, they open source their code so they can't do this without everyone finding out.
My point was really that the comment I was replying to was dumb - just because you have "encryption" doesn't mean no one will ever read your messages. The keys that can decrypt those encrypted messages must also be kept safe.
FriendlyDespot t1_j9xdldu wrote
> Or they could simply have the app upload your keys to their server.
That wouldn't make much sense, because the keys are ephemeral, so you'd have to upload about as many keys as there'd be messages.
[deleted] t1_j9xebtu wrote
[removed]
kcabnazil t1_j9xf5jy wrote
I hope noone is downvoting this because they think it is inaccurate.
It is, however, missing the point.
Being open source means you can show to have security objectively, not through obscurity. It means others can not only analyze it for weaknesses, but contribute resolution to those weaknesses as well.
Whether or not that open source code is what's really used to build an application... is another matter. I wonder if that can be objectively proved for Signal. It definitely can't be for others ;)
emote_control t1_j9xg058 wrote
So they're not allowed to "produce a document such that it is not possible for Ofcom to understand the information it contains"? Why are they writing a law that requires all online content to be written at or below a third-grade reading level?
TudorSuta t1_j9xgb0f wrote
You think the house of lords would oppose this?
Sigg3net t1_j9xhtci wrote
That's great. You don't want to give them access to it anyway.
Sigg3net t1_j9xhy92 wrote
You haven't heard about military grade encryption, apparently;)
kcabnazil t1_j9xi3m9 wrote
You make good points here. Multiple, actually. Using any software or hardware means putting your trust in whoever made it. Extrapolating that into something of a strawman argument/fallacy that is still completely true, using any device you didn't personally manufacture and write all the firmware/software for is opening yourself up to insecurity. The real question is, as you allude to, "who do you trust, and how far?"
However, I'd argue the semantics of, "if anything, it means people will overly trust it."
People will overly trust anything that sells them the message they want. That includes using products from big name companies. That also means believing their IT friend Bob who says anything open source is the way to go. Little do they know, Bob also happens to be making a dollar a day on their open source but rarely scrutinized app for dog memes using your phone for cryptomining.
Apple's image of privacy for the iPhone is a mirage built on believable efforts and misleading reports. People still gulp it down eagerly. Signal's image of privacy is built on throwing themselves to the lions by being well known and showing their code; anyone and everyone with the capacity to look will look if it matters to them. It doesn't mean Signal is perfect, but it does mean they're putting everything on the line to prove they're doing the best they and every other contributor can. Both teams have track records, but only one is willing to show you what happened along the way.
That said, I find it very surprising that Signal has not gone the way of Lavabit. How have they evaded U.S. government gag orders while honoring their commitments? I assume no big company has; that's rather perposterous, honestly. Several have canaries for these situations.
kcabnazil t1_j9xiq9q wrote
Using Signal for SMS is like trying to have your cake and eat it too. Eventually, the curtains had to drop; but, that also means either getting EVERYONE YOU KNOW to use Signal or fragmenting your messaging clusters. It sucks that they dropped support, even if I understand the spirit of the decision.
1wiseguy t1_j9xiysd wrote
A back door literally means a third party can read your message.
In theory, it's a good third party, but there's no way to be sure of that.
1wiseguy t1_j9xjauw wrote
The military uses encryption that follows the same rules as other encryption.
They are generally more careful to make sure their crypto systems are secure. In theory.
The German military had a really good crypto system in WW2, and it was broken partly because humans made mistakes.
[deleted] t1_j9xjb72 wrote
[removed]
[deleted] t1_j9xjifu wrote
[removed]
TheFriendlyArtificer t1_j9xjj0z wrote
The only thing that has really been grinding my gears is the lack of love for Android tablets.
You can download Signal for iPad, Linux, etc. and link it as a regular secondary device. But try with an Android tablet and it refuses.
Multiple tickets and even a PR and the dev team seems to be giving less of a shit than ever.
I know there's only a few of us, but I refuse to use iOS devices (or non-rooted Android devices) and feel like a second class citizen.
einmaldrin_alleshin t1_j9xkeqd wrote
You can actually read out the public key from Whatsapp and use that to verify the encryption scheme.
But that would be of little use if they could extract private keys or plaintext messages from the device.
Amazing-Cicada5536 t1_j9xkuhw wrote
If it’s e2e encrypted then by definition no middle man can access the unencrypted data, only the two ends. No matter how much the server/government/anyone listens in. Of course they can still get the data after the decryption on one end, so the sending device has to be trusted itself.
[deleted] t1_j9xl0wg wrote
[removed]
uwu2420 t1_j9xl3a4 wrote
There’s two big issues with iMessage that Signal solves:
-
iMessage only works with iOS devices
-
iMessages are end to end encrypted, BUT, they are stored in readable format in iOS backups, and since most people tend to use iCloud backups, which by default are not end to end encrypted, this is used as a back door to defeat the protection. The option to encrypt iCloud backups, Advanced Data Protection, is new and only came out a couple months ago — prior to this, there was no way at all to encrypt iCloud backups. Importantly, as a sender, you have no idea if your recipient is taking the proper precautions, and no way to enforce it.
[deleted] t1_j9xl3ja wrote
[removed]
Immediate_Ability111 t1_j9xl7a6 wrote
Just donated to Signal for the first time off the back of this news.
Immediate_Ability111 t1_j9xlh9p wrote
Three prime ministers in a year and a half. It’s a shit show.
Heijoshinn t1_j9xn8e9 wrote
Bruh.. Lol
You clearly don't understand how encryption works to be commenting on the subject matter. Especially when you openly admitted:
> I'm not sure how exactly Signal and these other messaging apps implement their encryption
For starters, both Signal and WhatsApp use the Signal Protocol: an encryption standard that was engineered by Signal in-house. Also, Signal is open source meaning that anyone can verify their source code on how the app was constructed. Signal wouldn't tamper with their code and even if they did, Signal is set up in such a way that any adversary that wanted to snoop would need the device itself to discover the messages.
Do more research my friend.
carlosvega t1_j9xnebx wrote
But where is the proof that the app code is the same as GitHub code? 🤔 do they provide some hash or something?
frogking t1_j9xoe2r wrote
“Military grade” as in outsourced to the cheapest source and likely to be hugely overpriced when delivered?
These days, when it’s encrypted, a significant amount of time has to be put into decryption. So significant that the decrypted material will be irrelevant. (65 million years later, it doesn’t matter)
Exoddity t1_j9xoe8j wrote
They care in as much as they seem to believe that they alone will hold the keys and their own privacy would never be impinged. They're also very, very stupid.
Heijoshinn t1_j9xom8a wrote
I replied to another comment of yours regarding encryption. But this statement you made gives much more clarity on your issue of "trust" in [insert company here].
Encryption works depending on it's implementation. Take AES for example. It's a standard that's wisely recognized and widely used by virtually everyone on the encryption scene. As a result, it's been tested, used in multitude of ways and is regularly attempted to be broken. That's because AES is the standard. Since this is the case, it's less likely to have side channel attack weakness due to it's wide spread application and audit.
Compare that to something like TwoFish. It's strong like AES and is built differently. You could use this method of encryption and likely be safe. However, it's not widely used. This means it's likely not audited or scrutinized as much as AES and since it's not used as much, it's implementation is also at higher risk of side channel attacks. Without players routinely executing TwoFish encryption, it's level of progress is much lower than AES by comparison. This doesn't mean TwoFish is necessarily inferior but that it doesn't have the "run time" that AES has.
Heijoshinn t1_j9xpspo wrote
> requires providers of encrypted communications to alter their products to ensure user messages are free of material that’s harmful to children.
Oh for f***'s sake, Lovejoy Arguments?? I swear, any time I hear about regulation "for the sake of the children", it's got to be conservatives. Every freaking time, they scapegoat the idea of child protection in order to effect overwatch or controls on people.
Sounds like the U.S. EARN IT Act.
ArcherBoy27 t1_j9xpt95 wrote
Yup, indeed.
From 00:20 on encryption. It would be funny if it weren't so dystopian.
fooey t1_j9xpw45 wrote
They understand it
They're attempting to scare the people who don't understand it
ArcherBoy27 t1_j9xpzay wrote
Encryption in transit is HTTPS, that's not end to end.
What the bill is saying is we can't read your letters in the middle so we will read them over your shoulder instead. How comforting...
[deleted] t1_j9xqkv0 wrote
[deleted]
SirCB85 t1_j9xso2m wrote
They allow you to compile your own executable kf the app from the code visible on GitHub (for Systems that allow sideloading, sorry Apple fans).
drawkbox t1_j9xtbji wrote
Though the iMessage/iCloud backups are linked to a user and everything it keyed on that. Now they can additionally encrypt but it was always encrypted under the user.
I see this same complaint with browser password managers in the browser (not extension), they do encrypt now but they used to just by the user. You'd have to login as the user to be able to decrypt everything or access it. Things like Signal, Telegram, LastPass, Bitwarden and other third party style systems that do not encrypt by user, it is encrypted but you can break it outside the context of the user, not possible with backups, iMessage, Chrome/Safari/Edge passwords etc.
> Importantly, as a sender, you have no idea if your recipient is taking the proper precautions, and no way to enforce it.
By default Signal/Telegram both use your number and if one participant of the chat (even a 'ghost' user) isn't, or even if they are, all that data is wide open. Telegram by default has encryption off. If one of your recipients is that way, well you are wide open.
kudoistas1 t1_j9xtph7 wrote
Respect to Signal, hold your ground!
drawkbox t1_j9xtvb4 wrote
I am more zero trust but if you are going to trust, trust fewer third parties. Even if trustable. Third parties get sold. Third parties need to make money from that not other ways only (Apple/Google for instance don't need you using messaging to survive).
If you are already on a browser, a password store/generator is safer without a third party involved. The OS, browser and company already have you, why involve a third party?
Same with messaging... Trusting WhatsApp/Signal/Telegram is not only another level third party, it is your most private content... why trust a funded/private equity/questionable source system if you don't have to.
Signal does appear to be the best of them, however being open is not safer always.
The new trick is dependency/build attacks, so good sometimes the main company doesn't even know it is happening (see SolarWinds that was hacked via TeamCity CI, the bad bits were being put into the dependencies at build, code was fully independently verified). The problem is blanket trust. It is what led to the OpenSSL Heartbleed hole, the Log4j/Log4Shell hole and pretty much any bit hole in the last year was part of open source.
When a company gets their source code stolen (LastPass for instance) the point is to find dependencies they can manipulate, not even the code itself. Almost all closed code uses dependencies that are open or known, and have known holes, the key there is utilizing that when you know the code flow. Open source actually makes that part easier, no need to steal source code.
I am a big OSS fan, but I hate how devs are the weak link today. Devs today are so willing to trust a third party because they heard about it or it saves a day. Those are the MOST targeted dependencies...
drawkbox t1_j9xu5tw wrote
Agreed, security through obscurity is always a bad idea. Zero trust is the only way and less third parties helps you minimize the attack vectors.
My comment here addresses some of these points
While OSS is has code to review openly, that is a good company level trust, but that also is a potential weak area where people will overly trust and let in a bad dependency that not even the company knows got compromised. It can also let you target dependencies that the code uses without even needing to steal the code. You can trust that the company that open sources will make sure their code looks good and has less holes possibly, but not always.
It has happened in OSS for decades now to the largest toolkits with the most eyes and broadest use, because that is the best way to get into systems now, via the devs who are the weak link sadly. As a dev I am blown away at the lack of awareness of devs and these issues.
uwu2420 t1_j9xugo6 wrote
No, iCloud backups up until a couple months ago were not end to end encrypted and it was explicitly used by governments as a backdoor to get around iMessage encryption. There’s a leaked law enforcement slide about it somewhere.
https://support.apple.com/en-us/HT202303
See under “data categories and encryption”, under “standard data protection” (which was the only option up until a couple months ago, and still the default option to this day), note how iCloud backups (including both the full contents of the device, and iMessages) are not end to end encrypted.
Telegram’s encryption is a homebuilt algorithm rather than a tried and true standard (never roll your own crypto…) and as you pointed out not on by default. So it was always inferior to Signal.
Signal by default doesn’t keep its data in device backups. You’d need to build a custom client to get it to do that. There’s no way to get Signal to not end-to-end encrypt it’s chats, it’s on by default and can’t be turned off.
Edit: some more links to back this up:
https://www.howtogeek.com/710509/apples-imessage-is-secure...-unless-you-have-icloud-enabled/
And the leaked slide as I mentioned earlier:
https://www.pcmag.com/news/fbi-document-shows-how-popular-secure-messaging-apps-stack-up
[deleted] t1_j9xuhfl wrote
[removed]
iain_1986 t1_j9xvcys wrote
>People in the government who say messages must be encrypted in a manner such that the government can read them doesn't seem to understand how encryption works.
Some of them absolutely do, they also know nothing will actually happen BUT it wins over votes from certain demographics
YoureHereForOthers t1_j9xvnce wrote
There’s a UK law to lower it? Wtf. As a world community, IEEE, NIST,etc, We are quite literally in the final and post final stages of setting new encryption standards to replace AES and others BECAUSE they’re too week and outdated.
idontevenknowlol t1_j9xvvrc wrote
Interested in this space... What are the new standards we're heading into / already in / where are things going?
drawkbox t1_j9xvw64 wrote
Good info. The leaked screenshot I wish someone had a good version of it, so small.
The point with iCloud is that it is always under the users security context, that is encrypted. The backup files themselves weren't but across the board the OS and cloud level access requires the user context, if you were to take those outside the system you'd still need the auth context.
For law enforcement that is more accessible on iCloud, but for others it is more difficult like cyber criminals or ransomware and other things.
Phone backups also don't have to go to iCloud, it is wise to for not losing content, but you can still backup to desktop or other.
The point is, they aren't a third party, they don't make money only from messaging and they have a very vested interest in making sure their system is secure from third parties. If you add a third party into the mix like on messaging, you better trust it because your OS/device already can see that AND the third party. Adding more attack vectors is really security by obscurity, but with more obscurity.
> Signal by default doesn’t keep its data in device backups. You’d need to build a custom client to get it to do that. There’s no way to get Signal to not end-to-end encrypt it’s chats, it’s on by default and can’t be turned off.
This all falls apart when a participant is added (ghost or actual) that gets the entire convo. This is very common in messaging apps and a known issue on WhatsApp, Telegram, many other ones and Signal also has the ability to attach users to convos. The moment you have another participant all the end to end encryption is moot.
Focusing on just encrypted backups is probably what third parties want people to focus on, because they are third parties and want you to use them, but even if you trust them, how long can they be trusted. When it is bought out by private equity later that can get bad. Now they might sift everything. It is alot like those VPNs that say "we retain no logs" but they divert them to a third party and when it is reviewed the logs surely aren't there, but they are still out there.
KafkaesqueBrainwaves t1_j9xw4z4 wrote
i just wish they didn't drop support for SMS/MMS
SIMPLYadumb t1_j9xw951 wrote
While I like to trust what signal says, what companies say, vs what they do, often does not align.
uwu2420 t1_j9xwdgk wrote
The backup files not being encrypted is the whole point though. What good is everything else being encrypted if you can just subpoena or get a copy of the backup where all of that stuff that was encrypted is in plaintext lol
> Phone backups also don’t have to go to iCloud
Yes, but it’s on by default, and the majority of users have it turned on. Advanced Data Protection means you’re giving up a lot of account recovery options so most users don’t have that on, plus it’s only ~3 months old.
> This all falls apart when a participant is added
Well.. then just don’t add participants you don’t trust to your group chats…
> Focusing on just encrypted backups
But it’s a big issue lol. See above and refer to the slides I linked in the earlier comment. Again, what good is the encryption if there’s an easily available plaintext version too?
> When it is bought out by private equity
Signal is open source and you can run your own server if you want.
ThimeeX t1_j9xwuig wrote
Probably not on a locked down corporate laptop. Plus they often come with corporate software that specially scans for unauthorized software, so it's not worth the risk of trying to bypass.
Lots of people don't seem to realize how locked down and monitored work laptops are, you really don't want to run personal Signal accounts or anything else for that matter on these devices.
ModernCoder t1_j9xx0pp wrote
Tbh there has been research that would allow somehow for the encrpytion to work ONLY if some words / phrases are not present.
It's a stupid thing but some genuinely want to do that. And it literally makes the encryption pointless if they do do that because then some "trusted" party makes a list of words and phrases which can be any fucking thing.
Tiggywiggler t1_j9xx8nn wrote
I'm in the UK and I use Signal all the time. I would be sad to see it go, but immensely happy to see Signal stand by its principles. I would miss it, but if they do scorch the earth on their way out I would hold the can of petrol for them. For once I just want to see a company do what's right before profit.
drawkbox t1_j9xxebn wrote
I mean pretty much anything in a cloud should be considered secure from everything but law enforcement.
The point is you still need the user context/auth. These files only work with the OS to access them. Like an iOS app or Windows Store data/settings in an app, that is specifically signature/encrypted to your user. Outside of that context it is useless. Third party ones are usually not tied to OS/browser/app for a reason.
I think most people are worried about hackers/ransomware/criminals over law enforcement so if you use the cloud that is why people are willing to make that tradeoff. The most insecure place is the local systems most likely, very easy to compromise a user compared to Apple/Google/Microsoft. It is possible but way more difficult. You almost have to be rogue state level funded for that.
> Well.. then just don’t add participants you don’t trust to your group chats…
Sure, but there is a 'ghost' user ability. In many messengers this is used to look for spam/moderation or other potentially nefarious reasons. Any chat system that has the ability to connect more than two people has the potential for you to not see the user. This is the most common use like in honeypot apps.
Encrypted messaging app used by criminals was actually an FBI honeypot
>> The encrypted messaging app in question was called ANOM, and was installed on special smartphones that couldn’t make calls or send emails. ANOM purported to be end-to-end encrypted, meaning only the sender and receiver could view messages. In reality, every single message was passed to police, who used them to make the arrests.
Ghost users is a major problem in "secure" messaging apps. There are plausible deniability reasons for them, spam detection, moderation etc, that is the rub.
> if there’s an easily available plaintext version too?
It is only plaintext in the context of the user... Taking it out of that context it is no longer. People make this claim about browser password managers but everything is tied at the system level to the user. Sure if the person gets the user context then they can get the files unencrypted, that is how it works. That would mean everything is compromised even your "secure" third party messenger like Signal.
> Signal is open source and you can run your own server if you want.
Yes. Still doesn't mean a third party that relies on messaging only is trustable.
Apple/Google/Microsoft have a vested interest in securing all your content, you don't have to worry about them stealing messages or siphon.
All of them will be open to law enforcement most likely because there are so many attack vectors in systems and especially third parties that don't have the sophistication at the cyber security level simply due to cost.
YoureHereForOthers t1_j9xxi65 wrote
NIST is the entity heading the crypto competitions, the two I’m aware of/contributed to on are Light Weight Crypto (LWC) for small embedded IoT devices and Post Quantum Crypto (PQC) for larger devices like computers and such aimed at securing our devices from quantum attacks that can easily break RSA and cut AES times in half.
The competition sites I know of
https://csrc.nist.gov/Projects/lightweight-cryptography
https://csrc.nist.gov/projects/post-quantum-cryptography
There are links to the remaining algorithms in the finalists/round 4 respectively.
CRYSTALS-DELITHIUM for pqc digital signature is already being used in some new specialized processor among a couple others. CRYSTALS-KYBER is likely to replace RSA.
Last I looked into it I think GRAIN-128 AEAD will be the new LWC alg.
uwu2420 t1_j9xyhri wrote
> I mean pretty much anything in a cloud should be considered secure from everything but law enforcement
Again, nope. If a cloud service is truly end to end encrypted, and designed well, nobody but the end user should be able to access the data. Yes, even if there is a subpoena.
> The point is your still need the user context
Or if you have access to the files on Apple’s server, then no user auth is required.
> These files only work with the OS to access them
Again, no. There are many commercial and open source tools that are able to read the backup file for you. Elcomsoft, iMazing and the Citizen Lab Mobile Verification Toolkit are some examples.
> Most people are worried about hackers
It wouldn’t be the first time someone’s iCloud account was hacked into.
> there is a “ghost” user ability
Show me where in the Signal code there is this functionality. Again, it’s open source, so a honeypot would be quickly found. Also, if you’re worried about state level honeypots, note that retrieving an unencrypted iCloud backup is a lot easier.
> It is only plaintext in the context of the user…
…and anyone with access to the files on Apple’s servers, which aren’t only subpoenas but also hackers, governments that don’t respect human rights, etc. which is the whole point of having end to end encryption, even the service provider themselves should not have the ability to access the data on your account.
Do you not understand the point of end to end encryption? The whole point is that nobody, not even the service provider hosting the cloud service can access your data.
drawkbox t1_j9xzevz wrote
> If a cloud service is truly end to end encrypted, and designed well, nobody but the end user should be able to access the data.
I agree this is just not the case with so many holes and side channels out there. The cloud is good for securing content from others, oversight will always find a way. Anyone that thinks otherwise is a suka.
> Or if you have access to the files on Apple’s server, then no user auth is required.
User auth still required but yeah you could hack Apple I supposed and get it. Good luck though.
> There are many commercial and open source tools that are able to read the backup file for you. Elcomsoft, iMazing and the Citizen Lab Mobile Verification Toolkit are some examples.
If those apps are getting the user context then sure. If not then no.
Take Elcomsoft for instance with LastPass vs Password managers. That is why you don't install clients or extensions, like LastPass.
Read this closely:
>> Windows Data Protection API Not Used
>> One may argue that extracting passwords stored by the Google Chrome browser is similarly a one-click affair with third-party tools (e.g. Elcomsoft Internet Password Breaker). The difference between Chrome and LastPass password storage is that Chrome makes use of Microsoft’s Data Protection API, while LastPass does not.
>> Google Chrome does, indeed, store user’s passwords. Similar to third-party password managers, the Windows edition of the Chrome browser encrypts passwords when stored. By default, the encrypted database is not protected with a master password; instead, Chrome employs the Data Protection API (DPAPI) introduced way back in Windows 2000. DPAPI uses AES-256 to encrypt the password data. In order to access passwords, one must sign in with the user’s Windows credentials (authenticating with a login and password, PIN code, or Windows Hello). As a result, Google Chrome password storage has the same level of protection as the user’s Windows login.
>> This, effectively, enables someone who knows the user’s login and password or hijacks the current session to access the stored passwords. This is exactly what we implemented in Elcomsoft Internet Password Breaker.
>> However, in order to extract passwords from Web browsers such as Chrome or Microsoft Edge, one must possess the user’s Windows login and password or hijack an authenticated session. Analyzing a ‘cold’ disk image without knowing the user’s password will not provide access to Chrome or Edge cached passwords.
>> This is not the case for the LastPass Chrome extension (the desktop app is seemingly not affected). For the LastPass database, the attacker will not need the user’s Windows login credentials of macOS account password. All that’s actually required is the file containing the encrypted password database, which can be easily obtained from the forensic disk image. Neither Windows credentials nor master password are required.
>> macOS has a built-in secure storage, the so-called keychain. The Mac version of Chrome does not use the native keychain to store the user’s passwords; neither does the iOS version. However, Chrome does store the master password in the corresponding macOS or iOS keychain, effectively providing the same level of protection as the system keychain. Elcomsoft Password Digger can decrypt the macOS keychain provided that the user’s logon credentials (or the separate keychain password) are known.
Elcomsoft mentions the OS level protections on these.
> It wouldn’t be the first time someone’s iCloud account was hacked into.
If someone gets into iCloud they are most likely getting into the device and again, the point of a "secure" messenger or cloud falls apart because they have access to their user. Yes, people should be careful with their user, it opens up everything.
> not even the service provider hosting the cloud service can access your data.
If you believe this then you believe in magic. Even if a provider tried to do this, software has holes... See OpenSSL/Log4j/Log4Shell/on and on and on and on... The fact that you trusted it because they said they don't look, it was probably a lie, but even if it wasn't they can get in.
megahamstertron t1_j9xzzpq wrote
As soon as she said it was to protect the children™, it was obvious it was BS.
uwu2420 t1_j9y02jf wrote
Yeah I agree there are always vulnerabilities in software, but the thing is, as far as I know, there aren’t any known bugs that would leak data from Signal so far despite all the security research attention it gets, and plenty of evidence that it’s safe.
Meanwhile, I’ve already explained how it’s trivial to get around the end to end nature of iMessage for a large majority of users.
If you don’t care about your conversation being end to end encrypted, then yes, by all means, use iMessage or even just plain SMS. Much easier. But if you do care, I’m not sure why you’d shoot yourself in the foot with the option known to have a major workaround.
JeffreyMarsalek t1_j9y1bpe wrote
Chill bosses until the next bad quarter (for public companies)/recession (for private ones)
crowsaboveme t1_j9y1fah wrote
It's the intro speech for every right taken away.
ArcherBoy27 t1_j9y1lvz wrote
My favorite part is the end of that segment.
Basically says "we will read your messages but only to protect children, only for that purpose". Like that is any better and offers no guarantees that it wont be misused.
carlosvega t1_j9y2aau wrote
Yeah, that I know, but I was wondering if they publish the md5 of the apk or compiled app so that you can test later on or something. Or if it’s possible to check the md5 of the downloaded apps from the store. I am not sure why I am downvoted, I think it is a legitimate question.
Some bad guys could fork the app, add some changes and publish it in third party stores.
Something similar to this: https://www.infosecurity-magazine.com/news/malicious-python-libraries-found/
And I am not the first one asking this question:
Edit: a colleague just shared this with me! https://signal.org/blog/reproducible-android/
drawkbox t1_j9y2ig6 wrote
Signal definitely seems the best out of them if you are into using a third party messenger, for now.
I would still trust the OS level messaging on mobile over third parties because of the scale, future funding, incentives and trust. The OS already has access to your info. Other people getting access to your data is probably always easier on third party systems, even if the third party is trustable, not ever person or dependency is.
iMessage is secure, if you are going straight SMS yes that is more open. I also know what Apple wants and their goals fully, that is a secure platform that isn't just messaging.
The fact is though, every system has holes and security issues, so the best opsec is less third parties, big or small or open or closed...
Just ask Jeff Bezos after he got hacked via WhatsApp temp hole by something sent to him by freaking MBS.
SirensToGo t1_j9y2j5n wrote
Don't put anything on your work laptop that you don't want the company to have or, alternatively, entered into the legal record when someone sues your company and your data gets taken for discovery
HRKing505 t1_j9y306c wrote
> "[...]we want you to invest in technology to get around this [encryption] problem, just for child abuse and sexual exploitation not for anything else"
uh huh....
She then goes on to say "we're not ending encryption in any way". So to phrase it all in another way: "we're not taking away the locks on your doors, all the doors will just use the same key."
uwu2420 t1_j9y3ix1 wrote
The thing you’re missing is, if you’re using Signal you specifically want an end to end encrypted chat service.
The OS doesn’t upload that data in unencrypted form. You can analyze for yourself with a MITM proxy.
iMessage is secure, as long as you don’t care for your messages remaining end to end encrypted, and you trust Apple to have a copy of your messages. Is that reasonable to you? Maybe, maybe not.
With Signal, the whole idea is you shouldn’t have to trust Signal with your messages, because they don’t have the ability to read them, even if they wanted to. Yes, it could have holes like any software, so this can really be simplified to: use the tool with a known issue for sure, vs use the tool that might have an issue in the future but for now is known to be safe.
ArcherBoy27 t1_j9y3lmt wrote
Pretty much, "we are not banning locks, of course not. But we will be requiring there to be a cutout in the frame so we can push the door open and walk in to check your kids. We will only walk in to check on your kids, promise"
drawkbox t1_j9y44d1 wrote
> end to end encrypted chat service
End to end means nothing though when a client isn't in your control and may have another user attached. Again, common in many messaging apps.
iMessage is end to end encryption to iMessage, just like Signal to Signal. If you use SMS it is not.
> We designed iMessage to use end-to-end encryption, so there's no way for Apple to decrypt the content of your conversations when they are in transit between devices. Attachments you send over iMessage (such as photos or videos) are encrypted so that no one but the sender and receiver(s) can access them
Apple can't magically make SMS secure, it is not secure by default as it was really from telephone diagnostics and repurposed for messaging. So when you message Android it goes SMS. SMS was from SS7 and really only for diagnostic or messages to customers for testing. MMS is better but still not that great. Apple should bring iMessage to Android and do better on messengers, it is leading many people to third parties that open up opsec issues.
> With Signal, the whole idea is you shouldn’t have to trust Signal with your messages, because they don’t have the ability to read them, even if they wanted to.
That is a bold statement. Yes, on the surface. Again, ghost users, compromised clients, endpoints can have problems. Also Signal does have a proprietary shim for monitoring, spam checking and other things, could easily be used to surveil or sift. There are probably a dozen ways or more to get around it currently at the dependency level or breaking their encryption as it is custom.
The best opsec is always less third parties.
the_mystery_men t1_j9y4cve wrote
Yeah this bugged me too. Its hard to get people to switch but I used to be able to convince them to use it as their SMS app too it was easier. I can't remember the reasons they dropped support, I'd assume because it's easier to develop without the unsecured bit
uwu2420 t1_j9y4q7s wrote
We can assume the majority of people go with the default settings. iMessage conversations, by default, end up in a backup file on iCloud that is not end to end encrypted. Signal conversations, by default, do not.
> We designed iMessage to use end-to-end encryption
Which again is worthless when a plaintext version is also being uploaded in the form of iCloud backups, which are on by default. The same isn’t true for Signal.
You’ll have to provide a source for your claims about Signal’s ghost users. As far as a compromised client, that’s not something any messaging client can defend against, and if a service is end to end encrypted, it shouldn’t matter if the endpoint server is entirely compromised.
> breaking their encryption as it is custom
That’s the cool part, it’s not. It’s based on the same old algorithms that have been around for years. ECDSA, ECDH, AES, etc. if I remember correctly
It depends on your definition of opsec but just a reminder that we’re in a thread about Signal and end to end encryption specifically.
drawkbox t1_j9y5gri wrote
You dismiss user level auth/encryption like it is nothing. If anyone had access to user level auth why go to the backup file, just go to the device, load up Signal and scrape their messages.
All other messaging apps have the "ghost user" problem confirmed.
Signal has a shim for spam that is very unclear, I'll just say that. There are other things around Signal that make is sus in my opinion.
> The source code for spam detection is not public.
So there is a plausible deniability reason to hide some code... you have to trussssst. Here's the kicker, you can't check for spam if you aren't seeing the message.
Even on the self installed versions...
> Signal's servers are partially open source, but the server software's anti-spam component is proprietary and closed source due to security concerns
Signal does handle users being added better, but this could just be theater as well.
> The real problem with the GCHQ proposal is that it targets a weakness in messaging/calling systems that’s already well-known to providers, and moreover, a weakness that providers have been working to close — perhaps because they’re worried that someone just like GCHQ (or probably, much worse) will try to exploit it. By making this proposal, the folks at GCHQ have virtually guaranteed that those providers will move much, much faster on this.
> And they have quite a few options at their disposal. Over the past several years researchers have proposed several designs that offer transparency to users regarding which keys they’re obtaining from a provider’s identity service. These systems operate by having the identity service commit to the keys that are associated with individual users, such that it’s very hard for the provider to change a user’s keys (or to add a device) without everyone in the world noticing.
> As mentioned above, advanced messengers like Signal have “submerged” the group chat management into the encrypted communications flow, so that the server cannot add new users without the digitally authenticated approval of one of the existing participants. This design, if ported to in more popular services like WhatsApp, would seem to kill the GCHQ proposal dead.
I personally don't trust Signal for a few reasons beyond these items, but if you trust them then rock on.
uwu2420 t1_j9y65ub wrote
> You dismiss user level auth
To scrape my Signal messages, you need access to my physical device, and you need my passcode. To get access to iMessage messages, all you need to do is get my latest backup, or the backup of the person I talked to, off of Apple’s servers, for example, with a legal request, which completely bypasses the need for any user level auth/encryption.
Agree to disagree
drawkbox t1_j9y8ajk wrote
> To scrape my Signal messages, you need access to my physical device, and you need my passcode.
Same with getting access to your iCloud/Apple account.
> To get access to iMessage messages, all you need to do is get my latest backup, or the backup of the person I talked to, off of Apple’s servers, for example, with a legal request, which completely bypasses the need for any user level auth/encryption.
As I said, anything stored in a cloud will have some oversight. Anyone that thinks storing something in a cloud is secure from oversight is dim.
If you think anyone from enforcement can just get your iCloud/Apple account, that would also mean they are able to access your device and everything on it including your "encrypted end to end" Signal messages that are plaintext on your client.
Messaging apps also get cam/mic/location/contacts permissions, Signal is no different, one more entity with your face/voice/place/friends.
You can trust Signal, a third party, rock on. Acton is involved in both, from WhatsApp, went to Facebook, and then when people stopped trusting Facebook he made Signal to catch those leaving. The story is he didn't trust Facebook, no one should, but can you trust Signal/Acton or is it a front. You decide. Problem is you trust them so much they got ya. I mean Elon Musk and Edward Snowden recommend it... is must be safe /s. Signal is maybe safe, maybe safe from some five eyes, but not all eyes not in the five. Even then, there are always ways to get in via dependencies and devs (mainly devops) are the weak link today sadly.
Agree to disagree, good discussion though.
[deleted] t1_j9y8hm6 wrote
[removed]
jens-2420 t1_j9y8oor wrote
100% is real. More is a lie.
uwu2420 t1_j9y8y3c wrote
> Same with getting access to your iCloud/Apple account
No you don’t, because it is stored unencrypted on Apple servers and Apple themselves will give it to you (for example, with a legal request). If you go this route, you don’t need the physical device or any of the user’s passwords. The user won’t even know it’s happened until they are told.
The difference is in one case, my password and my physical device is needed. If you want that, you’ll have to physically get my device from me, and then get me to tell you my passcode. The other is just stored on Apple’s servers. If you don’t see how one is much harder than the other, I dunno what to tell you lol
Edit: no need to believe me, believe American law enforcement instead and refer to the leaked slide I posted earlier.
Catji t1_j9y9855 wrote
British comedy.
vs. American comedy.
Clowns to the left, jokers to the right.
drawkbox t1_j9y9geb wrote
Unencrypted behind being encrypted by the user context... so yes, not double encrypted. You can encrypt it though.
Messaging apps also get cam/mic/location/contacts permissions, Signal is no different, one more entity with your face/voice/place/friends.
We definitely agreed to disagree.
CuppaTeaThreesome t1_j9yayjo wrote
Every single call, message, email, chat every communication the government, PM, minster & misc deals with should be held to higher compliance standards than a financial body. Every purchase traced, every figure used for a report known. Nothing made up. Just the same as every financial house after 2008 has to do. Yet they are corrupt calling for others to not use encryption.
They are criminals.
rdyek t1_j9ycbga wrote
Signal seems like a Russian op.
SlapNuts007 t1_j9ydw7j wrote
I wish it had more automation around downloading media to the device.
RepresentativeBid846 t1_j9yey59 wrote
Well, we know what is the next app to be banned by our benevolent governments.
[deleted] t1_j9yfcy2 wrote
[removed]
[deleted] t1_j9yff36 wrote
[deleted]
[deleted] t1_j9yfl5j wrote
[deleted]
[deleted] t1_j9yfr9z wrote
https://gizmodo.com/authorities-claim-they-accessed-encrypted-signal-chats-1848361100
If authorities want access they will find a way
iByteABit t1_j9yg641 wrote
The key difference here is that it's not driven by profit. Sure, they get donations, but the developers are doing it solely out of interest and believing in a safe and private internet, I doubt the donations are even enough to make a living out of.
There's also a lack of the business layer of a company that's usually comprised of greedy snakes that wouldn't think twice about making double profits by sacrificing their morals.
Otterism t1_j9yh2qn wrote
This is most like a case of this. If authorities got into the phone of any participant in that group chat they could read it.
That the thing with all these apps, they're very user friendly for the user of the phone - just open the app and everything is available (unless you have a separate code in the app - typically opt-in).
[deleted] t1_j9yh4jw wrote
[deleted]
[deleted] t1_j9yh9m1 wrote
[deleted]
[deleted] t1_j9yhh24 wrote
I have my convos set to disappear after an hour, but I still get nervous about it. I just assume anything I type on a phone/computer can be found.
Otterism t1_j9yhk18 wrote
This is all very good, but what I really like about Signal is all the pretty files it provides...
Sharknado4President t1_j9yhuc5 wrote
Public key cryptography allows decryption to anyone with the private key. Signal could modify the client app to transmit your private key to the UK government. Not sure what point you’re making here. It’s absolutely possible.
360_face_palm t1_j9yi7bw wrote
they 100% will though
360_face_palm t1_j9yikts wrote
this has been the perennial problem with government - they're complete fuckwits and especially when it comes to any kind of modern technology.
gamergirlpee69 t1_j9yip79 wrote
Signal is the only messenger left that hasn't turned into a steaming shit pile of bloatware.
IamfromSpace t1_j9yjlt6 wrote
I’d love to see a paper linked, because that sounds absurd from the outset, lol
ChosenMate t1_j9yk80a wrote
Your finishing sentence just isn't right, unfortunately
karmaputa t1_j9yks9t wrote
And TBH because no one outside the US still uses SMS/MMS.
the_mystery_men t1_j9ym6ak wrote
Yeah true, but, atleast here in UK it's still used by orgs and for 2fa etc so it was easier to get someone to install it on android as you could set it as your default SMS app. But noone wants the aggro of trying a new messaging app
richalex2010 t1_j9ymd3x wrote
The issue is that it doesn't sound absurd to a politician or a regular person.
dethb0y t1_j9yn5xi wrote
Someone needs to start standing up to the madness going on over there.
addiktion t1_j9yo27j wrote
My one workaround is I own the company too haha but yes take this advice. It doesn't matter how well mannered or awesome your bosses are when shit hits the fan, however that is, your personal shit on a company laptop is only gonna work against ya.
ttustudent t1_j9yt5o2 wrote
Well that's lame.
NextFaithlessness7 t1_j9ytgel wrote
They only had access to private conversations for the last 100 years. Since telephone lines exist
Edsgnat t1_j9yu2vs wrote
Governments couldn’t intercept mailed letters or packages?
ModernCoder t1_j9yv6zw wrote
Here is one of the papers I managed to remember: https://drops.dagstuhl.de/opus/volltexte/2022/15600/pdf/LIPIcs-ITCS-2022-4.pdf
NextFaithlessness7 t1_j9yvj3q wrote
They cant read every letter. Also intercepting your specific letter between hundreds of others on a pile is kinda difficult
Spanky007_bong_867 t1_j9yxyym wrote
Signal is how I keep in touch in Ukraine. Good for Signal
serilian t1_j9z02eu wrote
It's a non-profit: https://signalfoundation.org/
altrdgenetics t1_j9z5ipe wrote
And as an owner at least in the US it is better to keep them separate too for the Tax Man. Makes it easier to clearly denote personal vs business assets.
Edsgnat t1_j9z5ns4 wrote
They can read every letter, and governments have a long history of doing exactly that. Most states (in the broad sense, not the US sense) control postal systems. And while private delivery in some form has also existed, almost all states have the ability to seize private property.
All postal services have access to your private communications because they have physical control over it. Letters and packages cannot be encrypted to the same level of sophistication that electronic communication can, meaning they’re almost always understandable by the receiver. Any deliverer of mail, at any time, can open a random letter, read it, and understand it. Any government can seize that letter and do the same. Unfettered access to private communication.
Looking for a specific letter — or specific content in a group of letters — is a question of incentive, resources, and law. For centuries (and let’s be real, millennia) states have had incentives to control messaging through censorship, seize contraband, investigate criminal activity, change private votes, you name it. Almost all states have the resources to pay large amounts people to deliver — or intercept the delivery of — mail, and to sift through mail, read it, and censor it or what have you. In the last few hundred years, a large number of states have prohibited themselves from doing this indiscriminately, while still reserving the ability to do so. Other states have no constitutions and can do so indiscriminately.
Electronic communication of some kind or other has frustrated the states ability to intercept private communication. So far states have had the resources to develop technologies in response, like wiretapping for instance.
Encrypted messaging used to be a thing states did to keep secrets from each other, but now the state’s citizens can do the same on a scale unprecedented in human history. Until recently, states had the resources and time to break encrypted communication. Now technology has advanced to the point where they have neither. Thus states have incentives to intercept private communication between citizens (see above) but no ability to.
Hence, they want a back door.
altrdgenetics t1_j9z5rt1 wrote
But heaven forbid we do anything about guns getting into schools or superintendents leaving theirs on the shiiter in the bathrooms... Ya know, for the children.
Mccobsta t1_j9z6htg wrote
Gov will be fucked as they use this
Street_Masterpiece35 t1_j9zdgk3 wrote
Sort of, the technology didn’t exist to search all phone calls, it’s a little different but yes a percentage of targeted peoples post could be read and phone calls tapped
AvaX90 t1_j9zi28n wrote
1000% is better than 100%.
Sa404 t1_j9zj6bq wrote
This is a lie, if it comes down to implement it or get banned from the UK they’ll go for the first option undoubtedly
Metholis t1_j9zj8zf wrote
Does telegram have the same level of encryption?
Sa404 t1_j9zjaml wrote
Probably because it’s extremely expensive too
wim-wac t1_j9zns33 wrote
What is the ownership % for Android tablets vs iPad I wonder? I sort of just assumed Android tablets were no longer a thing. Do they run android still or is it ChromeOS?
Prestigious_Push_947 t1_j9zop4q wrote
There are a lot more kinds of encryption in transit than HTTPS, Signal is absolutely not using HTTPS as the protocol to protect your messages in transit. That said, yes, the British proposal is wrong - as I said in my post. But if you're going to criticize them without understanding the proposal, you hurt the effort to counter the it.
Prestigious_Push_947 t1_j9zp767 wrote
Whatsapp doesn't encrypt metadata though, which is a massive security vulnerability. There's absolutely no excuse to be using whatsapp.
Prestigious_Push_947 t1_j9zpk9k wrote
You should probably read the article.
Prestigious_Push_947 t1_j9zq900 wrote
Telegram and Whatsapp are both privacy nightmares.
Tiggywiggler t1_j9zqj2a wrote
TIL. Thank you!
IsilZha t1_j9zqree wrote
Any encryption with a universal back door is bad encryption.
franklindstallone t1_j9zt0qp wrote
Problem is most brits like being oppressed by their government so the bill will pass and the people will be proud.
vriska1 t1_j9zv3xd wrote
Well the UK is about to enter a recession meaning Ofcom is likely to be super underfunded and unable to enforce 90% of the bill.
drever123 t1_ja00mlh wrote
Lmao what are you sending that you are so paranoid about the government finding out?
Slawtering t1_ja01gq1 wrote
These guns in the schools?
[deleted] t1_ja02xv5 wrote
[deleted]
elliotborst t1_ja038k5 wrote
Signal is good I just wish it didn’t destroy video and photo quality when you try to send
LochAwe t1_ja03t2c wrote
Fucking great. First Brexit; now Signal.
geekmoose t1_ja0589y wrote
It’s UK, not US.
BAG1 t1_ja07yfa wrote
100, due to nature of fractions and our pesky base 10 math, is the highest percentage possible. There's just no way around it. If you think you gave it your all last time but this time you really gave 110%, no, you didn't. THAT was your 100%. (And it probably wasn't even your 100% because that's what people say when they want you to think they did more than they actually did.)
And that's what grinds my gears.
NoPriorThreat t1_ja0ffen wrote
not really, it is more like having a camera watching your front door and checking who is entering.
BamBam-BamBam t1_ja0foa6 wrote
That's how "good" encryption works. FTFY. Somebody is telling them that symetrical encryption is "good enough."
ArcherBoy27 t1_ja0g2bk wrote
I never suggested they were, just stated that only encryption in transit isn't e2ee. If it was, this wouldn't even be an issue in the first place. I understand the proposal just fine.
veritanuda t1_ja0jogp wrote
> This was proven with WhatsApp not too long ago and Signal also has the ability to attach users.
That maybe true for Whatsapp but Signal has worked hard to tackle that.
Adding users to groups is not possible unless they are already in your contacts in the first place, as Signal pull contacts from your local contacts. But they never share the number, and only if other people have that same number already in their contacts will it be show to both parties.
edit:
> Default settings in Telegram aren’t encrypted, same with Signal
That is plain wrong. Telegram does not encrypt by default and not at all in channels. Signal ALWAYS encrypts for one to one and for group chats.
I am not going to go through picking apart all you said, suffice to say not all of it is accurate.
voodoovan t1_ja0k03q wrote
The UK Gov can just ask the US Gov for any data, as they already have access to Signal. Or the UK could threaten to ban them like the US does to other non-US companies.
xyzone t1_ja0kp97 wrote
>almost all states have the ability to seize private property.
What do you mean 'almost all'? All private property everywhere is enforced by a state.
Prestigious_Push_947 t1_ja0pzke wrote
Okay, so you misunderstand what encryption in transit, E2EE or HTTPS are, got it. You know the words, but you don't actually understand any of them. E2EE is by definition encryption in transit. It is encryption from end to end, between two ends, whilst transiting between them. All E2EE is encryption in transit, though not all encryption in transit is E2EE.
HTTPS is just a kind of encryption in transit that protects HTTP traffic while it moves over the wire. There are lots of other protocols that provide encryption in transit for other cleartext protocols. You can even have multiple different ways to provide encryption in transit for different protocols. Signal provides encryption in transit for message traffic, and it only provides encryption in transit. It does not provide other types of encryption (i.e. encryption at rest) for your messages.
You don't understand the proposal, you don't understand even the very basics of any of this.
Sporesword t1_ja0qkhz wrote
Governments worldwide would like civilian use of encryption to be illegal.
5thvoice t1_ja0wsaz wrote
Of course Signal encrypts data at rest. Why would they want other apps installed on your phone to be able to snoop on your messages?
easyjimi1974 t1_ja12j14 wrote
100% would have sufficed.
ant1992 t1_ja19il7 wrote
Just download the apk and install it manually
tickleMyBigPoop t1_ja1a2ti wrote
Laughs in PGP
Prestigious_Push_947 t1_ja1fuo9 wrote
You should look deeper into the app. It has been reported repeatedly that content is available on the endpoint either in cleartext or in a way that can be trivially recovered. Signal themselves have repeatedly stated that they do not intend to be secure against someone in control of the device. Their encryption on the device is not hardened, and it's not meant to be. They recommend using robust full-disk encryption to secure your messages at rest.
Ivanoff91 t1_ja1m95n wrote
Imagine new government-compliant messenger: 60% of the time we don't read your messages!
powersv2 t1_ja1p0ss wrote
Man i adopted it early and its paid dividends.
1wiseguy t1_ja1uvc9 wrote
The UK government is a vague organization.
Do you know who that is? Can you trust them with your information? Would you send them a copy of every message that you send anybody?
No, no, and no.
1wiseguy t1_ja1v48w wrote
If authorities want access (to communications), and the messages are not securely encrypted, they will find a way.
It's not difficult to encrypt data such that others will not find a way. But sometimes people don't.
Sharknado4President t1_ja1ymcy wrote
Didn't say it was a good idea. Was simply refuting the parent post which said that encryption means nobody can read it.
Bogus1989 t1_ja1z1g3 wrote
Imagine. Having parents monitor their kids. its not that hard. If I saw my kid with signal on his phone id wonder why he has it. No point in tryna limit signal. Another one will just pop up.
LMAO, also (no offense to any Police in here) but the whole keeping a key only for when it is approved to be used. 🤣 What a joke…
Just like they did with Cellebrite?
“MumblemumbleSomethinggreatergood”
Abuse.
Guilty until proven innocent here in the US. Witnessed it first hand myself….even if you fought and bled for your country.
drawkbox t1_ja27hqu wrote
The content is accurate, the opinion is based on the findings and additional inputs.
The less third parties you have the better the opsec.
Messaging apps also get cam/mic/location/contacts permissions, Signal is no different, one more entity with your face/voice/place/friends.
You are free to keep using Signal, but trusting a third party messenger is really, really risky.
Agree to disagree.
ArcherBoy27 t1_ja2h05o wrote
Yes I know. I was just stating "just" encryption in transit isn't E2EE (I.e. https).
E2ee is encrypted from end to end. From when it is written and saved on the source to when it is received and read on the destination. Anything except you that can read messages before you do, without your permission, and potentially send it off somewhere breaks E2EE, which is what they are proposing.
> It does not provide other types of encryption (i.e. encryption at rest) for your messages.
Going to need a source on that, no encryption at rest. Nothing I can find suggests that. I have found some claim it can be broken with physical device access but if the device itself is encrypted then it doesn't matter.
spektre t1_ja2lfyq wrote
Yeah, if the camera automatically unlocks the door.
spektre t1_ja2m0hx wrote
What they are saying is that they can't protect against someone for example forcing you to unlock it, installing a keylogger, or taking screenshots of your conversations. Because that would be a pretty hard problem to solve.
spektre t1_ja2oz3h wrote
That doesn't matter at all from a systematical perspective.
It could be your plans to topple the genocidal dictator, it could be your drug trafficking business, it could be your vacation photos you don't want a phone repairman getting access to.
Shaming people for wanting privacy is not cool.
1wiseguy t1_ja3aft4 wrote
There are 3 ways an adversary can read your encrypted message:
-
Somebody gave them the key, either deliberately or by mistake.
-
The algorithm is faulty or weak.
-
The adversary has the resources to do a brute-force attack. That relates somewhat to #2.
All of these can be avoided, if you do it right. Sometimes people don't do it right.
ArcherBoy27 t1_ja3n0ub wrote
Relevant XKCD
Prestigious_Push_947 t1_ja4max0 wrote
You're just speaking nonsense. You don't understand these concepts at all. I'm not sourcing anything for some high school kid who's taken one IT class and thinks they're hot shit b/c they know CIA. There are loads of people reporting "vulnerabilities" in Signal because the on-device data is trivially accessed. Signal's response is consistently and repeatedly that their intent is not to provide on-device security and that you should use FDE. This is a very easy Google search away for you.
Prestigious_Push_947 t1_ja4n234 wrote
No, this is not a relevant XKCD, you dunce.
Prestigious_Push_947 t1_ja4nxer wrote
This really depends on a lot of scenarios. For example, if you use Signal for desktop on a Windows system without Bitlocker, your message content can be recovered easily without forcing you to unlock the device or installing any kind of keylogger. If you have FDE enabled, but your device is unlocked, then your message content can be retrieved. No keylogger or additional tooling is necessary. Signal is as secure as your device is, it provides no additional security for your messages. They have repeatedly classified bug reports for weak local security as "Won't fix" because they are up front about the fact that their intent is ONLY to secure messages in transit.
ArcherBoy27 t1_ja4uvhh wrote
Great source, everywhere I could find didn't mention anything like that. You are giving me no reason to believe you.
No need to be aggressive, I asked for a source since I couldn't find one to match what you are saying.
Besides this has nothing to do with client side scanning, the reason signal said what they said, at all.
Forget it, I'm not spending time with someone that can't be civil.
ArcherBoy27 t1_ja4v4ju wrote
I'll agree to disagree.
Not sure there was a need to name call. Completely uncalled for. Comment with respect or not at all.
[deleted] t1_j9wcge7 wrote
[deleted]