Pavel Valeryevich Durov
(Russian: Павел Валерьевич Дуров; born 10 October 1984)[4] is a Russian-born Emirati entrepreneur who is known for founding the social networking site VK and the app Telegram Messenger. He is the younger brother of Nikolai Durov. As of 29 September 2022, his net worth is estimated at US$15.1 billion. In 2022, he was recognized as the richest expat in the United Arab Emirates, according to Forbes. In February 2023, Arabian Business named him the most powerful entrepreneur in Dubai.
translation of major allegation :
“The (French) Justice system considers that the lack of moderation, cooperation with law enforcement, and the tools offered by Telegram (disposable numbers, cryptocurrencies, etc.) make it an accomplice to drug trafficking, pedocriminal offenses, and scams.”What a fucking horseshit excuse for law enforcement.
Encrypted communication should be a human right.
The issue I see with Telegram is that they retain a certain control over the content on their platform, as they have blocked channels in the past. That’s unlike for example Signal, which only acts as a carrier for the encrypted data.
If they have control over what people are able to share via their platform, the relevant laws should apply, imho.
I agree but its not even an encrypted messenger. Almost no one uses the weak encryption and im pretty sure they offer decryption to governments considering they were threatened to be banned in russia and avoided it
What has encryption got to do with it?
Most of telegram is not encrypted. There are unencrypted channels on telegram right now hosting child pornography. Telegram never removes them.
Good. They shouldn’t.
Unencrypted channels are the ones that are easiest to trace, and the easiest ones to successfully base a prosecution on.
The most correct response is to report them to law enforcement. Unencrypted channels make amazingly effective honeypots. It’s fairly easy to bust people using unencrypted channels, esp. because people think they’re anonymous and safe. It’s much, much harder to bust people once they move to .onion sites and the real dark net away from their phone. When you shut down all the easy channels, you push people into areas where it’s much harder, almost impossible, to root them out.
Look, if you’re going to knowingly host child porn, and not take it down even after you have been made aware, you get what’s coming to you.
The idea that Telegram would somehow better serve their customers by staging law enforcement stings in unencrypted channels is completely divorced from Telegram’s core mission.
This has nothing to do with freedom of speech or the right to encrypt. Child Pornography is a criminal matter. Failure to cooperate with law enforcement while providing comms and distribution to child pornographers is going to land you in deep shit eventually.
What if telegram refuses to cooperate with law enforcement in a timely fashion to provide details of the people sharing that material? What should law enforcement do then?
At that point they’re willingly hosting it for no reason other than to host it for their customers and they’re complicit, no?
I think that holding the executives and BoD in criminal contempt of court is a good place to start.
EDIT: AFAIK Telegram doesn’t use warrant canaries.
Telegram and VK are both CSAM cesspits. Most of Russian social media has this problem, but VK and Telegram are where you’ll end up exposed to shit just by browsing.
So shut down the internet… FOR THE CHILDREN
No. You can say that about so many laws being made, but telegram simply hosts the most vile shit.
So you’re saying we should shut down any service that allows encrypted communication? Because any service that offers encrypted communication is going to be enticing to someone who commits crimes.
No, we should shut down any service that refuses to take action against its users when presented with proof that they are distributing child porn.
Encryption only offers so much plausible deniability. Once law enforcement gains access to one of these channels, they have proof of what’s going on, and the content is all hosted by Telegram. When Telegram refuses to cooperate and remove this content, they become complicit in distributing it.
Encryption isn’t an excuse to violate the law. If Telegram’s policy is not to remove CSAM, then they are a criminal organization.
We should shut down any organization that doesn’t cooperate with police when people are breaking laws or just the laws you want them to help enforce?
That’s… Not at all what I’m saying…
Removed by mod
^ this comment right here law enforcement
Sounds like they meant that for chat GPT…
lol. Technically if they could answer it, they should be the one investigated
Are you honestly asking me to link you some child porn?
What the fuck is wrong with you?
deleted by creator
Yes, but… I mean, it is being used for all of that.
It doesn’t matter in the slightest.
Making a tool that provides a private communication service literally everyone should have unrestricted access to does not make you an accomplice to anything.
So is the Internet, better go arrest my ISP.
The ISP will absolutely cooperate with law enforcement though, unlike telegram. That seems the nature of the issue in that there is a lack of moderation and oversight, which anonymity is not mutually-exclusive from flagging nefarious activities, ideally. I REALLY am not too keen on giving safe harbor to the likes of pedos and traffickers and what have you.
I REALLY am not too keen on giving safe harbor to the likes of pedos and traffickers and what have you.
Secure communication between individuals is a fundamental right. That nefarious activities can be conducted over secure channels can never be justification for suspending that right.
I’m not sure I yet agree with that. People can have secure communications; that’s called meeting in person and in a private room. That line gets blurred with intercontinental mass-communication that ultimately is beyond the use of the average citizen and is more frequently utilized to nefarious ends. If the damage outweighs the benefits to society, then clearly a rational limit perhaps should be considered.
Ultimately, what matters is respecting the house rules; and if the house rules of France were broken, why in the world would he travel there?
That line gets blurred with intercontinental mass-communication that ultimately is beyond the use of the average citizen and is more frequently utilized to nefarious ends.
I reject the premise of your argument: secure communication is not more frequently used for nefarious purposes than non-nefarious purposes.
But even if I accepted that premise, I would still reject your argument. The underlying principle of your argument is misanthropy: humans are inherently evil. They will always choose evil, and therefore, they must never have an ability to effectively dissent from totalitarian control.
The dangers posed just by the French government greatly exceed the dangers posed by every single person who ever has or ever will “nefariously communicate” over every communications platform that has ever been or ever will be invented.
Why? They happily hand all your data over to whoever asks and so does everyone else that’s why they can single them out because you’re already bought and paid for.
It is but so are phones and computers in general. Same with cars, many crimes require transportation.
As always, there’s a lot of nuance which is lost on Lemmy users.
It’s a question of exactly what telegram is being used for, what telegram the company can reasonably be aware of, what they’ve been asked to do, and what they’ve done.
Gotta add that “pedocriminal” thing so people don’t argue against it. Don’t wanna be seen “supporting pedocriminals” by supporting encrypted communications
I’m pretty sure that telegrams do collaborate a lot when it comes to crimes.
Pretty sure that this is a copyright thing as always. I get several scams tries each month on WhatsApp and phone call. But they don’t care about people trying to steal actual money, they just care if I use telegram to watch a TV show without letting Disney murder my wife.
The news source of this post could not be identified. Please check the source yourself. Media Bias Fact Check | bot support
TF1 (… standing for Télévision Française 1) is a French commercial television network … TF1’s average market share of 24% makes it the most popular (french) domestic network.
Yeah but how am I supposed to know what my knee jerk reaction to this story should be if someone doesn’t tell me whether it’s a left or right publication?
I can really see how this bot helps the moderation team and somehow lowers their volume of work as they’ve claimed multiple times for keeping it around despite everyone hating it for some reason or another.
Everyone hates it because it’s bad data.
Edit: no, strike that, it’s not even data, it’s just one guy’s opinion.
What happens when the bias checker is biased?
The mbfc site should not be used for anything. It’s just the subjective opinions of the site owner (who is misleadingly talking about “we” and “our” in his methodology page), aided by a few unknown volunteers who do some of the “checking”. The site claims to be objective, but there’s been enough examples to show that it isn’t (fe, it says that Fox News is as trustworthy as The Guardian or that CNN is somehow center left).
The so called methodology that is used, is just a lot of words that boil down to “several facets were checked by a human and that human gave a subjective rating to each facet, we then count up those subjective ratings and claim to be objective because we use a point system”.
For checking the trustworthiness of a source, I’d say that the mbfc site is about as useful as using CPU Userbenchmark for chosing a CPU. Yes, it’s easy to read and more convenient to use than other sources, but it’s also a load of horseshit and unless you drill down into the underlying “data”, you’re just going to draw the wrong conclusions because of how misleading the site is.
Who the hell writes a bot to fail loudly like this? If you can’t identify the source, shut the fuck up.