you are viewing a single comment's thread.

view the rest of the comments →

[–]Jesus 4 insightful - 2 fun4 insightful - 1 fun5 insightful - 2 fun -  (3 children)

So you're pro-censorship. Care to provide some dangerous disinfo?

[–][deleted] 1 insightful - 1 fun1 insightful - 0 fun2 insightful - 1 fun -  (2 children)

I merely explained the legal reason for her suspension. Don't feel too sorry for her. She and others are supported (often financially) by a network of disinformation campaigns that are causing people to harm themselves and others, burning down 5G towers, attacking grocery store employees because they want to record themselves creating problems for others in the name of freedom, and encouraging others to participate in a disinformation campaign that is keeping businesses closed, putting some people in hospitals with COVID, and killing a few people who can't recover from COVID, while also helping the virus to mutate. Fuck these people. They're not helping anyone. Twitter has a legal obligation to remove harmful content and people (eg. child porn, murder, torture, and promoting the the spread of COVID with anti-mask campaigns, promoting violence, &c).

[–]FlippyKing 3 insightful - 2 fun3 insightful - 1 fun4 insightful - 2 fun -  (1 child)

You did not "merely" explain the "legal reason for her suspension". First off, "legal"? What laws did you cite, and what evidence did you provide that she broke a law. Legal has a meaning, and you used legal. I think you used legal very spuriously. What you did was state in your own voice that "she constantly spreads disinformation that is harmful to people" but you did not back that accusation up with anything.

Twitter is literally being sued because they did not take down CHILD PORN when the child involved contacted them to take it down. Twitter does not even seem to dispute that part of it. I think, if I remember correctly, they only took it down when it became a "legal" (in quotes to harken back to your misuse of the word) issue for them and law enforcement contacted them.

Do you really think people would sue twitter, and it would not be thrown out of court immediately with prejudice, for misinformation? Why haven't every social media platform and every major news outlet sued by people who are casualties of the of Iraq War, or by relatives of the dead from that war? Will they be sued for taking down and putting disclaimers in front of anything talking about the best thing ever to happen to people with covid: Ivermectin? Your statement is ridiculous. At least you're consistent.

[–][deleted] 1 insightful - 3 fun1 insightful - 2 fun2 insightful - 3 fun -  (0 children)

Yes, these are natural concerns. This is - more specifically - how it works:

Twitter - like other websites - responds to numerous reports of abuse daily.

Some of thse reports are in the form of legal "cease and desist" requests.

If Twitter does not remove the abusive post or suspend the abusive account, one can hire an attorney to request further action (on any number of a broad range of alleged abuses).

Moreover, Twitter's board and administration determine what they wish to prohibit at the website, per their policies.

Users are warned when they violate the policies, at which time, or soon thereafter, they are potentially suspended.

If someone wants to sue Twitter because of an abusive user, it will be necessary to try the options above first, after which a lawsuit can potentially arrange a cease and desist order against the content or user, but this is rarely necessary.

In the case of Trump's Twitter suspension for 2 years, this was due to inciting violence.

As for Wolf, she's guildy of disinformation spam and of working as a professional "influencer" who is spreading that disinformation, especially when that disinformation can cause grievous bodily harm (GBH) and in some cases, death. Twitter has a policy (as to other websites) that forbids this, the consequence for which is a suspension.