Forum - View topicNEWS: WIRED: Buffalo Mass Shooting Victims' Families Sue 4chan, Good Smile Company, Others
Goto page Previous 1, 2, 3, 4 Note: this is the discussion thread for this article |
Author | Message | |||
---|---|---|---|---|
ViviP
Posts: 73 |
|
|||
I wasn’t talking about racially motivated mass murder, I was poking fun at people trying to turn a racially motivated mass murder into capital for political sports. But I suppose sarcastic and humours tones in a fairly negative environment can be perceived as disrespectful, so I get you. |
||||
|
||||
BadNewsBlues
Posts: 6076 |
|
|||
Yeah which leads to a never ending battle involving things such as racism, sexism, and intolerance towards certain groups. And that’s even if these terrible ideologies are allowed to be challenged as countries like Iran, North Korea, Saudi Arabia, Afghanistan perfectly encapsulates what happens when the needle moves to far to the other end of bad. People who believe in the aforementioned things don’t need a platform with which to explain or rationalize their hatred. Especially when we have about 3000 years or so that demonstrates what happens when you do this and countless lives lost as a consequence.
Except the Great Replacement Theory the guy used to justify his actions is something that Conservatives have largely promoted. They’re also the ones who constantly complain about legit racists being called out because you’re somehow not supposed to do as it’s unfair or some bullshit. But these same people whine about LGBTQ people & drag queens being pedophiles & groomers based off not much of anything. |
||||
|
||||
ChirashiD
![]() Posts: 127 Location: WA |
|
|||
In general, I like the idea of companies that run social media communication platforms to be held liable for the actions and speech of users of that platform, but only under the following conditions:
1) The user is an employee, site administrator, representative, official spokesperson, columnist or partner of the company. 2) Regular users must have agreed to one of those user agreements that stipulates their speech becomes intellectual property of the site or company that owns the site. Usually there is no such agreement for regular users because no company wants to be held liable that easily for someone else's crap. Unfortunately there's also the issue of censorship. If users are solely responsible for their words rather than the site taking the blame for any incidents users cause, I think a reasonable compromise for that is to allow the site a certain level of policing ability and control over users' speech. It is a protective measure to both abusive users and their victims whether those victims may be directly or indirectly harmed. But places like 4chan, which are known for completely lacking any sort of control over speech and expression and that being the reason people contribute to it and whether any liability rests on the site owner for incidents, perhaps it's time for some precedent to be set via litigation or legislation. |
||||
|
||||
Rentwo
![]() Posts: 184 |
|
|||
A lot of websites simply wouldn't be able to exist if they became liable for what their users did. YouTube being held legally liable for anytime someone uploaded copyrighted infringing content would be a nightmare and make the site unsustainable because no company wants to deal with that headache and litigation. If something like Section 230 was ever repealed then the internet as we know it would cease to exist. There's advantages and disadvantages for both, but most people seem to agree laws like Section 230 are necessary for the internet to function as it does now and the benefits outweigh the negatives. Imagine a world where all someone had to do to get Anime News Network sued was post bad stuff on the forums. Nobody who owns or runs a site would want something like that.
This isn't true at all. 4chan has rules just like any other site. Their TOS and Guidelines are publicly available for anyone to read. In terms of actually illegal content and abuse, you'll find a lot more on sites like Instagram, Facebook, Twitter, Tumblr, Telegram, and Reddit hosting illegal content like child abuse images, drug and human trafficking, and other actual illegal activities. Posting illegal stuff in 4chan is probably the worst place to do so as you'll easily be caught and reported compared to other communities which have automated and generally lacking content removal systems due to the volume and scope they have by comparison. As for content and speech that isn't illegal but people find personally immoral or wrong, that's all subjective so it's pretty hard to argue either way. But most sites do have varying limits of control in determining what is and isn't allowed on their platform. For example, for the longest time a lot of Black creators would choose to stream on YouTube over Twitch because Twitch has a hard ban on the n-word and it was inconvenient for a lot of folks who casually use the word in their everyday speech because they'd end up getting banned from Twitch. YouTube on the other hand allows it to be said, so people like Etika and IShowSpeed found that platform more appealing and welcoming to their streaming style. Whether you personally side with Twitch or YouTube's view on the subject is up to you, but I think it's good there's options out there, and if people don't like one verses the other then they can personally choose which to watch. |
||||
|
||||
Philmister978
Posts: 322 |
|
|||
It's not lacking in control (people have been banned for minor infractions and flame wars, believe me, I have first-hand experience after flaming on /co/). It's that it's incredibly lax and inconsistent. Really the same thing applies to most social media sites when they aren't being authoritarian towards one side of the other. YouTube is a great example of how their moderation is incompetent as they let people with racist, disgusting, and misleading content slide, but copyright is a huge no-no. despite someone promoting armed murder being a lot worse than someone posting a song without the original creator's consent imo (doesn't help that their DMCA system is borked and abusable by anyone who gets easily offended). 4Chan is pretty much that. The moderation is not consistent on a day-by-day basis. One day, illegal content is a-okay, the next, it's an automatic ban. To say nothing about how some mods have huge biases towards some content that personally offends them, or how some boards are pretty much dead to the point where they aren't moderated at all because of how pointless it'd be in their eyes. Not even /pol/, the second most controversial board on there next to /v/ has a consistently applied moderation system and its users have complained about it. Most of its users are right-leaning (even those from outside the US, as it's the only board to use a flag system), and they've noted how bad the moderation there is. It really makes one think just how the site hasn't imploded unto itself by now. Generally speaking, for 4chan, the red boards are the more NSFW parts of the site; not that it really stops the blue boards from posting some nasty things too. And that all comes down to how incompetently moderated the site is. |
||||
|
||||
Los Nido
Posts: 132 |
|
|||
What exactly do you mean by "promoting armed murder"? Anything actually encouraging violence is against YouTube TOS in addition to also being against US law. Emphasis on the "actually" part. I feel a big problem with social media is there's a lot of hyperbole out there when it comes to the way people react to content on it. Personally I'm of the mind if something isn't actually illegal then I think it's fine for a site to be okay with it and choose not to moderate it if they so choose. Technically, being racist isn't actually illegal, it's what you do with those feelings that matter like discrimination, threats, or violence. You can't automatically arrest someone for being racist, or even saying something racist because it would mean stuff like comedy would be illegal. So you can take down those clips of Family Guy or South Park racist joke compilations for copyright infringement if Comedy Central or Fox didn't upload them, but just because the jokes in them are racist I don't think should be reason enough to do so or not allow them on YouTube or other content like that. There's also the journalistic angle where news channels reporting on and showcasing racist incidents or content should also probably be left up as well for documentation purposes and exposing issues to the public. |
||||
|
||||
Moses34
Posts: 8 |
|
|||
You're right, I did not read his delusional ramblings. However, I'm not surprised. I agree with Philmister978 on this point. As for the 4chan stuff, you're also correct in the sense of it lacking shock content and dedicated shock boards. But that is an obvious issue that is thankfully gone. A much more deep-rooted issue that's impossible to control are the problematic ideals that slowly spread throughout the site in the last ten or so years regardless of political inclination. Sure, we don't have boards spammed with illegal content anymore, but now every single board regardless of how "innocent" it is has radicalized posters from all possible sides inciting violence against whoever they disagree with. A reflex of today's society, of course, but a manufactured one nonetheless that has been rampant for the last ten or so years. Regardless of where you go or what board you look at, you'll find the same radical lingo and dogwhistling left and right, as even asking for gardening tips becomes inherently political to them. I think it's definitely fair to criticize that. "But it used to be worse!" isn't an excuse to let radicalization slide when it leads to many disasters like this. But back to the lawsuit, it won't affect any of the above because there isn't really anything to be done about that. Nor do I think the families had anything concrete in mind regarding that, other than bringing attention to the issue and any websites that the guy browsed. No amount of moderation would stop that content or the flow of it anymore. |
||||
|
||||
medicinodestiny
Posts: 33 |
|
|||
It is kind of strange Twitter isn't named in this lawsuit because if you didn't specify 4chan I'd think you were talking about them. I still remember Kenta Shinohara's deleting his Twitter after being attacked by political obsessed crazies and other mangaka or artists who've been threatened with harm over this stuff. Some even driven to suicide... sigh. The only time I would ever support censorship is if it's equal. As long as people could agree radicalization of any kind is bad and not just certain kinds they don't like then maybe it's worth discussing. But there's a lot of people who got it into their heads that inciting violence and hate is okay as long as it's directed towards someone they think deserves it. A lot of people who tend to make a big issue about radicalization have been radicalized themselves. Either they don't realize it or they've jump through a lot of hoops to explain why them threatening and encouraging violence is different than when other people do it. "Fighting the good fight" and all that. But I have little faith censorship and regulation wouldn't favor one side more than the other because a human has to do it at the end of the day and humans are flawed. |
||||
|
||||
ATastySub
Past ANN Contributor
![]() Posts: 663 |
|
|||
Thank you for both sides-ing the response to a racist mass shooting as just as radical as the racist mass shooting. Truly the horseshoe theory is the one guiding principle we should all strive to follow. I guess we have to wait for an equivalent to White Supremacist Great Replacement rhetoric and its accompanying violence before anyone can do anything about it otherwise it's just too unfair. |
||||
|
||||
FinalVentCard
ANN Reviewer
![]() Posts: 533 |
|
|||
Yeah, I remember when the 1921 Tulsa bombing happened and the Holocaust went down, we stopped all of that with a healthy debate. ![]() I'd like to remind folks that you should be careful in your arguing that hatespeech should be allowed because muh freeze peaches. We have rules against that in the forums. Anyone claiming that 4Chan is "better now" than it used to be has a lot of threads about open nazism, transphobia, misogyny and queerphobia openly out and about to answer for. Hate speech isn't considered free speech; if GoodSmile wants to make its money off of a website that spews hatred, they can answer to the people who suffer from it. |
||||
|
||||
ChirashiD
![]() Posts: 127 Location: WA |
|
|||
I thought the key thing about 4chan was that everyone uses aliases or the anonymous tag, which yeah I know isn't a feature that's unique to 4chan. As long as you're able to sign up to your preferred social media site without providing any identifying info, the police won't come knocking on your door no matter how illegal your words or images are. Your IP may reveal your general location, but there are ways around that too. |
||||
|
||||
Fedora-san
![]() Posts: 464 |
|
|||
All anyone needs is your IP and they know who you are. They can contact your IP, ask them for the account and person tied to it and go from there. That's how people get dinged for torrenting. Every website knows your IP when you access it. Sure, you can use VPNs or Proxies but most sites can detect those and block access, and in the case they do allowed them the police can just contact the VPN to get them to give them the info in the case of illegal activates which every VPN forbids you from using their service for to begin with You're never truly anonymous online. Even if you do something clever like log into your neighbor's wifi and do it that way there's still your MAC address that ties your specific computer device to the network that they can look at. But most people don't jump through all those hoops at all and just post normally. People get arrested all the time for making threats on the internet even on throwaway accounts with no identifying information. |
||||
|
||||
FinalVentCard
ANN Reviewer
![]() Posts: 533 |
|
|||
Even other 4Channers can sus each other out, as can be seen by the detective work other anons put into tracking down an animal abuser in the case of Dusty the Cat. So yeah, all of 4Chan's claims of anonymity are hot gas. |
||||
|
||||
GhostStalkerSA
![]() Posts: 425 Location: NYC |
|
|||
So /pol/ was explicitly implemented as a containment board by moot prior to him selling it, in that all the nasty white supremacist political stuff would be quarantined there and prevented from spreading to the rest of the boards where people were just discussing anime or cosplay or vidya games or mecha or hentai or music or weapons or trad games or quests or Yuri or whatever. 4chan is a lot more than /b/ and /pol/ even if those two boards get the most attention. It’s become clear in the last half decade that this containment strategy has failed, even after some of the most right wing stuff hived off to 8chan/8kun after discussion of GamerGate was banned by moot; in that /pol/ has pretty much infiltrated every other board slowly but surely. Sure, the jannies still hammer stuff that’s egregiously off topic, but the low level stuff stays under the radar. One of the biggest problems was that /pol/ originally started off with a core of actual racists and anons posting “ironic” racism, and then that actual racist core soon took over and attracted more of them as the “ironic” posting turned into actual racism and spread their noxious ideology. Doesn’t help that Hiroshima Nagasaki Hiroyuki Nishimura is notoriously hands off with his ownership and has basically delegated moderation to one of the head jannies, who goes by RapeApe, and who is apparently responsible for /pol/ content getting more prevalent over the years, starting during the lead up to the 2016 election. However, shutting down /pol/ would arguably be worse, because then the thin veneer holding back the tide would turn into a flood of that outright crap flowing over the breached dam into the other blue “SFW” or otherwise productive discussion boards. It’s been amusing (read, frankly off putting) watching 4chan turn from this misunderstood boogeyman who spawned memes central to the early internet like Rickrolling and Chocolate Rain and mudkipz and lolcats and Pool’s Closed/Habbo Hotel and the Anonymous protests against the Church of Scientology with a very edgy veneer and core in the mid 2000s (when I first started browsing it in college) that spawned breathless news reports that were memed to hell and back when things invariably broke containment of the Internet morph into something decried as one of the main nexuses for all the nastiness of white supremacy and far right wing thought and radicalization since the 2016 election. I had already stepped back from 4chan around the time moot sold it to Nishimura, but it definitely got a lot worse after that time. I think one of first times I realized this (besides the random shock imagery and porn and sometimes actual CP that was posted on /b/) was when I started seeing an anon attempt to mobilize right wingers on /b/ in the early 2010s to support Hal Turner, an Internet right wing radio host who was being targeted by the Feds for posting threats against a number of federal judges in New Jersey dealing with a divorce case or something involving him and who eventually went to jail for it, without knowing that Turner and his radio show were one of the people that Anonymous raided with spam calls and ddoses in the late 2000s before Anonymous was a thing turned on against Scientology and the like. Sure, that anon was shouted down like a lot of the people who try and get the horde of anons to do something to their benefit that didn’t involve the lulz (remember, neither /b/ or 4chan are your personal army and all that), but it still struck me. Like, the turnover in anons in half a decade and people not lurking more to understand board culture made me realize how different a site like that could get in such a relatively short time. Also, thinking back on this, it makes me realize how much 4chan (and it’s originator in Something Awful, which itself has become a lot more liberal over the past 5 years or so as a lot of the Trump supporters were banned and Lowtax was forced to sell the site after a user revolt following credible accusations of domestic abuse; plus 2chan) had laid the memetic backbone of the early pre-web 2.0 Internet, and how that ironic snarking anonymous meme culture ultimately spawned the right wing movements that infest the web today. If I were to mark a turning point besides the run up to the 2016 election, I would likely say GamerGate and the fallout of moot banning discussion of that leading a large number of right wing anons to revolt and further radicalize, leading to where we are today. I know a number of interesting books have been written on this radicalization pipeline from memes to right wing nastiness, and I really would love to read more about this subject and see if it conflates with my memories of the early internets and how imageboard culture changed over time in the 2010s. |
||||
|
||||
All times are GMT - 5 Hours |
||
|
Powered by phpBB © 2001, 2005 phpBB Group