Home

Content Moderation Quotes

There are 145 quotes

"Senator, our tools for identifying that kind of content are industry-leading."
"In today's environment, the conversation around free speech and content moderation is essential for the health of our democracy."
"We've always had processes for moderating content, as far back as human civilization goes."
"Matt implies it's YouTube's fault that predators are on the platform, and therefore, that YouTube deserves to be punished."
"If someone's going to remove your content, they need to give you an explicit reason why."
"It's an impossible task to actually moderate billions upon billions of posts effectively."
"Section 230... says that websites and platforms are allowed to engage in moderation without affirmatively taking on the burden of having notice of everything that is on their platform, and being liable for it."
"What's acceptable on social media is in a constant state of change, and behind each decision about what can stay and what should go, there's a big, expensive, often hidden industry of content moderation."
"You can't have a usable platform if you don't do some sort of content moderation; otherwise, every platform will just be porn and diet pills. This is expensive, this is hard."
"These are things that demand big investments. There is no cheap solution; anyone who promises that there's a cheap, easy way to solve these problems is just lying."
"Any new social media company should really take heed that your institution of content moderation strategies that allows you to be more inclusive, much more discerning, are going to make you a star in this next round of the internet."
"What makes this so hard at the end of the day is that society itself is always reevaluating what is appropriate, what is harmful, and what can be defended as free speech."
"We've got a big problem right now with social media companies and their failure to moderate content and the explosion of hate on Twitter."
"We have two responsibilities: to remove content when it can cause real danger as effectively as we can, and to fight to uphold as wide of a definition of freedom of expression as possible."
"The way to do this is to give American citizens a private right of action. Let them sue over the content moderation decisions as it relates to censorship. That would clear this up, I think, really quickly."
"Literally promoting the killing of people, isn't that some sort of violation of terms of service?"
"I would draw the line far on the side of being permissive with respect to content but I would not draw it all the way to 100."
"However, it doesn't mean the data don't imply things, and you know I think YouTube ought to think very carefully about whether it wants to confront two people who have the proper credentials."
"Matt became the sheriff and his audience were all deputized to go find more of this content and aggregate... it's a creepy thing to say but Matt should have assumed that some of his audience might not have been trying to help."
"We removed the targeting category. We removed pseudoscience as a category."
"The solution was to pressure platforms to enforce their rules both by removing content or accounts that spread disinformation and by more aggressively policing in the first place."
"The solution she concluded was to pressure platforms to enforce their rules, both by removing content or accounts that spread disinformation and by more aggressively policing it in the first place."
"The previous 20 years have been...saying what we were going to do and not doing it."
"I believe there is a line in which we do ban content and that censorship is not inherently evil."
"Wow, they're actually removing the content that's bad."
"I'm hopeful that Elon Musk is going to bend Twitter's content moderation towards a greater embrace of free speech."
"If they're platforms, they can't be selecting and banning people."
"This goes against the basic principles of behavioral learning... if creators don't know why their videos are unacceptable then they can't fix them."
"Maybe we do want to take down the Trump tweets even if we don't like them because they are inciting violence."
"There's no appeal process. There's no real appeal process. And so your post gets blurred out and then it gets buried into oblivion by the algorithm."
"In order to facilitate content moderation, these big tech companies create tools."
"Apologies for the comments section. YouTube was able to demonetize people at the blink of an eye, delete channels at the blink of a [ __ ] eye, but they can't manage this damn spam thing."
"The YouTube kids app doesn't have any cursing."
"35% of the URLs flagged were actioned under remove, reduce, or inform policies."
"Section 230 avoids what has been called the moderators dilemma ensuring that websites are free to remove objectionable content without increasing their legal risk."
"The First Amendment applies to the government... it does not apply to restrict content of private companies."
"I think these platforms have a moral responsibility to filter out content like that... without fearing increased legal liability."
"Nothing is off-limits, but if you're just trying to spew a bunch of hate or [__], then people are gonna flag that."
"What's the right remediation here? Do we interstitial the video?" - Employee grappling with moderation decisions.
"YouTube will no longer allow videos that maliciously insult someone based on protected attributes such as race, gender identity, or sexuality."
"The thing people would not expect is that if there's any algorithmic stuff going on at Twitter or there any human moderators who are being biased in their decisions you would not have that on parlour."
"Twitter executives divided into content moderation and PR camps."
"If YouTube could just tell us what they don't like and ask us to remove it, we'd gladly do it."
"Removing pornographic and kink books from middle schools is a very, very good one."
"Twitter failed to take action allowing the content to be viewed over 167,000 times."
"Deplatforming and taking content down is a very different step from not recommending that content."
"Reviews protect consumers from mods that may not contain the kind of content they want."
"I know some people don't always want to hear the cursing, which is fine. It's a lot."
"There's no list of words or key words that is going to go into our classifiers making an a priori decision about whether videos monetized or not."
"It's sad you know but there's a lot of crazy content on YouTube and I do think that sometimes YouTube does pick and choose who they want to punish because again."
"Currently got a strike on YouTube for promoting gambling when I've never gambled in my life."
"YouTube finally went scorched Earth on this type of content."
"Wow, and that's the perfect thing. It's like, because it's like if they call me like a real slur or something I couldn't even show that."
"It's not about proactively cleaning it up, it's just deciding whether or not it was valid once something has already been removed."
"Showing Musk exactly why certain content moderation and verification systems are in place has been extremely satisfying."
"Six Collective seconds of similar content cannot be weaponized into the deletion of an entire popular Channel unless they on mostly apologize when the accuser has a track record of then weaponizing the apology to inflict as much damage as possible."
"Inappropriate or explicit content has absolutely no place in our game."
"This guy needs to be stopped. This guy needs to be stopped."
"Our approach will require online platforms to eliminate illegal content."
"OpenAI released content moderation tools based on large language models."
"You have to draw a clean line that says you allow it all, allow it all the only things you can't do or direct threats of violence and daxing people and shit like that but you just you have to take a principled stand for free speech."
"If there was a way to handpick a specific amount of Reddit users or Twitter users to get rid of, you could literally decrease the amount of dogshit comments by ten thousand percent."
"There's not a YouTube channel in existence that I know of that puts as much care into moderating the comments."
"They're trying to age-restrict me twice for the same video, three times after they've said, 'Oh, after review, we found that video okay, and now they're like, 'Nah, now it's not okay.'"
"No free speech absolutism and Protect free speech you can't moderate content and Protect free speech at the same time."
"There is no hate, malice, racism on the site."
"Elon Musk paid 44 billion dollars to discover what we already knew: content moderation is messy."
"Profanity is being cut down and as you know we swear like sailors here, at least I do. I don't know Joe and, unfortunately, it's an issue with YouTube."
"Anything with the word war or room possibly or Afghanistan could be offensive to YouTube's little algorithm."
"Lines and veils are just a tool to make sure that the content in the game doesn't make anybody uncomfortable."
"There are real problems with the abuse of these platforms. We do want things like calls to genocide from people in positions of power to be removed." - Edward Snowden
"Plague moth claims his Discord server never had uncensored gore, contradicting reports."
"I'm not anti-free speech. I'm not in favor of him being banned. I think almost everyone should be allowed on YouTube."
"Your server should in fact have an 18 plus rating or in general be removed."
"Unfortunately, YouTube moderates and cuts content that's like 18 plus. I think that's why horror videos probably get hit with it."
"I hope your channel gets shut down. Yeah, like what do you call that?"
"The comments have now been removed from the video, which really just says it all."
"That was by far the bloodiest video we had ever filmed. And surprise, surprise, we had to censor it. Fighting YouTube to try to get it monetized, and then I just had to end up censoring it. I just felt like a piece of sh*t by the end of it."
"Deplatforming works to limit the spread of harmful content."
"I even clean up my comments. I have a huge filtered list inside of YouTube where I remove profanity, I remove hate speech and racist comments. I remove all of that stuff and I try to keep my comments as brand friendly and as PC friendly as possible."
"These are your own rules on your own platform these go against the rules on your platform that's why I'm asking you if you had if you said listen we allow everything but that's not what your content rule say and that's why I'm asking you why are they still there."
"Be cleverer than that. Let the stupid content remain stupid without commenting, and then it'll disappear like it never happened."
"Parents should be able to control the content their kids consume, not have the option taken away from them. This is Disney."
"If CSAM content is posted on X, we remove it and now we also remove any account that engages with CSAM content, whether it is real or computer-generated."
"You don't mess with people's platforms like that willy-nilly. You don't just tell your audience to report people's content and get it taken down because you're upset by it. That's not how this [__] works."
"We very much believe that heavy-handed content moderation on a service like ours would kill the business and would be the wrong thing for the world."
"YouTube eventually removed most of the videos, however, damage had already been done."
"What steps are you taking to improve the AI or whatever else you're doing to limit this content? For example, if you search for certain search terms, we do direct you on TikTok to safety resources. That's one of the things we have done, and we will continue to invest in this."
"The vast majority of our users come to our platform for entertaining, safe content, but there are people who do spout some dangerous misinformation, and we need to take that very seriously, invest in it, proactively identify it, and remove it from our platform."
"I'm not saying you have to agree with em but if I see any hate in the comments I will not hesitate to delete your comment and block you from my channel."
"Removing content like that would represent a significant incursion into traditional boundaries of free expression in the United States."
"The various value judgments that are embodied in its content moderation standards... embody a judgment of 'this is material we think might be of interest to our users'."
"The game-changing feature here is where it will automatically remove your bad takes."
"We've gone from proactively identifying and taking down about 20% of the hate speech on the service to now we are proactively identifying about 94% of the hate speech that we end up taking down."
"The whole reason that the content moderation is what it is has been to satisfy the advertisers."
"We do have content moderation, it actually is built in."
"We're finally cracking down on content that features minors."
"We really need more people on the inside reporting this content."
"Content that praises, glorifies, or encourages viewers to imitate anorexia or other eating disorders... should be removed."
"If content is actively hurting minors, it should be removed or at the bare minimum age restricted until it has changed."
"We are investing more and more every day into bringing down content that is dangerous or damaging."
"We have clear content policies and when we find violations on our policies, we do remove those videos."
"Luckily though, amidst the aforementioned astronomical turmoil, there are some areas where most people agree about what constitutes offensive content, especially when it comes to shielding children."
"Tik Tok told the journal that it bans the sale of underage modeling content."
"We have an API called the Vision API and it is able to do things like object detection and also able to flag explicit content."
"Placing cautionary labels and ratings on commonly agreed upon troubling content is appropriate because such markings allow others to freely choose what they like to experience."
"It's a complex topic, this is why it's going to be a very difficult thing to try and even moderate on YouTube."