Security NOT through obscurity against SEO?

Thanks! That’s a pretty long thread but a very insightful one. So it seems that the challenge in creating a successful algorithm is less about the technology, more about the acceptance. That people need to change their thinking on content moderation, that it’s less about the content, more about the behavior, to have a consensus of what constitutes an effective content moderation tool.

Can you enumerate in what ways do they differ? So far I see the challenge to be quite similar, that SEO garbage have patterns that can be detected, just that some would protest loudly if you did so (even though it would be beneficial to almost all users). Perhaps a method like what Block Party did could work, i.e, stash the sites filtered on a box that users can check? That would avoid censoring content, allowing users to compare the sites filtered to the search results, and let users provide feedback if there are mistakes.

Or are there other issues that I have overlooked? Like is it because contents in sites are longer than tweets and you lack computing power as @Josh has mentioned before regarding snippets?