This might be a real stupid question but why is discover updating to a lower version? Is there any place I can read up why this is the case?
P.S.: Yes, I could have absolutely google this but lemmy is about more than just shitposts and memes imo. Asking some rather noobish questions will make us appear on google btw.
Nah, we already are on google but it depends on a lot of factors. A lemmy frontend is just another webpage so google will crawl it if you allow it. So if an instance disallows it specifically or has incorrect/unfamiliar sitemap it might not work.
hmm, maybe that’s the case for lemmy.world. I wasn’t able to find anything from my comment or post history, even though I copied it exactly into the search.
edit: just had the idea to put quotation marks around the thing I copied to search for that exact string, and now I found it. Yeah, it really just seems like lemmy pages are not very popular results so google pushes them all the way to the bottom.
Exactly. The reason is that google favors pages that play the algorithm instead of actual content. That and popularity.
The major difference between lemmy and reddit is that there’s many instances for search engines to crawl, compared to a single reddit.com. They likely treat each instance seperately, which leads to a lot of duplicate content and most of lemmy isn’t search engine optimized.
Sadly I don’t see a better way to do it than for search engines to be optimized for this kind of federated platforms. It’s not obvious from the outside which is the preferred instance to show to a user.
I’ve had some luck finding content on lemmy by forcing a specific instance using
site:lemmy.instance.domain
, but it depends on the search engine whether it’s respected.yeah good point about multiple domains