User talk:Enki

From Encyc

talk

Georgia's 6th congressional district[edit]

Encyc: Georgia's 6th congressional district is a congressional district in the U.S. state of Georgia. Georgia's 6th congressional district has existed since January 3, 2013.

Wikipedia: Georgia's 6th congressional district has existed since the 29th Congress (1845–1847), the first Congress in which U.S. representatives were elected from districts rather than at-large. Georgia gained a sixth U.S. representative for the first time in the 13th Congress (1813–1815). --TheWingedWarrior (talk) 09:47, 5 January 2022 (EST)

I took out the contested statement for now. Hopefully the original author will return and clear things up. Enki (talk) 13:53, 5 January 2022 (EST)

hi.[edit]

I would like to have moderations off, if you could??? If not I understand, thanks. Piki877778 (talk) 21:36, 30 August 2023 (EDT)

Done. Thank you for your contributions. Enki (talk) 13:54, 31 August 2023 (EDT)

Why Google doesnt push Rankings?[edit]

In my years working on websites I managed to learn a bit of Search Engine Optimization.

The current database has plenty of links directing Google's crawlers to dead end roads (the pages which are still to be created with red linking text). Ang Google doesnt like wasting their crawl budget, so that will encourage Google to crawl less frquently your website.

I recommend creating a very simple robots file and blocking all crawlers (the legit crawlers like Googlebot will respect the orders on robots.txt file, while the non-legit scrappers wont, so this is just to help Search Engines, but not a security measure).

Here there are instructions from Google on how to do so

File could be something like


  • User-agent: *
  • Allow: /
  • Disallow: */index.php?title=

Adittionally I would suggest you to create an XML sitemap )not sure how to do that without wordpress, but I´m sure you can investigate.

The XML sitemap gives a list of your existing pages, and also provides information about when the last update of each page done. So when Search Engines find your website,m they will inmediately go to explore the new updates done.

Said all that, if you dont want to create a sitemap )WIKIPEDIA doesnt have it either. Then you can remove the last line of the robots.txt file.

Let me know if you need any adittional help about the topic and I can ask a colleague who knows more about the topic.

Cheers!

Thank you for the tips. We did have a sitemap for a while. Mediawiki has a way to create sitemaps but it's a little complicated. I don't think it did any good, and it was very out of date, so I deleted it. Like you said, Wikipedia doesn't have one either. I do have Google configured to check the recent changes XML feed, which should help.
I will take another look at the robots.txt file. Enki (talk) 22:52, 11 October 2023 (EDT)
So I followed up on this and ended up copying Wikipedia's robots.txt file (cc-by-sa). There were a couple of complications like how images are handled and I just figured it would be easier to go with what works for Wikipedia, realizing a lot of the disallow statements will probably do nothing because we have different article names. If strange stuff starts popping up in Google results we'll have a good starting point for fixing it. Enki (talk) 22:44, 28 October 2023 (EDT)