Skip to Main Content

Google proposed the introduction of a new Internet standard when it comes to rules related to robots.txt files. 

 

These rules, entered in Robots Exclusion Protocol (REP), represent an unofficial standard for the past 25 years.

 

Although REP adopted by search engines, it is not yet official, which means that it is open for interpretation by developers. Further, it has never been updated to cover today's use cases.

 

As they say in Google-u, this creates a challenge for the owners website page because the protocol was ambiguously written, and made it difficult to write the rules correctly.

New rules for robots.txt

To eliminate this challenge, Google has documented how REP used in modern website-ui submitted it for consideration to the Internet Engineering Task Force (IETF).

 

Google explains what is included in the draft:

 

“The proposed draft REP reflects over 20 years of real-world experience relying on robots.txt rules, which are also used by Googlebot and other major search engines, as well as about half a billion sites that rely on REP. These modified controls give the publisher the ability to decide what they would like to be indexed on their site and potentially displayed to interested users. "

 

The draft does not change any of the rules established in 1994, it is only updated for modern ones website.

Some of the updated rules include:

 

  • Any URI transfer protocol can use robots.txt. It is no longer limited to HTTP. It can also be used for FTP or CoAP.
  • Developers must parse at least the first 500 kibybytes of the file txt.
  • The new maximum caching time is 24 hours or controlled data caching if available, which owners website The site gives you the flexibility to update your txt whenever they want.
  • Kada txt the file becomes inaccessible due to server errors, known unauthorized pages are not indexed for a reasonably long period of time.

 

Google is fully open to feedback on the proposed draft and says he is committed to doing it the right way.

 

Text taken from: www.searchenginejournal.com

Author Matt Southern

Made by Nebojsa Radovanovic - SEO Expert @Digitizer

Back To Top