16.1 C
Washington
Sunday, October 2, 2022
spot_img

Google: 5 Ways To Prepare For Site Closure Via @sejournal, @MattGSouthern

Date:

Share:

Are you planning to close your website for a day or longer? According to advice from Google’s Search Advocate John Mueller, here are five ways to prepare.

Mueller shares this advice in tweets while linking to relevant Google help pages.

Spoiler alert — there’s no good way to close a website temporarily. You should avoid doing it if at all possible.

However, there are things you can do to keep the negative impact to a minimum.

Mueller’s recommendations include:

  • Use HTTP 503 status code
  • Keep HTTP 503 up for no longer than a day
  • Change robots.txt file to return 200 status code
  • Prepare for consequences if the site is down longer than a day
  • Expect reduced crawling from Googlebot

More detail about these recommendations and how to deal with the negative impact of taking a site offline is explained in the following sections.

1. HTTP 503 Status Code

When taking a website offline, ensure it serves an HTTP 503 status code to web crawlers.

When web crawlers like Googlebot encounter a 503 status code, they understand the site is unavailable and may become available later.

With a 503 code, crawlers know to check on the site again rather than drop it from Google’s search index.

Mueller explains how to check for a 503 status code using Chrome:

1. They should use HTTP 503 for the “closed” pages. You can check that in Chrome, right-click: Inspect, select “Network” on top, then refresh the page. Check the top entry, it should be red & show 503 Status. pic.twitter.com/dkH7VE7OTb

— 🌽〈link href=//johnmu.com rel=canonical 〉🌽 (@JohnMu) September 19, 2022

2. Keep 503 Status Code No Longer Than A Day

Googlebot will go back to a site after initially encountering a 503, but it won’t keep coming back forever.

If Googlebot sees a 503 code day after day, it will eventually start dropping pages from the index.

Mueller says, ideally, you should keep the 503 status code for a day at most.

“Keep the 503 status – ideally – at most for a day. I know, not everything is limited to 1 day. A “permanent” 503 can result in pages being dropped from search. Be frugal with 503 times. Don’t fret the “retry after” setting.”

3. Robots.txt – 200 Status Code

While pages of a closed website should return a 503 code, the robots.txt file should return either a 200 or 404 status code.

Robots.txt shouldn’t serve a 503, Mueller says. Googlebot will assume the site is entirely blocked from crawling.

Additionally, Mueller recommends using Chrome DevTools to examine your website’s robots.txt file:

2. The robots.txt file should return either 200 + a proper robots.txt file, or 404. It should *not* return 503. Never believe it if the page shows “404”, it might still be a 503 – check it. pic.twitter.com/nxN2kCeyWm

— 🌽〈link href=//johnmu.com rel=canonical 〉🌽 (@JohnMu) September 19, 2022

4. Prepare For Negative Effects

As we mentioned at the beginning of this article, there’s no way to take a website offline and avoid all negative consequences.

If your website will be offline for longer than a day, prepare accordingly.

Mueller says pages will likely drop out of search results regardless of the 503 status code:

“Hmm.. What if a site wants to close for >1 day? There will be negative effects no matter the option you choose (503, blocked, noindex, 404, 403) – pages are likely to drop out of the search results.”

When you “open” your website again, check to see if critical pages are still indexed. If they’re not, submit them for indexing.

5. Expect Reduced Crawling

An unavoidable side effect of a serving 503 code is reduced crawling, no matter how long it’s up for.

Mueller says on Twitter:

“A side-effect of even 1 day of 503s is that Googlebot (note: all of this is with a Google lens, I don’t know other search engines) will slow down crawling. Is it a small site? That doesn’t matter. Is it giant? The keyword is “crawl budget”.”

Reduced crawling can affect a site in several ways. The main things to be aware of are new pages may take longer to get indexed, and updates to existing pages may take longer to show in search results.

Once Googlebot sees your site is back online and you’re actively updating it, your crawl rate will likely return to normal.


Source: @JohnMu on Twitter

Featured Image: BUNDITINAY/Shutterstock

FREE GIFT







more like this

A Few Tips For Search Engine Marketing

Some individuals prefer to complain, while other folks put their efforts into building a good living by operating a business. Should you be happy to change, keep reading to learn how to utilize SEO on your website.When publishing content, it's best to post multiple short pieces on topics which are similar than to share extremely

Searching For Effective Marketing With Email Techniques? Try These Ideas!

Perhaps you have attempted to launch an email marketing plan, but have thus far neglected to achieve your goals? Are you presently thinking about marketing with email, but aren't sure best places to start? You've come on the right place! From the following paragraphs, you'll find advice that will help you put together emails that

Increase your Domain Authority with High Quality Backlinks MOZ DA 50+ Seo

 Buy Now  Price: 9990 USDIncrease your Domain Authority with High Quality Backlinks MOZ DA 50+ Seovar descGaugeInfo = {descST:(new Date()).getTime()}; window.onerror = function(message, url, line)...

The Zen of Social Media Marketing: An Easier Way to Build Credibi

 Buy Now  Price: 451 USDThe Zen of Social Media Marketing: An Easier Way to Build Credibi eBay Book
spot_img

LEAVE A REPLY

Please enter your comment!
Please enter your name here