It’s now almost two weeks since I updated and optimized my BizzThemes website and Google still hasn’t refreshed a whole site index. What the hell, I thought. So, I went through to research the issue of Changing Google’s crawl rate, which I’m sure a lot of website owners have – probably not even aware of it.
So, what’s the big deal?
The problem with website crawling from search spiders is most obvious, when you update your site, change its structure or transfer it to another domain. If your site is popular, you wont have much trouble as crawl rate is by default a lot faster, but if your site is average, Google will treat you like a cheap hooker – it will give you some attention, but only as much to keep you on the street.
What are the hazards?
- Drop in search rankings
- Slow SEO recovery
- Decreased page rank
- Removal of your site from the index (worst case scenario)
The solution?
I tried almost everything:
- Redirected 404 error to valid URLs
- Refreshed sitemaps
- Re-submitted sitemaps to search engines
- Fixed all other crawl errors
- Refreshed robots.txt
With little success. Now, understanding how Google pays its hookers, I’m testing the most obvious solution, which works, guaranteed:
- I’m refreshing my updated site by linking to it from other web sources, just like this post =)
I’m not sure if this post will help you with your own site SEO, but I’m positive your site will become a better web hooker, if you attract Google’s attention with fresh back links to it – pretty obvious, when you think about it.