Getting google to index your site

While Google is a fully-automated search engine that will naturally find your web site with 'spiders', we aren't patient people and sometimes we want to get our content listed / updated as soon as possible. This article looks at some of the tricks of the trade for getting your content updated more quickly.

How does it all work?

So before we get started on how to get your content recognised, here's a brief overview of what's going on behind the scenes... Googlebot is the name given to the software that gathers information from web sites to be used in their Google's search engine (sometimes referred to as a 'spider'). The process of getting results listed on Google is primarily made up of two things:

  • Crawling: the process of Googlebot finding new data on websites using links.
  • Indexing: data collected from the crawling process is used to build up an 'index' which is used for Google search results.

How does Google find new domains?

Google is a domain registrar and has no problem picking up new domains, even if they don't have any inbound links. Google will crawl sites already in the index, new domains and sitemaps to build up the data it requires for its index.

So Google is already fully automated, but there are still several things you can do to help your site get indexed faster... 

How do I get google to index my site?

Webmaster Tools

If you don't have web master tools yet then this is first place to start, go and create a Google account to open up a world of options for improving and monitoring your online presence.

With webmaster tools you can select your web site and ask Google to crawl it, this can take up to 15 minutes but once successful you can submit this to the index. For a step-by-step take a look at the Google Support Page. Note most of the references are to search a specific page, but you can just ask it to crawl your homepage and then index all the related links.

The fetch status will fail if you don't have a 'robots.txt' file on your site root, this should contain the following lines to inform search engines to crawl everything:

User-agent: *
Allow: /

Just ask!

Google also lets you submit URL's directly if you don't have web master tools. 

Submit a sitemap

Site map generators will create an XML file that you can upload in web master tools. This informs Google about the structure of your web site and the frequency that the pages change. Files can be uploaded under 'Crawl' > 'Sitemaps'. If you have a site that changes regularly (e.g. a blog, you might want to look at creating a sitemap that is automatically generated reflecting new content).

Create some social accounts

Given that Google finds sites via links, creating some social media accounts with the likes of Facebook, Google+, Twitter and Co is a great way to get your new site url out there on already well-indexed sites. 

Follow guidelines

Make crawling your site easy for search engines by following best practice for syntax and structure. Here's a few tips:

  • Run your site through an HTML validator to ensure the markup is valid
  • Have a clear linking structure with all pages accessible within two clicks
  • Think about the keywords used especially in key areas like heading tags - when your site does get indexed what are your visitors going to be searching for?
  • Check your site for any broken links.


Sign Up
comments powered by Disqus

Popular Tags

Need a web developer?

If you'd like to work with code synthesis on your next project get in touch via the contact page.