Many websites have been developed to the final customer and not for search engines, resulting a large amount of errors in the page indexing. Sometimes the mere omission of an exclusion in the robots.txt file can cause a lot of mistakes that end up penalizing your site in the search results.
- Robots.txt : We carry for you 'entry points' for robots that visit your website, to provide them with a clear and precise way to your content, and a ban on index, for example, your administration, files 'marketing' or other documents not used your business. The robots.txt file is used to restrict access to the crawler crawls, which always begin by checking the file before visiting your site.
- Sitemap : Several URLs are important ? other lower priority, but still deserve to be indexed in Google ? We analyze your content and submit the most relevant Google based on priorities we assign all together. In the form of XML file listing all the important URLs on your website, useful information can be added, as the last modification (the response of the server according to), or 'priority' of a page relative to other.
- Webmaster tools : Real dashboards showing the performance of your website on search engines ocean, the 'webmaster tools' have become indispensable tools for optimization. We create these accounts for you, and configure.
manually according to your market and technical specifications of your site. It is possible, for example, take control over the parameters in URLs to exclude, to fight against duplicate content eg. Also it is possible to better geo-locate your site, to redirect to another domain, see suggestions for your code, and especially to monitor errors in the crawls to always be in control of your website SEO.