Basic SEO Guide for Da Vinci: Trace Files

There are files to make your site easier for search engines to read, helping in the ranking of SERP. Check out what they are and how to use them on Da Vinci.

Sitemap

 

In simple terms, Sitemap.xml is a list of all the pages (URLs) of a website. It maps out how the site is structured and what is included in it. In other words, the sitemap is a file designed to facilitate the process of indexing pages on search engines such as Yahoo!, Bing and Google.

 

At Da Vinci:

 

Sitemap in Da Vinci

 

Or, to access the site's sitemap file, click Settings, Search and metadata, and Simple XML Sitemap.

 

Sitemap in Da Vinci

 

Robots

The robots.txt file, also known as the robot exclusion protocol or standard, is a text file that tells internet robots which directories and pages on your website should or not be accessed by them. For more detailed information access: Google Documentation on the Robots file

 

The commands in robots.txt work similarly to HTML and different programming languages. These commands will be followed by the robots to navigate and find the pages on your website.

 

In Da Vinci: select the option “Meta Tags” and then “Advanced”. There you will find a list of options about the page to add to the robots.txt file.

 

We instruct that for your content to be read and indexed by search engine robots, use the index option.

 

Robots.txt in Da VinciRobots.txt in Da Vinci

To access your complete robots.txt file, click Settings, Search and metadata, and RobotsTxt.

 

Robots.txt in Da Vinci

 

If you want learn more about SEO Techniques at Da Vinci, check out other links:
Meta tags, Keywords and URLs;
Heading Tags;
Optimized images.