When use force.com to public a website, it will be unsure whether the site can be indexed by google search engine or not if you don't set Site Robots.txt. Normally it is hard to be indexed.
To make it 100% can be indexed, it is better to create a robots.txt file. It is easy to use Visualforce page to create one.
Step 1.
New a Visualforce page. Ex. named as 'robots'.
Step 2.
Set contentType of tag <apex:page> as 'text/plain' to make Visualforce page 'robots' will be treat as a text file. As below:
By the way, it will be unindexed as follow:
Set Site Robots.txt. In Setup | Develop | Sites | Edit
Use magnifier glass mark to find the Visualforce page 'robots' or just input the name, and save.
To make it 100% can be indexed, it is better to create a robots.txt file. It is easy to use Visualforce page to create one.
Step 1.
New a Visualforce page. Ex. named as 'robots'.
Step 2.
Set contentType of tag <apex:page> as 'text/plain' to make Visualforce page 'robots' will be treat as a text file. As below:
<apex:page contentType="text/plain">
User-agent: *
Disallow:
Allow: /
</apex:page>
*If there are something that don' t want to be indexed, you can disallow they. Ex. 'Disallow: /unindexed_page/'. Same to Allow.By the way, it will be unindexed as follow:
<apex:page contentType="text/plain">
User-agent: *
Disallow:/
Allow:
</apex:page>
Step 3.Set Site Robots.txt. In Setup | Develop | Sites | Edit
Use magnifier glass mark to find the Visualforce page 'robots' or just input the name, and save.
No comments:
Post a Comment