Visibilitas Mesin Pencarian V1 Bantuan

Selecting website pages

The Control Crawling option of Search Engine Visibility lets you control what pages of your site can be crawled, and generates the required file or tags.

From the Optimize tab, select Control Crawling to get started.

To Control Crawling for your website

Each tab lets you perform the indicated function and generates the necessary file. On each tab you can do the following:

  • Allow All — Lets crawlers access all of your site's pages.
  • Block specific web pages and search engines — Under this tab, click Add new or modify existing rule to choose specific pages of your site to block by selecting from a list of search engines that cannot crawl the pages.
  • Block All — Blocks all of your site's pages from crawlers. NOTE: This is not recommended.

Once you have determined how you want to control crawling for your site, click Get File. Click Create robots.txt or Create Meta Tag to generate the appropriate file. For more information on robots.txt and Meta Tags, see What's the difference between robots.txt and Meta Tags?


Apakah Artikel Ini Berguna?
Terima kasih atas masukan Anda. Untuk berbicara dengan perwakilan layanan pelanggan, hubungi nomor ponsel dukungan atau gunakan opsi chatting di atas.
Senang bisa membantu! Ada hal lain yang dapat kami lakukan untuk Anda?
Maaf tentang hal tersebut. Sampaikan apa yang membuat Anda bingung atau mengapa solusi tersebut tidak mengatasi masalah Anda.