Facebook Comments Box Indexable and Crawlable by the Search Engines

SEOMoz community member Roy Peleg writes about how to make the Facebook comments (part of the Facebook comments box iframe) indexable and crawlable by the search engines including Google.

He links to a PHP script that basically pulls out comments from the API: http://www.rayhe.net/fb/comments.phps and inserts them into a page.

He writes:

So basically you can now use Facebook Comments Box on your site and serve GoogleBot (or any other crawler/browser agent) with the comments to have them crawled & indexed. Obviously this won’t be considered as cloaking as you’re serving Google exactly what the users see (just like creating an HTML version for a Flash website).

However, trying to cloak (although having noble intentions) is just wrong in any case. Specially, when current methods and possibilities allow us to provide content visible only to the search engines. Instead of using the easiest way, Roy Peleg recommends one of the Google banned techniques.

What is the easiest way I am talking about?

Using the plain old <noscript> element that is well suited for this purpose (search engines do not use Javascript, so they will “see” alternate content provided on the page):

The NOSCRIPT element allows authors to provide alternate content when a script is not executed. The content of a NOSCRIPT element should only be rendered by a script-aware user agent in the following cases:

  • The user agent is configured not to evaluate scripts.
  • The user agent doesn’t support a scripting language invoked by a SCRIPT element earlier in the document.

User agents that do not support client-side scripts must render this element’s contents.

Easy, peasy and accessible, dear Roy Peleg.

Have a comment? Join discussion on Mastodon as a reply to: @dusoft@fosstodon.org