I know that you, fair Techland readers, are always civil in the comments section. Those who aren’t might want to think twice about saying something nasty, as Googlebots (a.k.a. Google’s infamous web spiders) can now index website comments from engines like Facebook, Livefyre and Disqus.
Before, the fact that those commenting engines used JavaScript meant that Google couldn’t read or index them. Now, however, Google SEO guru Matt Cutts confirms via Twitter that “Googlebot keeps getting smarter. Now has the ability to execute AJAX/JS to index some dynamic comments.”
(PHOTOS: Top 10 Technology Bans)
Digital Inspiration points out that you can now search for all the comments someone has made via Facebook’s commenting system by searching for something like “commenter name * commenter title.” So, internet trolls that for some reason sign in using your real name, your days are numbered!
This also has major implications for websites when it comes to SEO. As The Next Web points out, it’s a good thing that all of a sudden a lot more content is going to be searchable, meaning that users could get to sites via things commenters have said. The negative? If a commenter—gasp!—says something inappropriate, that could show up in a site’s search results. Also, if websites are lax in rooting out spam, that could negatively affect how Google’s spiders view them.
Lesson to websites: Pay attention to what people are saying in your comments section. Lesson to commenters: Don’t drink and post ill-conceived angry rants at 3:00 am! Or at least go incognito when you do.
[via Digital Inspiration, The Next Web]
MORE: Stronger Online Privacy Regulation Comes with Tradeoffs