matthew henry

  • Pagination Tunnels – An Experiment in Crawlability and Click Depth

    We’ve all seen pagination links — those little numbered links at the top and bottom of multi-page content. They are used on blogs, e-commerce sites, webcomics, gallery pages, SERPs, and multi-page articles. 31 Flavors of Pagination From the human visitor’s point of view, pagination is pretty simple.

    Portent, Inc- 13 readers -
  • Robots.txt Mistakes and Best Uses: A Guide

    Robots.txt is a small text file that lives in the root directory of a website. It tells well-behaved crawlers whether to crawl certain parts of the site or not. The file uses simple syntax to be easy for crawlers to put in place (which makes it easy for webmasters to put in place, too). Write it well, and you’ll be in indexed heaven.

    Portent, Incin SEO- 25 readers -
  • Field Guide to Spider Traps: An SEO’s Companion

    If search engines can’t crawl your site, SEO efforts do not amount to much. One of the problems I see most often are ‘spider traps’. Traps kill crawls and hurt indexation. Here’s how you find and fix them: What is a spider trap? A spider trap is a structural issue that causes a web crawler to get stuck in a loop, loading meaningless ‘junk’ pages forever.

    Portent, Incin SEO- 22 readers -