Dynamic Web Pages

How our test site had a total of 70,000+ dynamic web pages indexed by four crawling search engines


We recently released the recording of our Dynamic Web Pages clinic. You can listen to a recording of this clinic here:

Windows Media Audio:


This research brief will answer the following questions:

  1. What are dynamic web pages?
  2. What are the advantages and disadvantages of using dynamic web pages?
  3. Can you increase the likelihood that dynamic web pages will be indexed by search engines?
  4. What are most effective techniques you should keep in mind when optimizing dynamic web pages? (10 techniques)

1. What are dynamic web pages?

Dynamic (or database-driven) web pages are created “on the fly” for visitors as they browse a website. This is accomplished with server-side technology such as ASP, ColdFusion, Perl, etc.

Dynamic pages are created by the web server when a visitor defines a number of variables. The variables can include such things as product ID, product specs, session ID, language, geographic location, search terms, and so on. Often, these variables are selected by the visitor simply clicking links on the site.

Dynamic pages don’t actually exist until they are requested by the user. This differs from static HTML pages which exist as individual files on the server.

As we will see below, dynamic web pages, while pragmatic for many reasons, do present some difficulty for certain types of search engines.

2. What are the advantages and disadvantages of using dynamic web pages?

There are a number of advantages that dynamic web pages provide. They include:

  1. Large websites do not require thousands and thousands of HTML files on the server. Instead, pages use templates and are created based on the data sent from the web browser. In the long run, this can save significant time and server space.
  2. Page or product information is stored in a database as opposed to being hard-coded into static pages. This allows updates to be made more quickly.
  3. Dynamic web pages allow for greater customization. Your server can create pages “on the fly” to match the requests, interests, and specifications of your individual visitors.

However, these features do come at a price: fewer/lower natural search rankings.

Dynamic websites are often not indexed well by search engines that “crawl” or “spider” websites (as apposed to those based only on manual URL submissions). It is difficult to get dynamic websites properly indexed without the right kind of optimization. (We will discuss this in detail below.)

Many spiders do not read past the “?” in a URL. (The part of the URL that follows the “?” is called the “query string” and is an indicator that the page is generated by a database.) Spiders don’t like these characters because dynamic pages can create an “endless loop” or cause them to get “lost” in the database.

One of the biggest problems spiders encounter with dynamic pages is session IDs. If a spider looks at the same page twice, it will see different session IDs and assume it is looking at two different pages. This, more than any other factor, can create an “endless loop” situation.

Recently, some search engines (notably Google) have improved their spiders to be able to index SOME dynamic pages. However, dynamic sites still risk not being indexed to their greatest potential without the right kind of optimization.

KEY POINT: The more pages a site has, the greater chance it has of appearing in results for relevant search terms. But thousands or even millions of pages won’t do you any good if the spiders aren’t indexing them.

3. Can you increase the likelihood that dynamic web pages will be indexed by search engines?

In order to answer this question, we compared Test Site A, which had undergone a number of key optimization techniques, to Test Site B, which had little or no optimization to its dynamic pages. Here are the results of that comparison:

Dynamic Web Pages – Crawling Search Engines
Test Site Total Pages Google AltaVista AllTheWeb HotBot
Test Site A <100,000 22,700 22,100 22,000 4,704
Test Site B 1,000,000+ 19,000 1,010 1,010 255

What You Need To UNDERSTAND: Test Site A has less than 100,000 pages and Test Site B has well over 1,000,000. Both sites use predominantly dynamic pages, but Test Site A, despite being a much smaller site, still had 336% more total pages indexed by the crawling search engines.

Why did Test Site A perform so much better in the indexing search engines than Test Site B?

Test Site A used a number of optimization techniques for its dynamic pages. Most notably it implemented the rewriting of characters such as “?” as more spider-friendly characters. We will cover this in detail in the section below.

(Note that Google indexed Test Site B much better than the other three search engines did. This is due to recent improvements in their spider that allows it to index some dynamic pages.)

The data above was generated with this tool, which you may find useful in your own page-optimization efforts.

KEY POINT: If you use dynamic web pages but do not optimize for the “crawling” search engines, you will be limited to search traffic from paid-inclusion (PPC and other) programs and the search engines that accept direct submissions. You will lose out on the “deep” indexing that a crawling engine can offer.

4. What are most effective techniques you should keep in mind when optimizing dynamic web pages? (10 techniques)

  1. Product pages, article pages, discussion forum archives, and similar pages are all EXCELLENT keyword-rich pages that will often rank quite well in search engines. These are very often dynamically generated and will benefit greatly from the techniques in this section.
  2. Avoid “?”, “&”, and “%”. Rewrite your URLs to make them look like static pages. The next several points cover how to do this on various platforms.Effective URL ending: /products/12254/blue/xlarge/Ineffective URL ending: /prod.asp?pid=12254 &color=blue&size=xlarge
  3. If you are using an Apache web server, there is a rewrite module (mod_rewrite) available for Apache 1.2 and beyond that converts requested URLs on the fly. You can rewrite URLs that contain query strings into URLs that can be indexed by search engines.For more information on the Apache rewrite module, see:
  4. If you use Active Server Pages (ASP), Exception Digital Enterprise Solutions offers a product called XQASP that is an excellent tool for converting dynamic ASP pages into search engine-compatible formats.More information on this product is available at:
  5. If you are running ColdFusion, you will have to reconfigure it on your server so that the “?” in a query string is replaced with a “/”.Here is a good place to get started:
  6. Other solutions are possible, such as writing a CGI/Perl script that utilizes “Path_Info” and “Script_Name” environment variables to rewrite URLs, or using an XML feed to send page/product information directly to the search engines. In our Literature Review below, we have included a number of articles that cover these techniques in more detail.
  7. Finally, if none of these options are feasible, you may want to continue using static web pages. While this may not be practical for large sites, it will increase your chances of achieving a “deep crawl”. Weigh the benefits of increased search engine exposure against the potential time loss of not using dynamic pages to determine if this option is right for you.
  8. If you continue to use dynamic URLs with no rewrite, limit the number of variables that you use (no more than 3). The fewer variables there are in the query string, the greater the chances of some of the smarter spiders (such as Google) indexing your site.KEY POINT: Eliminate session IDs if at all possible. This element in a dynamic URL will cause spiders more problems than any other.
  9. KEY POINT: You must provide an actual link path to your pages if you want a spider to find them. Utilize a “site map” or “site index” to encourage deep indexing by spiders. (This page should be static.)
  10. If there is a key area of your site that spiders have not visited, manually submit a static page that links to that area.

Using these techniques, you can make a significant impact on the volume of your dynamic web pages that are indexed by crawling search engines. This will result in greater exposure for your site and more traffic from these “free” search engines.

For more on the topic of “free” search engines, see our recent report on Natural Search:

Related MEC Reports:

As part of our research on this topic, we have prepared a review of the best Internet resources on this topic.

Rating System

These sites were rated for usefulness and clarity, but alas, the rating is purely subjective.

* = Decent ** = Good *** = Excellent **** = Indispensable

Search Engine Watch Forums – Dynamic Web Site Issues ****

Search Engine Saturation Tool ****

Dynamic Site SEO Tips and Hints ***

Dynamic Pages and Search Engines ***

Dynamic Web Page Optimization and Search Engine Inclusion ***

Building Dynamic Pages with Search Engines in Mind ***

Search Engine Friendly E-Commerce Catalogs ***

Masquerading Your CGI/PHP Scripts as Static HTML Pages ***

Make Way for the Deep Crawl ***

Registering Dynamic Sites ***

Solutions for Dynamic Page Registration ***

Making Dynamic and E-Commerce Sites Search Engine Friendly ***

Optimizing Dynamic Pages – Part I ***

Optimizing Dynamic Pages – Part II ***

How to Optimize Dynamic Web Sites ***

Deep Submit Your Dynamic Pages ***

How to Convert Dynamic Pages into Static Pages ***

Dynamic Site SEO Tips and Hints ***

Getting Millions of Dynamic Pages Indexed ***

Making Your Site Search Engine Friendly **

Invite Search Engine Spiders Into Your Dynamic Web Site **

Listing Dynamic Web Pages in Search Engines **

Is Google Indexing Dynamic Pages Now? **

Dynamic Web Pages – How Do They Work? **


Editor — Flint McGlaughlin

Writer — Brian Alt

Contributor — Jimmy Ellis

HTML Designer — Cliff Rainer

You might also like

Leave A Reply

Your email address will not be published.