How to Optimize Dynamic Websites for Better Search Engine Rankings

By Asif Iqbal

There is a misconception related to dynamic websites that dynamic websites are not search engine friendly or they can’t have good positions in major search engines. This is absolutely wrong, dynamic websites can have better and more controlled positions in search engines comparatively than static websites.

What is a dynamic website?

A dynamic website is database driven website in which parts of the content are generated by Server Side Programs/ Middle Tier.
Dynamic webpage doesn’t physically exist as a file/document on (hosting) server, unless the request comes for a webpage. The request contains parameters, user identities, date & time, context etc.

Problems with Dynamic Websites according to Search Engines

This is true that search engines are not good at reading dynamic web pages, but there is always a solution for any problem, first you need to understand that why search engines are unable to read dynamically generated websites? What hurts them not to read dynamic web pages?

  1. Dynamic webpage doesn’t physically exit on server
  2. Dynamic website has complex URLs such as “
  3. Search engine bots/crawlers usually have difficulty in reading these characters “?”, “=”, “@”, “%”, “$”, “*”, “&”, “!” in URLs
  4. Search engine usually considers dynamic website as group of never ending links
  5. Search engine bots/crawlers might get stuck in an infinite loop, specially if the dynamic webpage has session id

Tips to Optimize Dynamic Websites

Now you know what hurts search engine bots/crawlers to index your website? What you need to know is that how you can keep your valuable website indexed by search engines, the more your web pages are indexed the better your website will impress search engines

  1. Create an HTML sitemap with 100 text links or less. If you have more than 100 links, break the sitemap into more than one web pages
  2. Google Sitemap will also be an advantage, specially if your website is big and dynamic
  3. Get inbound links deep into your website from other relevant websites such as directories, classified directories, vertical industrial portals
  4. Convert dynamic web pages into static web pages with the help of URL re-writing techniques
  5. You can use some plug-in applications that will change your existing dynamic URLs into static ones, specially for shopping carts there are plenty of applications available
  6. Avoid using session IDs in the URL, specially when user has not logged in
  7. If you do need to include parameters, limit it to two and limit the number of characters per parameter to ten or less
  8. If you do have small dynamic website and enough time you can apply this technique. Just right click on page by page of you website, copy the source code and create new static page with .htm or .html extensions

URL Rewriting Techniques and Tools

A rewrite engine is a piece of web server software application that is used to modify URLs before fetching the requested items for a variety of purposes.

Rewrite Engine for Apache HTTP server:

Apache HTTP server has a rewrite engine called mod_rewrite, which has been described as “the Swiss Army knife of URL manipulation”

Rewrite engines for Microsoft’s Internet Information Server (IIS):

  1. IISRewrite from Qwerksoft
  2. ISAPI_Rewrite from
  3. URL Replacer from Motobit
  4. Ionic’s ISAPI Rewrite Filter (IIRF) (open source) from Ionic Shade

Rewrite HttpModule for Microsoft ASP.NET:

  1. URLRewriting.NET

Rewrite engine for Java 2 Platform, Enterprise Edition (J2EE) Servlet container servers:

  1. Apache Tomcat, Resin, Orion etc)
  2. HttpRedirectFilter (open source)
  3. UrlRewriteFilter (open source) – allows you to rewrite URLs before they get to your Servlets, JSP’s, Struts etc
  4. URL Rewriter (open source – LGPL) – URL Rewriter is a tool for rewriting URLs in Java Servlets. It is similar to mod_rewrite


Dynamic websites are not impossible to optimize, it’s just a small fine tune that you need to keep in mind when developing a dynamic website, if you can understand the problems search engine bots/ crawlers have to face when crawling your website, you can better prepare your website, so that search engine bots/crawlers can easily index your valuable website.