Recommended Workloads

Given the evolutionary nature of the Internet and Web servers, it is hard to determine the exact workload. There are some quantitative studies that evaluate the load and traffic on servers. It is hard to emulate exact scenarios of the workload, but you can use the Web Capacity Analysis Tool to emulate repetitious page fetches in random and distributed fashion. It is wise to model workloads on data from many different real-world Web servers, including HTTP proxy servers used by corporations. It is more logical to gather this information from access servers, because the information will represent the client end of access to HTTP servers. You should determine the workload mix based on your server’s anticipated use. The following is rough outline of a recommended workload:
  1. A series of page transfers of different file sizes ranging from 512 bytes through 1M.

  2. A weighted mix of page transfers with file sizes from 512 through 1M bytes, in single and multiple directories. The workload can vary in the amount of the data set present, ranging from 0.5MB to 200MB.

  3. A mix of pages with files and ISAPI transfers

  4. A mix of pages with files and CGI transfers

  5. A mix of pages with files, ISAPI, and CGI transfers.

  6. ISAPI with ODBC and database integration.

  7. SSL and Personal Communications Technology (PCT) encrypted data transfer using above combinations.

  8. Persistent connection (HTTP keep-alive) page transfers using above combinations.