TPC photo
The TPC defines transaction processing and database benchmarks and delivers trusted results to the industry
    Document Search         Member Login    
     Home
About the TPC
Benchmarks
      Newsletter
      Join the TPC
      Downloads
      Technical Articles
      TPCTC



TPC Benchmark Status
January 1999

By Kim Shanley, TPC Administrator

TPC Benchmark Status is published about every two months. The first and primary purpose of the newsletter is to keep interested parties informed about the content, issues, and schedule of the TPC's benchmark development efforts. The second purpose is to invite new members to join these important development efforts. We've already outlined most of the reasons for joining the TPC in another article, Why Join. To receive the status report by email, please click here.

TPC Benchmark FAQs (Part 1)
Q: "Why are TPC benchmarks so expensive?"

Oh, yes, I have been asked this question frequently over the years. I've been asked this by users who wonder why companies don't publish TPC benchmarks on all their systems. I've been asked this by commercial and magazine technical labs who want to run TPC benchmarks for their users or readers. The answer--and this will become a familiar refrain in these FAQs, is due to tradeoffs. If you want to build a good, realistic benchmark, it must be complex. The TPC's goal is to build benchmarks that are representative of production-ready environments. In the TPC area of transaction processing (at least for OLTP), this means the system defined by the benchmark must be capable of supporting hundreds and even thousands of concurrent users. It also means the system must have the kind of failure-protection and backup and recovery required by real users. In addition to the demanding technical requirements, the benchmark results must be credible. The TPC is the only benchmarking organization that I know that has a mandatory and completely thorough audit of each and every benchmark result by an objective third party--in our case, TPC certified auditors. This review process is backed up by a public review and challenge process. In this process, anyone can challenge the published and public benchmark result (documented in a full disclosure report) and a review process will be inititated to review and adjudicate the challenge. So, running TPC benchmarks is an expensive proposition, but there are good reasons for making the investment.

The Next Version of TPC-D Approved
Version 2.0 of TPC-D was approved by the TPC general membership on November 23, 1998. The primary changes in Version 2.0 were the following:

  • Requiring multiple simultaneous query streams, as the vast majority of real decision support systems must process query streams from more than a single user at a time.
  • A simpler single reported performance metric, making it easier to compare results from different vendors.
  • Increasing the number of queries from 17 to 22, to add additional query functionality and complexity to the workload.
  • Reducing the number of query variants from which test sponsors are allowed to choose. This increases the comparability of the various results.
  • More realistic requirements with respect to database durability.
  • More efficient parallel generation of data (using the DBGEN program) for database load.
TPC-D Version 1 and 2 Milestones
Realizing that preparing to run a new major version of a benchmark takes some time, the TPC extended the life of Version 1.3 for a few more months. Version 1.3 will remain in effect (companies can still publish Version 1.3 benchmark tests) until February 15, 1999. After Version 2.0 was approved on November 23, 1998, companies could run TPC-C against the Version 2.0 benchmark specification, if they chose to do that (again, they can continue to publish against Version 1.0 until February 15, 1999). As of February 16, 1999, companies must publish TPC-D results against Version 2.0. Unless the timetable is modified by the TPC, Version 1.3 results will remain official TPC results for six months after their submittal date. Thereafter, these results will be removed from the TPC's official results list.

TPC-D Version 3 - Planning Stage
The TPC-D Subcommittee, chaired by Susanne Englert of Compaq, is now turning its attention to developing Version 3 of TPC-D. The Subcommittee is considering the following issues:

  • New data mart workload as well as evolving the current data warehouse workload. The Subcommittee is already reviewing one working data mart benchmark submitted by a Subcommittee member.
  • Ensuring that the benchmark's workload and execution rules are derived from a clear business model.
Right now there is no timetable set for completing Version 3.

TPC-C, Version 4 Status
TPC-C, the TPC’s on-line transaction processing (OLTP) benchmark, was approved in July 1992. Since then it has passed through two major revisions, with the revision number now standing at Version 3.4. The goals of Version 4 are as follows:

  • Provide a system level benchmark that represents balanced performance.
  • Increase CPU utilization per transaction.
  • Increase the read/write ratio to better model customer implementations.
  • Reduce required disk storage to a reasonable configuration.
  • Reduce maximum number of supported users on a given platform.
  • Increase ease of benchmarking; reduce the cost of each benchmark result; reduce the time to implement.
  • Increase the value of reported data and improve readability.
TPC-C Version 4 Milestones
At the 12/1998 General Council meeting, TPC-C Subcommittee Chair, John Fowler of IBM, reported that the Version 4 benchmark specification in 90-90% complete. Completion of the final stages of the process await evaluation of further prototype data. Here is the current milestones for TPC-C Version 4.0:

  • 2/1999: Company Review
  • 4/1999: Mail Ballot
  • 8/1999: V4 becomes mandatory (last month V3 test could be submitted).
Web Commerce (TPC-W) Benchmark Status
TPC-W is designed to represent any business (retail store, software distribution, airline reservation, electronic stock trades, etc.) that markets and sells over the Internet. It also represents Intranet environments that use Web based transactions for internal operations. The benchmark will measure the performance of systems supporting users browsing, ordering, and conducting transaction oriented business activities. For more information, see: TPC Launches New ECommerce Transactional Web Benchmark Effort.

TPC-W features 3 workloads:
  • Web shopping. Primary metric is web interactions per second (WIPS)
  • Web browsing. Secondary metric: web interactions per second-browsing (WIPSB)
  • Web OLTP. Secondary metric: web interactions per second-OLTP (WIPSO)
Angara Database Systems joined the TPC in November, 1998 to participate in the TPC-W effort.

TPC-W Milestones
At the 12/1999 TPC meeting, TPC-W Chair Jerry Buggert reported that the Subcommittee was meeting twice a week via teleconference and was making steady progress on completing the benchmark. The current schedule is:

  • 4/1999: Company/public review
  • 10/1999: Mail ballot submittal
  • 12/1999: Spec approved
All Benchmark Status Reports
 

Valid XHTML 1.0 Transitional Valid CSS!