The "final" ISPD-2013 Benchmark Suite was released. Please look under the ISPD-2013 Contest Benchmark section for more details.
The ISPD-2013 contest presentation with results was released.
The Benchmark release notes, Contest details, Evaluation code and metrics can be found in the Download Material section.
Please cite the following paper
when you refer to
these benchmarks in a publication:
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
M. M. Ozdal, C. Amin, A. Ayupov, S. Burns, G. Wilke, C. Zhuo,
"An Improved Benchmark Suite for the ISPD-2013 Discrete Cell
Sizing Contest", Proc. of ACM International Symposium on Physical Design, pp.
168-170, 2013.
Thanks again for the participation in ISPD 2013 Discrete Gate Sizing
Contest!
Mar. 06, 2013 (WED): "FINAL" ISPD2013 Benchmark Suite Released: The tar ball includes the benchmarks used for evaluations. Please see benchmarks and download materials below.
Feb. 20, 2013 (WED): CORRECTIONS: There are two sets of slides related to evaluations - 1) Evaluation metrics, and 2) Slides explaining scripts/code for evaluation. Please take a look at them to clear any confusions.
Feb. 13, 2013 (WED): Evaluation Program Released: The evaluation program code was released.
Feb. 8, 2013 (FRI): Contest Deadline POSTPONED: The deadline is now postponed to Feb. 24, 2013. The alpha submission deadline is still Feb. 16, 2013. Teams that do not submit any alpha executable by February 16, 2013, will be removed from the contest.
Jan. 30, 2013 (WED): Second Evaluation Metric Updated: The second evaluation metric is updated with a minor change.
Jan. 29, 2013 (TUE): Benchmarks Updated: The benchmarks are updated with two bugs fixed. The bugs are related to net name mismatches and inconsistent CAP numbers in the SPEF. The release note is also updated.
Jan. 25, 2013 (FRI): Alpha Submission Rule: In order to continue the contest, the contestants must complete at least one alpha submission during the alpha submission periods (from 01/27/2013 to 02/16/2013). Teams that do not submit any alpha executable by February 16, 2013, will be removed from the contest.
Jan. 23, 2013 (WED): Supported library updated.
Jan. 22, 2013 (TUE): Evaluation Slides Released Final submission deadline is February 20th. Also, you must submit an alpha binary by February 16. Some changes were made from last year to this year contest, please pay attention to: (1) Runtime formula was changed, runtimes were slightly reduced. (2) Alpha submission testing process. (3)
Jan. 22, 2013 (TUE): The ISPD contest results and details will be presented at the TAU/ISPD joint session on Mar. 27 (WED), 2013.
Jan. 9, 2013 (WED): Four New Benchmark Release We have released four new benchmarks in ispd2013.tgz. Please see benchmarks and download materials below.
Dec. 3, 2012 (MON): Large Benchmark Release We have released the netcard benchmark as a separate zipped file due to its large size. Please see benchmarks and download materials below.
Nov. 26, 2012 (MON): Registration Clarifications - a) You may not register with more than one team unless you are faculty/professor/advisor for those teams. b) You may not submit more than one tool per team.
Nov. 19, 2012 (MON): Contest details have been updated and benchmarks have been released
Oct. 8, 2012 (MON): The website is updated with the contest details. We reserve the right to make changes in the contest rules or benchmarks in the future. It is your responsibility to check the contest announcements on the web page until the submission deadline
Oct. 8, 2012 (MON): The detailed description about the ISPD 2013 contest is ready. Please read the file "ISPD_2013_Contest_Details" under "Contest Details". This contains the important information regarding the contest.
Oct. 8, 2012 (MON): Call-for-participation (CFP) is ready:
Call for Participation-----------------------------------------------------------------Start Date: October 8th, 2012Registration Deadline: December 14th, 2012-----------------------------------------------------------------
More details can be found at CFP (pdf or txt).
Name | Description | Last Update |
Contest Final Results | Contest final results | Mar. 27, 2013 |
Final ISPD2013 Benchmark Suite | Final benchmark suite | Mar. 06, 2013 |
Evaluation Metrics | Contest evaluation rules and details | Feb. 20, 2013 |
Slides Explaining Evaluation Scripts | Contest evaluation rules and details | Feb. 20, 2013 |
Evaluation Program | Contest evaluation program code | Feb. 13, 2013 |
Benchmark Release Notes | Release notes | Jan. 29, 2013 |
Netcard Benchmark | A large benchmark (zipped file size is ~239MB) | Jan. 29, 2013 |
Benchmarks | The sample benchmarks (does not contain Netcard) | Jan . 29, 2013 |
Contest details | The detailed description of the ISPD 2013 contest | Nov. 19, 2012 |
*The most recently updated files are indicated with red date color.
The detailed description of the ISPD 2013 contest can be found in this file: ISPD_2013_Contest_Details.pdf. The file contains the information about: (1) submission information; (2) contest evaluation; (3) benchmark files & scripts; and (4) sizer/timer interaction. Teams are advised to read this file as it has important information regarding the benchmark suite and utility scripts.
* Other names and brands may be claimed as the property of others.
The ISPD 2013 Discrete Gate Sizing Contest uses realistic benchmarks to evaluate all the contest submissions. All benchmarks consist of the verilog netlist, constraint file, parasitics file and cell library. Please read ISPD_2013_Contest_Details.pdf and ISPD_2013_Benchmark_Notes.pdf for benchmarks details.
Sample benchmarks and related parsers for the ISPD 2013 contest can be found here.
Final ISPD2013 benchmarks used for evaluations can be found here.
Please send the emails to ispd2013contest@gmail.com.
Please add "ISPD2013" to your subject line to get a quick response from the contest administrator!
Oct. 8, 2012 | Official start date. The sample benchmark was released. |
Dec. 14, 2012 | Registration deadline For registration or any other inquiries, please send emails to the following address (ispd2013contest@gmail.com). For registration, please send an email including the following information: Affiliation of the team/contestant, Names of team members, Synopsys PrimeTime?license availability, One email address as the corresponding address of the team. |
Feb. 24, 2013 | Receive submissions from all teams |
Mar. 24-27, 2013 | ISPD 2013, announce contest results |
Please try to submit a static binary as it helps with portability.
Each team is allowed to submit a single binary that should run on all benchmarks.
The submitted work can be either single threaded or multi-threaded version.
In order to continue the contest, the contestants must complete at least one alpha submission during the alpha submission periods (from 01/27/2013 to 02/16/2013). Please check the evaluation metrics for details. Teams that do not submit any alpha executable by February 16, 2013, will be removed from the contest.
No pre-computed information can be used to influence the current run. The run directory will be cleaned prior to each run.
The officially supported programming language will be C/C++. For other languages, please check with the contest organizers first.
The library for the contest is the standard C/C++ library. For any parallel library or third party public domain library, please check with the contest organizers first.
You are allowed to implement your own timer, but the final timing evaluation will be done using Synopsys PrimeTime*.
System Specification:
* Other names and brands may be claimed as the property of others.
There will be two separate rankings.
Primary ranking: Solution quality will be the main metric. Runtime will be used for tie-breaking.
Secondary ranking: Both solution quality and runtime will be important. Multi-core implementations are encouraged!
The details can be found in evaluation metrics slides. Final submission deadline is February 24th. Also, you must submit an alpha binary by February 16. Some changes were made from last year to this year contest, please pay attention to:
Evaluations will be run on a machine with the following configuration. The evaluation program can be found here and is explained in evaluation scripts slides.
Below is the list of C/C++ libraries are allowed and supported by the contest organizers, if you wish to use a library not listed below please contact contest organizers first. Supported libraries:
For the contest announcement and call for participation, please see pdf or txt.