ROADEF'2003 Challenge: How to participate?
Summary
- The scenarii provided by ONERA-CNES
- Information to be provided by the participants
- The testbed environments
- The evaluation and ranking procedure
- Timetable (updated on January 29th, 2003)
The scenarii provided by ONERA-CNES
Three scenarii databases will be provided, they will give opportunities to the candidates to tune their programs:
- TestSet A: available since the beginning of this challenge
(the data instances are here). These data are
used by the jury to select the finalists.
This is the TestSet of qualification. - TestSet B: will be provided to the finalists to better tune their programs for the TestSet X.
- TestSet X: will be used with the TestSet A to rank the finalists at the
ROADEF'2003 conference.
This is the Final TestSet. This TestSet X will stay unknown to the finalists till the end of this challenge.
Information to be provided by the participants
Participants in the ROADEF Challenge must provide the following information to ONERA-CNES:
- description of the team:
- name(s)
- position(s) (student, post doc, professor, private sector employee,...)
- affiliation
- short description of the team and their past experience
- description of the methods used with motivation and references (less than 6 pages), including the commercial software used and computer languages used
- a table with the best solutions obtained within the given time, i.e. 300 seconds CPU time (on a machine similar to the Sun-Blade-1000 described herafter)
- the solutions and result files corresponding to different instances. To be provided in ACSII format.
- A program which will be used to rank the participants. Either:
- the C or C++ code including a manual for compiling, and the associated makefile allowing ONERA-CNES to create an executable code, or
- an executable code which could be run directly on one of the two machines provided by ONERA-CNES here below.
Organization of a participant's file hierarchy:
For a participant indexed NN, the file hierarchy is:
- main directory: Candidate-NN/
- sub-directories:
- Candidate-NN/Team-description/
- Candidate-NN/Method-description/
- Candidate-NN/Result-synthesis/
- Candidate-NN/Instances/
- Candidate-NN/Solutions/
- Candidate-NN/Results/
- Candidate-NN/Program/
Whether furnished or generated, the executable must be usable online by
typing in the directory Candidate-NN/Program/ :
executable-name instance-name -t cpu-time
with:
instance-name: scenario name.
The program will look for and write to the files in the appropriate directories.
cpu-time: run time in seconds, limited to 300.
The program should look for the data file instance-name in the directory Candidate-NN/Instances/ and write the solution obtained at cpu-time to the dircetory Candidate-NN/Solutions/.
Participants could be excluded for the following reasons:
- exceeding the deadline
- incomplete application (any of the required elements missing)
- impossible to evaluate the code (problems in compilation or data formats, ...)
If a participant wishes, and only with her/his agreement, her/his code can be accessible by Internet.
Tested environments
The programs will be tested on either:
- a Sun-Blade-1000 workstation (the reference machine) 750Mhz, RAM 512Mbytes, equipped witth Unix SunOS 5.8 and gcc version 3.0 for C/C++.
- a PC-Pentium MMX 233Mhz, RAM 256Mbytes, equipped with Linux RedHat 7.2 (Enigma) EDT 2000 version (kernel 2.4.7-10) and gcc version 3.0 for C/C++.
The jury will take into account the relative speeds of these machines.
The evaluation and ranking procedure
There are two categories:
- Senior category: all the particpants are grouped in this category.
- Junior category: limited to student projects selected by the jury (a project is a student project if the majority of participants are students, possibly guided by their professors).
Remark: Note that a student project could win the prize of the Senior category if this project is the best over all the submitted projects. On the opposite, a Senior project can not win the prize of the Junior category.
Important points:
- The evaluation program provided by ONERA-CNES will basically be a shell script. For each data instance, this script will call the program of a participant, let it run 300 seconds, look for the solution obtained in the directory Candidate-NN/Solutions/, check and evaluate it, and write the related cost in the directory Candidate-NN/Results/.
- For the none-determinsitic programs, 10 runs per instance are required to obtain the mean, the median and the standard deviation.
- 300 seconds of CPU time correspond to operational constrains and the only evaluation criterion is the solution obtained after having spent this time.
- On the results of the TestSet A, at most ten participants will be qualified to the final.
- The finalists' programs will only be evaluated on the TestSets A and X. The TestSet B is only provided to tune the finalists' programs.
Timetable
- 08/04/02: Beginning of the qualification stage
- Problems available on the WEB.
- 01/07/02: extended to september 30th, 2002
- Application deadline of the candidates.
- Please to send your complete affiliations to Van-Dat CUNG.
- 18/11/02: End of the qualification stage
- Deadline for participant submissions with results and programs on TestSet A.
- 18/12/02: Beginning of the final stage postponed to January 15th, 2003
- Qualification stage results announced.
- Selection of the finalists who will be invited to present their works at the ROADEF'2003 conference.
- TestSet B problems will be provided to the finalists to tune their programs.
- 20/01/03: postponed to January 31st, 2003, extra delayed to February 3rd, 2003, midnight.
- The finalists who wish could send an improved version of their programs.
- ONERA-CNES will test the finalists' programs on the TestSet X instances.
- February 26-28th, 2003 at the ROADEF'2003 conference :
- Announce of the final results on TestSets A and X.
- The winner in each of the two categories will receive a prize accorded by the ROADEF.