Organizer's Guide



For a brief overview of challenge organization, see Quick Tutorial.

What is a Challenge?

Challenge is an on-line data mining competition run on TunedIT website. It can be launched by any registered user of TunedIT - the organizer - who defines the task, timetable and rules of participation. After opening the challenge, users of TunedIT may register as participants and submit solutions, evaluated afterwards by the organizer using TunedTester. Challenges are interactive: preliminary results are published immediately on leaderboard, so participants know their standings when the challenge is still open and have a chance to improve their solutions.


Challenge lifetime

The challenge can be in 3 different states with respect to its functionality:

  • New - partially configured, does not accept yet participant registrations nor submissions.
  • Open - fully configured, accepts registrations and submissions, solutions undergo evaluation.
  • Closed - registrations and submissions are no longer accepted, but pending evaluations may be still executed. When all the evaluations are completed, final results are published on Leaderboard.

Additionally, the challenge can be either:

  • Draft - is visible on Challenges list to no one else but the organizer, who can perform dry-run tests before publicly disclosing the challenge; or
  • Published - everyone can see it on Challenges list.

Thus, there are 6 different states in total. Transitions between them are invoked either manually by the organizer (publishing and revoking publication) or automatically by TunedIT, when Start or End dates defined by the organizer pass. All possible states and transitions are shown below:



New challenge can be created on New Challenge page. After creation, it is empty - New Draft - and requires configuration before publishing and opening.

As a challenge owner, you have access to multiple settings, which can be tuned to your needs. Most of them are located just in the place where they are needed or displayed to participants. For example, registration code can be changed on Register subpage and start/end dates - on the Fact sheet (the block of basic facts, located below side menu). Some of the settings are grouped on separate administration pages, like Evaluation Settings. A summary of all the settings and their current values can be found on the Checklist subpage.


Typically, you will configure most of the settings before publishing and opening the challenge, when it is still in New Draft state. However, it is sometimes necessary to correct a setting later on, when the challenge is already published or open. TunedIT allows for such modifications, if only they do not break consistency of the challenge.

Below is a list of available settings, their locations on web pages and conditions when they are editable:

Setting Description Editable in state
Title Human-readable title of the challenge Draft
Name Computer-readable name of the challenge. You will pass it to TunedTester to identify the challenge for solution evaluation purposes. Used also to build URL of challenge web pages New Draft
Folder Location in Repository where challenge files - datasets, evaluation procedures, submitted solutions - will be stored. You may also upload there any other files manually, through Repository pages New Draft
Start The date when registration of participants and submission of solutions will become open. If you wish, you may set a date in the past, but not earlier than the date when the challenge was created Draft or New Published
End The date when registration of new participants and submission of solutions will become closed, and final results will be published. The hour is always 23:59:59 Draft, New Published or Open Published
Overview General description of the challenge: purpose, organizers, schedule, awards. May contain formatting and links to other pages Anytime
Task Detailed description of the task: input data, expected output, format of the solution, method of evalution Anytime
Evaluation procedure The evaluation procedure(s) you wish to use to assess solutions submitted by participants. Can be distinct for preliminary and final tests. Read more... New or Open
Datasets The dataset(s) you wish to use to evaluate solutions. Can be distinct for preliminary and final tests. Read more... New or Open
Timeout Time limit for solution evaluation. If evaluation of a given solution lasts longer, it is terminated with timeout result New
Precision No. of digits after comma displayed on Leaderboard. May influence the ordering of solutions New or Open
Remember best Which participant's solution should be the active one, presented on Leaderboard and considered for final evaluation: the best or most recent one New or Open
Limit submissions Maximum number of solutions the participant may submit for the whole duration of the challenge. If unchecked or zero, there is no limit New or Open
Registration code The secret code every user will have to enter when registering to the challenge. Used for security purposes, to limit access to the challenge New

Dataset files and JARs with evaluation procedures are put into challenge folder in Repository as private resources, accessible only by challenge owner.


See Data Files for manual configuration of datasets or Data Wizard for automatic configuration.

Evaluation procedure

See the Evaluation Settings page.

Public resources

Apart from resources required for evaluation - evaluation procedure(s) and test dataset(s) - you may upload any other files to the challenge folder and set their access rights as public or private, according to your needs. You may also create subfolders. This can be done directly on Repository page of the challenge folder - just click the folder name shown on the fact sheet and you will be redirected there.

Additionally, if you want to distribute files that should be accessible only to registered participants and no one else, like training datasets, put them into public subfolder of the challenge folder and leave their access rights as restricted - these special rights denote that only participants registered in the challenge can access the files.


If you wish to include images in Overview and Task pages, use bbcode "img" tags: [img]http://....[/img].

If the image doesn't have a URL yet - it's stored on your local computer - you can upload it to Repository and use its Repository URL in the page text. Keep in mind that:

  1. You must take "download" address of the file, instead of regular "repo" address (which links to description page of the file, not to the image itself). For example:
  2. Watch the access rights. Typically, you will have public access to image files, otherwise they won't be visible for other users. In special cases you may wish to upload images to the public subfolder and apply restricted rights, so that images are visible to registered participants alone.

Dry-run Tests

Before publication of the challenge, you may wish to check if submission and evaluation of solutions works as expected. To do this, you should:

  • Open the challenge by setting the Start date in the past
  • Register yourself as a participant
  • Create and submit an example solution
  • Check if the correct result appeared on Leaderboard. Do not forget to start TunedTester for the challenge beforehand

As long as the challenge is not published, you can open and close it multiple times by changing Start and End dates, and change all the settings. It can also be deleted.

When you are done with testing, just set the Start date in the future, so that the challenge is reset to a clean empty state and all submissions are removed. Then you can publish it.

Evaluation of Solutions

To evaluate solutions submitted by participants, download and un-zip TunedTester application, then open a console and run the following command in TunedTester directory:

  tunedtester.bat -u YourUsername -p YourPassword -c YourChallengeName      (on Windows)


  ./ -u YourUsername -p YourPassword -c YourChallengeName      (on Linux)

That's all!

From now on, Tunedtester will be working in a loop: it will automatically query the server for new submissions, download them, evaluate and send results back to TunedIT, to be published on Leaderboard. It will only stop if you make it so by pressing Ctrl+C. Specification of the test - what evaluation procedure and dataset to use, what is the time limit - will be taken from the server automatically, as well.

The same TT instance can execute both preliminary and final tests, without additional configuration. By default, final tests begin 1 week before the end of the challenge. Their results are gathered on the server and published only when the challenge is finished.

Note that at least one instance of TT should run all the time during the contest. If it is stopped, new solutions will be "not tested yet", until execution is resumed or another instance of TT is started. To start TT on a remote machine, you may need an application like screen (see man page and tutorial), which keeps your software running after you log out.

You can start more than one instance of TT, on multiple machines - for redundancy (evaluation is still going on when one of the machines fail) and speed-up (when evaluation is time-consuming, several tests can be executed in parallel).

Publishing, Opening & Closing

When you are ready to disclose the challenge publicly, even if it is not yet fully configured, but Overview is filled in and Start and End dates are set, click Publish button located below the side menu. The challenge title will become visible to everybody on Challenges list.

Opening and closing is done automatically by TunedIT, when Start or End dates pass (automatic check is done 1 min after every full hour). If the challenge is not yet fully configured when Start date passes, the Start date will be cleared to undefined and you will receive e-mail notification that opening was not possible.

After the end, when all final results are calculated, you will be asked to review them and accept. Only then, after your explicit acceptance, they will appear publicly on the Leaderboard (beforehand, they are visible only for you). If you wish, you may create also Summary page and publish it after final results are disclosed - see ICDM Summary for an example of what can be put there.


Baseline. Submit a Baseline solution when the contest starts. Use your account for this purpose, set team name to "Baseline" or whatever you like.

Forum. Contact us if you want a discussion forum to be created for your challenge. After forum is created, it's a good idea to write an invitation post, like here or here. You can also "subscribe to forum" (link at the bottom of the page), so as to receive e-mail notifications about new threads and posts.


See the Examples page.