Every file and folder in Repository may have an associated description, visible on Repository page of a given resource. Description can be modified only by the owner.
TunedTester can be downloaded at this page.
TunedTester runs tests of algorithms according to test specifications given by you. Specification is composed of TunedIT resource names of: an evaluation procedure, a dataset and an algorithm that should be used to set up the test. It is possible to give several test specifications at once, by listing a number of datasets and/or algorithms in text areas of TunedTester window. In such case, TunedTester will run tests for all possible combinations of the given items.
In order to download necessary resources from Repository or send results to Knowledge Base you must be authenticated by TunedIT server. For this purpose you must give your username and password in TunedTester window before starting test execution.
TunedTester creates a cache folder in the local file system to keep copies of resources downloaded from Repository. This folder may become large at some point and require manual cleaning. To do this, you should simply remove the folder with all its contents - it will be automatically recreated with empty contents upon next execution of TunedTester. Cache folder is named tunedit-cache and is located in user's home directory.
Knowledge Base page
KB page shows aggregated results of tests collected in KB. In section Filters you can specify which results you want to view, by defining a pattern that must be matched by test specifications of the results. The pattern is built as a conjunction of patterns for each part of test specification: name of algorithm, dataset and evaluation procedure. Empty pattern will match all possible names. After the filters are defined, press "Show Results" to download matching results from TunedIT server. Please be patient, this operation may take a couple of seconds. When downloaded, the results are presented in section Results, where you can manipulate them and change the way how they are presented without downloading them again.
Important: the exact meaning of "Mean Result" depends on what evaluation procedure was used. Result value can be interpreted either as gain or loss, so for some evaluation procedures it is the bigger value which indicates higher quality of the algorithm, while for others it is the lower. For instance, ClassificationTT70 measures classification accuracy of an algorithm, interpreted as gain, while RegressionTT70 calculates Root Mean Squared Error (RMSE), interpreted as loss. These differences must be taken into account when analysing results of tests. In order to find out how the results should be interpreted for a given evaluation procedure, it is best to go to its Repository page and read the description.
If "exact match" check box is on, pattern matching is case-sensitive. Please watch carefully for the case of letters.
The chart presents mean results of all algorithms that were tested on the selected dataset using the selected evaluation procedure. You can choose another dataset and evaluation procedure using drop-down lists located above the chart. If you place the mouse over a bar on the chart, a tooltip will show up in the upper-left corner of the window, displaying detailed information about the selected test.
Meaning of columns of the result table:
- Evaluation procedure, Dataset, Algorithm: specification of tests whose aggregated result is presented in a given row.
- Support: number of all atomic results stored in KB for a given test specification and contributing to the presented aggregated result. Only the tests which were correctly completed (without error) are counted.
- Mean Result: aggregated result - arithmetic average of the atomic results.
- Std Dev: aggregated result - standard deviation of the atomic results.
Names of evaluation procedures, algorithms and datasets are hyperlinks which lead to Repository pages of the resources, so you may click the name and see all the details of a given resource.
You can sort result tables by any column, in ascending or descending order, by clicking on the header of a chosen column.
You can download the results as CSV files for off-line analysis, by clicking on the [download as CSV] link located rigth above the result table.