Evaluation Metrics (Open Competition)

The open competition submission is considered to determine the 3 finalist teams.

  • The most important evaluation criterion is the average Levenshtein distance (LD, link)¬†obtained from testing the submitted executable file with data from unknown writers. For each writer, your executable receives 5 labelled adaptation equations to adapt to before evaluation.
  • If the difference in LD between multiple teams is below 0.1, the quality of the written report and the quality of the source code are taken into consideration.

What to submit?

  • An executable of your recognizer
  • Source code necessary for preprocessing, training and testing your model (along with a common open source license of your choice). Please provide documentation.
  • A short written report (pdf, 3-6 pages) describing the algorithms that your team used.
Format of the Submitted Executable

The format will be announced after discussing this with the participating teams. We will release a test-script in time to make sure your submissions are compatible.