Submissions

Evaluation Metrics

The open competition submission is considered to determine the 3 finalist teams. The same metrics are used in the final competition.

  • The most important evaluation criterion is the average Levenshtein distance (LD, link)¬†obtained from testing the submitted executable file with data from unknown writers. For each writer, your executable receives 5 labelled adaptation equations to adapt to before evaluation.
  • If the difference in LD between multiple teams is below 0.1, the quality of the written report and the quality of the source code are taken into consideration.

You can download the submission template including an evaluation script here.

What to submit?

  • The generated docker image (.tar)
  • All source code for preprocessing, training, adaptation and deployment
  • A common open-source license of your choice
  • Documentation
  • A short written report (3-6 pages, pdf) describing the algorithms your team used.
  • ¬†