Final Competition Results
The final ranking was be released during the Ubicomp conference on 25th of September 2021 (10am CEST, 5pm JST). The finalists also presented their methods in that session.
The final ranking:
|Open Competition Rank||Team||Prize||Avg. Levenshtein Distance|
Open Competition Results
We saw a wide variety of algorithms and approaches. Thanks to all teams that submitted!
The submissions were evaluated on equations written by multiple persons that were not part of the training set (writer independent evaluation).
The 3 teams that advance to the final round are:
|Open Competition Rank||Team||Nationality||Avg. Levenshtein Distance|
The finalists have the chance to keep working on their algorithms until September 6th. Info on the next steps follow via e-mail.
The open competition submission is considered to determine the 3 finalist teams. The same metrics are used in the final competition.
- The most important evaluation criterion is the average Levenshtein distance (LD, link) obtained from testing the submitted executable file with data from unknown writers. For each writer, your executable receives 5 labelled adaptation equations to adapt to before evaluation.
- If the difference in LD between multiple teams is below 0.1, the quality of the written report and the quality of the source code are taken into consideration.
You can download the submission template including an evaluation script here.
What to submit?
- The generated docker image (.tar)
- All source code for preprocessing, training, adaptation and deployment
- A common open-source license of your choice
- A short written report (3-6 pages, pdf) describing the algorithms your team used.