R.R.C. Robust Reading Competition
  • Home (current)
  • Challenges
    • MapText2025
    • Comics Understanding2025
    • NoTeS2025
    • Occluded RoadText2024
    • MapText2024
    • HR-Ciphers2024
    • DocVQA2020-23
    • ReST2023
    • SVRD2023
    • DSText2023
    • DUDE 😎2023
    • NewsVideoQA2023
    • RoadText2023
    • DocILE2023
    • HierText2022
    • Out of Vocabulary2022
    • ST-VQA2019
    • MLT2019
    • LSVT2019
    • ArT2019
    • SROIE2019
    • ReCTS2019
    • COCO-Text2017
    • DeTEXT2017
    • DOST2017
    • FSNS2017
    • MLT2017
    • IEHHR2017
    • Incidental Scene Text2015
    • Text in Videos2013-2015
    • Focused Scene Text2013-2015
    • Born-Digital Images (Web and Email)2011-2015
  • Register
    SROIE 2019
  • Overview
  • Tasks
  • Downloads
  • Results
  • My Methods
  • Organizers
  • Home
  • SROIE
  • Results
  • Task 1 - Text Localization
  • Method: CITlab Argus Textline Detection
  • Samples
  • Task 1 - Text Localization - Method: CITlab Argus Textline Detection
  • Method info
  • Samples list
  • Per sample details
    Page of 19
  • next >
  • View
  • Thumbnails
  • Table
SampleResults
Deteval
RecallPrecisionHmean
186.81%84.90%85.84%
280.49%80.49%80.49%
385.19%78.41%81.66%
463.18%63.56%63.37%
578.26%76.60%77.42%
696.30%89.47%92.76%
792.44%93.78%93.11%
892.63%95.38%93.99%
9100.00%100.00%100.00%
1094.83%94.13%94.48%
1199.47%98.97%99.22%
1298.82%97.78%98.30%
1393.93%93.48%93.70%
1496.72%93.33%95.00%
1598.11%98.00%98.06%
1693.58%92.20%92.88%
1793.11%94.33%93.72%
1890.32%98.25%94.12%
1993.94%93.94%93.94%
2097.92%97.69%97.81%
Download full table (.csv)

Results

Deteval