method: SRFormer (ResNet50-#1seg)2023-08-09

Authors: Qingwen Bu

Affiliation: Shanghai Jiao Tong University

Description: We first pre-train our model on SynthText150k, MLT17, LSVT and ICDAR19-ArT for 300k iterations and then tune it on ArT for 50k iterations. No TTA or any ensemble method is employed.

method: TextFuseNet (ResNeXt-101)2020-10-01

Authors: Jian Ye, Zhe Chen, Juhua Liu, Bo Du

Affiliation: Wuhan University

Email: leaf-yej@whu.edu.cn

Description: This is a preliminary evaluation result of TextFuseNet with ResNeXt-101. Multi-scale training and single-scale testing are used to get the final results. Sigma Lab, Wuhan University.

Ranking Table

Description Paper Source Code
DateMethodRecallPrecisionHmean
2023-08-09SRFormer (ResNet50-#1seg)73.51%86.08%79.30%
2020-10-01TextFuseNet (ResNeXt-101)72.77%85.42%78.59%
2019-04-30CUTeOCR71.56%86.57%78.36%
2022-07-11DPText-DETR (ResNet-50)73.70%82.97%78.06%
2021-03-26TextFuseNet (ResNet-50)69.42%82.59%75.44%
2019-04-30Fudan-Supremind Detection v371.61%79.26%75.24%
2019-04-30DMText_art66.15%85.09%74.43%
2019-04-30TEXT_SNIPER71.45%76.17%73.74%
2019-04-29CRAFT68.93%77.25%72.85%
2019-04-30MaskDet67.04%76.47%71.44%
2019-04-30CCISTD60.72%81.16%69.47%
2019-04-25Art detect by vivo57.15%80.72%66.92%
2019-04-30DMCA64.01%69.08%66.45%
2019-04-30TMIS53.49%86.19%66.01%
2019-04-28Improved Progressive scale expansion Net52.24%75.88%61.88%
2019-04-27TextCohesion_143.66%68.08%53.20%
2019-04-30MSR0.46%0.55%0.50%

Ranking Graphic