method: Upstage KR2023-04-01

Authors: Yunsu Kim, Seung Shin, Bibek Chaudhary, Sanghoon Kim, Dahyun Kim, Sehwan Joo

Affiliation: Upstage

Description: In addressing hierarchical text detection, we implement a two-step approach. First, we perform multi-class semantic segmentation where classes are word, line, and paragraph regions. Then, we use the predicted probability map to extract and organize these entities hierarchically. Specifically, we utilize ensemble of UNets with ImageNet-pretrained EfficientNetB7/MitB4 backbones to extract class masks. Connected components are identified in the predicted mask to separate words from each other, same for lines and paragraphs. Then, word_i is assigned as a child of line_j if line_j has the highest IoU with word_i compared to all other lines. This process is similarly applied to lines and paragraphs.
For training, we erode target entities and dillate predicted entities. Also we ensure that target entities maintain a gap between them. We use symmetric Lovasz loss. We use SynthText dataset to pretrain our models.

method: Upstage KR2023-03-30

Authors: Yunsu Kim, Seung Shin, Bibek Chaudhary, Sanghoon Kim, Dahyun Kim, Sehwan Joo

Affiliation: Upstage

Description: In addressing hierarchical text detection, we implement a two-step approach. First, we perform multi-class semantic segmentation where classes are word, line, and paragraph regions. Then, we use the predicted probability map to extract and organize these entities hierarchically. Specifically, we utilize ensemble of UNets with ImageNet-pretrained EfficientNetB7/MitB4 backbones to extract class masks. Connected components are identified in the predicted mask to separate words from each other, same for lines and paragraphs. Then, word_i is assigned as a child of line_j if line_j has the highest IoU with word_i compared to all other lines. This process is similarly applied to lines and paragraphs.
For training, we erode target entities and dillate predicted entities. Also we ensure that target entities maintain a gap between them. We use symmetric Lovasz loss. We use SynthText dataset to pretrain our models.

Authors: Zhong Humen, Tang Jun, Yang zhibo, Song xiaoge

Affiliation: Alibaba DAMO OCR Team

Email: zhonghumen@gmail.com

Description: Our method is a single end-to-end model designed for hierarchical text detection. Our model utilizes the pipeline of DETR-like methods and design a hierarchical decoder so that the model can detect more text instances with less queries for reducing computational cost.
The model uses ImageNet pretrained Swin-S as backbone and is trained only on HierText training set. Single-scale inference is used during testing. No external data and synthetic data is used.

Ranking Table

Description Paper Source Code
WordLineParagraph
DateMethodPQFscorePrecisionRecallTightnessPQFscorePrecisionRecallTightnessPQFscorePrecisionRecallTightness
2023-04-01Upstage KR0.79800.91880.94730.89200.86850.76400.88340.91320.85560.86480.74540.86150.87400.84940.8652
2023-03-30Upstage KR0.79480.91380.94960.88070.86970.76570.87970.90890.85230.87040.74790.85910.87110.84740.8705
2023-04-01hiertext_submit_0401_curve_199_v20.76710.88180.92710.84080.86990.71430.83320.89320.78070.85730.63970.74830.81250.69350.8548
2023-03-31hiertext_submit_curve_1990.76710.88180.92710.84080.86990.71210.83140.88820.78140.85650.64060.74970.81350.69530.8545
2023-03-28hiertext_submit_03280.76630.88110.92540.84080.86980.71150.83090.88650.78190.85630.63830.74760.81150.69300.8538
2023-11-141230.76630.88110.92540.84080.86980.71150.83090.88650.78190.85630.63830.74760.81150.69300.8538
2023-04-01Global and local instance segmentations for hierarchical text detection0.76160.90720.93450.88160.83950.68500.82220.80240.84310.83310.62550.75110.74000.76250.8328
2023-04-02DeepSE hierarchical detection model0.75300.88490.93500.83990.85100.69430.82430.82650.82210.84230.68510.81390.81690.81100.8417
2023-03-24NVTextSpotter0.73690.87070.95100.80290.84630.67760.80420.93870.70350.84250.65510.78040.81820.74600.8394
2023-03-31Multi Class Deformable Detr for Hierarchal Text Detection0.73200.88890.90650.87200.82350.69010.84130.84830.83450.82020.63800.78070.77070.79090.8173
2023-03-17NVTextSpotter0.72150.85500.95220.77580.84390.48030.59110.82830.45960.81260.63430.75990.81610.71090.8348
2023-04-01Clova DEER0.71750.91950.93090.90830.78030.69850.89000.91260.86860.78480.65310.83500.83780.83220.7822
2023-04-02Ensemble of three task-specific Clova DEER0.71540.92030.93820.90310.77740.69640.89040.91750.86490.78210.65290.83700.84170.83230.7801
2023-03-31Hierarchical Transformers for Text Detection0.70440.86090.88470.83830.81820.69300.85230.87830.82780.81310.63460.78400.77840.78970.8094
2023-04-01SCUT-HUAWEI0.70080.89580.89790.89370.78230.67700.86200.90460.82330.78530.53140.69060.74030.64720.7696
2023-12-28Hi-SAM0.64300.82860.87660.78560.77600.66960.85300.91090.80200.78500.59090.75970.81520.71130.7779
2022-11-24DQ-DETR0.61010.77270.80640.74170.78960.26960.35910.26810.54390.75070.18380.24720.15990.54410.7436
2022-11-10nn_l0.53620.75260.88230.65610.71250.50890.69780.88860.57440.72920.37580.52220.73470.40500.7197
2023-09-11nn_g_cloud0.52340.73860.84550.65580.70860.49140.68640.79730.60270.71580.38630.54950.61170.49890.7030
2023-05-15nn_fixed0.52040.73840.84640.65480.70480.49210.68800.87490.56690.71530.34840.50060.48350.51900.6959
2023-05-15nn_adaptive0.52040.73840.84640.65480.70480.49090.68590.88510.55990.71560.38670.55170.61010.50350.7009
2022-11-10nn0.51160.73160.85750.63790.69930.45770.64020.85830.51040.71500.20720.29650.61840.19500.6988
2023-09-11nn_g130.51080.72270.86180.62230.70670.48280.67180.84380.55810.71860.38200.54180.64710.46590.7050
2023-05-12adaptive_clustering0.48300.66800.85410.54840.72300.43230.59760.84270.46290.72340.34830.49290.53640.45600.7067
2023-05-12fixed_clustering0.48300.66800.85410.54840.72300.42660.59090.81140.46470.72200.32840.46510.46840.46180.7060
2022-08-09Unified Detector (CVPR 2022 version)0.48210.61510.67540.56470.78380.62230.79910.79640.80190.77870.53600.68580.76040.62450.7817
2023-02-06HierText official ckpt0.47990.61350.67190.56450.78220.62200.79980.80000.79960.77770.53510.68560.76540.62080.7805
2023-03-01test0.27450.41750.51820.34950.65760.25610.39040.51500.31430.65590.16320.24520.35610.18700.6657
2023-02-19UnifiedDetector0.16860.23500.35230.17630.71750.20480.29090.46680.21130.70410.10760.15690.24080.11630.6856
2023-02-06a0.00000.00000.00240.00000.53620.00010.00010.00250.00010.51290.00010.00020.00210.00010.5089

Ranking Graphic

Ranking Graphic - Line PQ

Ranking Graphic - Paragraph PQ