--- license: mit base_model: SCUT-DLVCLab/lilt-roberta-en-base tags: - generated_from_trainer datasets: - funsd-layoutlmv3 model-index: - name: lilt-en-funsd results: [] --- # lilt-en-funsd This model is a fine-tuned version of [SCUT-DLVCLab/lilt-roberta-en-base](https://huggingface.co/SCUT-DLVCLab/lilt-roberta-en-base) on the funsd-layoutlmv3 dataset. It achieves the following results on the evaluation set: - Loss: 1.5541 - Answer: {'precision': 0.8696655132641292, 'recall': 0.9228886168910648, 'f1': 0.8954869358669834, 'number': 817} - Header: {'precision': 0.6559139784946236, 'recall': 0.5126050420168067, 'f1': 0.5754716981132076, 'number': 119} - Question: {'precision': 0.9031963470319635, 'recall': 0.9182915506035283, 'f1': 0.9106813996316759, 'number': 1077} - Overall Precision: 0.8779 - Overall Recall: 0.8962 - Overall F1: 0.8869 - Overall Accuracy: 0.8228 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - training_steps: 2500 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Answer | Header | Question | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy | |:-------------:|:------:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:| | 0.4364 | 10.53 | 200 | 0.9844 | {'precision': 0.834106728538283, 'recall': 0.8800489596083231, 'f1': 0.8564621798689696, 'number': 817} | {'precision': 0.41935483870967744, 'recall': 0.6554621848739496, 'f1': 0.5114754098360655, 'number': 119} | {'precision': 0.8516833484986351, 'recall': 0.8690807799442897, 'f1': 0.860294117647059, 'number': 1077} | 0.8072 | 0.8609 | 0.8332 | 0.7792 | | 0.0425 | 21.05 | 400 | 1.1086 | {'precision': 0.8603550295857988, 'recall': 0.8898408812729498, 'f1': 0.8748495788206979, 'number': 817} | {'precision': 0.6288659793814433, 'recall': 0.5126050420168067, 'f1': 0.5648148148148148, 'number': 119} | {'precision': 0.8763250883392226, 'recall': 0.9210770659238626, 'f1': 0.8981439565414214, 'number': 1077} | 0.8582 | 0.8843 | 0.8711 | 0.8199 | | 0.0132 | 31.58 | 600 | 1.3613 | {'precision': 0.8785276073619632, 'recall': 0.8763769889840881, 'f1': 0.8774509803921567, 'number': 817} | {'precision': 0.5182481751824818, 'recall': 0.5966386554621849, 'f1': 0.5546875, 'number': 119} | {'precision': 0.8656195462478184, 'recall': 0.9210770659238626, 'f1': 0.8924876293297346, 'number': 1077} | 0.8480 | 0.8838 | 0.8655 | 0.8090 | | 0.0061 | 42.11 | 800 | 1.5515 | {'precision': 0.8825, 'recall': 0.8641370869033048, 'f1': 0.8732220160791588, 'number': 817} | {'precision': 0.5689655172413793, 'recall': 0.5546218487394958, 'f1': 0.5617021276595745, 'number': 119} | {'precision': 0.8888888888888888, 'recall': 0.8987929433611885, 'f1': 0.8938134810710988, 'number': 1077} | 0.8678 | 0.8644 | 0.8661 | 0.7964 | | 0.0041 | 52.63 | 1000 | 1.5132 | {'precision': 0.8808664259927798, 'recall': 0.8959608323133414, 'f1': 0.8883495145631067, 'number': 817} | {'precision': 0.6296296296296297, 'recall': 0.5714285714285714, 'f1': 0.5991189427312775, 'number': 119} | {'precision': 0.8793256433007985, 'recall': 0.9201485608170845, 'f1': 0.8992740471869328, 'number': 1077} | 0.8669 | 0.8897 | 0.8782 | 0.8021 | | 0.0023 | 63.16 | 1200 | 1.6099 | {'precision': 0.8483466362599772, 'recall': 0.9106487148102815, 'f1': 0.8783943329397875, 'number': 817} | {'precision': 0.6470588235294118, 'recall': 0.46218487394957986, 'f1': 0.5392156862745099, 'number': 119} | {'precision': 0.8795718108831401, 'recall': 0.9155060352831941, 'f1': 0.897179253867152, 'number': 1077} | 0.8569 | 0.8867 | 0.8716 | 0.8007 | | 0.0013 | 73.68 | 1400 | 1.5668 | {'precision': 0.8819277108433735, 'recall': 0.8959608323133414, 'f1': 0.8888888888888888, 'number': 817} | {'precision': 0.5775862068965517, 'recall': 0.5630252100840336, 'f1': 0.5702127659574467, 'number': 119} | {'precision': 0.8972477064220183, 'recall': 0.9080779944289693, 'f1': 0.9026303645592985, 'number': 1077} | 0.8728 | 0.8828 | 0.8777 | 0.8079 | | 0.0009 | 84.21 | 1600 | 1.7323 | {'precision': 0.8639534883720931, 'recall': 0.9094247246022031, 'f1': 0.8861061419200955, 'number': 817} | {'precision': 0.6039603960396039, 'recall': 0.5126050420168067, 'f1': 0.5545454545454545, 'number': 119} | {'precision': 0.8962962962962963, 'recall': 0.8987929433611885, 'f1': 0.8975428836346777, 'number': 1077} | 0.8682 | 0.8803 | 0.8742 | 0.8078 | | 0.0008 | 94.74 | 1800 | 1.5326 | {'precision': 0.8741258741258742, 'recall': 0.9179926560587516, 'f1': 0.8955223880597015, 'number': 817} | {'precision': 0.6226415094339622, 'recall': 0.5546218487394958, 'f1': 0.5866666666666668, 'number': 119} | {'precision': 0.9044117647058824, 'recall': 0.9136490250696379, 'f1': 0.9090069284064666, 'number': 1077} | 0.8772 | 0.8942 | 0.8856 | 0.8208 | | 0.0003 | 105.26 | 2000 | 1.5560 | {'precision': 0.8625429553264605, 'recall': 0.9216646266829865, 'f1': 0.8911242603550296, 'number': 817} | {'precision': 0.616822429906542, 'recall': 0.5546218487394958, 'f1': 0.5840707964601769, 'number': 119} | {'precision': 0.9008264462809917, 'recall': 0.9108635097493036, 'f1': 0.9058171745152355, 'number': 1077} | 0.8700 | 0.8942 | 0.8819 | 0.8189 | | 0.0002 | 115.79 | 2200 | 1.5541 | {'precision': 0.8696655132641292, 'recall': 0.9228886168910648, 'f1': 0.8954869358669834, 'number': 817} | {'precision': 0.6559139784946236, 'recall': 0.5126050420168067, 'f1': 0.5754716981132076, 'number': 119} | {'precision': 0.9031963470319635, 'recall': 0.9182915506035283, 'f1': 0.9106813996316759, 'number': 1077} | 0.8779 | 0.8962 | 0.8869 | 0.8228 | | 0.0002 | 126.32 | 2400 | 1.5664 | {'precision': 0.8670520231213873, 'recall': 0.9179926560587516, 'f1': 0.89179548156956, 'number': 817} | {'precision': 0.6595744680851063, 'recall': 0.5210084033613446, 'f1': 0.5821596244131456, 'number': 119} | {'precision': 0.9135687732342007, 'recall': 0.9127205199628597, 'f1': 0.913144449605202, 'number': 1077} | 0.8821 | 0.8917 | 0.8869 | 0.8262 | ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu118 - Datasets 2.15.0 - Tokenizers 0.15.0