Abstract
Semantic Table Interpretation (STI) is essential for understanding and analyzing structured tabular data, with column-level entity prediction being a key subtask. Recent studies have actively explored Transformer-based architectures for this purpose. However, these methods often incur substantial computational overhead due to the complexity of their attention mechanisms, resulting in prolonged inference times. In this study, we propose a highly efficient column-level entity prediction model that leverages FlashAttention to mitigate these limitations. Our approach structurally captures the relationships between table cells and their corresponding headers, enabling effective semantic representation of each column. Experimental results show that the proposed model achieves up to 2.1× faster inference compared to existing methods, without sacrificing prediction accuracy.