Tabular Transformers Meet Relational Databases.

Saved in:
Bibliographic Details
Title: Tabular Transformers Meet Relational Databases.
Authors: PELEŠKA, JAKUB, ŠÍR, GUSTAV
Source: ACM Transactions on Intelligent Systems & Technology; Oct2025, Vol. 16 Issue 5, p1-24, 24p
Subject Terms: RELATIONAL databases, MACHINE learning, ARTIFICIAL neural networks, ARTIFICIAL intelligence, DATA structures
Abstract: Transformer models have continuously expanded into all machine learning domains convertible to the underlying sequence-to-sequence representation, including tabular data. However, while ubiquitous, this representation restricts its extension to the more general case of relational databases. In this article, we introduce a modular neural message-passing scheme that closely adheres to the formal relational model, enabling direct end-to-end learning of tabular transformers from database storage systems. We address the associated challenges of appropriate learning data representation and loading, which are critical in the database setting, and compare our approach against a number of representative models from various related fields across a significantly wide range of datasets. Our results then demonstrate superior performance of this newly proposed class of neural architectures. [ABSTRACT FROM AUTHOR]
Copyright of ACM Transactions on Intelligent Systems & Technology is the property of Association for Computing Machinery and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
Database: Complementary Index
Description
Abstract:Transformer models have continuously expanded into all machine learning domains convertible to the underlying sequence-to-sequence representation, including tabular data. However, while ubiquitous, this representation restricts its extension to the more general case of relational databases. In this article, we introduce a modular neural message-passing scheme that closely adheres to the formal relational model, enabling direct end-to-end learning of tabular transformers from database storage systems. We address the associated challenges of appropriate learning data representation and loading, which are critical in the database setting, and compare our approach against a number of representative models from various related fields across a significantly wide range of datasets. Our results then demonstrate superior performance of this newly proposed class of neural architectures. [ABSTRACT FROM AUTHOR]
ISSN:21576904
DOI:10.1145/3749991