NET-DNF: EFFECTIVE DEEP MODELING OF TABULAR DATA

Liran Katzir, Gal Elidan, Ran El-Yaniv

Research output: Contribution to conferencePaperpeer-review

17 Scopus citations

Abstract

A challenging open question in deep learning is how to handle tabular data. Unlike domains such as image and natural language processing, where deep architectures prevail, there is still no widely accepted neural architecture that dominates tabular data. As a step toward bridging this gap, we present Net-DNF a novel generic architecture whose inductive bias elicits models whose structure corresponds to logical Boolean formulas in disjunctive normal form (DNF) over affine soft-threshold decision terms. Net-DNFs also promote localized decisions that are taken over small subsets of the features. We present extensive experiments showing that Net-DNFs significantly and consistently outperform fully connected networks over tabular data. With relatively few hyperparameters, Net-DNFs open the door to practical end-to-end handling of tabular data using neural networks. We present ablation studies, which justify the design choices of Net-DNF including the inductive bias elements, namely, Boolean formulation, locality, and feature selection.

Original languageEnglish
StatePublished - 2021
Event9th International Conference on Learning Representations, ICLR 2021 - Virtual, Online
Duration: 3 May 20217 May 2021

Conference

Conference9th International Conference on Learning Representations, ICLR 2021
CityVirtual, Online
Period3/05/217/05/21

Bibliographical note

Publisher Copyright:
© 2021 ICLR 2021 - 9th International Conference on Learning Representations. All rights reserved.

Fingerprint

Dive into the research topics of 'NET-DNF: EFFECTIVE DEEP MODELING OF TABULAR DATA'. Together they form a unique fingerprint.

Cite this