CISPA
Browse
- No file added yet -

Efficiently Factorizing Boolean Matrices using Proximal Gradient Descent

Download (2.01 MB)
conference contribution
posted on 2023-11-29, 18:25 authored by Sebastian Dalleiger, Jilles VreekenJilles Vreeken
Addressing the interpretability problem of NMF on Boolean data, Boolean Matrix Factorization (BMF) uses Boolean algebra to decompose the input into low-rank Boolean factor matrices. These matrices are highly interpretable and very useful in practice, but they come at the high computational cost of solving an NP-hard combinatorial optimization problem. To reduce the computational burden, we propose to relax BMF continuously using a novel elastic-binary regularizer, from which we derive a proximal gradient algorithm. Through an extensive set of experiments, we demonstrate that our method works well in practice: On synthetic data, we show that it converges quickly, recovers the ground truth precisely, and estimates the simulated rank exactly. On real-world data, we improve upon the state of the art in recall, loss, and runtime, and a case study from the medical domain confirms that our results are easily interpretable and semantically meaningful.

History

Preferred Citation

Sebastian Dalleiger and Jilles Vreeken. Efficiently Factorizing Boolean Matrices using Proximal Gradient Descent. In: Conference on Neural Information Processing Systems (NeurIPS). 2022.

Primary Research Area

  • Reliable Security Guarantees

Name of Conference

Conference on Neural Information Processing Systems (NeurIPS)

Legacy Posted Date

2022-11-25

Open Access Type

  • Green

BibTeX

@inproceedings{cispa_all_3885, title = "Efficiently Factorizing Boolean Matrices using Proximal Gradient Descent", author = "Dalleiger, Sebastian and Vreeken, Jilles", booktitle="{Conference on Neural Information Processing Systems (NeurIPS)}", year="2022", }

Usage metrics

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC