A novel hierarchically-structured factor mixture model for cluster discovery from multi-modality data

Bing Si, Todd J. Schwedt, Catherine D. Chong, Teresa Wu, Jing Li

Research output: Contribution to journalArticlepeer-review

1 Scopus citations


Advances in sensing technology have generated multi-modality datasets with complementary information in various domains. In health care, it is common to acquire images of different types/modalities for the same patient to facilitate clinical decision making. We propose a clustering method called hierarchically-structured Factor Mixture Model (hierFMM) that enables cluster discovery from multi-modality datasets to exploit their joint strength. HierFMM employs a novel double-L21-penalized likelihood formulation to achieve hierarchical selection of modalities and features that are nested within the modalities. This formulation is proven to satisfy a Quadratic Majorization condition that allows for an efficient Group-wise Majorization Descent algorithm to be developed for model estimation. Simulation studies show significantly better performance of hierFMM than competing methods. HierFMM is applied to an application of identifying clusters/subgroups of migraine patients based on brain cortical area, thickness, and volume datasets extracted from Magnetic Resonance Imaging. Two subgroups are found, whose patients significantly differ in clinical characteristics. This finding shows the promise of using multi-modality imaging data to help patient stratification and develop optimal treatment for different subgroups with migraine.

Original languageEnglish (US)
Pages (from-to)799-811
Number of pages13
JournalIISE Transactions
Issue number7
StatePublished - 2021


  • Factor model
  • clustering
  • health care
  • sparse learning

ASJC Scopus subject areas

  • Industrial and Manufacturing Engineering


Dive into the research topics of 'A novel hierarchically-structured factor mixture model for cluster discovery from multi-modality data'. Together they form a unique fingerprint.

Cite this