%0 Journal Article %1 pu2009latent %A Wang, Pu %A Domeniconi, Carlotta %A Laskey, Kathryn %D 2009 %J Machine Learning and Knowledge Discovery in Databases %K 2009 bayesian clustering ecml lda pkdd %P 522--537 %T Latent Dirichlet Bayesian Co-Clustering %U http://dx.doi.org/10.1007/978-3-642-04174-7_34 %X Co-clustering has emerged as an important technique for mining contingency data matrices. However, almost all existing co-clustering algorithms are hard partitioning, assigning each row and column of the data matrix to one cluster. Recently a Bayesian co-clusteringapproach has been proposed which allows a probability distribution membership in row and column clusters. The approach usesvariational inference for parameter estimation. In this work, we modify the Bayesian co-clustering model, and use collapsedGibbs sampling and collapsed variational inference for parameter estimation. Our empirical evaluation on real data sets showsthat both collapsed Gibbs sampling and collapsed variational inference are able to find more accurate likelihood estimatesthan the standard variational Bayesian co-clustering approach.