Showing 1–16 of 16 results for author: Muandet, K

.
  1. arXiv:1805.08845  [pdf, other stat.ML

    Counterfactual Mean Embedding: A Kernel Method for Nonparametric Causal Inference

    Authors: Krikamol Muandet, Motonobu Kanagawa, Sorawit Saengkyongam, Sanparith Marukatat

    Abstract: This paper introduces a novel Hilbert space representation of a counterfactual distribution---called counterfactual mean embedding (CME)---with applications in nonparametric causal inference. Counterfactual prediction has become an ubiquitous tool in machine learning applications, such as online advertisement, recommendation systems, and medical diagnosis, whose performance relies on certain inter… ▽ More

    Submitted 22 May, 2018; originally announced May 2018.

  2. arXiv:1712.01572  [pdf, other math.DS

    Eigendecompositions of Transfer Operators in Reproducing Kernel Hilbert Spaces

    Authors: Stefan Klus, Ingmar Schuster, Krikamol Muandet

    Abstract: Transfer operators such as the Perron--Frobenius or Koopman operator play an important role in the global analysis of complex dynamical systems. The eigenfunctions of these operators can be used to detect metastable sets, to project the dynamics onto the dominant slow processes, or to separate superimposed signals. We extend transfer operator theory to reproducing kernel Hilbert spaces and show th… ▽ More

    Submitted 16 May, 2018; v1 submitted 5 December, 2017; originally announced December 2017.

  3. arXiv:1708.09794  [pdf, other cs.DL

    Design and Analysis of the NIPS 2016 Review Process

    Authors: Nihar B. Shah, Behzad Tabibian, Krikamol Muandet, Isabelle Guyon, Ulrike von Luxburg

    Abstract: Neural Information Processing Systems (NIPS) is a top-tier annual conference in machine learning. The 2016 edition of the conference comprised more than 2,400 paper submissions, 3,000 reviewers, and 8,000 attendees. This represents a growth of nearly 40% in terms of submissions, 96% in terms of reviewers, and over 100% in terms of attendees as compared to the previous year. The massive scale as we… ▽ More

    Submitted 23 April, 2018; v1 submitted 31 August, 2017; originally announced August 2017.

  4. Kernel Mean Embedding of Distributions: A Review and Beyond

    Authors: Krikamol Muandet, Kenji Fukumizu, Bharath Sriperumbudur, Bernhard Schölkopf

    Abstract: A Hilbert space embedding of a distribution---in short, a kernel mean embedding---has recently emerged as a powerful tool for machine learning and inference. The basic idea behind this framework is to map distributions into a reproducing kernel Hilbert space (RKHS) in which the whole arsenal of kernel methods can be extended to probability measures. It can be viewed as a generalization of the orig… ▽ More

    Submitted 25 January, 2017; v1 submitted 31 May, 2016; originally announced May 2016.

    Comments: 147 pages; this is a version of the manuscript after the review process

    Journal ref: Foundations and Trends in Machine Learning: Vol. 10: No. 1-2, pp 1-141 (2017)

  5. arXiv:1602.04361  [pdf, ps, other math.ST

    Minimax Estimation of Kernel Mean Embeddings

    Authors: Ilya Tolstikhin, Bharath Sriperumbudur, Krikamol Muandet

    Abstract: In this paper, we study the minimax estimation of the Bochner integral $$μ_k(P):=\int_{\mathcal{X}} k(\cdot,x)\,dP(x),$$ also called as the kernel mean embedding, based on random samples drawn i.i.d.~from $P$, where $k:\mathcal{X}\times\mathcal{X}\rightarrow\mathbb{R}$ is a positive definite kernel. Various estimators (including the empirical estimator), $\hatθ_n$ of $μ_k(P)… ▽ More

    Submitted 31 July, 2017; v1 submitted 13 February, 2016; originally announced February 2016.

    MSC Class: 62G05; 62G07

  6. arXiv:1502.02398  [pdf, other stat.ML

    Towards a Learning Theory of Cause-Effect Inference

    Authors: David Lopez-Paz, Krikamol Muandet, Bernhard Schölkopf, Ilya Tolstikhin

    Abstract: We pose causal inference as the problem of learning to classify probability distributions. In particular, we assume access to a collection $\{(S_i,l_i)\}_{i=1}^n$, where each $S_i$ is a sample drawn from the probability distribution of $X_i \times Y_i$, and $l_i$ is a binary label indicating whether "$X_i \to Y_i$" or "$X_i \leftarrow Y_i$". Given these data, we build a causal inference rule in tw… ▽ More

    Submitted 18 May, 2015; v1 submitted 9 February, 2015; originally announced February 2015.

  7. arXiv:1501.06794  [pdf, other stat.ML

    Computing Functions of Random Variables via Reproducing Kernel Hilbert Space Representations

    Authors: Bernhard Schölkopf, Krikamol Muandet, Kenji Fukumizu, Jonas Peters

    Abstract: We describe a method to perform functional operations on probability distributions of random variables. The method uses reproducing kernel Hilbert space representations of probability distributions, and it is applicable to all operations which can be applied to points drawn from the respective distributions. We refer to our approach as {\em kernel probabilistic programming}. We illustrate it on sy… ▽ More

    Submitted 27 January, 2015; originally announced January 2015.

    ACM Class: G.3; I.2.6; D.3.3

    Journal ref: Statistics and Computing 25:755-766 (2015)

  8. arXiv:1411.0900  [pdf, ps, other stat.ML

    Kernel Mean Estimation via Spectral Filtering

    Authors: Krikamol Muandet, Bharath Sriperumbudur, Bernhard Schölkopf

    Abstract: The problem of estimating the kernel mean in a reproducing kernel Hilbert space (RKHS) is central to kernel methods in that it is used by classical approaches (e.g., when centering a kernel PCA matrix), and it also forms the core inference step of modern kernel methods (e.g., kernel-based non-parametric tests) that rely on embedding probability distributions in RKHSs. Muandet et al. (2014) has sho… ▽ More

    Submitted 4 November, 2014; originally announced November 2014.

    Comments: To appear at the 28th Annual Conference on Neural Information Processing Systems (NIPS 2014). 16 pages

  9. arXiv:1409.4366  [pdf, other stat.ML

    The Randomized Causation Coefficient

    Authors: David Lopez-Paz, Krikamol Muandet, Benjamin Recht

    Abstract: We are interested in learning causal relationships between pairs of random variables, purely from observational data. To effectively address this task, the state-of-the-art relies on strong assumptions regarding the mechanisms mapping causes to effects, such as invertibility or the existence of additive noise, which only hold in limited situations. On the contrary, this short paper proposes to lea… ▽ More

    Submitted 15 September, 2014; originally announced September 2014.

  10. arXiv:1408.2064  [pdf cs.LG

    One-Class Support Measure Machines for Group Anomaly Detection

    Authors: Krikamol Muandet, Bernhard Schoelkopf

    Abstract: We propose one-class support measure machines (OCSMMs) for group anomaly detection which aims at recognizing anomalous aggregate behaviors of data points. The OCSMMs generalize well-known one-class support vector machines (OCSVMs) to a space of probability measures. By formulating the problem as quantile estimation on distributions, we can establish an interesting connection to the OCSVMs and vari… ▽ More

    Submitted 9 August, 2014; originally announced August 2014.

    Comments: Appears in Proceedings of the Twenty-Ninth Conference on Uncertainty in Artificial Intelligence (UAI2013)

    Report number: UAI-P-2013-PG-449-458

  11. arXiv:1405.5505  [pdf, ps, other stat.ML

    Kernel Mean Shrinkage Estimators

    Authors: Krikamol Muandet, Bharath Sriperumbudur, Kenji Fukumizu, Arthur Gretton, Bernhard Schölkopf

    Abstract: A mean function in a reproducing kernel Hilbert space (RKHS), or a kernel mean, is central to kernel methods in that it is used by many classical algorithms such as kernel principal component analysis, and it also forms the core inference step of modern kernel methods that rely on embedding probability distributions in RKHSs. Given a finite sample, an empirical average has been used commonly as a… ▽ More

    Submitted 25 February, 2016; v1 submitted 21 May, 2014; originally announced May 2014.

    Comments: 41 pages

  12. arXiv:1306.0842  [pdf, ps, other stat.ML

    Kernel Mean Estimation and Stein's Effect

    Authors: Krikamol Muandet, Kenji Fukumizu, Bharath Sriperumbudur, Arthur Gretton, Bernhard Schölkopf

    Abstract: A mean function in reproducing kernel Hilbert space, or a kernel mean, is an important part of many applications ranging from kernel principal component analysis to Hilbert-space embedding of distributions. Given finite samples, an empirical average is the standard estimate for the true kernel mean. We show that this estimator can be improved via a well-known phenomenon in statistics called Stein'… ▽ More

    Submitted 6 June, 2013; v1 submitted 4 June, 2013; originally announced June 2013.

    Comments: first draft

  13. arXiv:1303.0309  [pdf, ps, other stat.ML

    One-Class Support Measure Machines for Group Anomaly Detection

    Authors: Krikamol Muandet, Bernhard Schölkopf

    Abstract: We propose one-class support measure machines (OCSMMs) for group anomaly detection which aims at recognizing anomalous aggregate behaviors of data points. The OCSMMs generalize well-known one-class support vector machines (OCSVMs) to a space of probability measures. By formulating the problem as quantile estimation on distributions, we can establish an interesting connection to the OCSVMs and vari… ▽ More

    Submitted 1 June, 2013; v1 submitted 1 March, 2013; originally announced March 2013.

    Comments: Conference on Uncertainty in Artificial Intelligence (UAI2013)

  14. arXiv:1301.2115  [pdf, ps, other stat.ML

    Domain Generalization via Invariant Feature Representation

    Authors: Krikamol Muandet, David Balduzzi, Bernhard Schölkopf

    Abstract: This paper investigates domain generalization: How to take knowledge acquired from an arbitrary number of related domains and apply it to previously unseen domains? We propose Domain-Invariant Component Analysis (DICA), a kernel-based optimization algorithm that learns an invariant transformation by minimizing the dissimilarity across domains, whilst preserving the functional relationship between… ▽ More

    Submitted 10 January, 2013; originally announced January 2013.

    Comments: The 30th International Conference on Machine Learning (ICML 2013)

  15. arXiv:1210.4347  [pdf, ps, other stat.ML

    Hilbert Space Embedding for Dirichlet Process Mixtures

    Authors: Krikamol Muandet

    Abstract: This paper proposes a Hilbert space embedding for Dirichlet Process mixture models via a stick-breaking construction of Sethuraman. Although Bayesian nonparametrics offers a powerful approach to construct a prior that avoids the need to specify the model size/complexity explicitly, an exact inference is often intractable. On the other hand, frequentist approaches such as kernel machines, which suf… ▽ More

    Submitted 16 October, 2012; originally announced October 2012.

    Comments: NIPS 2012 Workshop in confluence between kernel methods and graphical models

  16. arXiv:1202.6504  [pdf, ps, other stat.ML

    Learning from Distributions via Support Measure Machines

    Authors: Krikamol Muandet, Kenji Fukumizu, Francesco Dinuzzo, Bernhard Schölkopf

    Abstract: This paper presents a kernel-based discriminative learning framework on probability measures. Rather than relying on large collections of vectorial training examples, our framework learns using a collection of probability distributions that have been constructed to meaningfully represent training data. By representing these probability distributions as mean embeddings in the reproducing kernel Hil… ▽ More

    Submitted 12 January, 2013; v1 submitted 29 February, 2012; originally announced February 2012.

    Comments: Advances in Neural Information Processing Systems 25