.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples/mixture/plot_gmm_sin.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. or to run this example in your browser via Binder .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_mixture_plot_gmm_sin.py: ================================= 高斯混合模型正弦曲线 ================================= 本示例演示了在数据并非从高斯随机变量混合中采样时,高斯混合模型的行为。数据集由100个点组成,这些点大致沿着一个有噪声的正弦曲线分布。因此,没有关于高斯成分数量的真实值。 第一个模型是一个经典的高斯混合模型,具有10个成分,并使用期望最大化算法进行拟合。 第二个模型是一个具有狄利克雷过程先验的贝叶斯高斯混合模型,使用变分推断进行拟合。低浓度先验值使得模型倾向于较少的活跃成分数量。该模型“决定”将其建模能力集中在数据集结构的整体上:通过非对角协方差矩阵建模的交替方向的点群。这些交替方向大致捕捉了原始正弦信号的交替特性。 第三个模型也是一个具有狄利克雷过程先验的贝叶斯高斯混合模型,但这次浓度先验值较高,使得模型有更多的自由度来建模数据的细粒度结构。结果是一个具有更多活跃成分的混合模型,类似于我们任意决定将成分数量固定为10的第一个模型。 哪个模型最好是一个主观判断的问题:我们是希望偏向于只捕捉整体结构以总结和解释大部分数据结构的模型,同时忽略细节,还是更喜欢紧密跟随信号高密度区域的模型? 最后两个面板展示了如何从后两个模型中采样。结果样本分布看起来与原始数据分布不完全相同。差异主要来自于我们使用假设数据由有限数量的高斯成分生成的模型所产生的近似误差,而不是一个连续的有噪声的正弦曲线。 .. GENERATED FROM PYTHON SOURCE LINES 19-162 .. image-sg:: /auto_examples/mixture/images/sphx_glr_plot_gmm_sin_001.png :alt: Expectation-maximization, Bayesian Gaussian mixture models with a Dirichlet process prior for $\gamma_0=0.01$., Gaussian mixture with a Dirichlet process prior for $\gamma_0=0.01$ sampled with $2000$ samples., Bayesian Gaussian mixture models with a Dirichlet process prior for $\gamma_0=100$, Gaussian mixture with a Dirichlet process prior for $\gamma_0=100$ sampled with $2000$ samples. :srcset: /auto_examples/mixture/images/sphx_glr_plot_gmm_sin_001.png :class: sphx-glr-single-img .. code-block:: Python import itertools import matplotlib as mpl import matplotlib.pyplot as plt import numpy as np from scipy import linalg from sklearn import mixture color_iter = itertools.cycle(["navy", "c", "cornflowerblue", "gold", "darkorange"]) def plot_results(X, Y, means, covariances, index, title): splot = plt.subplot(5, 1, 1 + index) for i, (mean, covar, color) in enumerate(zip(means, covariances, color_iter)): v, w = linalg.eigh(covar) v = 2.0 * np.sqrt(2.0) * np.sqrt(v) u = w[0] / linalg.norm(w[0]) # 由于动态规划不会使用它能访问的每个组件,除非它需要它们,因此我们不应该绘制冗余组件。 if not np.any(Y == i): continue plt.scatter(X[Y == i, 0], X[Y == i, 1], 0.8, color=color) # 绘制一个椭圆来显示高斯成分 angle = np.arctan(u[1] / u[0]) angle = 180.0 * angle / np.pi # convert to degrees ell = mpl.patches.Ellipse(mean, v[0], v[1], angle=180.0 + angle, color=color) ell.set_clip_box(splot.bbox) ell.set_alpha(0.5) splot.add_artist(ell) plt.xlim(-6.0, 4.0 * np.pi - 6.0) plt.ylim(-5.0, 5.0) plt.title(title) plt.xticks(()) plt.yticks(()) def plot_samples(X, Y, n_components, index, title): plt.subplot(5, 1, 4 + index) for i, color in zip(range(n_components), color_iter): # 由于动态规划不会使用它能访问的每个组件,除非它需要它们,因此我们不应该绘制冗余组件。 if not np.any(Y == i): continue plt.scatter(X[Y == i, 0], X[Y == i, 1], 0.8, color=color) plt.xlim(-6.0, 4.0 * np.pi - 6.0) plt.ylim(-5.0, 5.0) plt.title(title) plt.xticks(()) plt.yticks(()) # Parameters n_samples = 100 # 生成遵循正弦曲线的随机样本 np.random.seed(0) X = np.zeros((n_samples, 2)) step = 4.0 * np.pi / n_samples for i in range(X.shape[0]): x = i * step - 6.0 X[i, 0] = x + np.random.normal(0, 0.1) X[i, 1] = 3.0 * (np.sin(x) + np.random.normal(0, 0.2)) plt.figure(figsize=(10, 10)) plt.subplots_adjust( bottom=0.04, top=0.95, hspace=0.2, wspace=0.05, left=0.03, right=0.97 ) # 使用EM算法拟合具有十个成分的高斯混合模型 gmm = mixture.GaussianMixture( n_components=10, covariance_type="full", max_iter=100 ).fit(X) plot_results( X, gmm.predict(X), gmm.means_, gmm.covariances_, 0, "Expectation-maximization" ) dpgmm = mixture.BayesianGaussianMixture( n_components=10, covariance_type="full", weight_concentration_prior=1e-2, weight_concentration_prior_type="dirichlet_process", mean_precision_prior=1e-2, covariance_prior=1e0 * np.eye(2), init_params="random", max_iter=100, random_state=2, ).fit(X) plot_results( X, dpgmm.predict(X), dpgmm.means_, dpgmm.covariances_, 1, "Bayesian Gaussian mixture models with a Dirichlet process prior " r"for $\gamma_0=0.01$.", ) X_s, y_s = dpgmm.sample(n_samples=2000) plot_samples( X_s, y_s, dpgmm.n_components, 0, "Gaussian mixture with a Dirichlet process prior " r"for $\gamma_0=0.01$ sampled with $2000$ samples.", ) dpgmm = mixture.BayesianGaussianMixture( n_components=10, covariance_type="full", weight_concentration_prior=1e2, weight_concentration_prior_type="dirichlet_process", mean_precision_prior=1e-2, covariance_prior=1e0 * np.eye(2), init_params="kmeans", max_iter=100, random_state=2, ).fit(X) plot_results( X, dpgmm.predict(X), dpgmm.means_, dpgmm.covariances_, 2, "Bayesian Gaussian mixture models with a Dirichlet process prior " r"for $\gamma_0=100$", ) X_s, y_s = dpgmm.sample(n_samples=2000) plot_samples( X_s, y_s, dpgmm.n_components, 1, "Gaussian mixture with a Dirichlet process prior " r"for $\gamma_0=100$ sampled with $2000$ samples.", ) plt.show() .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 0.189 seconds) .. _sphx_glr_download_auto_examples_mixture_plot_gmm_sin.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: binder-badge .. image:: images/binder_badge_logo.svg :target: https://mybinder.org/v2/gh/scikit-learn/scikit-learn/main?urlpath=lab/tree/notebooks/auto_examples/mixture/plot_gmm_sin.ipynb :alt: Launch binder :width: 150 px .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_gmm_sin.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_gmm_sin.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: plot_gmm_sin.zip ` .. include:: plot_gmm_sin.recommendations .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_