.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples/preprocessing/plot_map_data_to_normal.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. or to run this example in your browser via Binder .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_preprocessing_plot_map_data_to_normal.py: ================================= 将数据映射到正态分布 ================================= .. currentmodule:: sklearn.preprocessing 本示例演示了通过 :class:`~PowerTransformer` 使用 Box-Cox 和 Yeo-Johnson 变换 将各种分布的数据映射到正态分布。 在建模问题中,当需要同方差性和正态性时,幂变换是有用的。以下是 Box-Cox 和 Yeo-Johnson 应用于六种不同概率分布的示例:对数正态分布、卡方分布、韦伯分布、高斯分布、均匀分布和双峰分布。 请注意,当应用于某些数据集时,这些变换成功地将数据映射到正态分布,但对其他数据集则无效。这突显了在变换前后可视化数据的重要性。 还要注意,尽管 Box-Cox 似乎在对数正态分布和卡方分布上表现优于 Yeo-Johnson,但请记住 Box-Cox 不支持负值输入。 为了比较,我们还添加了 :class:`~QuantileTransformer` 的输出。只要有足够的训练样本(成千上万),它可以将任意分布强制转换为高斯分布。由于它是一种非参数方法,因此比参数方法(Box-Cox 和 Yeo-Johnson)更难解释。 在“小”数据集(少于几百个点)上,分位数变换器容易过拟合。因此,建议使用幂变换。 .. GENERATED FROM PYTHON SOURCE LINES 22-130 .. image-sg:: /auto_examples/preprocessing/images/sphx_glr_plot_map_data_to_normal_001.png :alt: Lognormal, Chi-squared, Weibull, After Box-Cox $\lambda$ = 0.04, After Box-Cox $\lambda$ = 0.27, After Box-Cox $\lambda$ = 12.83, After Yeo-Johnson $\lambda$ = -0.79, After Yeo-Johnson $\lambda$ = -0.13, After Yeo-Johnson $\lambda$ = 25.0, After Quantile transform, After Quantile transform, After Quantile transform, Gaussian, Uniform, Bimodal, After Box-Cox $\lambda$ = 4.92, After Box-Cox $\lambda$ = 0.63, After Box-Cox $\lambda$ = -1.66, After Yeo-Johnson $\lambda$ = 4.96, After Yeo-Johnson $\lambda$ = 0.37, After Yeo-Johnson $\lambda$ = -1.69, After Quantile transform, After Quantile transform, After Quantile transform :srcset: /auto_examples/preprocessing/images/sphx_glr_plot_map_data_to_normal_001.png :class: sphx-glr-single-img .. code-block:: Python # 作者:scikit-learn 开发者 # SPDX-License-Identifier: BSD-3-Clause import matplotlib.pyplot as plt import numpy as np from sklearn.model_selection import train_test_split from sklearn.preprocessing import PowerTransformer, QuantileTransformer N_SAMPLES = 1000 FONT_SIZE = 6 BINS = 30 rng = np.random.RandomState(304) bc = PowerTransformer(method="box-cox") yj = PowerTransformer(method="yeo-johnson") # n_quantiles 被设置为训练集大小而不是默认值,以避免此示例引发警告 qt = QuantileTransformer( n_quantiles=500, output_distribution="normal", random_state=rng ) size = (N_SAMPLES, 1) # 对数正态分布 X_lognormal = rng.lognormal(size=size) # 卡方分布 df = 3 X_chisq = rng.chisquare(df=df, size=size) # 威布尔分布 a = 50 X_weibull = rng.weibull(a=a, size=size) # 高斯分布 loc = 100 X_gaussian = rng.normal(loc=loc, size=size) # 均匀分布 X_uniform = rng.uniform(low=0, high=1, size=size) # 双峰分布 loc_a, loc_b = 100, 105 X_a, X_b = rng.normal(loc=loc_a, size=size), rng.normal(loc=loc_b, size=size) X_bimodal = np.concatenate([X_a, X_b], axis=0) # 创建图表 distributions = [ ("Lognormal", X_lognormal), ("Chi-squared", X_chisq), ("Weibull", X_weibull), ("Gaussian", X_gaussian), ("Uniform", X_uniform), ("Bimodal", X_bimodal), ] colors = ["#D81B60", "#0188FF", "#FFC107", "#B7A2FF", "#000000", "#2EC5AC"] fig, axes = plt.subplots(nrows=8, ncols=3, figsize=plt.figaspect(2)) axes = axes.flatten() axes_idxs = [ (0, 3, 6, 9), (1, 4, 7, 10), (2, 5, 8, 11), (12, 15, 18, 21), (13, 16, 19, 22), (14, 17, 20, 23), ] axes_list = [(axes[i], axes[j], axes[k], axes[l]) for (i, j, k, l) in axes_idxs] for distribution, color, axes in zip(distributions, colors, axes_list): name, X = distribution X_train, X_test = train_test_split(X, test_size=0.5) # 执行幂变换和分位数变换 X_trans_bc = bc.fit(X_train).transform(X_test) lmbda_bc = round(bc.lambdas_[0], 2) X_trans_yj = yj.fit(X_train).transform(X_test) lmbda_yj = round(yj.lambdas_[0], 2) X_trans_qt = qt.fit(X_train).transform(X_test) ax_original, ax_bc, ax_yj, ax_qt = axes ax_original.hist(X_train, color=color, bins=BINS) ax_original.set_title(name, fontsize=FONT_SIZE) ax_original.tick_params(axis="both", which="major", labelsize=FONT_SIZE) for ax, X_trans, meth_name, lmbda in zip( (ax_bc, ax_yj, ax_qt), (X_trans_bc, X_trans_yj, X_trans_qt), ("Box-Cox", "Yeo-Johnson", "Quantile transform"), (lmbda_bc, lmbda_yj, None), ): ax.hist(X_trans, color=color, bins=BINS) title = "After {}".format(meth_name) if lmbda is not None: title += "\n$\\lambda$ = {}".format(lmbda) ax.set_title(title, fontsize=FONT_SIZE) ax.tick_params(axis="both", which="major", labelsize=FONT_SIZE) ax.set_xlim([-3.5, 3.5]) plt.tight_layout() plt.show() .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 0.736 seconds) .. _sphx_glr_download_auto_examples_preprocessing_plot_map_data_to_normal.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: binder-badge .. image:: images/binder_badge_logo.svg :target: https://mybinder.org/v2/gh/scikit-learn/scikit-learn/main?urlpath=lab/tree/notebooks/auto_examples/preprocessing/plot_map_data_to_normal.ipynb :alt: Launch binder :width: 150 px .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_map_data_to_normal.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_map_data_to_normal.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: plot_map_data_to_normal.zip ` .. include:: plot_map_data_to_normal.recommendations .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_