Jensen–Shannon divergence
生活随笔
收集整理的这篇文章主要介绍了
Jensen–Shannon divergence
小编觉得挺不错的,现在分享给大家,帮大家做个参考.
Jensen–Shannon divergence(J-S散度) is a method of measuring the similarity between two probability distributions.
It is based on the Kullback–Leibler divergence(K-L散度), with some notable
(and useful) differences, including that it is symmetric and it is always a finite value.
The Jensen–Shannon divergence (JSD) is a symmetrized and smoothed version of the Kullback–Leibler divergence . It is defined by:
where
The Jensen–Shannon divergence is bounded by 1 for two probability distributions, given that one uses the base 2 logarithm.
For log base e, or ln, which is commonly used in statistical thermodynamics, the upper bound is ln(2):
总结
以上是生活随笔为你收集整理的Jensen–Shannon divergence的全部内容,希望文章能够帮你解决所遇到的问题。
- 上一篇: 【计算机网络】Shannon公式与Nyq
- 下一篇: Shannon极限与Nyquist极限