欢迎访问 生活随笔!

生活随笔

当前位置: 首页 > 编程资源 > 编程问答 >内容正文

编程问答

Jensen–Shannon divergence

发布时间:2023/12/8 编程问答 46 豆豆
生活随笔 收集整理的这篇文章主要介绍了 Jensen–Shannon divergence 小编觉得挺不错的,现在分享给大家,帮大家做个参考.

 Jensen–Shannon divergence(J-S散度) is a method of measuring the similarity between two probability distributions.

It is based on the Kullback–Leibler divergence(K-L散度), with some notable

(and useful) differences, including that it is symmetric and it is always a finite value.

The Jensen–Shannon divergence (JSD)  is a symmetrized and smoothed version of the Kullback–Leibler divergence . It is defined by:

   where   

The Jensen–Shannon divergence is bounded by 1 for two probability distributions, given that one uses the base 2 logarithm.    

For log base e, or ln, which is commonly used in statistical thermodynamics, the upper bound is ln(2):    

总结

以上是生活随笔为你收集整理的Jensen–Shannon divergence的全部内容,希望文章能够帮你解决所遇到的问题。

如果觉得生活随笔网站内容还不错,欢迎将生活随笔推荐给好友。