solidot新版网站常见问题,请点击这里查看。
消息
本文已被查看222次
The Semantic Information Method for Maximum Mutual Information and Maximum Likelihood of Tests, Estimations, and Mixture Models. (arXiv:1706.07918v1 [cs.IT])
来源于:arXiv
It is very difficult to solve the Maximum Mutual Information (MMI) or Maximum
Likelihood (ML) for all possible Shannon Channels or uncertain rules of
choosing hypotheses, so that we have to use iterative methods. According to the
Semantic Mutual Information (SMI) and R(G) function proposed by Chenguang Lu
(1993) (where R(G) is an extension of information rate distortion function
R(D), and G is the lower limit of the SMI), we can obtain a new iterative
algorithm of solving the MMI and ML for tests, estimations, and mixture models.
The SMI is defined by the average log normalized likelihood. The likelihood
function is produced from the truth function and the prior by semantic Bayesian
inference. A group of truth functions constitute a semantic channel. Letting
the semantic channel and Shannon channel mutually match and iterate, we can
obtain the Shannon channel that maximizes the Shannon mutual information and
the average log likelihood. This iterative algorithm is called Channels'
Match 查看全文>>