• If a probability density p is known then image information content can be estimated regardless of its interpretation using entropy H. The concept of entropy has roots in thermodynamics and statistical mechanics but it took many years before entropy was related to information. The information-theoretic formulation of entropy comes from Shannon [Shannon, 1948] and is often called information entropy.

    如果知道概率密度p,用H(Entropy)就可以估计出图像的信息量,而与其解释无关。熵的概念根源于热力学和统计力学,直到很多年后才与信息联系起来。熵的信息论的形成源于香农[Shannon, 1948],常称作信息熵(information entropy

    An intuitive understanding of information entropy relates to the amount of uncertainty about an event associated with a given probability distribution. The entropy can serve as an measure of 'disorder'. As the level of disorder rises, entropy increases and events are less predictable.

    信息熵的直觉理解与关联于给定概率分布的事件的不确定性大小有关。熵可作为“失调”的度量。当失调水平上升时,熵就增加而事件就越难于预测。 

    The entropy is defined formally assuming a discrete random variable X with possible outcomes (called also states) x1,... ,xn. Let p(xk) be the probability of the outcome xk,k = 1,... n. Then the entropy is defined as

    熵定义为:假设一个离散随机变量X具有可能的结果(又称为状态)x1,...,xn。令p(xk)为输出结果xk的概率,k=1,...,n。那么熵定义为:

     

    The entropy of the random variable X is the sum, over all possible outcomes k of X, of the product of the probability of outcome xk with the logarithm of the inverse of the probability of xk. log2(1/p(xk)) is also called the surprisal of the outcome xk. The entropy of the random discrete variable X is the expected value of its outcome's surprisal.

    The base of the logarithm in this formula determines the unit in which entropy is measured. If this base is two then the entropy is given in bits. Recall that the probability density p(xk) needed to calculate the entropy is often estimated using a gray-level histogram in image analysis, Section 2.3.2.

    Entropy measures the uncertainty about the realization of a random variable. For Shannon, it served as a proxy capturing the concept of information contained in a message as opposed to the portion of the message that is strictly determined and predictable by inherent structures. For example, we shall explore entropy to assess redundancy in an image for image compression (Chapter 14).

  • 相关阅读:
    信号线上串接电阻的作用
    python3 获取线上所有mysql的表大于1000万的信息
    airflow dags 任务 python脚本
    sqlalchemy 从零到一的使用流程(python+sqlite)
    sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) no such table:XXX 的解决办法(flask sqlite 项目)
    Vue Axios请求封装
    百度新首页随想一二
    在 CentOS 7.5 64位上使用 yum 安装 MySQL 8.0
    浅析线程池 ThreadPoolExecutor 源码
    Java大整形BigInteger的用法
  • 原文地址:https://www.cnblogs.com/2008nmj/p/9218729.html
Copyright © 2020-2023  润新知