Information Entropy
Time Limit: 1 Sec
Memory Limit: 256 MB
题目连接
http://acm.zju.edu.cn/onlinejudge/showProblem.do?problemCode=3827Description
Information Theory is one of the most popular courses in Marjar University. In this course, there is an important chapter about information entropy.
Entropy is the average amount of information contained in each message received. Here, a message stands for an event, or a sample or a character drawn from a distribution or a data stream. Entropy thus characterizes our uncertainty about our source of information. The source is also characterized by the probability distribution of the samples drawn from it. The idea here is that the less likely an event is, the more information it provides when it occurs.
Generally, "entropy" stands for "disorder" or uncertainty. The entropy we talk about here was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication". We also call it Shannon entropy or information entropy to distinguish from other occurrences of the term, which appears in various parts of physics in different forms.
Named after Boltzmann's H-theorem, Shannon defined the entropy Η (Greek letter Η, η) of a discrete random variable X with possible values {x1, x2, ..., xn} and probability mass function P(X) as:
Here E is the expected value operator. When taken from a finite sample, the entropy can explicitly be written as
Where b is the base of the logarithm used. Common values of b are 2, Euler's number e, and 10. The unit of entropy is bit for b = 2, natfor b = e, and dit (or digit) for b = 10 respectively.
In the case of P(xi) = 0 for some i, the value of the corresponding summand 0 logb(0) is taken to be a well-known limit:
Your task is to calculate the entropy of a finite sample with N values.
Input
There are multiple test cases. The first line of input contains an integer T indicating the number of test cases. For each test case:
The first line contains an integer N (1 <= N <= 100) and a string S. The string S is one of "bit", "nat" or "dit", indicating the unit of entropy.
In the next line, there are N non-negative integers P1, P2, .., PN. Pi means the probability of the i-th value in percentage and the sum of Piwill be 100.
Output
For each test case, output the entropy in the corresponding unit.
Any solution with a relative or absolute error of at most 10-8 will be accepted.
Sample Input
3 3 bit 25 25 50 7 nat 1 2 4 8 16 32 37 10 dit 10 10 10 10 10 10 10 10 10 10
Sample Output
1.500000000000 1.480810832465 1.000000000000
HINT
题意
给你n个数,然后让你算XXX东西……
然后把公式给你了
题解:
题目已经把公式给你了,当x=0的时候,就直接跳过就好了……
代码:
//qscqesze #pragma comment(linker, "/STACK:1024000000,1024000000") #include <cstdio> #include <cmath> #include <cstring> #include <ctime> #include <iostream> #include <algorithm> #include <set> #include <bitset> #include <vector> #include <sstream> #include <queue> #include <typeinfo> #include <fstream> #include <map> #include <stack> typedef long long ll; using namespace std; //freopen("D.in","r",stdin); //freopen("D.out","w",stdout); #define sspeed ios_base::sync_with_stdio(0);cin.tie(0) #define maxn 100006 #define mod 1000000007 #define eps 1e-9 #define e exp(1.0) #define PI acos(-1) const double EP = 1E-10 ; int Num; //const int inf=0x7fffffff; const ll inf=999999999; inline ll read() { ll x=0,f=1;char ch=getchar(); while(ch<'0'||ch>'9'){if(ch=='-')f=-1;ch=getchar();} while(ch>='0'&&ch<='9'){x=x*10+ch-'0';ch=getchar();} return x*f; } //************************************************************************************* int main() { int t=read(); while(t--) { int n=read();string s; cin>>s; double b; if(s=="bit")b=2.0; if(s=="nat")b=e; if(s=="dit")b=10.0; double ans = 0; for(int i=1;i<=n;i++) { double x;scanf("%lf",&x); x/=100.0; if(x) ans += -(x*log(x)/log(b)); } printf("%.14f ",ans); } }