• 目录:Matrix Differential Calculus with Applications in Statistics and Econometrics,3rd_[Magnus2019]


    目录:Matrix Differential Calculus with Applications in Statistics and Econometrics,3rd_[Magnus2019]

    Title	-16
    Contents	-14
    Preface	-6
    Part One — Matrices	1
    1 Basic properties of vectors and matrices	3
    	1.1 Introduction	3
    	1.2 Sets	3
    	1.3 Matrices: addition and multiplication	4
    	1.4 The transpose of a matrix	6
    	1.5 Square matrices	6
    	1.6 Linear forms and quadratic forms	7
    	1.7 The rank of a matrix	9
    	1.8 The inverse	10
    	1.9 The determinant	10
    	1.10 The trace	11
    	1.11 Partitioned matrices	12
    	1.12 Complex matrices	14
    	1.13 Eigenvalues and eigenvectors	14
    	1.14 Schur’s decomposition theorem	17
    	1.15 The Jordan decomposition	18
    	1.16 The singular-value decomposition	20
    	1.17 Further results concerning eigenvalues	20
    	1.18 Positive (semi)definite matrices	23
    	1.19 Three further results for positive definite matrices	25
    	1.20 A useful result	26
    	1.21 Symmetric matrix functions	27
    	Miscellaneous exercises	28
    	Bibliographical notes	30
    2 Kronecker products, vec operator, and Moore-Penrose inverse	31
    	2.1 Introduction	31
    	2.2 The Kronecker product	31
    	2.3 Eigenvalues of a Kronecker product	33
    	2.4 The vec operator	34
    	2.5 The Moore-Penrose (MP) inverse	36
    	2.6 Existence and uniqueness of the MP inverse	37
    	2.7 Some properties of the MP inverse	38
    	2.8 Further properties	39
    	2.9 The solution of linear equation systems	41
    	Miscellaneous exercises	43
    	Bibliographical notes	45
    3 Miscellaneous matrix results	47
    	3.1 Introduction	47
    	3.2 The adjoint matrix	47
    	3.3 Proof of Theorem 3.1	49
    	3.4 Bordered determinants	51
    	3.5 The matrix equation AX = 0	51
    	3.6 The Hadamard product	52
    	3.7 The commutation matrix K mn	54
    	3.8 The duplication matrix D n	56
    	3.9 Relationship between D n+1 and D n , I	58
    	3.10 Relationship between D n+1 and D n , II	59
    	3.11 Conditions for a quadratic form to be positive (negative) subject to linear constraints	60
    	12 Necessary and sufficient conditions for r(A : B) = r(A) + r(B)63
    	13 The bordered Gramian matrix	65
    	14 The equations X 1 A + X 2 B ′ = G 1 ,X 1 B = G 2	67
    	Miscellaneous exercises	69
    	Bibliographical notes	70
    Part Two — Differentials: the theory
    4 Mathematical preliminaries	73
    	4.1 Introduction	73
    	4.2 Interior points and accumulation points	73
    	4.3 Open and closed sets	75
    	4.4 The Bolzano-Weierstrass theorem	77
    	4.5 Functions	78
    	4.6 The limit of a function	79
    	4.7 Continuous functions and compactness	80
    	4.8 Convex sets	81
    	4.9 Convex and concave functions	83
    	Bibliographical notes	86
    5 Differentials and differentiability	87
    	5.1 Introduction	87
    	5.2 Continuity	88
    	5.3 Differentiability and linear approximation	90
    	5.4 The differential of a vector function	91
    	5.5 Uniqueness of the differential	93
    	5.6 Continuity of differentiable functions	94
    	5.7 Partial derivatives	95
    	5.8 The first identification theorem	96
    	5.9 Existence of the differential, I	97
    	5.10 Existence of the differential, II	99
    	5.11 Continuous differentiability	100
    	5.12 The chain rule	100
    	5.13 Cauchy invariance	102
    	5.14 The mean-value theorem for real-valued functions	103
    	5.15 Differentiable matrix functions	104
    	5.16 Some remarks on notation	106
    	5.17 Complex differentiation	108
    	Miscellaneous exercises	110
    	Bibliographical notes	110
    6 The second differential	111
    	6.1 Introduction	111
    	6.2 Second-order partial derivatives	111
    	6.3 The Hessian matrix	112
    	6.4 Twice differentiability and second-order approximation, I	113
    	6.5 Definition of twice differentiability	114
    	6.6 The second differential	115
    	6.7 Symmetry of the Hessian matrix	117
    	6.8 The second identification theorem	119
    	6.9 Twice differentiability and second-order approximation, II	119
    	6.10 Chain rule for Hessian matrices	121
    	6.11 The analog for second differentials	123
    	6.12 Taylor’s theorem for real-valued functions	124
    	6.13 Higher-order differentials	125
    	6.14 Real analytic functions	125
    	6.15 Twice differentiable matrix functions	126
    	Bibliographical notes	127
    7 Static optimization	129
    	7.1 Introduction	129
    	7.2 Unconstrained optimization	130
    	7.3 The existence of absolute extrema	131
    	7.4 Necessary conditions for a local minimum	132
    	7.5 Sufficient conditions for a local minimum: first-derivative test	134
    	7.6 Sufficient conditions for a local minimum: second-derivative test	136
    	7.7 Characterization of differentiable convex functions	138
    	7.8 Characterization of twice differentiable convex functions	141
    	7.9 Sufficient conditions for an absolute minimum	142
    	7.10 Monotonic transformations	143
    	7.11 Optimization subject to constraints	144
    	7.12 Necessary conditions for a local minimum under constraints	145
    	7.13 Sufficient conditions for a local minimum under constraints	149
    	7.14 Sufficient conditions for an absolute minimum under constraints	154
    	7.15 A note on constraints in matrix form	155
    	7.16 Economic interpretation of Lagrange multipliers	155
    	Appendix: the implicit function theorem	157
    	Bibliographical notes	159
    Part Three — Differentials: the practice	161
    8 Some important differentials	163
    	8.1 Introduction	163
    	8.2 Fundamental rules of differential calculus	163
    	8.3 The differential of a determinant	165
    	8.4 The differential of an inverse	168
    	8.5 Differential of the Moore-Penrose inverse	169
    	8.6 The differential of the adjoint matrix	172
    	8.7 On differentiating eigenvalues and eigenvectors	174
    	8.8 The continuity of eigenprojections	176
    	8.9 The differential of eigenvalues and eigenvectors: symmetric case	180
    	8.10 Two alternative expressions for dλ	183
    	8.11 Second differential of the eigenvalue function	185
    	Miscellaneous exercises	186
    	Bibliographical notes	189
    9 First-order differentials and Jacobian matrices	191
    	9.1 Introduction	191
    	9.2 Classification	192
    	9.3 Derisatives	192
    	9.4 Derivatives	194
    	9.5 Identification of Jacobian matrices	196
    	9.6 The first identification table	197
    	9.7 Partitioning of the derivative	197
    	9.8 Scalar functions of a scalar	198
    	9.9 Scalar functions of a vector	198
    	9.10 Scalar functions of a matrix, I: trace	199
    	9.11 Scalar functions of a matrix, II: determinant	201
    	9.12 Scalar functions of a matrix, III: eigenvalue	202
    	9.13 Two examples of vector functions	203
    	9.14 Matrix functions	204
    	9.15 Kronecker products	206
    	9.16 Some other problems	208
    	9.17 Jacobians of transformations	209
    	Bibliographical notes	210
    10 Second-order differentials and Hessian matrices	211
    	10.1 Introduction	211
    	10.2 The second identification table	211
    	10.3 Linear and quadratic forms	212
    	10.4 A useful theorem	213
    	10.5 The determinant function	214
    	10.6 The eigenvalue function	215
    	10.7 Other examples	215
    	10.8 Composite functions	217
    	10.9 The eigenvector function	218
    	10.10 Hessian of matrix functions, I	219
    	10.11 Hessian of matrix functions, II	219
    	Miscellaneous exercises	220
    Part Four — Inequalities	223
    11 Inequalities	225
    	11.1 Introduction	225
    	11.2 The Cauchy-Schwarz inequality	226
    	11.3 Matrix analogs of the Cauchy-Schwarz inequality	227
    	11.4 The theorem of the arithmetic and geometric means	228
    	11.5 The Rayleigh quotient	230
    	11.6 Concavity of λ 1 and convexity of λ n	232
    	11.7 Variational description of eigenvalues	232
    	11.8 Fischer’s min-max theorem	234
    	11.9 Monotonicity of the eigenvalues	236
    	11.10 The Poincar´ e separation theorem	236
    	11.11 Two corollaries of Poincar´ e’s theorem	237
    	11.12 Further consequences of the Poincar´ e theorem	238
    	11.13 Multiplicative version	239
    	11.14 The maximum of a bilinear form	241
    	11.15 Hadamard’s inequality	242
    	11.16 An interlude: Karamata’s inequality	242
    	11.17 Karamata’s inequality and eigenvalues	244
    	11.18 An inequality concerning positive semidefinite matrices	245
    	11.19 A representation theorem for (Papi) 1/p	246
    	11.20 A representation theorem for (trA p ) 1/p	247
    	11.21 Hölder’s inequality	248
    	11.22 Concavity of log|A|	250
    	11.23 Minkowski’s inequality	251
    	11.24 Quasilinear representation of |A| 1/n	253
    	11.25 Minkowski’s determinant theorem	255
    	11.26 Weighted means of order p	256
    	11.27 Schl¨ omilch’s inequality	258
    	11.28 Curvature properties of M p (x,a)	259
    	11.29 Least squares	260
    	11.30 Generalized least squares	261
    	11.31 Restricted least squares	262
    	11.32 Restricted least squares: matrix version	264
    	Miscellaneous exercises	265
    	Bibliographical notes	269
    Part Five — The linear model	271
    12 Statistical preliminaries	273
    	12.1 Introduction	273
    	12.2 The cumulative distribution function	273
    	12.3 The joint density function	274
    	12.4 Expectations	274
    	12.5 Variance and covariance	275
    	12.6 Independence of two random variables	277
    	12.7 Independence of n random variables	279
    	12.8 Sampling	279
    	12.9 The one-dimensional normal distribution	279
    	12.10 The multivariate normal distribution	280
    	12.11 Estimation	282
    	Miscellaneous exercises	282
    	Bibliographical notes	283
    13 The linear regression model	285
    	13.1 Introduction	285
    	13.2 Affine minimum-trace unbiased estimation	286
    	13.3 The Gauss-Markov theorem	287
    	13.4 The method of least squares	290
    	13.5 Aitken’s theorem	291
    	13.6 Multicollinearity	293
    	13.7 Estimable functions	295
    	13.8 Linear constraints: the case M(R ′ ) ⊂ M(X ′ )	296
    	13.9 Linear constraints: the general case	300
    	13.10 Linear constraints: the case M(R ′ ) ∩ M(X ′ ) = {0}	302
    	13.11 A singular variance matrix: the case M(X) ⊂ M(V )	304
    	13.12 A singular variance matrix: the case r(X'V+ X) = r(X)	305
    	13.13 A singular variance matrix: the general case, I	307
    	13.14 Explicit and implicit linear constraints	307
    	13.15 The general linear model, I	310
    	13.16 A singular variance matrix: the general case, II	311
    	13.17 The general linear model, II	314
    	13.18 Generalized least squares	315
    	13.19 Restricted least squares	316
    	Miscellaneous exercises	318
    	Bibliographical notes	319
    14 Further topics in the linear model	321
    	14.1 Introduction	321
    	14.2 Best quadratic unbiased estimation of σ 2	322
    	14.3 The best quadratic and positive unbiased estimator of σ 2	322
    	14.4 The best quadratic unbiased estimator of σ 2	324
    	14.5 Best quadratic invariant estimation of σ 2	326
    	14.6 The best quadratic and positive invariant estimator of σ 2	327
    	14.7 The best quadratic invariant estimator of σ 2	329
    	14.8 Best quadratic unbiased estimation: multivariate normal case	330
    	14.9 Bounds for the bias of the least-squares estimator of σ 2 , I	332
    	14.10 Bounds for the bias of the least-squares estimator of σ 2 , II	333
    	14.11 The prediction of disturbances	335
    	14.12 Best linear unbiased predictors with scalar variance matrix	336
    	14.13 Best linear unbiased predictors with fixed variance matrix, I	338
    	14.14 Best linear unbiased predictors with fixed variance matrix, II	340
    	14.15 Local sensitivity of the posterior mean	341
    	14.16 Local sensitivity of the posterior precision	342
    	Bibliographical notes	344
    Part Six — Applications to maximum likelihood estimation	345
    15 Maximum likelihood estimation	347
    	15.1 Introduction	347
    	15.2 The method of maximum likelihood (ML)	347
    	15.3 ML estimation of the multivariate normal distribution	348
    	15.4 Symmetry: implicit versus explicit treatment	350
    	15.5 The treatment of positive definiteness	351
    	15.6 The information matrix	352
    	15.7 ML estimation of the multivariate normal distribution:distinct means	354
    	15.8 The multivariate linear regression model	354
    	15.9 The errors-in-variables model	357
    	15.10 The nonlinear regression model with normal errors	359
    	15.11 Special case: functional independence of mean and variance parameters	361
    	15.12 Generalization of Theorem 15.6	362
    	Miscellaneous exercises	364
    	Bibliographical notes	365
    16 Simultaneous equations	367
    	16.1 Introduction	367
    	16.2 The simultaneous equations model	367
    	16.3 The identification problem	369
    	16.4 Identification with linear constraints on B and Γ only	371
    	16.5 Identification with linear constraints on B, Γ, and Σ	371
    	16.6 Nonlinear constraints	373
    	16.7 FIML: the information matrix (general case)	374
    	16.8 FIML: asymptotic variance matrix (special case)	376
    	16.9 LIML: first-order conditions	378
    	16.10 LIML: information matrix	381
    	16.11 LIML: asymptotic variance matrix	383
    	Bibliographical notes	388
    17 Topics in psychometrics	389
    	17.1 Introduction	389
    	17.2 Population principal components	390
    	17.3 Optimality of principal components	391
    	17.4 A related result	392
    	17.5 Sample principal components	393
    	17.6 Optimality of sample principal components	395
    	17.7 One-mode component analysis	395
    	17.8 One-mode component analysis and sample principal components	398
    	17.9 Two-mode component analysis	399
    	17.10 Multimode component analysis	400
    	17.11 Factor analysis	404
    	17.12 A zigzag routine	407
    	17.13 A Newton-Raphson routine	408
    	17.14 Kaiser’s varimax method	412
    	17.15 Canonical correlations and variates in the population	414
    	17.16 Correspondence analysis	417
    	17.17 Linear discriminant analysis	418
    	Bibliographical notes	419
    Part Seven — Summary	421
    18 Matrix calculus: the essentials	423
    	18.1 Introduction	423
    	18.2 Differentials	424
    	18.3 Vector calculus	426
    	18.4 Optimization	429
    	18.5 Least squares	431
    	18.6 Matrix calculus	432
    	18.7 Interlude on linear and quadratic forms	434
    	18.8 The second differential	434
    	18.9 Chain rule for second differentials	436
    	18.10 Four examples	438
    	18.11 The Kronecker product and vec operator	439
    	18.12 Identification	441
    	18.13 The commutation matrix	442
    	18.14 From second differential to Hessian	443
    	18.15 Symmetry and the duplication matrix	444
    	18.16 Maximum likelihood	445
    	Further reading	448
    Bibliography	449
    Index of symbols	467
    Subject index	471
    
  • 相关阅读:
    js 跳转链接
    reg.test is not a function 报错
    html中button自动提交表单?
    mysql主从复制及双主复制
    nginx反向代理后端web服务器记录客户端ip地址
    mysql多实例-主从复制安装
    LVS+Keepalived高可用负载均衡集群架构实验-01
    debug调试
    常用网站总结
    项目部署
  • 原文地址:https://www.cnblogs.com/ourweiguan/p/10986690.html
Copyright © 2020-2023  润新知