Basic Large Sample Theory
convergence
a.s.
in probability
in r-th mean
in distribution
uniformly integrable
convergence implications
a.s. -- p
p -- r
Vitali's thm
p -- d
inequality
c_r inequality
Holder inequality
Cauchy-Schwarz inequality
Liapunov inequality
Minkowski's inequality
Basic inequality
even, increasing on [0, inf), positive
Markov's inequality
Chebychev's inequality
Jensen's inequality
CLT
WLLN
SLLN
classical CLT
Liapunov CLT
Berry-Essen Theorem
evaluate the speed of convergence
Lindeberg-Feller CLT
Cramer-Wold Device
Mann-Wald(Continuous Mapping) Theorem
Slutsky's Theorem
The Delta Method
bounded in probability
Big Oh Pee and Little Oh Pee
Polya-Cantelli lemma
Skorokhod's Theorem
def. of F-1
properties of F-1
continuous F
general F
inverse transformation
skorokhod's theorem
Helly Bray theorem
g is bounded and continuous
lemmas
Fatou's lemma
MCT
DCT
Empirical Measures and Empirical Processes
uniform case
general case
Glivenko-Cantelli theorem
Gn and I are close
Theorem 5.2
for a special construction, Un and U are close
Donsker's theorem
g is ||.||inf - continuous
g(Un) ->d g(U)
Quantiles and Quantile Processes
quantiles
uniform case
general case
Gn-1 and I are close
relationship between Vn and Un, V and U
for a special construction, Vn and V are close
Theorem 7.2
Lower Bounds for Estimation
Cramer-Rao Bounds
conditions
M1: open set
M2
A:derivative of density
B: support
M3: information
M4:differentiated under integral
M5: twice differentiated under integral
efficient influence function
Geometry
adaptive estimator
efficient score for v
efficient influence for v
information for v
information bound for v
projections
score
influence
Regular Estimates and Superefficiency
def. of locally regular
Hodge's estimator
Hajek-Le Cam convolution and LAM
regular + LAN(DQM) = convolution
lower bounds
regular estimator + bowl-shaped loss
general estimator + LAN(DQM) + bowl-shaped loss
asymptotically linear estimator is best regular
Efficient Likelihood Estimation and Tests
Maximum likelihood
K-L divergence
motivates MLE
definitions
score function
influence function
regularity conditions
A0: identifiability
A1: support
A2: density
A3
open neighborhood
(i)
(ii)
A4
(i)
(ii)
(iii)
Theorem 1.2
existence and consistency
asymptotically linear
likelihood ratio statistics
Wald statistics
score statistics
LAN
corollary
one-step estimator
Theorem 1.3
start with moment estimator or quantile
Three tests
test statistics
H0 is fully given
H0 is partially given
distribution
H0 is fully given
under H0
under fixed alternative
1/n (.) ->0
under local alternative
non-central chi-square
H0 is partially given
under H0
under fixed alternative
strong consistency of MLE
Wald's theorem
compact
upper semi-continuous
integrable envelope
measurability
identifiability
EM algorithm
LAN
contiguity
definition
check contiguity
Le Cam's first lemma
example: e^N(\mu, \sigma^2)
Le Cam's third lemma
assumption
conclusion
abstract version
normal version
DQM
def.
DQM is a property of P_theta
checking
lemma3.1
continuously differentiable for every x
I_theta is well defined & continuous in theta
common examples
Poisson
Cramer regular
location family
uniform is not DQM
LAN
def.
LAN is a property of P_n,theta
Theorem 3.1
in i.i.d. case, DQM implies LAN
open set assumption
limiting distribution under shrinking alternative
asymptotic normality of local likelihood
matching
MLE usually matches with MLE
asymptotic normality of MLE
Hajek-Lecam Lower Bound
locally regular
LAM
Nonparametrics
KDE
MSE
Taylor expansion
MISE
AMISE
Dependent Data
stationary sequence
def. of stationary
m-dependent
m-dependent CLT
ergodic
Birkhoff's theorem LLN
mixing
alpha mixing
mixing CLT
Martingale Difference Sequence
def. of Martingale and MDS
CLT for MDS
Semiparametrics