As stated previous section, different ways to
approximate nonparametric part will get the corresponding estimators of
.
This section will present several macros to explain how to calculate the
estimates.
Let
be a kernel function satisfying certain conditions,
a bandwidth parameter. The weight function is defined as
Then least squares estimator of
is defined as
where
with
and
with
,
and the nonparametric estimator of g(t):
The detailed discussions on asymptotic theories
of the estimators
and
are referred to Gao, Hong and Liang
(1995) and Speckman (1988).
plmk
presents the estimates of the parameter
and the nonparametric part g(t) by using kernel method to
handle g(t).
res.hbeta:
vector or scalar, the estimate of
res.hsigma: scalar,
the estimate of the variance of
when it is homoscedastic
The next picture , "plmk-ex.xpl", gives
an example of
XploRe
code to generate a sample from the
PLM
model, and then shows how to compute the
PLM
estimates by
plmk.
Saving the example as a file (plmk-ex.xpl) and then linking
"Execute",
the actual parameter estimates are shown in the second picture (XploRe_out).
An example by using
plmk

The estimate result
of plmk-ex.xpl

Nonparametric fit by
plmk

Green curve stands
for true value, blue for parametric fit and red for nonparametric
fit..
Suppose that g has m-1 absolutely
continuous derivatives and m-th derivative that is square integrable
and satisfies
for a specified C>0. Via a Taylor expansion, the partially linear
model can be rewitten as
where
.
By using a quadrature rule, Rem(t) can be approximated by
a sum of the form
for some set of coefficients
and points
.
Let
and set
.
The least squares spline estimator is to minimize
Conveniently with matrix notations, denote
with
for
and
.
and
.
are found as the solution to the minimizing of
If the problem has an unique solution, its form is the same as
in
plmp. Otherwise, we may use ridge idea to modify the estimator.
plmls
presents the estimates of parameter
and non parametric function by fitting the nonparametric part with least squares spline.
We assume that g are Hölder
continuous smooth of order p(=m+r), that is, let r
and M denote nonnegative real constants
,
m is nonnegative integer such that
Now we describe piecewise polynomial approximation for the function
,
defined in [0, 1]. Given a positive
,
devide [0, 1] in Mn intervals with equal length
.
The estimator has the form of a piecewise polynomial of degree m
based on
the intervals, where the
coefficients are chosen by the method of least squares on the basis of
the data.
Let
be the indicator function of the
-th
interval, and
be the midpoint of the
-th
interval, so that
or 0 according to
for
and
or not. Let
be the m-order Taylor expansion of g(t) on
.
Denote
Consider the piecewise polynomial approximation
of g of degree m given by
Suppose we have n observing data
.
Denote
and
Then
Hence we need to find
and
to minimize
Suppose the minimization problem has an unique solution. Then the estimators
of b and
are
and
,
where
and
.
The estimate of g(t) can be described as
for a suitable z.
plmp presents the estimates
and
.
res.hbeta:
vector or scalar, the estimate of
The next picture , "plmp-ex.xpl", gives
an example of
XploRe
code to generate a sample from the
PLM
model, and then shows how to compute the
PLM
estimates by
plmp.
Saving the example as a file (plmp-ex.xpl) and then linking "Execute",
the actual parameter estimates are shown in the second picture(XploRe_out).
An example by using
plmp

The parameter estimate value is listed as follows

The result of nonparametric fitting with following
picture

Suppose that the
derivative of g(t) at the point
exists. We then approximate the unknown regression function g(t)
locally by a polynomial of order p. A Taylor expansion given, for
t in a neighborhood of
,
To estimate
and g(t), we first estimate the
's
as functions of
,
denoted
,
by minimizing
where h is a bandwidth controlling the size of the local neighborhood,
and
with K a kernel function assigning weights to each datum. Then minimize
Denote the solution of (7) by
.
Fianlly let
be the estimates of
,
and dnote by
.
It is clear from the Taylor expansion in (5) that
is an estimator for
,
.
To estimate the entire function
we solve the above weighted least squares problem for all points
in the domain of interest.
It is more convenient to work with matrix notation.
Denote by
the design matrix of T in problem (6). Denote
and put
Further, let
be the
diagonal matric of weights:
.
Then the weighted least squres problems (6) and (7)
can be rewritten as
and
with
.
The solution vectors are provided by weighted least squares and are given
by
plmlorg is designed to impliment the
above arguments in
XploRe.
lpregest is used to estimate nonparametric
regression functions. Interpolation idea is employed to calculate the estimators
of beta and g(t).
Considering the same example in previous sections,
we here approximate the nonlinear part with the 2-nd local polynomial approximation. The results for parametric and nonparametric parts are listed as follows.

