Existing DR methods such as ordinary least squares and sliced inverse regression often perform poorly in the presence of outliers. Also the DR theory usually assumes that the predictors satisfy the condition of linearly related predictors: e.g., for 1D regression E[x|beta' x] must be a linear function of beta'x. This dissertation develops outlier resistant DR methods that can give useful results when the assumption of linearly related predictors is violated.F 60 .044 .028 .024 .026 .025 .064 .202 .194 .238 .265 .187 x2 .382 .101 .059 . 043 .042 .087 .231 .233 .261 .286 .214 ... F 60 .045 .021 .026 .028 .039 .057 .154 .192 .135 .098 .032 x2 .382 .089 .050 .050 .060 .075 .182 .221 .150 .114 .050anbsp;...
Title | : | Resistant Dimension Reduction |
Author | : | Jing Chang |
Publisher | : | ProQuest - 2006 |
You must register with us as either a Registered User before you can Download this Book. You'll be greeted by a simple sign-up page.
Once you have finished the sign-up process, you will be redirected to your download Book page.
How it works: