Multivariate pattern recognition in chemometrics : illustrated by case studies /

Saved in:
Bibliographic Details
Imprint:Amsterdam ; New York : Elsevier, 1992.
Description:xi, 325 p. : ill. ; 25 cm.
Language:English
Series:Data handling in science and technology v. 9
Subject:Chemistry -- Statistical methods.
Multivariate analysis
Chemistry -- Statistical methods -- Data processing.
Multivariate analysis -- Data processing
Chemistry -- Statistical methods.
Chemistry -- Statistical methods -- Data processing.
Multivariate analysis.
Multivariate analysis -- Data processing.
Format: Print Book
URL for this record:http://pi.lib.uchicago.edu/1001/cat/bib/1453006
Hidden Bibliographic Details
Other authors / contributors:Brereton, Richard G.
ISBN:0444897836 (hardback : acid-free paper)
0444897844 (paperback)
0444897852 (software suppl.)
0444897860 (paperback and software suppl.)
Notes:Includes bibliographical references and index.
Table of Contents:
  • Introduction / R.G. Brereton
  • Ch. 1. Introduction to Multivariate Space / P.J. Lewi
  • 1. Introduction. 2. Matrices. 3. Multivariate space. 4. Dimension and rank. 5. Matrix product. 6. Vectors as one-dimensional matrices. 7. Unit matrix as a frame of multivariate space. 8. Product of a matrix with a vector. Projection of points upon a single axis. 9. Multiple linear regression (MLR) as a projection of points upon an axis. 10. Linear discriminant analysis (LDA) as a projection of points on an axis. 11. Product of a matrix with a two-column matrix. Projection of points upon a plane. 12. Product of two matrices as a rotation of points in multivariate space. 13. Factor rotation. 14. Factor data analysis. 15. References
  • Ch. 2. Multivariate Data Display / P.J. Lewi
  • 1. Introduction. 2. Basic methods of factor data analysis. 3. Choice of a particular display method. 4. SPECTRAMAP program. 5. The neuroleptics case. 6. Principal components analysis (PCA) with standardization. 7. Principal components analysis (PCA) with logarithms. 8. Correspondence factor analysis (CFA). 9. Spectral map analysis (SMA). 10. References
  • Ch. 3. Vectors and Matrices : Basic Matrix Algebra / N. Bratchell
  • 1. Introduction. 2. The data matrix. 3. Vector representation. 4. Vector manipulation. 5. Matrices. 6. Statistical equivalents. 7. References
  • Ch. 4. The Mathematics of Pattern Recognition / N. Bratchell
  • 1. Introduction. 2. Rotation and projection. 3. Dimensionality. 4. Expressing the information in the data. 5. Decomposition of data. 6. Final comments. 7. References
  • Ch. 5. Data Reduction Using Principal Components Analysis / J.M. Deane
  • 1. Introduction. 2. Principal components analysis. 3. Data reduction by dimensionality reduction. 4. Data reduction by variable reduction. 5. Conclusions
  • 6. References
  • Ch. 6. Cluster Analysis / N. Bratchell
  • 1. Introduction. 2. Two problems. 3. Visual inspection. 4. Measurement of distance and similarity. 5. Hierarchical methods. 6. Optimization partitioning methods. 7. Conclusions
  • 8. References
  • Ch. 7. SIMCA - Classification by Means of Disjoint Cross Validated Principal Components Models / O.M. Kvalheim and T.V. Karstang
  • 1. Introduction. 2. Distance, variance and covariance. 3. The principal component model. 4. Unsupervised principal component modelling. 5. Supervised principal component modelling using cross-validation. 6. Cross validated principal component models. 7. The SIMCA model. 8. Classification of new samples to a class model. 9. Communality and modelling power. 10. Discriminatory ability of variables. 11. Separation between classes. 12. Detection of outliers. 13. Data reduction by means of relevance. 14. Conclusion
  • 15. Acknowledgements
  • 16. References
  • Ch. 8. Hard Modelling in Supervised Pattern Recognition / D. Coomans and D.L. Massart
  • 1. Introduction. 2. The data set. 3. Geometric representation. 4. Classification rule. 5. Deterministic pattern recognition. 6. Probabilistic pattern recognition. 7. Final remarks
  • 8. References
  • Software Appendices
  • Spectramap / P.J. Lewi. 1. Installation of the program. 2. Execution of the program. 3. Tutorial cases
  • Sirius / O.M. Kvalheim and T.V. Karstang
  • 1. Introduction. 2. Starting SIRIUS. 3. The data table. 4. Defining, selecting and storing a class. 5. Principal component modelling. 6. Variance decomposition plots and other graphic representations. 7. Summary
  • 8. Acknowledgements
  • 9. References.