Learning from Data
Home > Computing and Information Technology > Computer science > Artificial intelligence > Machine learning > Learning from Data: Concepts, Theory, and Methods
Learning from Data: Concepts, Theory, and Methods

Learning from Data: Concepts, Theory, and Methods

|
     0     
5
4
3
2
1




International Edition


About the Book

An interdisciplinary framework for learning methodologies—covering statistics, neural networks, and fuzzy logic, this book provides a unified treatment of the principles and methods for learning dependencies from data. It establishes a general conceptual framework in which various learning methods from statistics, neural networks, and fuzzy logic can be applied—showing that a few fundamental principles underlie most new methods being proposed today in statistics, engineering, and computer science. Complete with over one hundred illustrations, case studies, and examples making this an invaluable text.

Table of Contents:
PREFACE. NOTATION. 1 Introduction. 1.1 Learning and Statistical Estimation. 1.2 Statistical Dependency and Causality. 1.3 Characterization of Variables. 1.4 Characterization of Uncertainty. 1.5 Predictive Learning versus Other Data Analytical Methodologies. 2 Problem Statement, Classical Approaches, and Adaptive Learning. 2.1 Formulation of the Learning Problem. 2.1.1 Objective of Learning. 2.1.2 Common Learning Tasks. 2.1.3 Scope of the Learning Problem Formulation. 2.2 Classical Approaches. 2.2.1 Density Estimation. 2.2.2 Classification. 2.2.3 Regression. 2.2.4 Solving Problems with Finite Data. 2.2.5 Nonparametric Methods. 2.2.6 Stochastic Approximation. 2.3 Adaptive Learning: Concepts and Inductive Principles. 2.3.1 Philosophy, Major Concepts, and Issues. 2.3.2 A Priori Knowledge and Model Complexity. 2.3.3 Inductive Principles. 2.3.4 Alternative Learning Formulations. 2.4 Summary. 3 Regularization Framework. 3.1 Curse and Complexity of Dimensionality. 3.2 Function Approximation and Characterization of Complexity. 3.3 Penalization. 3.3.1 Parametric Penalties. 3.3.2 Nonparametric Penalties. 3.4 Model Selection (Complexity Control). 3.4.1 Analytical Model Selection Criteria. 3.4.2 Model Selection via Resampling. 3.4.3 Bias–Variance Tradeoff. 3.4.4 Example of Model Selection. 3.4.5 Function Approximation versus Predictive Learning. 3.5 Summary. 4 Statistical Learning Theory. 4.1 Conditions for Consistency and Convergence of ERM. 4.2 Growth Function and VC Dimension. 4.2.1 VC Dimension for Classification and Regression Problems. 4.2.2 Examples of Calculating VC Dimension. 4.3 Bounds on the Generalization. 4.3.1 Classification. 4.3.2 Regression. 4.3.3 Generalization Bounds and Sampling Theorem. 4.4 Structural Risk Minimization. 4.4.1 Dictionary Representation. 4.4.2 Feature Selection. 4.4.3 Penalization Formulation. 4.4.4 Input Preprocessing. 4.4.5 Initial Conditions for Training Algorithm. 4.5 Comparisons of Model Selection for Regression. 4.5.1 Model Selection for Linear Estimators. 4.5.2 Model Selection for k-Nearest-Neighbor Regression. 4.5.3 Model Selection for Linear Subset Regression. 4.5.4 Discussion. 4.6 Measuring the VC Dimension. 4.7 VC Dimension, Occam’s Razor, and Popper’s Falsifiability. 4.8 Summary and Discussion. 5 Nonlinear Optimization Strategies. 5.1 Stochastic Approximation Methods. 5.1.1 Linear Parameter Estimation. 5.1.2 Backpropagation Training of MLP Networks. 5.2 Iterative Methods. 5.2.1 EM Methods for Density Estimation. 5.2.2 Generalized Inverse Training of MLP Networks. 5.3 Greedy Optimization. 5.3.1 Neural Network Construction Algorithms. 5.3.2 Classification and Regression Trees. 5.4 Feature Selection, Optimization, and Statistical Learning Theory. 5.5 Summary. 6 Methods for Data Reduction and Dimensionality Reduction. 6.1 Vector Quantization and Clustering. 6.1.1 Optimal Source Coding in Vector Quantization. 6.1.2 Generalized Lloyd Algorithm. 6.1.3 Clustering. 6.1.4 EM Algorithm for VQ and Clustering. 6.1.5 Fuzzy Clustering. 6.2 Dimensionality Reduction: Statistical Methods. 6.2.1 Linear Principal Components. 6.2.2 Principal Curves and Surfaces. 6.2.3 Multidimensional Scaling. 6.3 Dimensionality Reduction: Neural Network Methods. 6.3.1 Discrete Principal Curves and Self-Organizing Map Algorithm. 6.3.2 Statistical Interpretation of the SOM Method. 6.3.3 Flow-Through Version of the SOM and Learning Rate Schedules. 6.3.4 SOM Applications and Modifications. 6.3.5 Self-Supervised MLP. 6.4 Methods for Multivariate Data Analysis. 6.4.1 Factor Analysis. 6.4.2 Independent Component Analysis. 6.5 Summary. 7 Methods for Regression. 7.1 Taxonomy: Dictionary versus Kernel Representation. 7.2 Linear Estimators. 7.2.1 Estimation of Linear Models and Equivalence of Representations. 7.2.2 Analytic Form of Cross-Validation. 7.2.3 Estimating Complexity of Penalized Linear Models. 7.2.4 Nonadaptive Methods. 7.3 Adaptive Dictionary Methods. 7.3.1 Additive Methods and Projection Pursuit Regression. 7.3.2 Multilayer Perceptrons and Backpropagation. 7.3.3 Multivariate Adaptive Regression Splines. 7.3.4 Orthogonal Basis Functions and Wavelet Signal Denoising. 7.4 Adaptive Kernel Methods and Local Risk Minimization. 7.4.1 Generalized Memory-Based Learning. 7.4.2 Constrained Topological Mapping. 7.5 Empirical Studies. 7.5.1 Predicting Net Asset Value (NAV) of Mutual Funds. 7.5.2 Comparison of Adaptive Methods for Regression. 7.6 Combining Predictive Models. 7.7 Summary. 8 Classification. 8.1 Statistical Learning Theory Formulation. 8.2 Classical Formulation. 8.2.1 Statistical Decision Theory. 8.2.2 Fisher’s Linear Discriminant Analysis. 8.3 Methods for Classification. 8.3.1 Regression-Based Methods. 8.3.2 Tree-Based Methods. 8.3.3 Nearest-Neighbor and Prototype Methods. 8.3.4 Empirical Comparisons. 8.4 Combining Methods and Boosting. 8.4.1 Boosting as an Additive Model. 8.4.2 Boosting for Regression Problems. 8.5 Summary. 9 Support Vector Machines. 9.1 Motivation for Margin-Based Loss. 9.2 Margin-Based Loss, Robustness, and Complexity Control. 9.3 Optimal Separating Hyperplane. 9.4 High-Dimensional Mapping and Inner Product Kernels. 9.5 Support Vector Machine for Classification. 9.6 Support Vector Implementations. 9.7 Support Vector Regression. 9.8 SVM Model Selection. 9.9 Support Vector Machines and Regularization. 9.10 Single-Class SVM and Novelty Detection. 9.11 Summary and Discussion. 10 Noninductive Inference and Alternative Learning Formulations. 10.1 Sparse High-Dimensional Data. 10.2 Transduction. 10.3 Inference Through Contradictions. 10.4 Multiple-Model Estimation. 10.5 Summary. 11 Concluding Remarks. Appendix A: Review of Nonlinear Optimization. Appendix B: Eigenvalues and Singular Value Decomposition. References. Index.  


Best Sellers


Product Details
  • ISBN-13: 9780471681823
  • Publisher: John Wiley & Sons Inc
  • Binding: Hardback
  • Language: English
  • Returnable: N
  • Spine Width: 34 mm
  • Weight: 961 gr
  • ISBN-10: 0471681822
  • Publisher Date: 11 Sep 2007
  • Height: 243 mm
  • No of Pages: 560
  • Returnable: N
  • Sub Title: Concepts, Theory, and Methods
  • Width: 165 mm


Similar Products

Add Photo
Add Photo

Customer Reviews

REVIEWS      0     
Click Here To Be The First to Review this Product
Learning from Data: Concepts, Theory, and Methods
John Wiley & Sons Inc -
Learning from Data: Concepts, Theory, and Methods
Writing guidlines
We want to publish your review, so please:
  • keep your review on the product. Review's that defame author's character will be rejected.
  • Keep your review focused on the product.
  • Avoid writing about customer service. contact us instead if you have issue requiring immediate attention.
  • Refrain from mentioning competitors or the specific price you paid for the product.
  • Do not include any personally identifiable information, such as full names.

Learning from Data: Concepts, Theory, and Methods

Required fields are marked with *

Review Title*
Review
    Add Photo Add up to 6 photos
    Would you recommend this product to a friend?
    Tag this Book Read more
    Does your review contain spoilers?
    What type of reader best describes you?
    I agree to the terms & conditions
    You may receive emails regarding this submission. Any emails will include the ability to opt-out of future communications.

    CUSTOMER RATINGS AND REVIEWS AND QUESTIONS AND ANSWERS TERMS OF USE

    These Terms of Use govern your conduct associated with the Customer Ratings and Reviews and/or Questions and Answers service offered by Bookswagon (the "CRR Service").


    By submitting any content to Bookswagon, you guarantee that:
    • You are the sole author and owner of the intellectual property rights in the content;
    • All "moral rights" that you may have in such content have been voluntarily waived by you;
    • All content that you post is accurate;
    • You are at least 13 years old;
    • Use of the content you supply does not violate these Terms of Use and will not cause injury to any person or entity.
    You further agree that you may not submit any content:
    • That is known by you to be false, inaccurate or misleading;
    • That infringes any third party's copyright, patent, trademark, trade secret or other proprietary rights or rights of publicity or privacy;
    • That violates any law, statute, ordinance or regulation (including, but not limited to, those governing, consumer protection, unfair competition, anti-discrimination or false advertising);
    • That is, or may reasonably be considered to be, defamatory, libelous, hateful, racially or religiously biased or offensive, unlawfully threatening or unlawfully harassing to any individual, partnership or corporation;
    • For which you were compensated or granted any consideration by any unapproved third party;
    • That includes any information that references other websites, addresses, email addresses, contact information or phone numbers;
    • That contains any computer viruses, worms or other potentially damaging computer programs or files.
    You agree to indemnify and hold Bookswagon (and its officers, directors, agents, subsidiaries, joint ventures, employees and third-party service providers, including but not limited to Bazaarvoice, Inc.), harmless from all claims, demands, and damages (actual and consequential) of every kind and nature, known and unknown including reasonable attorneys' fees, arising out of a breach of your representations and warranties set forth above, or your violation of any law or the rights of a third party.


    For any content that you submit, you grant Bookswagon a perpetual, irrevocable, royalty-free, transferable right and license to use, copy, modify, delete in its entirety, adapt, publish, translate, create derivative works from and/or sell, transfer, and/or distribute such content and/or incorporate such content into any form, medium or technology throughout the world without compensation to you. Additionally,  Bookswagon may transfer or share any personal information that you submit with its third-party service providers, including but not limited to Bazaarvoice, Inc. in accordance with  Privacy Policy


    All content that you submit may be used at Bookswagon's sole discretion. Bookswagon reserves the right to change, condense, withhold publication, remove or delete any content on Bookswagon's website that Bookswagon deems, in its sole discretion, to violate the content guidelines or any other provision of these Terms of Use.  Bookswagon does not guarantee that you will have any recourse through Bookswagon to edit or delete any content you have submitted. Ratings and written comments are generally posted within two to four business days. However, Bookswagon reserves the right to remove or to refuse to post any submission to the extent authorized by law. You acknowledge that you, not Bookswagon, are responsible for the contents of your submission. None of the content that you submit shall be subject to any obligation of confidence on the part of Bookswagon, its agents, subsidiaries, affiliates, partners or third party service providers (including but not limited to Bazaarvoice, Inc.)and their respective directors, officers and employees.

    Accept

    New Arrivals

    Inspired by your browsing history


    Your review has been submitted!

    You've already reviewed this product!