返回首页
苏宁会员
购物车 0
易付宝
手机苏宁

服务体验

店铺评分与同行业相比

用户评价:----

物流时效:----

售后服务:----

  • 服务承诺: 正品保障
  • 公司名称:
  • 所 在 地:

  • Python机器学习 (美)塞巴斯蒂安·拉什卡(Sebastian Raschka) 著 著 专业科技 文轩网
  • 新华书店正版
    • 作者: (美)塞巴斯蒂安·拉什卡(Sebastian Raschka) 著著
    • 出版社: 东南大学出版社
    • 出版时间:2017-04-01 00:00:00
    送至
  • 由""直接销售和发货,并提供售后服务
  • 加入购物车 购买电子书
    服务

    看了又看

    商品预定流程:

    查看大图
    /
    ×

    苏宁商家

    商家:
    文轩网图书旗舰店
    联系:
    • 商品

    • 服务

    • 物流

    搜索店内商品

    商品分类

         https://product.suning.com/0070067633/11555288247.html

     

    商品参数
    • 作者: (美)塞巴斯蒂安·拉什卡(Sebastian Raschka) 著著
    • 出版社:东南大学出版社
    • 出版时间:2017-04-01 00:00:00
    • 版次:1
    • 印次:1
    • 印刷时间:2017-04-01
    • 字数:553000.0
    • 页数:425
    • 开本:16开
    • 装帧:平装
    • ISBN:9787564170776
    • 国别/地区:中国
    • 版权提供:东南大学出版社

    Python机器学习

    作  者:(美)塞巴斯蒂安·拉什卡(Sebastian Raschka) 著 著
    定  价:87
    出 版 社:东南大学出版社
    出版日期:2017年04月01日
    页  数:425
    装  帧:平装
    ISBN:9787564170776
    主编推荐

    内容简介

    本书将带你进入预测分析的世界,通过演示告诉你为什么Python是世界很好的数据科学语言之一。如果你想询问更深入的数据问题,或是想增进、拓展机器学习系统的能力,这本实用的书籍可谓是无价之宝。书中涵盖了包括 scikit-learn、Theano和Keras在内的大量功能强大的Python库,操作指南以及从情感分析到神经网络的各色小技巧,很快你就能够解答你个人及组织所面对的那些很重要的问题。

    作者简介

    Sebastian Raschka is a PhD student at Michigan State University, who develops new computational methods in the field of computational biology. He has been ranked as the number one most influential data scientist on GitHub by Analytics Vidhya. He has a yearlong experience in Python programming and he has conducted several seminars on the practical applications of data science and machine learning. null

    精彩内容

    目录
    Preface
    Chapter 1: Givin Computers the Ability to Learn from Data
    Building intelligent machines to transform data into knowledge
    The three different types of machine learning
    Making predictions about the future with supervised learning
    Classification for predicting class labels
    Regression for predicting continuous outcomes
    Solving interactive problems with reinforcement learning
    Discovering hidden structures with unsupervised learning
    Finding subgroups with clustering
    Dimensionality reduction for data compression
    An introduction to the basic terminology and notations
    A roadmap for building machine learning systems
    Preprocessing-getting data into shape
    Training and selecting a predictive model
    Evaluating models and predicting unseen data instances
    Using Python for machine learning
    Installing Python packages
    Summary
    Chapter 2: Training Machine Learning Algorithms
    for Classification
    Artifi neurons-a brief glimpse into the early history
    of machine learning
    Implementing a perceptron learning algorithm in Python
    Training a perceptron model on the Iris dataset
    Adaptive linear neurons and the convergence of learning
    Minimizing cost functions with gradient descent
    Implementing an Adaptive Linear Neuron in Python
    Large scale machine learning and stochastic gradient descent
    Summary
    Chapter 3: A Tour of Machine Learning Classifiers Using
    Scikit-learn
    Choosing a classification algorithm
    First steps with scikit-learn
    Training a perceptron via scikit-learn
    Modeling class probabilities via logistic regression
    Logistic regression intuition and conditional probabilities
    Learning the weights of the logistic cost function
    Training a logistic regression model with scikit-learn
    Tackling overfitting via regularization
    Maximum margin classification with support vector machines
    Maximum margin intuition
    Dealing with the nonlinearly separablecase using slack variables
    Alternative implementations in scikit-learn
    Solving nonlinear problems using a kernel SMM
    Using the kernel trick to find separating hyperplanes in higher
    dimensional space
    Decision tree learning
    Maximizing information gain-getting the most bang for the buck
    Building a decision tree
    Combining weak to strong learners via random forests
    K-nearest neighbors-a lazy learning algorithm
    Summary
    Chapter 4: Building Good Training Sets-Data Preprocessing
    Dealing with missing data
    Eliminating samples or features with missing values
    Imputing missing values
    Understanding the scikit-learn estimator API
    Handling categorical data
    Mapping ordinal features
    Encoding class labels
    Performing one-hot encoding on nominal features
    Partitioning a dataset in training and test sets
    Bringing features onto the same scale
    Selecting meaningful features
    Sparse solutions with L1 regularization
    Sequential feature selection algorithms
    Assessing feature importance with random forests
    Summary
    Chapter 5: Com~ Data via Di~ Reduction
    Unsupervised dimensionality reduction via principal
    component analysis
    Total and explained variance
    Feature transformation
    Principal component analysis in scikit-learn
    Supervised data compression via linear discriminant analysis
    Computing the scatter matrices
    Selecting linear discriminants for the new feature subspace
    Projecting samples onto the new feature space
    LDA via scikit-learn
    Using kernel principal component analysis for nonlinear mappings
    Kernel functions and the kernel trick
    Implementing a kernel principal component analysis in Python
    Example 1-separating half-moon shapes
    Example 2-separating concentric circles
    Projecting new data points
    Kernel principal component analysis in scikit-learn
    Summary
    Chapter 6: Learning Best Practices for Model Evaluation
    and Hyperparameter Tuni~
    Streamlining workflows with pipelines
    Loading the Breast Cancer Wisconsin dataset
    Combining transformers and estimators in a pipeline
    Using k-fold cross-validation to assess model performance
    The holdout method
    K-fold cross-validation
    Debugging algorithms with learning and validation curves
    Diagnosing bias and variance problems with learning curves
    Addressing overfitting and underfitting with validation curves
    Fine-tuning machine learning models via grid search
    Tuning hyperparameters via grid search
    Algorithm selection with nested cross-validation
    Looking at different performance evaluation metrics
    Reading a confusion matrix
    Optimizing the precision and recall of a classification model
    Plotting a receiver operating characteristic
    The scoring metrics for multiclass classification
    Summary
    Chapter 7: Combining Different Models for Ensemble Learning
    Learning with ensembles
    Implementing a simple majority vote classifier
    Combining different algorithms for classification with majority vote
    Evaluating and tuning the ensemble classifier
    Bagging-building an ensemble of classifiers from
    bootstrap samples
    Leveraging weak learners via adaptive boosting
    Summary
    Chapter 8: Applying Machine Learning to Sentiment Analysis
    Obtaining the IMDb movie review dataset
    Introducing the bag-of-words model
    Transforming words into feature vectors
    Assessing word relevancy via term frequency-inverse
    document frequency
    Cleaning text data
    Processing documents into tokens
    Training a logistic regression model for document classification
    Working with bigger data-online algorithms and
    out-of-core learning
    Summary
    Chapter 9: Embedding a Machine Learning Model into
    a Web Application
    Serializing fitted scikit-learn estimators
    Setting up a SQLite database for data storage
    Developing a web application with Flask
    Our first Flask web application
    Form validation and rendering
    Turning the movie classifier into a web application
    Deploying the web application to a public sewer
    Updating the movie review classifier
    Summary
    Chapter 10: Predicting Continuous Target Variables
    with R_Re_gression Analysis
    Introducing a simple linear regression model
    Exploring the Housing Dataset
    Visualizing the important characteristics of a dataset
    Implementing an ordinary least squares linear regression model
    Solving regression for regression parameters with gradient descent
    Estimating the coefficient of a regression model via scikit-learn
    Fitting a robust regression model using RANSAC
    Evaluating the performance of linear regression models
    Using regularized methods for regression
    Turning a linear regression model into a curve-polynomial regression
    Modeling nonlinear relationships in the Housing Dataset
    Dealing with nonlinear relationships using random forests
    Decision tree regression
    Random forest regression
    Summary
    Chapter 11: Working with Unlabeled Data- Cluste~
    Grouping objects by similarity using k-means
    K-means++
    Hard versus soft clustering
    Using the elbow method to find the optimal number of clusters
    Quantifying the quality of clustering via silhouette plots
    Organizing clusters as a hierarchical tree
    Performing hierarchical clustering on a distance matrix
    Attaching dendrograms to a heat map
    Applying agglomerative clustering via scikit-learn
    Locating regions of high density via DBSCAN
    Summary
    Chapter 12: Training Artifi Neural Networks for Image Recognition
    Modeling complex functions with artifi neural networks
    Single-layer neural network recap
    Introducing the multi-layer neural network architecture
    Activating a neural network via forward propagation
    Classifying handwritten digits
    Obtaining the MNIST dataset
    Implementing a multi-layer perceptron
    Training an artifi neural network
    Computing the logistic cost function
    Training neural networks via backpropagation
    Developing your intuition for backpropagation
    Debugging neural networks with gradient checking
    Convergence in neural networks
    Other neural network architectures
    Convolutional Neural Networks
    Recurrent Neural Networks
    A few last words about neural network implementation
    Summary
    Chapter 13: Parallelizing Neural Network Training with Theano
    Building, compiling, and running expressions with Theano
    What is Theano?
    First steps with Theano
    Configuring Theano
    Working with array structures
    Wrapping things up-a linear regression example
    Choosing activation functions for feedforward neural networks
    Logistic function recap
    Estimating probabilities in multi-class classification via the
    softmax function
    Broadening the output spectrum by using a hyperbolic tangent
    Training neural networks efficiently using Keras
    Summary
    Index

    售后保障

    最近浏览

    猜你喜欢

    该商品在当前城市正在进行 促销

    注:参加抢购将不再享受其他优惠活动

    x
    您已成功将商品加入收藏夹

    查看我的收藏夹

    确定

    非常抱歉,您前期未参加预订活动,
    无法支付尾款哦!

    关闭

    抱歉,您暂无任性付资格

    此时为正式期SUPER会员专享抢购期,普通会员暂不可抢购