Support Vector Machines and Perceptrons, 1st ed. 2016
Learning, Optimization, Classification, and Application to Social Networks

SpringerBriefs in Computer Science Series

Authors:

Language: English

Approximative price 52.74 €

In Print (Delivery period: 15 days).

Add to cartAdd to cart
Publication date:
Support: Print on demand

This work reviews the state of the art in SVM and perceptron classifiers. A Support Vector Machine (SVM) is easily the most popular tool for dealing with a variety of machine-learning tasks, including classification. SVMs are associated with maximizing the margin between two classes. The concerned optimization problem is a convex optimization guaranteeing a globally optimal solution. The weight vector associated with SVM is obtained by a linear combination of some of the boundary and noisy vectors. Further, when the data are not linearly separable, tuning the coefficient of the regularization term becomes crucial. Even though SVMs have popularized the kernel trick, in most of the practical applications that are high-dimensional, linear SVMs are popularly used. The text examines applications to social and information networks. The work also discusses another popular linear classifier, the perceptron, and compares its performance with that of the SVM in different application areas.>

Introduction

Linear Discriminant Function

Perceptron

Linear Support Vector Machines

Kernel Based SVM

Application to Social Networks

Conclusion

Presents a review of linear classifiers, with a focus on those based on linear discriminant functions

Discusses the application of support vector machines (SVMs) in link prediction in social networks

Describes the perceptron, another popular linear classifier, and compares its performance with that of the SVM in different application areas

Includes supplementary material: sn.pub/extras