Svoboda | Graniru | BBC Russia | Golosameriki | Facebook

To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

Decision boundary

From Wikipedia, the free encyclopedia

In a statistical-classification problem with two classes, a decision boundary or decision surface is a hypersurface that partitions the underlying vector space into two sets, one for each class. The classifier will classify all the points on one side of the decision boundary as belonging to one class and all those on the other side as belonging to the other class.

A decision boundary is the region of a problem space in which the output label of a classifier is ambiguous.[1]

If the decision surface is a hyperplane, then the classification problem is linear, and the classes are linearly separable.

Decision boundaries are not always clear cut. That is, the transition from one class in the feature space to another is not discontinuous, but gradual. This effect is common in fuzzy logic based classification algorithms, where membership in one class or another is ambiguous.

Decision boundaries can be approximations of optimal stopping boundaries. [2] The decision boundary is the set of points of that hyperplane that pass through zero. [3] For example, the angle between a vector and points in a set must be zero for points that are on or close to the decision boundary. [4]

Decision boundary instability can be incorporated with generalization error as a standard for selecting the most accurate and stable classifier. [5]

YouTube Encyclopedic

  • 1/3
    Views:
    6 777
    1 957
    211 510
  • Deep Learning | Decision Boundary of Neural Nets
  • Lecture 6.3 — Decision Boundary — [Machine Learning | Andrew Ng]
  • Lecture 6.3 — Logistic Regression | Decision Boundary — [ Machine Learning | Andrew Ng]

Transcription

In neural networks and support vector models

In the case of backpropagation based artificial neural networks or perceptrons, the type of decision boundary that the network can learn is determined by the number of hidden layers the network has. If it has no hidden layers, then it can only learn linear problems. If it has one hidden layer, then it can learn any continuous function on compact subsets of Rn as shown by the universal approximation theorem, thus it can have an arbitrary decision boundary.

In particular, support vector machines find a hyperplane that separates the feature space into two classes with the maximum margin. If the problem is not originally linearly separable, the kernel trick can be used to turn it into a linearly separable one, by increasing the number of dimensions. Thus a general hypersurface in a small dimension space is turned into a hyperplane in a space with much larger dimensions.

Neural networks try to learn the decision boundary which minimizes the empirical error, while support vector machines try to learn the decision boundary which maximizes the empirical margin between the decision boundary and data points.

See also

References

  1. ^ Corso, Jason J. (Spring 2013). "Quiz 1 of 14 - Solutions" (PDF). Department of Computer Science and Engineering - University at Buffalo School of Engineering and Applied Sciences. Johnson, David.
  2. ^ Whittle, P. (1973). "An Approximate Characterisation of Optimal Stopping Boundaries". Journal of Applied Probability. 10 (1): 158–165. doi:10.2307/3212503. ISSN 0021-9002. JSTOR 3212503. Retrieved 2022-11-28.
  3. ^ https://cmci.colorado.edu/classes/INFO-4604/files/notes_svm.pdf
  4. ^ Laber, Eric B.; Murphy, Susan A. (2011). "Rejoinder". Journal of the American Statistical Association. 106 (495): 940–945. ISSN 0162-1459. JSTOR 23427564. Retrieved 2022-11-28.
  5. ^ Sun, Will Wei; Cheng, Guang; Liu, Yufeng (2018). "Stability Enhanced Large-Margin Classifier Selection". Statistica Sinica. arXiv:1701.05672. doi:10.5705/ss.202016.0260. ISSN 1017-0405. Retrieved 2022-11-28.

Further reading

  • Duda, Richard O.; Hart, Peter E.; Stork, David G. (2001). Pattern Classification (2nd ed.). New York: Wiley. pp. 215–281. ISBN 0-471-05669-3.
This page was last edited on 29 April 2023, at 18:46
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.