INTRODUCTION
- Instance-based learning methods such as nearest neighbor and locally weighted regression are conceptually straightforward approaches to approximating real-valued or discrete-valued target functions.
- Learning in these algorithms consists of simply storing the presented training data. When a new query instance is encountered, a set of similar related instances is retrieved from memory and used to classify the new query instance
- Instance-based approaches can construct a different approximation to the target function for each distinct query instance that must be classified
Advantages of Instance-based learning
- Training is very fast
- Learn complex target
function
- Don’t lose information
Disadvantages of Instance-based learning
- The cost of classifying new instances can be high. This is due to the fact that nearly all computation takes place at classification time rather than when the training examples are first encountered.
- In many instance-based approaches, especially nearest-neighbor approaches, is that they typically consider all attributes of the instances when attempting to retrieve similar training examples from memory. If the target concept depends on only a few of the many available attributes, then the instances that are truly most "similar" may well be a large distance apart.
Ques. Write a short note on instance-based learning.
Answer:
Instance-based learning
is a family of learning algorithms that, instead of performing explicit generalization, compares new problem instances with instances seen in training, which have been stored in memory.- They are sometimes referred to as lazy learning methods because they delay processing until a new instance must be classified. The nearest neighbors of an instance are defined in terms of Euclidean distance.
- No model is learned
- The stored training instances themselves represent the knowledge
- Training instances are searched for instance that most closely resembles new instance
Ques. Explain instance-based learning representation.
Answer:
Instance-based learning
: It generates classification predictions using only specific instances. Instance-based learning algorithms do not maintain a set of abstractions derived from specific instances. This approach extends the nearest neighbor algorithm, which has large storage requirements.
Ques. What are the performance dimensions used for instance-based learning algorithm?
Answer:
Time complexity of Instance based learning algorithms depends upon the size of training data. Time complexity of this algorithm in the worst case is O (n), where n is the number of training items to be used to classify a single new instance.
Ques. What are the functions of instance-based learning ?
Answer:
Instance-based learning
refers to a family of techniques for classification and regression, which produce a class label/prediction based on the similarity of the query to its nearest neighbor(s) in the training set.
Functions are as follows:
- Similarity:Similarity is a machine learning method that uses a nearest neighbor approach to identify the similarity of two or more objects to each other based on algorithmic distance functions.
- Classification:Process of categorizing a given set of data into classes, It can be performed on both structured or unstructured data. The process starts with predicting the class of given data points. The classes are often referred to as target, label or categories.
- Concept Description:Much of human learning involves acquiring general concepts from past experiences.This description can then be used to predict the class labels of unlabeled cases.
Ques. What are the advantages and disadvantages of instance-based learning ?
Answer:
Advantages of instance-based learning:
- It has the ability to adapt to previously unseen data, which means that one can store a new instance or drop the old instance.
Disadvantages of instance-based learning:
- Classification costs are high.
- Large amount of memory required to store the data, and each query involves starting the identification of a local model from scratch.
0 Comments