Tools Cross-Validation: Difference between revisions
imported>Jeremy No edit summary |
imported>Jeremy No edit summary |
||
Line 56: | Line 56: | ||
Note: You must load data into the Analysis window before the Cross-Validation icon is available. | Note: You must load data into the Analysis window before the Cross-Validation icon is available. | ||
''Cross-validation icon in the Analysis window'' | ::''Cross-validation icon in the Analysis window'' | ||
[[Image:Cross_validation_icon_Analysis_window.png|406x83px]] | [[Image:Cross_validation_icon_Analysis_window.png|406x83px]] | ||
Line 79: | Line 79: | ||
|} | |} | ||
''Cross-Validation dialog box'' | ::''Cross-Validation dialog box'' | ||
[[Image:Cross_validation_icon_dialog_box.png|288x138px]] | [[Image:Cross_validation_icon_dialog_box.png|288x138px]] | ||
Line 122: | Line 122: | ||
|} | |} | ||
''Cross-validation methods compared'' | ::''Cross-validation methods compared'' | ||
{| | {| |
Revision as of 10:39, 29 July 2010
Table of Contents | Previous | Next
Cross-Validation Tool
You use the Cross-Validation tool to:
|
|
For a given set of data, cross-validation involves a series of steps called subvalidation steps in which you remove a subset of objects from a set of data (the test set), build of a model using the remaining objects in the set of data (the model building set), and then apply the resulting model to the removed objects. You note how the errors accumulate as you leave out samples to determine the number of principal components/latent variables/factors to retain in the model. Cross-validation typically involves more than one subvalidation step, each of which in turn involves the selection of different subsets of samples for model building and model testing. In Solo, five different cross-validation methods are available, and these methods vary with respect to how the different sample subsets are selected for these subvalidation steps.
1. | To open the Cross-Validation tool, do one of the following: |
|
|
Note: You must load data into the Analysis window before the Cross-Validation icon is available.
- Cross-validation icon in the Analysis window
|
2. | In the Cross-Validation dialog box, select the method of cross-validation that you want to use. |
- Cross-Validation dialog box
File:Cross validation icon dialog box.png
3. | Use the slider bars to change the default values for the available parameters. |
Note: Not all parameters are relevant for all cross-validation methods. The initial values that are specified for the available parameters are default values that are based on the dimensionality of the data. You can click Reset at any time to reset the parameters to their default settings. For the following descriptions:
|
|
|
- Cross-validation methods compared
' |
Leave One Out |
Venetian Blinds |
Contiguous Block |
Random Subsets |
Custom | ||||||||||
Cross-validation method |
|||||||||||||||
Description |
The default value. All samples in the set of data are used to build the model. |
Each test set is determined by selecting every sth object in the set of data, starting at objects numbered 1 through s. |
An alternative to Venetian Blinds. Each test set is determined by selecting contiguous blocks of n/s objects in the set of data, starting at object number 1. |
"s" different test sets are determined through random selection of n/s objects in the set of data, such that no single object is in more than one test set. This procedure is repeated "r" times, where "r" is the number of iterations. |
You manually define each of the test sets. You can assign specific objects in your set of data in one of three ways:
| ||||||||||
Available Parameters |
Maximum Number of LVs |
|
|
|
| ||||||||||
# of Subvalidation Steps |
n |
s |
s |
(s*r) |
s | ||||||||||
# of Test Samples per Subvalidation |
1 |
n/s |
n/s |
n/s |
Varies. User-defined. |
4. | Do one of the following: |
|
|