Tools Cross-Validation: Difference between revisions
imported>Jeremy No edit summary |
imported>Jeremy No edit summary |
||
Line 1: | Line 1: | ||
__TOC__ | __TOC__ | ||
[[TableOfContents|Table of Contents]] | [[ModelApplication_ValidationPhase|Previous]] | [[Tools_ModelRobustness|Next | [[TableOfContents|Table of Contents]] | [[ModelApplication_ValidationPhase|Previous]] | [[Tools_ModelRobustness|Next]] | ||
==Cross-Validation Tool== | ==Cross-Validation Tool== | ||
Line 6: | Line 6: | ||
You use the Cross-Validation tool to: | You use the Cross-Validation tool to: | ||
{| | |||
|- | |||
| | |||
* Assess the optimal complexity of a model (for example, the number of principal components in a PCA or PCR model, or the number of latent variables in a PLS model). | * Assess the optimal complexity of a model (for example, the number of principal components in a PCA or PCR model, or the number of latent variables in a PLS model). | ||
|- | |||
| | |||
* Estimate the performance of a model when you apply the model to unknown data. | * Estimate the performance of a model when you apply the model to unknown data. | ||
|} | |||
For a given set of data, cross-validation involves a series of steps called subvalidation steps in which you remove a subset of objects from a set of data (the test set), build of a model using the remaining objects in the set of data (the model building set), and then apply the resulting model to the removed objects. You note how the errors accumulate as you leave out samples to determine the number of principal components/latent variables/factors to retain in the model. Cross-validation typically involves more than one subvalidation step, each of which in turn involves the selection of different subsets of samples for model building and model testing. In Solo, five different cross-validation methods are available, and these methods vary with respect to how the different sample subsets are selected for these subvalidation steps. | For a given set of data, cross-validation involves a series of steps called subvalidation steps in which you remove a subset of objects from a set of data (the test set), build of a model using the remaining objects in the set of data (the model building set), and then apply the resulting model to the removed objects. You note how the errors accumulate as you leave out samples to determine the number of principal components/latent variables/factors to retain in the model. Cross-validation typically involves more than one subvalidation step, each of which in turn involves the selection of different subsets of samples for model building and model testing. In Solo, five different cross-validation methods are available, and these methods vary with respect to how the different sample subsets are selected for these subvalidation steps. | ||
{| | |||
|- | |||
:* Click the Cross-Validation icon in the Analysis window. | |1. | ||
|To open the Cross-Validation tool, do one of the following: | |||
|} | |||
: | |||
{| | |||
|- | |||
| | |||
* On the Analysis window, click Tools > Cross-Validation. | |||
|} | |||
: | |||
{| | |||
|- | |||
| | |||
* Click the Cross-Validation icon in the Analysis window. | |||
|} | |||
Note: You must load data into the Analysis window before the Cross-Validation icon is available. | Note: You must load data into the Analysis window before the Cross-Validation icon is available. | ||
Line 25: | Line 60: | ||
[[Image:Cross_validation_icon_Analysis_window.png|406x83px]] | [[Image:Cross_validation_icon_Analysis_window.png|406x83px]] | ||
:* In the Analysis window Flowchart pane, click Choose Cross-Validation. | : | ||
{| | |||
|- | |||
| | |||
* In the Analysis window Flowchart pane, click Choose Cross-Validation. | |||
|- | |||
|2. | |||
|In the Cross-Validation dialog box, select the method of cross-validation that you want to use. | |||
|} | |||
''Cross-Validation dialog box'' | ''Cross-Validation dialog box'' | ||
Line 34: | Line 81: | ||
[[Image:Cross_validation_icon_dialog_box.png|288x138px]] | [[Image:Cross_validation_icon_dialog_box.png|288x138px]] | ||
3. | {| | ||
Use the slider bars to change the default values for the available parameters. | |||
|- | |||
|3. | |||
|Use the slider bars to change the default values for the available parameters. | |||
|} | |||
Note: Not all parameters are relevant for all cross-validation methods. The initial values that are specified for the available parameters are default values that are based on the dimensionality of the data. You can click Reset at any time to reset the parameters to their default settings. For the following descriptions: | Note: Not all parameters are relevant for all cross-validation methods. The initial values that are specified for the available parameters are default values that are based on the dimensionality of the data. You can click Reset at any time to reset the parameters to their default settings. For the following descriptions: | ||
{| | |||
|- | |||
| | |||
* n is the total number of objects in the set of data. | * n is the total number of objects in the set of data. | ||
|- | |||
| | |||
* s is the number of data splits specified for the cross-validation procedure, which must be less than n/2. | * s is the number of data splits specified for the cross-validation procedure, which must be less than n/2. | ||
|- | |||
| | |||
* r is the number of iterations. | * r is the number of iterations. | ||
|} | |||
''Cross-validation methods compared'' | ''Cross-validation methods compared'' | ||
{| | |||
|- | |||
| | |||
'''''' | '''''' | ||
| | |||
'''Leave One Out''' | '''Leave One Out''' | ||
| | |||
'''Venetian Blinds''' | '''Venetian Blinds''' | ||
| | |||
'''Contiguous Block''' | '''Contiguous Block''' | ||
| | |||
'''Random Subsets''' | '''Random Subsets''' | ||
| | |||
'''Custom''' | '''Custom''' | ||
|- | |||
| | |||
Cross-validation method | Cross-validation method | ||
| | |||
[[Image:CV_Leave_One_Out.jpg|86x126px]] | [[Image:CV_Leave_One_Out.jpg|86x126px]] | ||
| | |||
[[Image:CVB_VenetianBlinds.jpg|84x126px]] | [[Image:CVB_VenetianBlinds.jpg|84x126px]] | ||
| | |||
[[Image:CV_ContinguousBlocks.jpg|85x126px]] | [[Image:CV_ContinguousBlocks.jpg|85x126px]] | ||
| | |||
[[Image:CV_RandomSubsets.jpg|85x127px]] | [[Image:CV_RandomSubsets.jpg|85x127px]] | ||
| | |||
|- | |||
| | |||
Description | Description | ||
| | |||
The default value. All samples in the set of data are used to build the model. | The default value. All samples in the set of data are used to build the model. | ||
| | |||
Each test set is determined by selecting every sth object in the set of data, starting at objects numbered 1 through s. | Each test set is determined by selecting every sth object in the set of data, starting at objects numbered 1 through s. | ||
| | |||
An alternative to Venetian Blinds. Each test set is determined by selecting contiguous blocks of n/s objects in the set of data, starting at object number 1. | An alternative to Venetian Blinds. Each test set is determined by selecting contiguous blocks of n/s objects in the set of data, starting at object number 1. | ||
| | |||
"s" different test sets are determined through random selection of n/s objects in the set of data, such that no single object is in more than one test set. This procedure is repeated "r" times, where "r" is the number of iterations. | "s" different test sets are determined through random selection of n/s objects in the set of data, such that no single object is in more than one test set. This procedure is repeated "r" times, where "r" is the number of iterations. | ||
| | |||
You manually define each of the test sets. You can assign specific objects in your set of data in one of three ways: | You manually define each of the test sets. You can assign specific objects in your set of data in one of three ways: | ||
{| | |||
|- | |||
| | |||
* To be in every test set. | * To be in every test set. | ||
|- | |||
| | |||
* To never be in a test set. | * To never be in a test set. | ||
|- | |||
| | |||
* To not be used in the cross- validation procedure at all. | * To not be used in the cross- validation procedure at all. | ||
|} | |||
|- | |||
| | |||
Available Parameters | Available Parameters | ||
| | |||
Maximum Number of LVs | Maximum Number of LVs | ||
| | |||
{| | |||
|- | |||
| | |||
* Maximum Number of LVs | * Maximum Number of LVs | ||
|- | |||
| | |||
* Number of Data Splits | * Number of Data Splits | ||
|} | |||
| | |||
{| | |||
|- | |||
| | |||
* Maximum Number of LVs | * Maximum Number of LVs | ||
|- | |||
| | |||
* Number of Data Splits | * Number of Data Splits | ||
|} | |||
| | |||
{| | |||
|- | |||
| | |||
* Maximum Number of LVs | * Maximum Number of LVs | ||
|- | |||
| | |||
* Number of Data Splits | * Number of Data Splits | ||
|- | |||
| | |||
* Number of Iterations | * Number of Iterations | ||
|} | |||
| | |||
{| | |||
|- | |||
| | |||
* Number of data splits | * Number of data splits | ||
|- | |||
| | |||
* Object membership for each split | * Object membership for each split | ||
|- | |||
| | |||
* Total number of objects | * Total number of objects | ||
|} | |||
|- | |||
| | |||
# of Subvalidation Steps | # of Subvalidation Steps | ||
| | |||
n | n | ||
| | |||
s | s | ||
| | |||
s | s | ||
| | |||
(s*r) | (s*r) | ||
| | |||
s | s | ||
|- | |||
| | |||
# of Test Samples per Subvalidation | # of Test Samples per Subvalidation | ||
| | |||
1 | 1 | ||
| | |||
n/s | n/s | ||
| | |||
n/s | n/s | ||
| | |||
n/s | n/s | ||
| | |||
Varies. User-defined. | Varies. User-defined. | ||
4. | |- | ||
Do one of the following: | |||
|4. | |||
|Do one of the following: | |||
|} | |||
: | |||
{| | |||
|- | |||
| | |||
* Click Apply button to apply these settings and keep the Cross Validation dialog box open. | |||
|} | |||
: | |||
{| | |||
|- | |||
| | |||
* Click OK to apply these settings and close the Cross Validation dialog box. | |||
|} |
Revision as of 09:24, 29 July 2010
Table of Contents | Previous | Next
Cross-Validation Tool
You use the Cross-Validation tool to:
|
|
For a given set of data, cross-validation involves a series of steps called subvalidation steps in which you remove a subset of objects from a set of data (the test set), build of a model using the remaining objects in the set of data (the model building set), and then apply the resulting model to the removed objects. You note how the errors accumulate as you leave out samples to determine the number of principal components/latent variables/factors to retain in the model. Cross-validation typically involves more than one subvalidation step, each of which in turn involves the selection of different subsets of samples for model building and model testing. In Solo, five different cross-validation methods are available, and these methods vary with respect to how the different sample subsets are selected for these subvalidation steps.
1. | To open the Cross-Validation tool, do one of the following: |
|
|
Note: You must load data into the Analysis window before the Cross-Validation icon is available.
Cross-validation icon in the Analysis window
| |
2. | In the Cross-Validation dialog box, select the method of cross-validation that you want to use. |
Cross-Validation dialog box
File:Cross validation icon dialog box.png
3. | Use the slider bars to change the default values for the available parameters. |
Note: Not all parameters are relevant for all cross-validation methods. The initial values that are specified for the available parameters are default values that are based on the dimensionality of the data. You can click Reset at any time to reset the parameters to their default settings. For the following descriptions:
|
|
|
Cross-validation methods compared
' |
Leave One Out |
Venetian Blinds |
Contiguous Block |
Random Subsets |
Custom | ||||||||||
Cross-validation method |
|||||||||||||||
Description |
The default value. All samples in the set of data are used to build the model. |
Each test set is determined by selecting every sth object in the set of data, starting at objects numbered 1 through s. |
An alternative to Venetian Blinds. Each test set is determined by selecting contiguous blocks of n/s objects in the set of data, starting at object number 1. |
"s" different test sets are determined through random selection of n/s objects in the set of data, such that no single object is in more than one test set. This procedure is repeated "r" times, where "r" is the number of iterations. |
You manually define each of the test sets. You can assign specific objects in your set of data in one of three ways:
| ||||||||||
Available Parameters |
Maximum Number of LVs |
|
|
|
| ||||||||||
|
n |
s |
s |
(s*r) |
s | ||||||||||
|
1 |
n/s |
n/s |
n/s |
Varies. User-defined. | ||||||||||
4. | Do one of the following: |
|
|