Constrainfit: Difference between revisions

From Eigenvector Research Documentation Wiki
Jump to navigation Jump to search
imported>Rasmus
No edit summary
 
(29 intermediate revisions by 3 users not shown)
Line 1: Line 1:
===Purpose===
===Purpose===


Finds A minimizing ||X-A*B'|| subject to constraints, given the small matrices ('''X''' ' '''B''') and ('''B''' ' '''B''')
Finds '''A''' minimizing ||X-A*B'|| subject to constraints, given the small matrices ('''X''' ' '''B''') and ('''B''' ' '''B''')


===Synopsis===
===Synopsis===
 
:  [A,diagnostics]=constrainfit(XB,BtB,Aold,options);  % Constrained
:  [A]=constrainfit(XB,BtB,Aold);  % Unconstrained
:  [A,diagnostics]=constrainfit(XB,BtB,Aold); % Unconstrained
 
:  '''Setting global constraints on A'''
:  opt = constrainfit('options');
:  opt.type='nonnegativity';
:  [A]=constrainfit(XB,BtB,Aold,opt); % Nonnegative
 
:  '''Setting constraints on just one column of A'''
:  opt = constrainfit('options');
:  opt.type='columnwise';
: opt.columnconstraints={0;2;0}; % If three columns
:  [A]=constrainfit(XB,BtB,Aold,opt); % Second column unimodal


===Description===
===Description===


CONSTRAINTFIT solves the least squares problem behind bilinear, trilinear and other multilinear models. Assuming a model '''X''' = '''A'''*'''B''' ' and assuming that '''X''' and '''B''' are known, the least squares estimate of '''A''' is obtained. Rather than using '''X''' and '''B''' this algorithm uses the cross product matrices ('''X''' ' '''B''') and ('''B''' ' '''B''') which are generally smaller and less memory-demanding especially in multi-way models.
CONSTRAINFIT solves the least squares problem behind bilinear, trilinear and other multilinear models. Assuming a model '''X''' = '''A'''*'''B''' ' and assuming that '''X''' and '''B''' are known, the least squares estimate of '''A''' is obtained. Rather than using '''X''' and '''B''' this algorithm uses the cross product matrices ('''X''' ' '''B''') and ('''B''' ' '''B''') which are generally smaller and less memory-demanding especially in multi-way models.


CONSTRAINFIT can do a number of general types of regression problems such as nonnegativity-constrained regression, regression with column-orthogonality of '''A''' etc. These constraints are simply set in the option field 'type', e.g. option.type='nonnegativity'. Thus, for most problems, only the 'type' field needs to be set. CONSTRAINFIT will provide a least squares solution to most of these problems.
CONSTRAINFIT can do a number of general types of regression problems such as nonnegativity-constrained regression, regression with column-orthogonality of '''A''' etc. These constraints are simply set in the option field 'type', e.g. option.type='nonnegativity'. Thus, for most problems, only the 'type' field needs to be set. CONSTRAINFIT will provide a least squares solution to most of these problems.


CONSTRAINFIT can also find '''A''' subject to different constraints on different columns. In this case, the update of '''A''' will be an improvement of the initially provided estimate '''Aold'''. As CONSTRAINFIT is used inside iterative algorithms, an improvement is sufficient to guarantee overall convergence.
CONSTRAINFIT can also find '''A''' subject to different constraints on different columns. In this case, the update of '''A''' will be an improvement of the initially provided estimate '''Aold''' though not necessarilly the least squares solution. As CONSTRAINFIT is used inside iterative algorithms, an improvement is sufficient to guarantee overall convergence.
 


====Inputs====
====Inputs====
* '''XB''' = This is the matrix '''X''' ' '''B'''.
* '''XB''' = The matrix '''X''' ' '''B'''.
* '''BtB''' = This is the matrix '''B''' ' '''B'''.
* '''BtB''' = The matrix '''B''' ' '''B'''.
* '''Aold''' = An initial estimate of '''A'''.
* '''Aold''' = An initial estimate of '''A'''.


Line 43: Line 31:


* '''type''': [ {'unconstrained'} | 'nonnegativity' | 'unimodality' | 'orthogonality' | 'columnorthogonal' | 'equality' | 'exponential' | 'rightprod' | 'columnwise']
* '''type''': [ {'unconstrained'} | 'nonnegativity' | 'unimodality' | 'orthogonality' | 'columnorthogonal' | 'equality' | 'exponential' | 'rightprod' | 'columnwise']
::: provides quick access to most important settings
::: ''''unconstrained''''  - do unconstrained fit of '''A'''
::: ''''nonnegativity''''  - '''A''' is all nonnegative
::: ''''unimodality''''    - '''A''' has unimodal columns ''and'' is nonnegative
::: ''''unimodality_nonon''''  - '''A''' has unimodal columns
::: ''''orthogonality''''  - '''A''' is orthogonal ('''A''' ' '''A''' = '''I''')
::: ''''columnorthogonal''''- '''A''' has orthogonal columns ('''A''' ' '''A''' = diagonal)
::: ''''equality''''        - columns in '''A''' are subject to equality constraints. Useful for e.g. imposing closure (see settings under options.equality below)
::: ''''exponential''''    - Columns are mono-exponentials
::: ''''rightprod''''      - Fitting '''A''' subject to being of the form '''F*D''', where '''D''' is predefined (must be set in options.advanced.linearconstraints.matrix). if imposed then columnwise constraints (see below) are applied to the columns of '''F''' rather than '''A'''. Hence options.columnconstraints must be set appropriately.
::: ''''L1 penalty''''      - A is estimated using a constraint that A should be sparse - (the higher options.L1.penalty, the sparser A will be). Note: A is also constrained to be nonnegative.
::: ''''columnwise''''      - A has other constraints than the above. These have to be defined in options.columnconstraints (see below).
* '''columnconstraints''': cell where element f defines constraints on column f (only applicable if options.type = 'columnwise').
::: columnconstraints is a cell vector {f1,f2,f3, ... fF}. Each element f1, f2, etc. corresponds to one column of A. f1 defines constraints on the first column of A etc. Each constraint on a column is defined by a number. For example if f1 is 1, then nonnegativity is imposed on the first column (see definitions below). If f1 = [1 4], then first nonnegativity is imposed and then smoothness. The following constraints are available on individual columns
::: a = 0 : Unconstrained
::: a = 1 : Nonnegativity
::: a = 2 : Unimodality
::: a = 3 : Inequality (every element >= scalar). Scalar has to be in options.inequality.scalar. This is a vector of size F, one scalar for each factor
::: a = 4 : Smoothness. options.smoothness.operator can be used to hold operator (for speeding up. Won't have to be estimated each time. options.smoothness.alpha (0<alpha<1). Setting to zero means no smoothness while setting to 1 means high degree of smoothness.
:::  a = 5 : Fixed elements. The elements that are fixed are defined in options.fixed.
::: a = 6 : Not applicable
::: a = 7 : Approximate unimodality. See options.unimodality.weight
::: a = 8 : Normalize the loading vectors to norm one
::: a = 20: Functional constraint. Using simple pre- or userdefined functions, any functional constraint can be imposed on individual columns. For example, that one column is exponential. Functional constraints require that a function is written (type HELP FITGAUSS for an example).
::'''Example:''' Fitting the second loading vector as being Gaussian:
<pre>
    NumberFactors=3;
    options.functional=cell(NumberFactors,1);
    ToFix = 2; % This constraint is for the second column
    options.functional{ToFix}.functionhandle = @fitgauss;
    % Define starting parameters
    center = 100;width = 100;height = .1;
    options.functional{ToFix}.parameters = [center width height];
    options.functional{ToFix}.additional=[]; % no additional input
</pre>
When a column has more than one constraint these are generally imposed sequentially starting with the first one in options.columnconstraint. For most constraints, the order of constraints will not be important. Advise is to input constraints with smaller numbers first.


* '''columnconstraints''': cell where element f defines constraints on column f (only applicable if options.type = 'columnwise'). For applicable column constraints see below.
* '''inequality''' : Defines a cutoff. If inequality is defined in columnwise constraints, all elements of that column will be > options.inequality.scalar. Thus, when set to zero, nonnegativity is imposed.
* '''inequality''' : Defines a cutoff. If inequality is defined in columnwise, all elements of that column will be > options.inequality.scalar. Thus, when set to zero, nonnegativity is imposed.
* '''nonnegativity''': defines which algorithm to use for imposing nonnegativity when options.type = 'nonnegativity'. If set to 0, the default NNLS algorithm is used. If set to 1, a faster columnwise update is used which only improves the current least squares fit, if set to 2, an ad hoc approach is used where '''A''' is estimated in a least squares sense and then negative numbers are set to zero. This will not provide a well-defined solution in terms of the least squares loss function. If set to 3, the NMF algorithm is used. This requires that all elements of the data array are nonnegative in order to work properly.
* '''nonnegativity''': defines which algorithm to use for imposing nonnegativity when options.type = 'nonnegativity'. If set to 0, the default NNLS algorithm is used. If set to 1, a faster columnwise update is used which only improves the current least squares fit, if set to 2, an ad hoc approach is used where '''A''' is estimated in a least squares sense and then negative numbers are set to zero. This will not provide a well-defined solution in terms of the least squares loss function. If set to 3, the NMF algorithm is used. This requires that all elements of the data array are nonnegative in order to work properly.
* '''smoothness''': defines how much smoothness is imposed when smoothness is imposed as a columnconstraint. smoothness.alpha is a number between 0 (no smoothness) and 1 (full smoothness)
* '''smoothness''': defines how much smoothness is imposed when smoothness is imposed as a columnconstraint. smoothness.alpha is a number between 0 (no smoothness) and 1 (full smoothness)
* '''fixed''':  
* '''fixed''': options.fixed.values is a matrix of the same size as the loading matrix ('''A''') with the actual numbers to be fixed in the positions corresponding to the position in A. The remaining positions must be NaN. The degree to which elements are fixed is set in options.fixed.weight (0<weight<1). Zero means not imposed whereas one means completely fixed.
* '''advanced''':  
* '''advanced''': In the field advanced.linearconstraints, settings for options.type = 'rightprod' are set. If '''A''' is IxF, then linearconstraint.matrix must be and SxF matrix '''D''''. '''A''' is then found as '''F*D'''. E.g. if '''A''' has three columns, set the predefined matrix D = [1 1 0; 0 0 1], then the first and second of the three columns in A will be identical (F*D where F is to be estimated).
* '''equality''':
* '''equality''': Settings for using options.type = 'equality'. Two fields are held in equality, C and d. When imposed, CONSTRAINFIT solves for loading matrix A subject to A(i,:)*C' = d for all i. Hence if you want to impose closure and have three factors, set C=[1 1 1] and d=1.
* '''unimodality''':  
* '''unimodality''': Set weight in options.unimodality.weight. weight==1: exact unimodality. weight==0: no unimodality
* '''functional''':  
* '''functional''': For functional constrains (see above)
* '''definitions''': @optiondefs
 
===Examples===
 


===Example===
'''Setting global constraints on A'''
  opt = constrainfit('options');
  opt.type='nonnegativity';
  [A]=constrainfit(XB,BtB,Aold,opt); % Nonnegative


'''Setting constraints on just one column of A'''
  opt = constrainfit('options');
  opt.type='columnwise';
  opt.columnconstraints={0;2;0}; % If three columns
  [A]=constrainfit(XB,BtB,Aold,opt); % Second column unimodal


<pre>
<pre>
>>This is an example
% Make a noisy dataset such that PARAFAC gives noisy loadings
% Make a noisy dataset
load aminoacids
load aminoacids
x = X.data;
x = X.data;
Line 68: Line 104:
op=parafac('options');
op=parafac('options');


% set constraints of columns in second mode
% set constraints in second mode to be defined columnwise
op.constraints{2}.type='columnwise';
op.constraints{2}.type='columnwise';
% Define that first column is smooth, second and third unconstrained
% Define that first column is smooth, second and third unconstrained
op.constraints{2}.columnconstraints={4 0 0};
op.constraints{2}.columnconstraints={4 0 0};
% Fit model
% Fit model
model = parafac(x,3,op);
model = parafac(x,3,op);
</pre>


Note how the first loading in the second mode is more smooth than the rest
Note how the first loading in the second mode is more smooth than the rest. If needed smoothness can be turned up (to one) and down (to zero) using op.constraints{2}.smoothness.alpha=0.6


if needed smoothness can be turned up (to one) and down (to zero) using
===Notes===
op.constraints{2}.smoothness.alpha=0.6


</pre>
* When attempting to use the fixed constraint with non-negativity or unimodality,  use the .fixed and .columnconstraints. Meaning, use constraints{n,1}.fixed.values, and add values ( as a cell array) in constraints{n,1}.columnconstraints using valid columnconstraint values as described above.


===See Also===
===See Also===


[[baselinew]], [[deresolv]]
[[parafac]]

Latest revision as of 01:04, 3 February 2021

Purpose

Finds A minimizing ||X-A*B'|| subject to constraints, given the small matrices (X ' B) and (B ' B)

Synopsis

[A,diagnostics]=constrainfit(XB,BtB,Aold,options);  % Constrained
[A,diagnostics]=constrainfit(XB,BtB,Aold);  % Unconstrained

Description

CONSTRAINFIT solves the least squares problem behind bilinear, trilinear and other multilinear models. Assuming a model X = A*B ' and assuming that X and B are known, the least squares estimate of A is obtained. Rather than using X and B this algorithm uses the cross product matrices (X ' B) and (B ' B) which are generally smaller and less memory-demanding especially in multi-way models.

CONSTRAINFIT can do a number of general types of regression problems such as nonnegativity-constrained regression, regression with column-orthogonality of A etc. These constraints are simply set in the option field 'type', e.g. option.type='nonnegativity'. Thus, for most problems, only the 'type' field needs to be set. CONSTRAINFIT will provide a least squares solution to most of these problems.

CONSTRAINFIT can also find A subject to different constraints on different columns. In this case, the update of A will be an improvement of the initially provided estimate Aold though not necessarilly the least squares solution. As CONSTRAINFIT is used inside iterative algorithms, an improvement is sufficient to guarantee overall convergence.

Inputs

  • XB = The matrix X ' B.
  • BtB = The matrix B ' B.
  • Aold = An initial estimate of A.

Optional Inputs

  • options = provides definitions for which type of constraint to impose.

Outputs

  • A = The improved estimate of A.

Options

options = a structure array with the following fields:

  • type: [ {'unconstrained'} | 'nonnegativity' | 'unimodality' | 'orthogonality' | 'columnorthogonal' | 'equality' | 'exponential' | 'rightprod' | 'columnwise']
provides quick access to most important settings
'unconstrained' - do unconstrained fit of A
'nonnegativity' - A is all nonnegative
'unimodality' - A has unimodal columns and is nonnegative
'unimodality_nonon' - A has unimodal columns
'orthogonality' - A is orthogonal (A ' A = I)
'columnorthogonal'- A has orthogonal columns (A ' A = diagonal)
'equality' - columns in A are subject to equality constraints. Useful for e.g. imposing closure (see settings under options.equality below)
'exponential' - Columns are mono-exponentials
'rightprod' - Fitting A subject to being of the form F*D, where D is predefined (must be set in options.advanced.linearconstraints.matrix). if imposed then columnwise constraints (see below) are applied to the columns of F rather than A. Hence options.columnconstraints must be set appropriately.
'L1 penalty' - A is estimated using a constraint that A should be sparse - (the higher options.L1.penalty, the sparser A will be). Note: A is also constrained to be nonnegative.
'columnwise' - A has other constraints than the above. These have to be defined in options.columnconstraints (see below).
  • columnconstraints: cell where element f defines constraints on column f (only applicable if options.type = 'columnwise').
columnconstraints is a cell vector {f1,f2,f3, ... fF}. Each element f1, f2, etc. corresponds to one column of A. f1 defines constraints on the first column of A etc. Each constraint on a column is defined by a number. For example if f1 is 1, then nonnegativity is imposed on the first column (see definitions below). If f1 = [1 4], then first nonnegativity is imposed and then smoothness. The following constraints are available on individual columns
a = 0 : Unconstrained
a = 1 : Nonnegativity
a = 2 : Unimodality
a = 3 : Inequality (every element >= scalar). Scalar has to be in options.inequality.scalar. This is a vector of size F, one scalar for each factor
a = 4 : Smoothness. options.smoothness.operator can be used to hold operator (for speeding up. Won't have to be estimated each time. options.smoothness.alpha (0<alpha<1). Setting to zero means no smoothness while setting to 1 means high degree of smoothness.
a = 5 : Fixed elements. The elements that are fixed are defined in options.fixed.
a = 6 : Not applicable
a = 7 : Approximate unimodality. See options.unimodality.weight
a = 8 : Normalize the loading vectors to norm one
a = 20: Functional constraint. Using simple pre- or userdefined functions, any functional constraint can be imposed on individual columns. For example, that one column is exponential. Functional constraints require that a function is written (type HELP FITGAUSS for an example).
Example: Fitting the second loading vector as being Gaussian:
    NumberFactors=3;
    options.functional=cell(NumberFactors,1);
    ToFix = 2; % This constraint is for the second column
    options.functional{ToFix}.functionhandle = @fitgauss;
    % Define starting parameters
    center = 100;width = 100;height = .1;
    options.functional{ToFix}.parameters = [center width height];
    options.functional{ToFix}.additional=[]; % no additional input

When a column has more than one constraint these are generally imposed sequentially starting with the first one in options.columnconstraint. For most constraints, the order of constraints will not be important. Advise is to input constraints with smaller numbers first.

  • inequality : Defines a cutoff. If inequality is defined in columnwise constraints, all elements of that column will be > options.inequality.scalar. Thus, when set to zero, nonnegativity is imposed.
  • nonnegativity: defines which algorithm to use for imposing nonnegativity when options.type = 'nonnegativity'. If set to 0, the default NNLS algorithm is used. If set to 1, a faster columnwise update is used which only improves the current least squares fit, if set to 2, an ad hoc approach is used where A is estimated in a least squares sense and then negative numbers are set to zero. This will not provide a well-defined solution in terms of the least squares loss function. If set to 3, the NMF algorithm is used. This requires that all elements of the data array are nonnegative in order to work properly.
  • smoothness: defines how much smoothness is imposed when smoothness is imposed as a columnconstraint. smoothness.alpha is a number between 0 (no smoothness) and 1 (full smoothness)
  • fixed: options.fixed.values is a matrix of the same size as the loading matrix (A) with the actual numbers to be fixed in the positions corresponding to the position in A. The remaining positions must be NaN. The degree to which elements are fixed is set in options.fixed.weight (0<weight<1). Zero means not imposed whereas one means completely fixed.
  • advanced: In the field advanced.linearconstraints, settings for options.type = 'rightprod' are set. If A is IxF, then linearconstraint.matrix must be and SxF matrix D'. A is then found as F*D. E.g. if A has three columns, set the predefined matrix D = [1 1 0; 0 0 1], then the first and second of the three columns in A will be identical (F*D where F is to be estimated).
  • equality: Settings for using options.type = 'equality'. Two fields are held in equality, C and d. When imposed, CONSTRAINFIT solves for loading matrix A subject to A(i,:)*C' = d for all i. Hence if you want to impose closure and have three factors, set C=[1 1 1] and d=1.
  • unimodality: Set weight in options.unimodality.weight. weight==1: exact unimodality. weight==0: no unimodality
  • functional: For functional constrains (see above)

Examples

Setting global constraints on A

 opt = constrainfit('options');
 opt.type='nonnegativity';
 [A]=constrainfit(XB,BtB,Aold,opt); % Nonnegative

Setting constraints on just one column of A

 opt = constrainfit('options');
 opt.type='columnwise';
 opt.columnconstraints={0;2;0}; % If three columns
 [A]=constrainfit(XB,BtB,Aold,opt); % Second column unimodal
% Make a noisy dataset such that PARAFAC gives noisy loadings
load aminoacids
x = X.data;
x = x+randn(size(x))*100;

% define parafac options
op=parafac('options');

% set constraints in second mode to be defined columnwise
op.constraints{2}.type='columnwise';

% Define that first column is smooth, second and third unconstrained
op.constraints{2}.columnconstraints={4 0 0};

% Fit model
model = parafac(x,3,op);

Note how the first loading in the second mode is more smooth than the rest. If needed smoothness can be turned up (to one) and down (to zero) using op.constraints{2}.smoothness.alpha=0.6

Notes

  • When attempting to use the fixed constraint with non-negativity or unimodality, use the .fixed and .columnconstraints. Meaning, use constraints{n,1}.fixed.values, and add values ( as a cell array) in constraints{n,1}.columnconstraints using valid columnconstraint values as described above.

See Also

parafac