Evri release management: Difference between revisions

From Eigenvector Research Documentation Wiki
Jump to navigation Jump to search
No edit summary
No edit summary
Line 12: Line 12:
* Testing - A mix of manual and automated testing is conducted to verify product quality. Each major release is regression tested to prior releases to assess numeric consistency.  
* Testing - A mix of manual and automated testing is conducted to verify product quality. Each major release is regression tested to prior releases to assess numeric consistency.  


== Validation Information ==
== Installation and Tracing Information ==


* All software is tested for common errors at the time of installation. A message appears in the installation window indicating the tests are being run. If an issue is encountered a warning dialog box will appear, otherwise the installation will proceed to open the main window.  
* All software is tested for common errors at the time of installation. A message appears in the installation window indicating the tests are being run. If an issue is encountered a warning dialog box will appear, otherwise the installation will proceed to open the main window.  
Line 18: Line 18:
[[Image:InstallationWindowTestingNote.jpg| | Installation Test Message]]
[[Image:InstallationWindowTestingNote.jpg| | Installation Test Message]]


* Solo and PLS_Toolbox have some tools to help trace model development but these are not a part of a certified regulatory environment. It is recommended that customers develop a model within Solo/PLS_Toolbox as COTS (commercial off-the shelf) software. Then validate the model to their own standards paying special attention to how a model performs with previously unused data that spans the range of the samples you expect the model to work with.
* Solo and PLS_Toolbox have some tools to help trace model development but these are not a part of a certified regulatory environment. Tracing model development can be done using the following functionality:
 
* Tracing model development can be done using the following functionality:
** Model [[Standard_Model_Structure#model.detail|detail.history]] field contains an entry log of datetime and changes made to the model.
** Model [[Standard_Model_Structure#model.detail|detail.history]] field contains an entry log of datetime and changes made to the model.
** DataSet [[DataSet_Object_Fields#.history|.history]] field contains a running history of commands that have modified the DataSet contents.
** DataSet [[DataSet_Object_Fields#.history|.history]] field contains a running history of commands that have modified the DataSet contents.
** The [[Analysis_Window:_Model_Cache_Pane|Modelcache]] stores models and data upon calculation in the analysis interface. The model cache interface keeps a complete history of calculated models.
** The [[Analysis_Window:_Model_Cache_Pane|Modelcache]] stores models and data upon calculation in the analysis interface. The model cache interface keeps a complete history of calculated models.
== Model Validation Information ==
It is recommended that customers develop a model within Solo/PLS_Toolbox as COTS (commercial off-the shelf) software. Then validate the model to their own standards paying special attention to how a model performs with previously unused data that spans the range of the samples you expect the model to work with.
* Steps to consider when validating a model.

Revision as of 11:06, 2 February 2021

Eigenvector Research Inc. follows industry standards for software production. EVRI release management process includes the following:

Release Management

EVRI release management process includes the following:

  • Continuous development - EVRI staff all work from the latest version of products so any changes are immediately tested in real world use. Typically a staff member will update from the repository at the beginning of work day. Note, occasionally staff will work from an existing release to maintain version compatibility with a customer.
  • Version Control - EVRI maintains modern version control software and processes to manage software development.
  • Issue Tracking - A software project management system for bug and enhancement tracking is used to document and manage issues.
  • Testing - A mix of manual and automated testing is conducted to verify product quality. Each major release is regression tested to prior releases to assess numeric consistency.

Installation and Tracing Information

  • All software is tested for common errors at the time of installation. A message appears in the installation window indicating the tests are being run. If an issue is encountered a warning dialog box will appear, otherwise the installation will proceed to open the main window.

Installation Test Message

  • Solo and PLS_Toolbox have some tools to help trace model development but these are not a part of a certified regulatory environment. Tracing model development can be done using the following functionality:
    • Model detail.history field contains an entry log of datetime and changes made to the model.
    • DataSet .history field contains a running history of commands that have modified the DataSet contents.
    • The Modelcache stores models and data upon calculation in the analysis interface. The model cache interface keeps a complete history of calculated models.

Model Validation Information

It is recommended that customers develop a model within Solo/PLS_Toolbox as COTS (commercial off-the shelf) software. Then validate the model to their own standards paying special attention to how a model performs with previously unused data that spans the range of the samples you expect the model to work with.

  • Steps to consider when validating a model.