Loading [MathJax]/extensions/tex2jax.js
https://mooseframework.inl.gov
All Classes Namespaces Files Functions Variables Typedefs Enumerations Enumerator Friends
Public Member Functions | Public Attributes | List of all members
StochasticTools::GaussianProcess::GPOptimizerOptions Struct Reference

Structure containing the optimization options for hyperparameter-tuning. More...

#include <GaussianProcess.h>

Public Member Functions

 GPOptimizerOptions ()
 Default constructor. More...
 
 GPOptimizerOptions (const bool show_every_nth_iteration=1, const unsigned int num_iter=1000, const unsigned int batch_size=0, const Real learning_rate=1e-3, const Real b1=0.9, const Real b2=0.999, const Real eps=1e-7, const Real lambda=0.0)
 Construct a new GPOptimizerOptions object using input parameters that will control the optimization. More...
 

Public Attributes

const unsigned int show_every_nth_iteration = false
 Switch to enable verbose output for parameter tuning at every n-th iteration. More...
 
const unsigned int num_iter = 1000
 The number of iterations for Adam optimizer. More...
 
const unsigned int batch_size = 0
 The batch isize for Adam optimizer. More...
 
const Real learning_rate = 1e-3
 The learning rate for Adam optimizer. More...
 
const Real b1 = 0.9
 Tuning parameter from the paper. More...
 
const Real b2 = 0.999
 Tuning parameter from the paper. More...
 
const Real eps = 1e-7
 Tuning parameter from the paper. More...
 
const Real lambda = 0.0
 Tuning parameter from the paper. More...
 

Detailed Description

Structure containing the optimization options for hyperparameter-tuning.

Definition at line 47 of file GaussianProcess.h.

Constructor & Destructor Documentation

◆ GPOptimizerOptions() [1/2]

StochasticTools::GaussianProcess::GPOptimizerOptions::GPOptimizerOptions ( )

Default constructor.

◆ GPOptimizerOptions() [2/2]

StochasticTools::GaussianProcess::GPOptimizerOptions::GPOptimizerOptions ( const bool  show_every_nth_iteration = 1,
const unsigned int  num_iter = 1000,
const unsigned int  batch_size = 0,
const Real  learning_rate = 1e-3,
const Real  b1 = 0.9,
const Real  b2 = 0.999,
const Real  eps = 1e-7,
const Real  lambda = 0.0 
)

Construct a new GPOptimizerOptions object using input parameters that will control the optimization.

Parameters
show_every_nth_iterationTo show the loss value at every n-th iteration, if set to 0, nothing is displayed
num_iterThe number of iterations we want in the optimization of the GP
batch_sizeThe number of samples in each batch
learning_rateThe learning rate for parameter updates
b1Tuning constant for the Adam algorithm
b2Tuning constant for the Adam algorithm
epsTuning constant for the Adam algorithm
lambdaTuning constant for the Adam algorithm

Definition at line 27 of file GaussianProcess.C.

39  b1(b1),
40  b2(b2),
41  eps(eps),
42  lambda(lambda)
43 {
44 }
const Real lambda
Tuning parameter from the paper.
const unsigned int show_every_nth_iteration
Switch to enable verbose output for parameter tuning at every n-th iteration.
const Real eps
Tuning parameter from the paper.
const unsigned int batch_size
The batch isize for Adam optimizer.
const Real b2
Tuning parameter from the paper.
const Real b1
Tuning parameter from the paper.
const unsigned int num_iter
The number of iterations for Adam optimizer.
const Real learning_rate
The learning rate for Adam optimizer.

Member Data Documentation

◆ b1

const Real StochasticTools::GaussianProcess::GPOptimizerOptions::b1 = 0.9

Tuning parameter from the paper.

Definition at line 82 of file GaussianProcess.h.

Referenced by StochasticTools::GaussianProcess::tuneHyperParamsAdam().

◆ b2

const Real StochasticTools::GaussianProcess::GPOptimizerOptions::b2 = 0.999

Tuning parameter from the paper.

Definition at line 84 of file GaussianProcess.h.

Referenced by StochasticTools::GaussianProcess::tuneHyperParamsAdam().

◆ batch_size

const unsigned int StochasticTools::GaussianProcess::GPOptimizerOptions::batch_size = 0

The batch isize for Adam optimizer.

Definition at line 78 of file GaussianProcess.h.

Referenced by GaussianProcessTrainer::GaussianProcessTrainer(), and StochasticTools::GaussianProcess::setupCovarianceMatrix().

◆ eps

const Real StochasticTools::GaussianProcess::GPOptimizerOptions::eps = 1e-7

Tuning parameter from the paper.

Definition at line 86 of file GaussianProcess.h.

Referenced by StochasticTools::GaussianProcess::tuneHyperParamsAdam().

◆ lambda

const Real StochasticTools::GaussianProcess::GPOptimizerOptions::lambda = 0.0

Tuning parameter from the paper.

Definition at line 88 of file GaussianProcess.h.

◆ learning_rate

const Real StochasticTools::GaussianProcess::GPOptimizerOptions::learning_rate = 1e-3

The learning rate for Adam optimizer.

Definition at line 80 of file GaussianProcess.h.

Referenced by StochasticTools::GaussianProcess::tuneHyperParamsAdam().

◆ num_iter

const unsigned int StochasticTools::GaussianProcess::GPOptimizerOptions::num_iter = 1000

The number of iterations for Adam optimizer.

Definition at line 76 of file GaussianProcess.h.

Referenced by StochasticTools::GaussianProcess::tuneHyperParamsAdam().

◆ show_every_nth_iteration

const unsigned int StochasticTools::GaussianProcess::GPOptimizerOptions::show_every_nth_iteration = false

Switch to enable verbose output for parameter tuning at every n-th iteration.

Definition at line 74 of file GaussianProcess.h.

Referenced by StochasticTools::GaussianProcess::tuneHyperParamsAdam().


The documentation for this struct was generated from the following files: