A manual of the UOSLib intention and functionality appeared at the 23. Workshop on Computational Intelligence in Dortmund in 2013.
Buschermoehle, A. and Huelsmann, J. and Brockmann, W.: UOSLib – A Library for Analysis of Online-Learning Algorithms. In: Proc. 23. Workshop on Computational Intelligence, Karlsruhe: KIT Scientific Publishing, 2013, pp. 355-369
The bibtex format for citations:
@inproceedings{UOSLib2013,
title={UOSLib – A Library for Analysis of Online-Learning Algorithms},
author={Buschermoehle, A. and Huelsmann, J. and Brockmann, W.},
booktitle={Proc. 23. Workshop Computational Intelligence},
year={2013},
publisher={KIT Scientific Publishing},
pages={355--369}
}
Manual
Documentation
The source files are devided into two groups, namely the basic framework and specific
learning algorithms which are located in the subdirectories src and algorithms, respectively.
To get started, the main file isicl_base which can either be run as a
script for single experiments or run as a function for automatic evaluations.
The top of icl_base contains all setups necessary to get started, to run a single learning task:
To get started, the main file is
mode - Choose CLA for Classification or REG for regressionlearnMethod - string to choose the learning method (e.g. 'RLS')algSetup - parameters that are passed to the learning algorithmmodel - model structure used for approximationstart - initial parameter vectortargetFunc.target - string, selecting the target function, to be approximated (e.g. 'sine')targetFunc.ND - number of training datatargetFunc.NG - number of ground truth data for comparisontargetFunc.noise - variance of normally distributed noise on training datatargetFunc.minPath - should be set to true to order the data in input space, false for random orderlivePlot - Live visualization after every learning stepfastmode - Enable to skip generation of data loss and ground truth lossquiet - Enable to suppress all messages and plotsrSeed - Seed for the random number generator to get reproducable results.
Additionally, optional setups currently not available as function parameters (for specific investigations/methods) are:
- implementing different losses (see variable
loss inicl_base )
Framework
- icl_base.m - This is the main function for running an experiment.
- icl_genGLT.m - Generate a grid-based lookup table approximation structure.
- icl_genGLTarb.m - Generate an arbitrary grid-based lookup table approximation structure.
- icl_genPoly.m - Generate a polynomial approximation structure.
- icl_initILS.m - Algorithm specific initialization of ILS (dummy file).
- icl_learn.m - Update the approximation with one data sample (dummy file).
- icl_loadDS.m - This is the data generator module to generate learning tasks.
- icl_predict.m - Predicts a label, i.e. evaluation of the approximator, for a given input.
- icl_showResult.m - Visualization module for low-dimensional learning tasks.
- icl_transform.m - Transforms a vector from input space to parameter space.
- icl_updateILS.m - Updates the ILS information with a new parameter vector
Learning Algorithms
- algorithms/icl_initILS_AROW.m - AROW specific initialization of ILS.
- algorithms/icl_initILS_CW.m - CW specific initialization of ILS.
- algorithms/icl_initILS_GH.m - GH specific initialization of ILS.
- algorithms/icl_initILS_IRMA.m - IRMA specific initialization of ILS.
- algorithms/icl_initILS_PA.m - PA specific initialization of ILS.
- algorithms/icl_initILS_Perceptron.m - Perceptron specific initialization of ILS.
- algorithms/icl_initILS_RLS.m - RLS specific initialization of ILS.
- algorithms/icl_learn_AROW.m - AROW learning algorithm.
- algorithms/icl_learn_CW.m - CW learning algorithm.
- algorithms/icl_learn_GH.m - GH learning algorithm.
- algorithms/icl_learn_IRMA.m - IRMA learning algorithm.
- algorithms/icl_learn_PA.m - PA learning algorithm.
- algorithms/icl_learn_Perceptron.m - Perceptron learning algorithm.
- algorithms/icl_learn_RLS.m - RLS learning algorithm.