Jensen: A toolkit with API support for Convex Optimization and Machine Learning For further documentation, please see https://arxiv.org/abs/1807.06574
Copyright (C) Rishabh Iyer, John T. Halloran, and Kai Wei Licensed under the Open Software License version 3.0 See COPYING or http://opensource.org/licenses/OSL-3.0
- Rishabh Iyer
- John Halloran
- Kai Wei
- Convex Function API
- Base class for convex optimization
L1LogistocLossandL2LogistocLoss,L1SmoothSVMLossandL2SmoothSVMLoss,L1HingeSVMLossandL2HingeSVMLoss,L1ProbitLossLossandL2ProbitLoss,L1HuberSVMLossandL2HuberSVMLoss,L1SmoothSVRLossandL2SmoothSVRLoss,L1HingeSVMLossandL2HingeSVMLoss
- Convex Optimization Algorithms API
Trust Region Newton(TRON)LBFGS AlgorithmLBFGS OWL(L1 regularization)Conjugate Gradient DescentDual Coordinate Descent for SVMs(SVCDual)Gradient DescentGradient Descent with Line SearchGradient Descent with Nesterov's algorithmGradient Descent with Barzilai-Borwein step sizeStochastic Gradient DescentStochastic Gradient Descent with AdaGradStochastic Gradient Descent with Dual AveragingStochastic Gradient Descent with Decaying Learning Rate
- ML Classification API
L1 Logistic Regression,L2 Logistic RegressionL1 Smooth SVML2 Smooth SVML2 Smooth SVM
- ML Regression API
L1 Linear RegressionL2 Linear RegressionL1 Smooth SVRsL2 Smooth SVRsL2 Hinge SVRs
- Install CMake
- Go to the main directory of jensen
- mkdir build
- cd build/
- cmake ..
- make
Once you run make, it should automatically build the entire library. Once the library is built, please try out the example codes in the build directory.
To test the optimization algorithms please run the test executables: ./TestL1LogisticLoss ./TestL2LogisticLoss ./TestL1SmoothSVMLoss ./TestL2LeastSquaresLoss etc.
You can also play around with the examples for testing classification and regression models. You can try them out as: ./ClassificationExample -trainFeatureFile ../data/heart_scale.feat -trainLabelFile ../data/heart_scale.label -testFeatureFile ../data/heart_scale.feat -testLabelFile ../data/heart_scale.label Optionally you can also play around with the method (L1LR, L2LR etc.), the algtype (LBFGS, TRON etc.), the regularization and so on.