This document will be used to keep track of changes made between release versions. I'll do my best to note any breaking changes!
- None
- Add a new
datasets
module behind adatasets
feature flag. - Add new classification scores:
precision
,recall
, andf1
. - Add a new
Transformer::fit
function to allow prefitting of aTransformer
before use.
- None
LinRegressor
now usessolve
instead ofinverse
for improved accuracy and stability.
- None
- Adding a new
confusion_matrix
module.
- None
- Updated rulinalg dependency to
0.3.7
.
- None
- None
- Regularization constant for GMM is now only added to diagonal.
- Added some better
Result
handling to GMM.
This version includes no changes but is a bump due to a crates bug.
See the notes for 0.5.0 below.
This is another fairly large release. Thank you to everyone who contributed!
- The
SupModel
andUnSupModel
traits now returnResult
s for thetrain
andpredict
functions. - Updated to rulinalg v0.3 (see rulinalg changelog for details).
- Adding RMSProp gradient descent algorithm. #121
- Adding cross validation. #125
- Adding a new
Shuffler
transformer. #135
- None
- Adding benchmarks
- Initiate GMM with sample covariance of data (instead of identity matrix).
- None
- Adding new
Transformer
trait for data preprocessing. - Adding a
MinMax
transformer. - Adding a
Standardizer
transformer.
- None
- tafia who is responsible for all changes in this release.
- None
- None
- Made neural nets more efficient by reducing clones and some restructuring.
- Removing unneeded copying in favour of slicing for performance.
- Using
iter_rows
in favour of manually row iterating by chunks.
- None
- None
- Fixed a significant bug in the K-Means algorithm. Centroids were not updating correctly during M-step.
- None
- Added experimental implementation of DBSCAN clustering.
- Added new example for K-Means clustering in repo.
This is the biggest release so far. Primarily because the linalg
module has been pulled out into its own crate: rulinalg.
In addition to this there have been a number of improvements to the linalg
and learning
moduled in this release.
- The
linalg
module pulled out and replaced by reexports of rulinalg. All structs are now imported at thelinalg
level, i.e.linalg::matrix::Matrix
->linalg::Matrix
. - Decomposition methods now return
Result
instead of panicking on fail. - K-Means now has a trait for
Initializer
- which allows generic initialization algorithms.
- New error handling in both the
linalg
(now rulinalg) andlearning
modules. - Bug fixed in eigendecomposition: it can now be used!
- K-means can now take a generic initialization algorithm.
- Optimization and code cleanup in the decomposition methods.
- Some optimization in the K-Means model.
- ic (Added examples to repo!)
- Parameter methods now return
Option<&Type>
instead of&Option<Type>
.
MatrixSlice
andMatrixSliceMut
now haveIntoIterator
methods.
- Adding examples to the repository.
- DarkDrek (Who is responsible for almost all changes in this release. Thank you!)
Matrix
:mean
andvariance
methods now takeAxes
enum instead ofusize
flag for dimension.
- Assignment operators (
+=
,-=
, etc.) now implemented forVector
.
- Some optimizations to
variance
computation forMatrix
. - Some code cleanup - thanks to clippy.
- None
- New helper methods to access GMM distribution parameters.
- New GMM constructor to choose different prior mixture weights.
- Fixed a bug where GMM covariances were incorrectly computed when using diagonal constraint.
- All fields on
GradDesc
andStochasticGD
are now private. - Matrix slices now have the same lifetime as their target data.
- Adding new slice utility methods :
from_raw_parts
forMatrixSlice
s andas_slice
methods forMatrix
. - Adding framework for regularization. Implementing regularization for nnets.
- Adding early stopping to gradient descent algorithms.
- Adding
AdaGrad
gradient descent algorithm. - Implementing
Into
andFrom
forMatrix
,Vector
, andMatrixSlice
s.
- Bug fixing naive bayes : no longer attempts to update empty class.
- Removing unneeded trait bounds on
Matrix
/Vector
implementations.
- The
new
constructors forMatrix
andVector
now take anInto<Vec>
generic type. May break some type inference.
- Added row iterators for each matrix struct.
- Implemented OpAssign overloading for
Matrix
andMatrixSliceMut
.
- Moved unit tests into respective modules.
- Modified slice iterators to make the
offset
usage safe(er). - Removed some compiler warnings from the tests.
- None
Matrix
andVector
now implement PartialEq.
- Fixed a bug where eigendecomposition for 2x2 matrices was incorrect.
- None
- None
- Fixing a bug with matrix slice multiplication.
- Removing unneeded NumCast import.
- None
- Adding Naive Bayes classifiers.
- Adding a prelude for common imports.
- Adding MatrixSlice and MatrixSliceMut for efficient matrix views.
- Using matrixmultiply to get huge performance gains! Thanks bluss.
- Code refactor to split up the matrix module.
- vishalsodani (fixing some typos)
- danlrobertson (added the
KMeansClassifierBuilder
)
- None
KMeansClassifier
now has a builder!
- We're now using travis for CI.
- Deriving Debug, Clone, Copy for Gaussian and Exponential distributions.
mut_data
method now returns a mutable slice&mut [T]
instead of aVec<T>
.
- More vectorization and optimization of linear algebra.
- Copy and Clone now implemented where applicable.
- Added test coverage.
- zackmdavis (contributed all features for this version, thank you!)
- None
- Can now debug print matrices and vectors.
- Can now pretty print matrices to given precision.
- Fixed the dependency versions used in Cargo.toml.
- Updated the library documentation with complete list of ML tools.
- None
- Addition of Gaussian Mixture Models.
- Allow basic arithmetic to combine kernels.
- Added some missing documentation.
- Some code formatting.
- Minor improvements thanks to clippy.
- Neural network instantiation
new
method now requires a training algorithm to be specified.
- Adding more kernels (for full list see API documentation).
- Generalized Linear Model.
- Updated model structures to allow more freedom in training algorithms.
- Some more documentation.
- Some minor code formatting.
- None
- Add Support Vector Machines.
- Minor code cleanup.
- Some micro optimization.
- None
- Added the stats module behind the optional feature flag
stats
. stats
currently includes support for the Exponential and Gaussian distributions.
- Some rustfmt code cleanup.
- Removed the
new
constructor for theLinRegressor
. This has been replaced by thedefault
function from theDefault
trait.
- Added a
select
method for cloning a block from a matrix. - Implemented QR decomposition, and eigenvalue decomposition.
- Implemented eigendecomp (though only works definitely for real-symmetric matrices).
- Optimizations to matrix multiplication