Skip to content

Commit

Permalink
Fix biblio
Browse files Browse the repository at this point in the history
  • Loading branch information
gdalle committed Feb 22, 2024
1 parent f28be9b commit 95dbee6
Show file tree
Hide file tree
Showing 2 changed files with 16 additions and 15 deletions.
27 changes: 14 additions & 13 deletions paper/HMM.bib
Original file line number Diff line number Diff line change
Expand Up @@ -80,6 +80,18 @@ @book{cappeInferenceHiddenMarkov2005
file = {/home/gdalle/snap/zotero-snap/common/Zotero/storage/2HYZE7ZD/Cappé et al_2005_Inference in Hidden Markov Models.pdf;/home/gdalle/snap/zotero-snap/common/Zotero/storage/QRNV9CL8/Cappé et al. - 2006 - Inference in Hidden Markov Models.pdf}
}

@software{changDynamaxStateSpace2024,
title = {Dynamax: {{State Space Models}} Library in {{JAX}}},
author = {Chang, Peter and Harper-Donnelly, Giles and Kara, Aleyna and Li, Xinglong and Linderman, Scott and Murphy, Kevin},
date = {2024-02-22T04:10:59Z},
origdate = {2022-04-11T23:42:29Z},
url = {https://github.com/probml/dynamax},
urldate = {2024-02-22},
abstract = {State Space Models library in JAX},
organization = {{Probabilistic machine learning}},
keywords = {hmm}
}

@thesis{dalleMachineLearningCombinatorial2022,
type = {phdthesis},
title = {Machine Learning and Combinatorial Optimization Algorithms, with Applications to Railway Planning},
Expand All @@ -97,9 +109,9 @@ @thesis{dalleMachineLearningCombinatorial2022
file = {/home/gdalle/snap/zotero-snap/common/Zotero/storage/CEVJMUP4/Dalle - Machine learning and combinatorial optimization al.pdf}
}

@software{hmmlearnHmmlearnHiddenMarkov2023,
@software{hmmlearndevelopersHmmlearnHiddenMarkov2023,
title = {Hmmlearn: {{Hidden Markov Models}} in {{Python}}, with Scikit-Learn like {{API}}},
author = {{hmmlearn}},
author = {{hmmlearn developers}},
date = {2023},
url = {https://github.com/hmmlearn/hmmlearn},
urldate = {2023-09-12},
Expand Down Expand Up @@ -139,17 +151,6 @@ @unpublished{ondelGPUAcceleratedForwardBackwardAlgorithm2021
file = {/home/gdalle/snap/zotero-snap/common/Zotero/storage/XRKC5QBG/Ondel et al. - 2021 - GPU-Accelerated Forward-Backward Algorithm with Ap.pdf}
}

@software{ProbmlDynamax2024,
title = {Probml/Dynamax},
date = {2024-02-22T04:10:59Z},
origdate = {2022-04-11T23:42:29Z},
url = {https://github.com/probml/dynamax},
urldate = {2024-02-22},
abstract = {State Space Models library in JAX},
organization = {{Probabilistic machine learning}},
keywords = {hmm}
}

@article{qinDirectOptimizationApproach2000,
title = {A {{Direct Optimization Approach}} to {{Hidden Markov Modeling}} for {{Single Channel Kinetics}}},
author = {Qin, Feng and Auerbach, Anthony and Sachs, Frederick},
Expand Down
4 changes: 2 additions & 2 deletions paper/paper.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,8 +48,8 @@ In this industrial use case, the observations were marked temporal point process

Unfortunately, nearly all implementations of HMMs we surveyed (in Julia and Python) expect the observations to be generated by a _predefined set of distributions_, with _no control dependency_.
In Julia, the reference package `HMMBase.jl` [@mouchetHMMBaseJlHidden2023] requires compliance with the `Distributions.jl` [@besanconDistributionsJlDefinition2021] interface, which precludes anything not scalar- or array-valued, let alone point processes.
In Python, `hmmlearn` [@hmmlearnHmmlearnHiddenMarkov2023], `pomegranate` [@schreiberPomegranateFastFlexible2018a] each offer a catalogue of discrete and continuous distributions, but do not allow for easy extension by the user.
The more recent `dynamax` [@ProbmlDynamax2024] is the only package adopting an extensible interface with optional controls, similar to ours.
In Python, `hmmlearn` [@hmmlearndevelopersHmmlearnHiddenMarkov2023], `pomegranate` [@schreiberPomegranateFastFlexible2018] each offer a catalogue of discrete and continuous distributions, but do not allow for easy extension by the user.
The more recent `dynamax` [@changDynamaxStateSpace2024] is the only package adopting an extensible interface with optional controls, similar to ours.

Focusing on Julia specifically, other downsides of `HMMBase.jl` include the lack of support for _multiple observation sequences_ or _sparse transition matrices_, and the mandatory use of _64-bit floating point numbers_.
Two other packages provide functionalities that `HMMBase.jl` lacks: `HMMGradients.jl` [@antonelloHMMGradientsJlEnables2021] contains a _differentiable loglikelihood function_, while `MarkovModels.jl` [@ondelGPUAcceleratedForwardBackwardAlgorithm2021] focuses on GPU acceleration.
Expand Down

0 comments on commit 95dbee6

Please sign in to comment.