Research on integrating datalog & lambda calculus via monotonicity types
-
Updated
Jun 21, 2022 - TeX
Research on integrating datalog & lambda calculus via monotonicity types
Constrained deep learning is an advanced approach to training deep neural networks by incorporating domain-specific constraints into the learning process.
Universal Dependency polarization for monotonicity based natural language inference
Summaries and annotations of research papers across broad spectrum of AI and ML.
Experiments of the ACL 2021 Findings paper "Language Models Use Monotonicity to Assess NPI Licensing"
A library for quickchecking lattice modules and associated operations
This repository contains the code to reproduce all of the results in our paper: Making Learners (More) Monotone, T J Viering, A Mey, M Loog, IDA 2020.
Quantifiers and monotonicity in reasoning tasks
Code of "Not Too Close and Not Too Far: Enforcing Monotonicity Requires Penalizing The Right Points"
Evaluate ranked-choice elections in a notebook interface. Able to import a wide-range of elections and detect non-monotonic results.
Implements an AutoIncrement counter class similar to PostgreSQL's sequence.
JumpBackHash: Say Goodbye to the Modulo Operation to Distribute Keys Uniformly to Buckets
Techniques for data mining.
General Constraint Regression Models
A simple program demonstrating O(n*log(n)) search on a monotonic matrix, versus the O(n**2) search required for a non-monotonic matrix.
This repository contains my seminar work (literature review) for topics in Machine Learning, Pattern Recognition at Paderborn University. Each topic is in a separate folder and the folder name is the topic of my seminar work.
IBM AI explainability
Add a description, image, and links to the monotonicity topic page so that developers can more easily learn about it.
To associate your repository with the monotonicity topic, visit your repo's landing page and select "manage topics."