Skip to content

An experimental Python-Prolog Hybrid model that explores adding reasoning capabilities to LLMs using a Fact, Rule and Inference Framework.

License

Notifications You must be signed in to change notification settings

ahmed-khalil-hafsi/Rational-Hallucination

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Rational-Hallucination

An experimental Python-Prolog Hybrid model that explores adding reasoning capabilities to LLMs using prolog knowledge bases.

License

This project is under GPL-2 and is copyrighted to Ahmed Khalil Hafsi. Any projects that depend or use this code must be open-source too. See LICENSE for details.

About

An experimental Python-Prolog Hybrid model that explores adding reasoning capabilities to LLMs using a Fact, Rule and Inference Framework.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published