Unifying language models with knowledge graphs

Unifying large language models (LLMs) and knowledge graphs (KGs) can address the shortcomings of LLMs such as lack of factual knowledge, hallucinations and lack of interpretability. Integrating LLMs with knowledge graphs enhances accuracy, contextual understanding, and scalability by leveraging structured, interconnected data.

In this presentation, the main aspects which would be discussed
1. KG-enhanced LLMs generation, which incorporate KGs during the pre-training and inference phases of LLMs
2. LLM-augmented KGs, that leverage for KG completion by Entity discovery, Relation extraction, End-to-End KG construction, and Distilling KGs from LLMs
3. Synergized LLMs with KGs, which is bidirectional with focus on knowledge representation and reasoning.
GraphsRAGs techniques are designed to extract meaningful, structured data from unstructured text using the LLM in order to provide substantial improvements in generation. Elaborating on the graph RAG pipelines using open-source language models(specially small ones SLEs), in combination with Neo4j graph database.
Discussion about the evaluation of the graph RAGs in the context of generating educational pathways for school students aspiring for higher education.

Finally a short summary and outlook on the further development

 

About the speaker
Amy-Heineike

Adi Tya

Co-Owner at Naavi Network

NLP-Summit

When

Online Event: September 24, 2024

 

Contact

nlpsummit@johnsnowlabs.com

Presented by

jhonsnow_logo