Listen "Graph of Thoughts"
Episode Synopsis
This episode explores Graph of Thoughts (GoT), a prompting scheme designed to enhance the reasoning abilities of large language models (LLMs). GoT is compared to other methods like Chain-of-Thought (CoT), Self-Consistency with CoT (CoT-SC), and Tree of Thoughts (ToT). GoT improves performance by utilizing thought transformations such as aggregation, allowing for larger thought volumes—the number of previous thoughts influencing a current thought. It offers a superior balance between latency (number of steps) and volume, resulting in better task performance.The episode also discusses GoT's practical applications, including set intersection, keyword counting, and document merging, providing specific examples and prompts for each. GoT consistently outperforms other prompting schemes in accuracy and cost, demonstrating its potential to improve LLM capabilities through its graph-based structure, which allows for more complex and flexible reasoning.https://arxiv.org/pdf/2308.09687
More episodes of the podcast Agentic Horizons
AI Storytelling with DOME
19/02/2025
Intelligence Explosion Microeconomics
18/02/2025
Theory of Mind in LLMs
15/02/2025
Designing AI Personalities
14/02/2025
LLMs Know More Than They Show
12/02/2025
AI Self-Evolution Using Long Term Memory
10/02/2025
ZARZA We are Zarza, the prestigious firm behind major projects in information technology.