# World Model, Abstraction, Analogy - Research Base * General Content * Melanie Mitchell book * Analogy * [Melanie Mitchell: Concepts, Analogies, Common Sense & Future of AI | Lex Fridman Podcast #61 - YouTube](https://www.youtube.com/watch?v=ImKkaeUx1MU) * See Hofstader * It seems that a large part about creativity comes from analogies? What do analogies require (do they require a world model of sorts? * could an llm learn analogies? Can LLMs reason by analogy and apply cross domain? (Eg in physics using analogies of water flow in pipes to understand current in circuits) * World Model * Is learning a world model required? Can a transformer (with predict next word architecture) learn a world model? See [Actually, Othello-GPT Has A Linear Emergent World Representation — LessWrong](https://www.lesswrong.com/s/nhGNHyJHbrofpPbRG/p/nmxzr2zsjNtjaHh7x) * could an LLM create/does it have good internal concepts to utilize? Does it need this? * World model could allow for mental simulation and creative thinking (NGI pg 30) * MuZero actually *learns an environment model* ([MuZero: Mastering Atari, Go, Chess and Shogi by Planning with a Learned Model - YouTube](https://www.youtube.com/watch?v=We20YSAJZSE)) * Analogy 1. Douglas Hofstadter is a cognitive scientist who has argued that human thought is based on analogy-making[1][3][5]. According to Hofstadter, cognition is analogy-making, and we are constantly seeing the deep essence that we are familiar with from previous experience[2]. He believes that analogy pervades every aspect of cognition, from simple everyday activities to deep scientific discoveries[5]. Hofstadter's ideas are so original and off-kilter that few yet know what to make of them[6]. He has written several books on the topic, including "Fluid Concepts and Creative Analogies"[6]. In summary, Hofstadter believes that analogy is the core of cognition and that it plays a fundamental role in human thought. 2. https://www.qualcomm.com/news/onq/2022/09/Can-neural-networks-think-in-analogies 3. https://www.solvingforpattern.org/2013/06/02/hofstadter-analogy-seeing-the-deep-essence/ 4. http://worrydream.com/refs/Hofstadter%20-%20Analogy%20as%20the%20Core%20of%20Cognition.pdf 5. https://youtube.com/watch?v=n8m7lFQ3njk 6. https://news2.rice.edu/2009/11/19/douglas-hofstadter-tells-rice-audience-analogy-is-key-to-cognition/ 7. https://www.wired.com/1995/11/kelly/ 8. Artificial Intelligence: A Guide for Thinking Humans 14. World Model 1. [Actually, Othello-GPT Has A Linear Emergent World Representation — LessWrong](https://www.lesswrong.com/s/nhGNHyJHbrofpPbRG/p/nmxzr2zsjNtjaHh7x) 2. [Large Language Model: world models or surface statistics?](https://thegradient.pub/othello/) 3. [Actually, Othello-GPT Has A Linear Emergent World Representation — LessWrong](https://www.lesswrong.com/posts/nmxzr2zsjNtjaHh7x/actually-othello-gpt-has-a-linear-emergent-world#How_do_models_represent_features_) 4. [How to check if a neural network has learned a specific phenomenon? - YouTube](https://www.youtube.com/watch?v=fL22NAtMNYo)