Semantic memory—our organized knowledge about the world—is perhaps the most distinctly human cognitive capacity. It allows us to know that Paris is in France without remembering when we learned it, to understand that birds can fly even if we've never seen every species.
The Structure of Semantic Knowledge
Cognitive scientists have long debated how semantic memory is organized. Feature-based models suggest we store concepts as bundles of attributes. Network models propose that concepts are nodes connected by relationships. Prototype models argue we organize around typical examples.
The truth is likely that semantic memory uses all of these strategies, dynamically selecting the most useful representation for the task at hand.
Implications for Machine Knowledge
Current AI systems predominantly use embedding spaces—high-dimensional vectors that capture semantic similarity. This is closer to feature-based models than network models. But human cognition suggests we may need hybrid approaches that can represent both similarity and structured relationships.
