Your Company's Neural Network Is Growing Right Now—Whether You're Managing It or Not
Neural networks are grown, not built—and we’re witnessing the emergence of hybrid organizational intelligence shaped by patterns of human-AI interaction that neither could create alone.
In my previous post, I explored Evan Ratliff’s experiment with an AI-only company which was a cautionary tale about agents optimizing themselves into dysfunction. But beneath that warning lies something more insightful in that we’re not just automating work, we’re growing new forms of organizational intelligence through patterns of human-AI interaction that have never existed before.
Neural networks are grown, not built. And every organization deploying AI is now cultivating a hybrid neural network that learns through the dance between human judgment and machine pattern recognition. These new patterns of creative intelligence transcends what either could achieve independently.
The question isn’t whether this is happening, rather, it’s whether we’ll nurture it consciously or let it grow wild.
The Emergence of Hybrid Intelligence
Consider the ideal state of how AI agent infused customer service is supposed to operate today: An AI agent handles initial triage, recognizing patterns across thousands of interactions. A human specialist intervenes when the situation requires judgment. The AI observes how the human resolved the ambiguity. The human notices patterns the AI surfaced that weren’t previously visible. Both adjust their behavior based on this interaction.
Ideally, this isn’t just collaboration, rather it’s co-evolution. The organizational neural network forming through these interactions isn’t human intelligence augmented by AI, nor AI intelligence supervised by humans. It’s a genuinely new form of collective intelligence, with pathways that flow through both biological and artificial nodes, strengthened by patterns neither would reinforce alone.
The human learns to recognize which patterns matter for escalation. The AI learns which contextual factors trigger human intervention. Together, they develop organizational capabilities that exist only in the interaction space between them, the dark energy, or intelligence that lives in the pattern itself, not in either node individually.
The Patterns That Shape Intelligence
In traditional organizations, intelligence was grown through apprenticeship whereby junior employees learned by observing senior ones, patterns reinforced through repeated exposure. The organizational neural network formed slowly, through human-to-human interaction patterns that took years to establish.
AI fundamentally accelerates and transforms this growth. Patterns that would take months to reinforce through human observation can strengthen in days through repeated human-AI interaction. But speed isn’t the only change, the patterns themselves are different.
When humans train humans, we transfer not just what to do but why, the underlying reasoning, the edge cases, the judgment calls. When humans interact with AI, we’re forced to externalize patterns we normally wouldn’t articulate. Today AI can’t effectively infer context, so humans must make it explicit. This externalization changes the pattern itself in that what was tacit becomes encoded, what was intuitive becomes structured.
Meanwhile, AI surfaces patterns humans never see like correlations across thousands of interactions, subtle signals in customer behavior, second-order effects of policy changes. When humans engage with these AI-discovered patterns, incorporating them into judgment, they’re growing organizational intelligence that transcends human pattern recognition limitations.
The resulting neural network combines human capacity for contextual judgment with AI capacity for cross-domain pattern recognition. Neither could grow this intelligence alone.
The Evolution of Organizational Memory
Perhaps the most profound shift is in how organizational intelligence persists. In traditional organizations, memory lived primarily in people. When employees left, neural pathways weakened or disappeared entirely. The organization had to constantly regrow its intelligence as turnover occurred.
With hybrid human-AI neural networks, memory distributes differently. Some patterns persist in AI systems, assuming they don’t forget, mainly because don’t take vacations, don’t change jobs. Other patterns remain distinctly human like the contextual judgment, or the relationship understanding, or the value frameworks that determine when rules should bend.
But the most interesting patterns live in the interaction space where the learned dance of when to escalate, when to override, when to trust, when to question. These patterns exist only in the dynamic between human and AI nodes, and they form the most valuable organizational intelligence. This dark energy is a massive hidden source of value.
This creates both possibility and peril. The possibility is that organizations can develop and maintain sophisticated intelligence that survives individual employee transitions. The peril is when those patterns reinforce poor judgment, they persist and strengthen rather than naturally degrading as bad human habits eventually do.
Designing for Emergence
The visionary challenge isn’t to architect these neural networks since they’re organic systems that will grow regardless. The challenge is to create conditions where the intelligence that emerges through human-AI interaction patterns serves genuine value for everyone rather than just optimization metrics.
This requires a fundamentally different approach to organizational design. Not command-and-control, not even human-centered design exactly, but cultivation of the conditions where beneficial patterns can emerge and harmful patterns can be recognized and pruned.
It means treating every human-AI interaction as an opportunity to grow intelligence, not just to complete a transaction. It means measuring network health as the quality of pattern formation, and not just network output. It means preserving space for pattern diversity, knowing that the richest neural networks maintain multiple pathways rather than converging on single optimized solutions.
Most importantly, it means accepting that we’re no longer designing systems that do what we specify. We’re cultivating living systems that will develop capabilities we didn’t explicitly program, through patterns of interaction we can influence but not control.
The Frontier of Organizational Evolution
We’re at the threshold of something genuinely new: organizations whose intelligence is neither purely human nor purely artificial, but emerges from patterns of interaction between the two. This hybrid intelligence has capabilities neither could achieve alone but with the scale and pattern recognition of AI combined with the judgment and values of human cognition.
But neural networks are grown, not built. Which means the intelligence your organization develops will be shaped by the patterns you allow to form through repeated human-AI interaction. Every escalation protocol, every override decision, every feedback loop is training your organizational neural network.
The visionaries of this era won’t be those who deploy the most sophisticated AI. They’ll be those who understand they’re cultivating new forms of intelligence, and who design the conditions where that intelligence can grow toward human flourishing rather than just mechanical efficiency.
We’ve spent centuries building organizations as machines. We’re now learning to grow them as gardens, where human and artificial intelligence intertwine through patterns of interaction, creating something neither could produce alone.
The question is what we’ll choose to grow.



This framing of distributed organizational memmory is brilliant, but it raises a tension you touch on lightly: while human memory degrades naturally, these hybrid patterns could fossilize. The risk isn't just that poor jugdment persists, its that the interaction patterns themselves become harder to question once theyre encoded in both human habit and machine logic. Organizations might need deliberate mechanisms for unlearning, a kind of scheduled entropy to prevent premature convergence on local maxima that look efficient but miss deeper value creation.