In 1858, Queen Victoria sent a telegram to President James Buchanan across the first transatlantic cable. The message took sixteen hours to transmit, and the reply took ten. Both were celebratory, ceremonial, and empty. The line failed a few weeks later. Saltwater had leaked into the gutta-percha insulation, and the engineers had overcompensated by raising the voltage. The cable burned out, unnoticed, beneath the Atlantic. It took almost a decade before another would take its place.
But by then, the damage had already been done. Not to the technology, but to the idea of it. What had begun as a project to connect continents and enable faster diplomacy had quickly collapsed into a PR stunt, and then into a cautionary tale. Newspapers ran editorials lamenting the unreliability of the medium. Pessimists called it premature. Historians now look back and call it progress.
What interests me is not whether the early cable was a technical failure. It was. What matters more is that the idea was sound, and the system around it was brittle. This is the phenomenon I want to explore: when a good idea meets a system that can only partially contain it, it doesn’t merely fail. It distorts.
There is a metaphor in information theory that I keep returning to. In a noiseless channel, a signal can be transmitted perfectly. But in the real world, every channel has noise. Some degrade the signal predictably. Others do so irreversibly.
A good idea, when passed through a noisy system, emerges warped. An education reform becomes a standardized test. A public health campaign becomes a moral panic. A participatory platform becomes a surveillance economy. The original intent is not only lost—it becomes unintelligible.
There is a temptation to call this corruption. But that lets the system off too easily. Corruption implies agency, intent, a choice to betray the idea. Degradation is subtler. It happens through feedback loops, incentive misalignments, coordination problems, and time. It happens even when everyone involved believes they are doing their job.
Consider the fate of peer review. In theory, it is a mechanism for scientific self-correction. In practice, it is a gatekeeping structure prone to conservatism, nepotism, and conformity. The idea was noble. The system became noisy. Publishable results now trend toward the flashy, the novel, the confirmatory. The signal is still there. But it takes a trained eye to read through the noise.
The Social Panopticon of Innovation
This is how bureaucracies decay innovation: A proposal must be framed in the language of ROI, impact metrics, and stakeholder alignment. Its risk profile must be quantifiable. Its outcome must be predictable. Its deliverables must be spreadsheet-compatible. Under these constraints, only a certain kind of idea can survive: the kind that mimics previous successes. And so the truly new gets filtered out by a series of well-meaning, rational, professional constraints.
A friend once described his job at a large foundation as "writing grant proposals for ideas we can't afford to execute." He meant that the only way to get funding was to present ideas as low-risk and high-certainty, which disqualified most of the projects they actually cared about. By the time the proposals were approved, they no longer resembled the original concepts.
In other words: the system degraded the signal, and everyone involved learned to transmit a different signal entirely.
Consider the internet. More specifically, consider what it could have been. In the 1990s, the web was imagined as a decentralized, permissionless publishing system. Anyone could post. Anyone could read. The idea was that information would flow freely, and good ideas would surface organically.
Fast forward thirty years, and the information environment resembles something closer to a casino than a library. You refresh your feed. You pull the lever. You get dopamine. Or not.
The good ideas are still there, but they are submerged. The system favors the clickable over the thoughtful, the enraging over the true, the repeated over the novel. The best content strategy is still outrage.
You can trace this back to simple incentives. Platforms monetize attention. Algorithms optimize for engagement. The result is not a deliberate distortion. It is a predictable one. Like overamplifying a telegraph signal until it fries the wire, platforms boost engagement until the ideas themselves start to burn out.
Again: the degradation is systemic, not conspiratorial. It arises from design choices made in ignorance, business models developed in haste, and feedback loops no one bothered to interrupt.
In 1958, C. Northcote Parkinson wrote that "work expands to fill the time available for its completion." But Parkinson's real insight was deeper. In his satirical essays on British civil service, he showed how organizations could grow while becoming less efficient, how decision-making structures could reward caution over insight, and how institutional inertia could outlast the people who built it.
This is where ideas go to die: in systems that have no memory of why they were created. The nonprofit that becomes more obsessed with fundraising than impact. The government agency that polices forms instead of outcomes. The startup that pivots into a derivative because the board demands it. These are not tragedies. They are patterns.
Every system has a kind of entropy. It tends toward self-preservation. It substitutes metrics for meaning. And in doing so, it degrades the clarity of the original signal.
It’s tempting to think that better systems could solve this. And in some cases, they can. Wikipedia is a better model for collaborative knowledge than most corporate intranets. The scientific preprint server arXiv allows faster dissemination than most journals. Some crypto protocols enforce transparency better than legacy financial systems.
But even these are vulnerable. Every system creates new incentives. Every incentive creates new distortions. There is no perfect channel.
What are we left with?
We can build systems that degrade more slowly. We can design channels where noise is predictable and compressible. We can teach people to recognize distortion and compensate for it. We can reward clarity, transparency, and epistemic humility.
But more than anything, we need to recognize the problem. Good ideas do not fail simply because they were wrong. Sometimes they fail because the environment was wrong. Because the structure was brittle. Because the channel was noisy.
And unless we learn to distinguish between bad ideas and bad transmission, we will keep burning the cables, congratulating ourselves on the sparks.