In the natural world, we often similar solutions evolve across many species because the solution space for challenges such as movement tends to be fairly small. This phenomenon, known as convergent evolution, illustrates that nature tends to converge on a small set of optimal strategies when faced with similar types of problems. One such strategy is the development of systems for coordination. The ability to act as a unified whole turns out to be a very useful adaptation for the functioning of any complex organism. Let’s take a look at why that is.
To thrive, animals must coordinate the actions of countless cells, tissues, and organs. This coordination is made possible by the nervous system and the brain, which integrate sensory input, process information, and orchestrate responses. Without such systems, a complex organism would collapse into chaos. Imagine a human body where each organ acted independently: the heart pumps without regard for oxygen levels, the lungs breathe without synchronizing with the muscles, and the limbs move without direction. Such an organism would have a very short existence. Coordination proves to be essential for orchestrating complex dynamic systems.
Of course, not all large organisms require such intricate systems. Take the Armillaria ostoyae, a fungus that spans thousands of acres. This organism thrives in a relatively static environment, relying on a network of mycelium to absorb nutrients and reproduce. Its structure is homogeneous, and its ability to adapt to rapid change is limited. While it is vast, it lacks the adaptability of animals. The need for coordination arises from the demands of the environment and the complexity of the tasks at hand. In dynamic, unpredictable environments, the ability to act in coordinated fashion becomes a survival imperative.
We can extend this principle beyond individual organisms to societies, which can be thought of as metaorganisms. Just as cells and organs work together within a body, individuals within a society collaborate to achieve shared goals. Societies, like organisms, compete for resources, and their competition exerts selective pressure. Those that can effectively coordinate labor and resources are more likely to persist and thrive. In small societies, coordination can be relatively simple. A tribe might have a leader who helps organize tasks, but much of the work is distributed among autonomous individuals, each specializing in a specific role, like hunting, crafting, or farming. The structure is flat, and communication is direct.
However, as societies grow, so too does the need for more sophisticated coordination. The transformation from a small tribe to a large civilization is a shift where quantity transforms into quality. With more people comes greater specialization, and with specialization comes interdependence. A blacksmith in a small town might work independently, but in a large society, blacksmiths become part of a broader network of producers, traders, and consumers. This interdependence demands systems to manage complexity, much like a nervous system manages the complexity of a multicellular organism. A group of people specializing in a particular profession is akin to an organ within a living organism.
This pattern emerges in all types of human organizations, from companies to governments. In a small team, direct communication suffices. Each member knows their role, and decisions can be made collaboratively. But as the organization grows, the lines of communication multiply exponentially. What works for five people becomes unmanageable for fifty, and impossible for five hundred. At this point, delegation becomes necessary. Departments form, each with its own leader, and these leaders coordinate with one another. Such hierarchical structure necessarily emerges as a solution to the problem of scale. It mirrors the way an organism relies on a brain and the nervous system to manage its many parts.
The need for coordination, in turn, gives rise to the need for authority. Authority is not inherently oppressive; it is a tool for managing complexity. In a software development project, for example, dozens of individuals might work on interconnected tasks. Frontend developers rely on backend developers to provide data, while backend developers depend on database administrators to manage information. If one team member fails to deliver, the entire project can stall. To prevent such breakdowns, the team must agree on shared norms, schedules, and decision-making processes. These agreements require a team lead to take charge in order to resolve disputes, set priorities, and ensure that everyone is aligned. This authority is not arbitrary; it emerges from the practical demands of coordination.
The same principle applies to large-scale industries. Modern factories, with their complex machinery and hundreds of workers, cannot function without a clear chain of command. Independent action gives way to combined action, and combined action requires organization. Authority, in this context, is not a top-down imposition but a bottom-up necessity. It arises because of the material conditions of production dictated by the scale, complexity, and interdependence of tasks.
Critics of authority often argue for absolute autonomy, but such arguments overlook the real and tangible need for coordination. Authority and autonomy are not opposites; they exist on a spectrum, and their balance shifts with the needs of the group. In a small, simple society, autonomy might dominate. In a large, complex one, authority becomes indispensable. To reject authority outright is to ignore the lessons of both biology and history: that coordination is the foundation of complexity, and that complexity, in turn, demands systems to manage it.
Authority, far from being a mere social construct, is a natural response to the challenges of scale and complexity. It is not inherently good or evil. Rather, it is an effective tool for addressing the needs of the group and the demands of the environment.
Some other writers in complexity can also be thought-provoking. As I’ve mentioned, complexity science is an emerging field so I don’t think a single school of thought has dominated? That also means that I, as a lay-person, may just be following quacks. So keep that in mind.
Here are a few other writers on complexity in case anyone is interested. Something to note is that you’ll find a bit of anti-Sovietism in these writers. It’s like they have this Hayekian worldview where they see socialism as unable to treat complexity, and this is why the USSR failed, etc. etc… But immanent critique of the field is a good way forward, so learning how these authors think of complexity can still be useful.
My first introduction to thinking about Complexity was this article on complexity, scale, and cybernetic-communism. It is leftist, but still anti-Soviet. There are many citations included, though, if you want to go through some rabbit holes. If you are up to the mathematics, the final section is an exploration about mathematical measures of Complexity.
Some main ideas that are cited in the above article:
1.) Studies using historical data from the Seshat Global History Databank suggests that the growth of complex societies follows a repeating two-phase cycle. The first phase is in which societies grow in scale, but not in information-capacity. Any increase in complexity is simply due to increasing scale. Their given information-capacity induces a “scale threshold”, a maximum scale, at which the society cannot progress beyond and this causes the society to ‘stagnate’. This stagnation is the second phase where the scale remains the same, but information-capacity (may) grow. If a society can advance their information-capacity then it can continue to grow in scale until it meets the next scale threshold, and repeat.
I’ve thought about this being another view (in terms of complexity and information) of the Marxist idea that
Or another way of thinking about the quantity to quality idea in dialectics.
2.) In order to build cybernetic-communism we need better “instruments of complexity” beyond money and markets. In the most general definition, complexity is a measure of how much information it takes to “describe” a system. The last section of the article goes into some attempts at quantifying this, and some criticisms of complexity measures such as KL divergence, Kolmogorov complexity, etc.
I’ve also listened to some episodes of the General Intellect Unit Podcast where they discuss Cybernetics form a leftist perspective (but still manage to be anti-Soviet). I can’t give a general recommendation, but it is another resource.
Other writers in complexity science that I have found are Alexander Siegenfeld and Yaneer Bar-Yam. Bar-Yam is the founding president of the New England Complex Systems Institute, and he definitely has is own ‘school of thought’ within Complexity Science. You may find it fruitful to go through some of his arguments, even if it is to build a better critique. He would definitely fit into the ‘anti-authority’ camp, and would view hierarchies as a limitation to the information-capacity of a network, and hence a limitation to complexity. So he would be a counter to your views on systems. You can also definitely find anti-Sovietism in his work, as well.
Siegenfeld and Bar-Yam wrote an introductory paper to complex systems that may be of interest, and doesn’t require mathematical knowledge.
Some big ideas from this paper are developing a more intuitive understanding of what complexity is. You can think of complexity as like the division of labor that you mentioned. It is correlation of various functions between a system’s parts. It isn’t randomness, and it isn’t uniform cohesion.
I think the biggest idea to take away form Bar-Yam’s work is a Scale-based extension of Ashby’s Requisite Law of Variety that we find in cybernetics. This means that the complexity of a system is a function of the components, the interactions, and the scale of the system. A system may be very complex at a large scale, but lacks necessary complexity at small scales, and etc. This is a view of complexity that focuses on Ashby’s idea of variety. This is the number of possible actions, or states, that a system can take. Bar-Yam extends variety to include scale. So there are small scale actions (actions of single individuals) and large-scale actions (of a state).
Ashby’s Law (with or without taking into account Scale) is about autopoiesis, a system’s ability to maintain itself in an environment. According to Ashby’s Law, a system in an environment (which itself is also a system) must be able to react or respond to actions from the environment at the appropriate scale. For example, the climate is an environment that our mode of production (system) is within (and also part of). Climate change creates certain actions (wildfires, global temperature increase, flooding) at a large (regional to global) scale. Capitalism, to maintain itself, has to respond to each environmental action at its scale. If it fails, then changes occur within Capitalism, the system. The system may adapt, evolve, change the environment, or fragment. For socialism (a system) to survive within its environment (global capitalism) it must have the appropriate responses, at the appropriate scale, to respond to capitalism’s “attacks”. The variety of a system must match or exceed the variety of its environment in order to survive. And this applies at all scales.
The scale-based version of Ashby’s Law suggests that sometimes a system can “flex” its variety at a particular scale in order to outcompete another system (or its environment). One example that’s given to describe Scale-based Ashby’s Law is guerilla warfare. In warfare, certain environments may favor smaller scale actions over large scale actions. Guerilla fighters may have more variety (actions/states/options) at small scale compared to larger armies. Some environments, like open fields, etc. may favor larger armies. So a large army will have more variety at a large scale than a small one, etc.
Other papers by Bar-Yam then talk about ways of calculating the complexity, or variety I suppose, of a system at its various scales.
At this point, I’m not certain how accepted these ideas are in complexity science. They may be approaching quack? But I’ve found them interesting to chew on, and try to incorporate into my understanding of Marxism and systems.