Most engineering projects are complex, but when they get really complex an approach that falls under the term “systems engineering” is called for. I was introduced to the term by Guru Madhavan, a senior policy advisor and a director of studies at the National Academies of Sciences, Engineering, and Medicine and author of Applied Minds: How Engineers Think. Guru and I go way back; since he received his Ph.D. at my university and made the most of EvoS, the campus-wide evolutionary studies program that I direct.
The reason that I have become intrigued by systems engineering is because it can be seen as a form of artificial cultural group selection. Group selection concerns the evolution of traits due to the differential survival and reproduction of groups, as opposed to the differential survival and reproduction of individuals within groups. Group selection can take place for culturally derived traits in addition to genetically derived traits. Indeed, cultural group selection is an exceptionally strong force in our own species1.
The distinction between “natural” and “artificial” selection is relatively clear-cut for genetic evolution. The environment is the selective agent for natural selection (e.g., the evolution of cryptic coloration in moths) and humans are the selective agents for artificial selection (e.g., the evolution of domestic flower varieties). These two forms of selection are not mutually exclusive, however. For example, the domestication of wolves might well have begun with natural selection for individuals that remained close to human habitations, before artificial selection could get started2.
Natural and artificial selection are even more intermingled for human cultural evolution. To some extent, our current groups are the result of many unplanned social experiments, a few that hung together compared to many that fell apart. These groups work well without having been consciously designed by anyone, like the aerodynamic design of a bird’s wing. Yet, humans are inveterate intentional planners, so some aspects of our groups are consciously designed, like the constitution of the United States.
Against this background, systems engineering can be seen as an exceptionally pure form of artificial cultural group selection, which explicitly treats a physical or a social system as the unit of selection and employs highly refined processes for evolving the system’s component parts. My use of the word “evolving” in the previous sentence goes beyond its colloquial meaning. The toolkit of the system engineer probably requires a process of “blind variation and selective retention” which is the substrate-free definition of an evolutionary process3.
Currently, there is very little overlap between the systems engineering community and the growing interdisciplinary community of scientists thinking about cultural group selection. My conversation with Guru aims to integrate the two communities.
David Sloan Wilson: Greetings, Guru, and thanks for making yourself available for this conversation!
Guru Madhaven: My pleasure, David—thank you for this opportunity.
DSW: Before we explore the concept of systems engineering as a form of cultural group selection, please set the stage by describing systems engineering against the background of other forms of engineering. What makes it distinctive?
GM: There’s a saying that “the way we see the problem is the problem.” This is fundamentally what systems engineering seeks to address. It does so by clarifying “dependencies” and judiciously tackling trade-offs in a network of issues that interact with one another. It’s as technical as it’s social in practice. Engineering fields are also now becoming super-specialized, and that’s why the practice of systems engineering—operating at the nexus of disciplines and technical denominations—is increasingly valuable. A systems-level perspective, much like group selection, enables engineers to take into account the issues of the parts and the whole that have been missed, ignored, or forgotten. The end product in systems engineering may not necessarily be an object or a thing.
Systems engineers typically spend a lot of time from the get-go to try to understand the requirements, desires, and endpoints relating to the possible solutions. There’s generally a wide appreciation in systems engineering that local fixes aren’t going to produce a globally viable solution that depends on the context anyway. There’s usually a long list of “ilities” to be considered as a package that other specialties of engineering may not always entertain—as in systems quality, reliability, usability, testability, utility, scalability, maintainability, recoverability, verifiability, safety, and so on. Naturally, these factors—and the underlying discipline—are applied to guard against failures that can be very unforgiving.
As a robust operational discipline—some even call it a practical philosophy—systems level engineering has existed for millennia and practiced across cultures. The modern successes of systems engineering exploded during the war-times, reaching wider and wider scales between the 1950s and the 1970s notably through defense, aerospace, urban planning, and manufacturing. Coincidentally, this is also around the time when the concepts of multi-level selection theory in biology—what I consider a close analog of systems engineering—was emerging with great contention. It’s interesting how these concepts, similar in spirit, were received in their times: one treated with great criticism in theory and one greatly valued in practice.
DSW: Thanks. That’s very helpful. There are two main features that qualify systems engineering as an exceptionally pure form of artificial cultural group selection. First, the whole system is the unit of selection. Second, processes of “blind variation and selective retention” are used to evolve the system. Could you please comment on these two main features in turn?
GM: One could readily consider systems engineering as deliberate cultural group selection but often under the constraints of complexity, costs, schedules, and requirements. The “system,” being the unit of selection, as such is defined by its goals and objectives it needs to achieve.
Clearly, a kind of requirement-driven engineering works well in the assembly of a hard system like an aircraft. However, imagine the more complex scenario of managing air travel, where the aircraft, previously a system on its own, is now part of a broader sociotechnical system. Air travel began with the development of aircraft with different capabilities, with competitive military and civil markets determining which would succeed. Supporting each successful aircraft is a complex network of testing, maintenance, and repair, fueling, and eventually scheduling and sequencing in carrier service. But routing design cannot take place in a vacuum since multiple airlines use the same airports and airspace. Thus, we soon evolved to a centralized control of airspace use, overseen by the Federal Aviation Administration and internationally coordinated treaties, which shows you how the system level unit of selection works. But that system must, in turn, rely on construction, maintenance, and ground operations of airports around the country, which in turn invokes issues of environmental protection, noise control, ground transportation, and urban planning at the local and state level. So, these improvements emerge as a result of variation and selection, often times deliberate and deadline driven, but sometimes organic and slow.
For example, one can achieve a list of specification or constructive cost models in the detailed design phase through simulations, if the requirements are well known. For more complex scenarios, evolutionary algorithms are applied to go around and “search” for a range of desired solutions. Call these managed or controlled evolution, but in either case, the key notion for selection and variation is that of “requisite variety” with the constituents of the system and their consequential states. This is an evolutionarily inspired concept that’s key to understanding complexity.
DSW: Great! This concrete example adds a lot to your previous more general description. Based on my limited understanding, systems engineering is distinctive in other ways, beyond these two main features; for example, the need to organize a system into a number of modules. Could you please comment on this and other distinctive aspects that go beyond “the system is the unit of selection” and “blind variation and selective retention processes are required”?
Deconstructing and reconstructing a system into its component parts—modular systems engineering—is a standard practice to gain insights into the behavior by function, interaction, and over time. It’s a diagnostic process in a way, again to understand the necessary trade-offs, especially when the issues are complex and contentious. Should we add capacity at a near-city airport, or build new facilities further away? Each option involves considerations far afield from airplanes yet are part of the aviation system.
A friend likes to remind me that even if we had the ability to instantly produce—or “3-D print”—a ship, it would still take 3 or more years to deploy it due to the protocols, paperwork, and the bureaucracy that come with it. The human factors really loom large. Availability of technology itself is only part of the issue, as can be seen through the continuous efforts to manage traffic congestion during peak hours in major cities. Building new roads and bridges are insufficient and don’t resolve the issue. Hence, the idea of charging people higher tolls or fares for on-demand routes during peak hours came into practice. A basic supply-demand concept was advanced to practice by engineers. In this case, a city was an organism, and the modeling studies-through cameras, transponders, satellite images, user-provided data—involved “seeing” and “listening” to the city. The city can be better understood through its modules, and how they individually and collectively perform.
DSW: My conversation with Geoff Mulgan, author of Big Mind: How Collective Intelligence Can Change Our World, touches upon the same themes.
GM: Systems engineering, in a way, is a form of generating, applying, and adjusting collective intelligence. This sort of successive optimization—fueled by data—takes the form of an evolutionary selection, because sooner or later, people are going to react to the changes in the set price. They may even become intolerant or insensitive to them. That’s a behavioral as well as a political challenge to the solution that still needs to be tackled.
Engineers have been dealing with these “human factors” and related design issues for a long time (setting aside the recent popularity of “behavioral economics”). After all, what motivates us to do one thing over another? Modern megaprojects are replete with such fundamental questions. Similarly, the public policy issues confront tough trade-offs: what disease control effort should we be prioritizing? What vaccines and drug interventions should one develop? What kinds of research should we be funding? Given the sheer variety of voices, biases, and interests surrounding any policy issue, it sometimes becomes difficult to step back and take a total-systems and evolutionary approach. So, how then do you generate a set of “requirements” for a solution that people will agree to sign on, and eventually work toward? A brute-force approach isn’t going to be received well. One may have to organically “evolve” toward a set of conducive requirements in the possible solution space.
Sometimes, ignoring a broader systemic analysis, we seem to be tunneling on narrow matters akin to specific aircraft designs, or even worse, on sub-sub-problems like seating density within an airplane or what food and beverage service to provide to passengers when the bigger issue is air traffic control. Public policy analysis is a ripe area for progress, and an explicit partnership between evolutionary principles and systems engineering is vital to guide some new thinking. This could well be part of what you and colleagues have been envisioning as a “new synthesis.”
DSW: How often are the “systems” of systems engineering purely physical systems, as opposed to systems that involve a human social system component? Could you give one or more examples of each?
GM: The physical and social divisions of systems are becoming very blurred, as we know it. Of course, traditional systems engineering focused on quality, efficiency, and related concepts work very well in say, an oil rig or refinery but the human element cannot be ignored. That said, how can one intuitively apply the tools of complex systems engineering to guide preparedness efforts to deal with a pandemic? In this case, systems engineering—much like cultural group selection—needs to be evolved into an acceptable social form. It’s simultaneously a coordination problem and a behavioral problem, and less to do with available or planned technologies themselves.
In the recent Ebola outbreaks in West Africa, for example, we had all kinds of surveillance systems, barcode tracking, and satellite imaging. We also had a potential vaccine that was shelved years before for various reasons. However, the role of local burial rites and customs (that contributed at least 20 percent of new infections according to the World Health Organization) was not considered among the top factors in economic analyses. Similarly, cleaning of a major river (or a socioeconomic system rather) like the Ganges in my home country of India involves more than trash skimming and sewage treatment, but a requisite change in people’s perception of the pollution viewed through their religious beliefs. These social factors are in my view are some of the new and significant forms of challenges systems engineering faces.
DSW: Your mention of the Ebola epidemic allows me to bring PROSOCIAL into our conversation. This is a method of rapid cultural evolution that we are beginning to employ around the world. It enabled one village in the Bo district of Sierra Leone to “mutate” its funeral practices by substituting a banana trunk as a symbolic representation of the deceased body. Then the new practice was spread to other villages by a network of facilitators, probably saving many lives, although this was difficult to monitor under the circumstances. Go here for an article that includes a powerful video. Not only is this an important example in its own right, but it illustrates the general need for what is sometimes called “co-production”, where the top-down organizers of the system engage with the participants of the system on an equal footing. What do you have to say on this subject?
GM: This is very interesting. Another way to advance “co-design” is for planners and organizers to work with the community people and identity approaches to make the burial rites safer and dignified while respecting the longstanding faith systems and traditions.
DSW: Now I’d like to explore the “added value” of thinking about systems engineering as a pure form of artificial cultural group selection. This will be very much a matter of perspective. The added value for a systems engineer will be different than for a cultural group selectionist, which in turn will be different than for a free market economist, for example. Let’s begin with a systems engineering perspective. Is it already obvious that the system must be the unit of selection and that processes of “blind variation and selective retention” are required to evolve the system? If so, then is there any added value of seeing it as a form of cultural group selection?
GM: The evolutionary elements are not fully obvious across systems engineering; they are not explicitly recognized nor communicated at this point either. The richness of perspective one could derive from using insights from group selection would be helpful to transcend the sometimes overly technical and procedural approaches of systems engineering. Perhaps the group selection framework could also help engineering to develop a better sense of our social meaning systems—be it scientific evidence or public values or the way we make collective decisions in society. Evolutionary concepts can help broaden the social contributions of engineering beyond the hardware and the software, and the mechanics and the electronics, and the picos and the petas in everything around us. Such cultural, anthropological, and philosophical perspectives should not be mere add-ons or electives but should be vitally integrated. This is a crucial necessity, if we are to, as you say, become “wise managers” of our evolution as a society.
DSW: Very well put. A key concept for both the systems engineer and the cultural group selectionist is contained in the phrase “the evolution of evolvability”. Inheritance systems are themselves the product of evolution. This is true not only for cultural inheritance systems, which are products of genetic group selection but also for the mechanisms of genetic evolution, which are far more elaborate than when they first arose during the origin of life. When we examine our genetic capacities for cultural evolution, they work only at a small spatial and temporal scale. We don’t spontaneously compare alternative practices over a wide geographical area or select them with long-term and large-scale outcomes in mind. Expanding the scale of cultural inheritance systems requires cultural evolution—or systems engineering if you like—using our genetically innate and previously evolved cultural capacities as building blocks. Examples already exist at medium scales, some a product of “natural” cultural group selection, others a product of “artificial” cultural group selection or a mixture of both. What’s needed now is to recognize the whole earth as the ultimate complex system and to create a cultural inheritance system that works at the planetary scale. Integrating the complex systems engineering community with the evolutionary community is an important step in that direction.
GM: I fully agree. Even for routine decision making, this alliance between systems engineering and cultural evolution seems crucial to me and holds great promise. This could be a platform for the next generation of comprehensive planning support, and importantly, user-centered design. After all, for our products and policies to be useful and relevant, it’s helpful to remember John Dewey who once said it’s the person who wears the shoe, not the shoemaker, who knows where it pinches.