Who’s Afraid of the Quark: Simplifying Complexity

When the word “quark” comes up in everyday conversation, it often elicits a kind of self- conscious chuckle. Who really knows what a quark is, after all, but a particle physicist? As a word, it more or less symbolizes the unknowably complex.

In fact, I challenge you to think of a time when it wasn’t being used to explain a highly complicated concept – except perhaps as a way to make fun of someone for being a nerd (I speak from experience as the nerd in question). The deeper you dig into the nature of quarks, the more unfamiliar things get: hadrons, baryons, leptons, mesons, and more sci-fi sounding but very real words – try Googling them if you don’t believe me.

The co-discoverer of the quark, Nobel Prize winner Murray Gell-Mann, was no stranger to complexity, both in the abstract and physical worlds. In addition to his well-known work in quantum physics and beyond, later in life he took on a more all-encompassing topic: quantifying complexity.

It’s easy enough to find complexity in our daily lives: traffic patterns, the physical universe, human emotions and interactions, the stock market, or – in my case – simply trying to assemble my daughter’s new dream dollhouse. Dr. Gell-Mann was focused on measuring complexity to get at the essence of it, and thereby understand the relative complexity or simplicity of a system. In doing so, he hoped to help with what is essentially everyone’s most basic goal: simplifying complexity.

What is complexity all about?

Definitionally, complexity is “the degree to which a system or component has a design or implementation that is difficult to understand and verify.” In its vagueness, the definition itself is fairly complex, so let’s take it a step further and look at the main factors that make something “difficult to understand”.

A complex system could have many moving parts or tasks; it could involve numerous and changing variables; and certain aspects of the process could be only partially understood, or might be time-consuming, or require specialized knowledge. All of these things affect a system’s complexity, and it’s usually a combination of these and other factors that together overwhelm the human mind and its ability to keep ideas organized.

The key to tackling complexity is to understand the rules and variables which underlie the phenomenon being analyzed. Generally speaking, all things in our known world behave according to rules driven by variables. In fact, many believe that nothing in the universe is truly random.

Take gambling, for example – I’m sure plenty of us (myself included!) would love to fully grasp the “rules” associated with dice rolls before hitting the craps tables. Of course, if I could rapidly quantify the effect of friction, starting position, speed, and the thousands of other variables that drive the outcome of each roll, I’d be lying on the beach in the Caribbean right now instead of writing this piece.

The fact of the matter is, the sheer level of granular understanding required to do this and the scope of possible outcomes drive an exponential number of results that’s functionally impossible to predict. This is what makes the outcomes of complex systems appear to be “random”.

So, how do you simplify a complex enterprise?

Endeavor Consulting Group is often called in to untangle exactly this kind of situation. Our solutions often make use of both agile and waterfall approaches, but there are a few key principles that really drive our success.

Understand the entire problem – the requirements, domain, and
foundation – so that the solution model can evolve over time.

Whenever you’re trying to improve a system, untangling the real requirements from what are simply historical practices is the first step in creating a robust solution. Using Domain-Driven Design (DDD) principles for software solutions, you must then establish a clear bounded context and ubiquitous language for your maps, design, and architecture.

This approach is built on classic object-oriented design (OOD), but takes it much further; DDD means that all subsequent and parallel work is driven from a common point of reference and framework, in which the proper space was given to future options, additions. and changes. Objects, naming, and functions are all organized and driven by common rules and language; this accounts for any future use cases, and ensures the same schema remains applicable.

Recognize and model the critical variables of the process or system,
and then take into account their effect on your results.

At ECG, we’ve seen a lot of the same requirements pop up from customer to customer, and project to project. This experience serves as a valuable resource when it comes to predicting outcomes, even taking into account contextual nuances and differences. By grouping significant variables and clearly documenting and quantifying their impact on complex scenarios, we’re able to drive design patterns that will work for expected – and even unexpected! – circumstances.

Define a holistic framework that accounts for exceptions, but doesn’t over-engineer based on them.

When process and system owners talk about complexity, “exceptions” always come up – instances in which a unique set of conditions drive certain behaviors. The frequency of these conditions and their actual impact on the standard scenario needs to be taken into account when you’re designing software solutions.

It can be tempting to try to automate every single variant, no matter how rare, but it can also be a costly mistake. Instead, learn to identify true “exceptions” and treat them as the infrequent events that they are by defining procedural methods of accounting for them rather than over-automating.

Test and sustain!

The closer a project comes to the finish line, the more deadlines (and praise for meeting them) begin to matter than thoroughness or caution. All too often proper testing — of the entire framework, not just the current relevant use cases — is ignored in favor of a speedy delivery. This is a mistake. The established and defined DDD from earlier in the process must be intimately understood by the entire team, from sustainment to design; quick knowledge transfer sessions just don’t cut it. The documentation and framework must be treated as living governance documents rather than legacy project documentation.

From quarks to QA, in pursuit of simplification.

While it may be a bit presumptuous to equate Dr. Gell-Mann’s principles of quarks and subatomic particles to software solutions, his theories on complexity theory do provide us with inspiration. Complexity in business today is a reality, and, like the quark, it should not be feared but rather embraced as an opportunity to find simpler solutions. When we strive to use design principles to simplify a system as described above, we come to better understand the behaviors and values that guide it.

Even if you never employ any of these tactics, Dr. Gell-Mann’s work – including quantum physics terms like “Poincaré symmetry” and “isospin” – will, at the very least, give you something smart to talk about at parties. At best, you’ll leave your guests something important to think about; at worst, they may simply end up teasing you for your nerdy interests. After all, what’s more complex than human interactions?

Chris Chambers is a Systems Solutions Architect and Managing Partner at Endeavor Consulting Group. Follow Chris on Twitter @clchamb13. This blog was originally published at https://www.endeavorcg.com/blog/whos-afraid-of-the-quark-simplifying-complexity.

Recent Posts