Sunday, January 31, 2016

Deceptively Simple Complexity

I seemed to have taken all of 2015 off to write 3 books. While researching those books, I've come across the realization that how people conceptualize complexity in systems is very dependent on their paradigm(s) of science. This got me looking into various existing scientific paradigms -- how they enable or constrain what we can see, and affect how we understand, the "stuff" in and of the world.

Scientific historian Thomas Kuhn wrote of paradigms, their communities and their revolutions. I see these processes going on -- simmering beneath the surface -- almost everywhere I look. In the evolution of paradigms, many won't survive. Those that do, won't survive for very long periods of time.  Perhaps the biggest surprise I found was my confirmation of the principle: "Deception in distant objectives and deception in the paths of objective search" -- first identified and investigated by Kenneth Stanley and Joel Lehman.

In addressing deception associated with lack of knowledge, for example, I see the complexity in systems as means to solutions (or resolutions) to complexity by deception  -- not always as a problem to be overcome. The hard part is understanding what makes complexity "interesting".

Wednesday, March 5, 2014

What is science?

What a silly question?  Science is ... you know ... "The answer to life, the universe and everything (Douglas Adams)"

It's harder to answer than you would expect.  Most people think it is the search for unassailable truth.  They will often be disappointed. The real, underlying question is: "What do people expect from science?" 

I just completed a book, "Foundations for a Science of Systems".  In order to answer that question, I had to identify the "systems of science" - showing that science is the interaction between 4 "worlds" - the Physical World, the Mental World, a Conceptual World (the three "worlds" of Karl R. Popper), and a relatively recent world, "The Platonic World of Forms" - a world of structure described by the Platonist physicist Roger Penrose, among others.  These four worlds are connected by six "languages" - percepts, schema, forms, cognitions, mathematics, and a complex series of languages that analyze, normalize, and synthesize intermediate results with evaluation and realization (ANSwERs).  Hey, what about natural language? All of these languages start out being described in whatever natural language - words, drawings, music, dance - that the communicator is comfortable with.  Note, too, there are many mathematical languages.  I use Category Theory, but others work too.

It is said, the objective of science "is to separate the demonstrably false from the probably true" (which seems the most correct and complete statement from many points of view). Science is a system - a complex coupled system - with an identifiable "form" (a formal system) that is used to build "valid" knowledge.  You can't tell if something is true, or not, if the process isn't valid. Validity means that the processes are consistent, if not complete.

So, how do you know you've reached "the answer"?  Science only asymptotically approaches the truth. Answers are either sufficient, or not, today.  They get better, or are revised, over time. That's probably the best one can reasonably expect from science.

Saturday, December 1, 2012

Hysteresis in complex interactions

Recently, a group of us have been discussing the flows of monies and values in and between high-tech entrepreneurial enterprises and various economic, financial, social/societal, and legal contexts.  This is harder to do than one would initially suppose, because these concepts can easily be conflated -- value is usually measured in money. The problem when modeling and simulating these flows is that they exhibit a hysteresis. One might say that money tends to follow value, but that is not always the case. In other words, money may "lead" or "lag" in its correspondence to value. Why is this important in modeling complex market systems?

Value, as in realization of a value proposition, is a problem because it is usually measured a posteriori, or after the fact -- meaning after it has been realized --and often in rather subjective, qualitative metrics. In other words, the realization of value leads or lags in a possibility space.  We see this, in a primitive conceptualization, as a level at some distance on Roger's Innovation Adaption Curve.  So, how can we measure potential value, if not directly by money?

How about value as generative of a probability distribution function, not as a location on a probability distribution?  We can do this by composing functions, hyper-dimensionally, in an indirectly encoded compositional pattern producing network (h-CPPN) according to open-ended fitness functions in each of the contexts of interest.  Value becomes an evolutionary function in a competitive, co-evolutionary system, meaning value is also a comparitive function from alternatives.

There seems to be adjunction in the natural transformations between functions of value, and the functions that generate revenue.  Identifying those natural transformations informs us of the complex system between monitary flows and flows of value - and similar to those seen in Lotka-Volterra systems.

Just a thought, but fast becoming a simulation ....



Thursday, March 1, 2012

Networked computation

Everyone has heard the term "neurocomputation" - referring to an artificial brain made from artificial, electronic neurons.

I deal with a form of computation that may - or may not - have anything to do with the way a natural brain works. It is simply, network computation. This means that any artificial neural network is a stretch. It is simply an artificial computational network, and that probably has nothing to do with thinking at all.

That doesn't mean that pattern recognition and sophisticated functions are beyond the range of network computation, it isn't. In fact, very sophistical physical, logical and mathematical manipulations are possible. But every time I try and tell people about my work, they leap to a science-fiction induced fantasy about artificial "brains".

Pardon me. Thinking is not coherently congruent with computation. There. I said it.

Friday, March 18, 2011

What is complex?

A couple of days ago, I was reading a document regarding a "complex system". The document's author was describing a "complicated system", but hardly complex. It seems to me, the casual term "complex" has become so broad as to become almost meaningless, and this is becoming increasingly unfortunate. Years ago, Dr. Seth Lloyd (no relation) used to collect definitions of the term - interesting, enlightening, but ultimately short of the goal. But S. Lloyds's collection does provide patterns that become interesting.

I suggest we start by some characterizations of patterns of complexity. The first separates "subjective complexity" - aspects of complexity caused by incompleteness, lack of understanding or information, uncertainty and of a probabilistic nature. This "simple" aspect of complexity is the focus of most books on complexity. Compare those aspects to "objective complexity" - aspects that emerge from 3 or more, mutual, non-linear couplings (sometimes referred to as non-linear recurrence). Careful study shows that subjective and objective aspects are themselves coupled into what becomes a foundation for potentially complex systems. Yet this is incomplete, as well.

A major contribution came from Steven Strogatz in his book "Non-linear Dynamics and Chaos, with Applicatons to Physics, Biology, Chemistry and Engineering" (Perseus, 1994). The problem was the word "chaos", which Strogatz attributed to 3 or more, mutually-coupled, non-linear systems. Strogatz's chaotic systems are only complex and chaotic at some energetic or thermodynamic non-equilibrium.

Chaos becomes associated with the exchange of entropy, information, or energy at some "distance from equilibrium"(1) that acts upon the potential complex system. Contextually, chaos has some relationship with stability, in that systems at or near informational, entropic, and energetic equilibrium are not ultimately complex, regardless of how complicated (how many parts exist). For example, we may speak of a potential instability, that is currently at equilibrium (balancing an egg on end).

When we couple the subjective incompleteness, with the energy, entropy, and informational dynamics of 3 or more, mutually coupled, non-linear objective systems , the potential complex systems becomes a realized complex system - or simply a "complex system". The 3 coupling of these aspects implies that complex systems are potentially meta-complex.


(1) See the Brussels-Austin Group and pioneering work by Ilya Prigogine.

Thursday, July 15, 2010

Sensemaking of Complex Systems

Sensemaking seems very much related to to pattern recognition - which obviously assumes you are congnizant of having seen that pattern before. Note that this is not saying, "I have seen this exact phenomonon before." For example, one might have seen collective swarming behavior in fish, in birds, in ants, even in people - there is a pattern to which we have given the name "swarm" characterized by some combination of synchrony, orientation (direction), attraction, bifurcation, (n-furcation) and dispersion.

How people go about the business of sensemaking is often quite different that the way artificial intelligence goes about sensemaking. By this I mean, humans often have much more sensual and contextual information upon which they make classifications. This is different than raw information, such as that stored in computer memory, in that human contextual information is encoded in highly coupled networks - the real neural network. Computer memory is discrete, rank and file - the substrate being independently and identically distributed (iid).

How would an artificial neural network go about sensemaking regarding the swarming behavioral pattern? The "sense" would have to be made in some of many other contextual patterns. One such contextual pattern is the "shape" of a connective, Compositional Pattern Producing Network, as found in HyperNEAT. However, that is just one context, and we need a network of contexts for sensemaking. Moreover, these contexts exist at various timescales, from nearly instantaneous to universally constant.

Some equate thought with computation. I'm not sure I agree. There is a composition between networked computation and linear computation (serial and/or parallel) that seems necessary for categorical sensemaking. And, of course, because something makes sense doesn't mean it is true or the right thing to do. That takes some interstitial experimentation - or meta-computing - with comparison to some real-world data.

So, I will go back to my Chinese Room and continue working on that categorical composition.

Saturday, March 27, 2010

Complexity vs. Chaos

There seems to be different phenomenon between complexity and chaos that I would like to explore in this post. First, I admit to a Brussels-Austin perspective of thermodynamics. The measure of chaos seems to be a distance from thermodynamic equilibrium. In open systems, this is the distance - or degrees - above absolute zero. In closed systems, this is thermodynamic difference (in heat and other kinetic motion) between the system elements.

Compare this to objective complexity, which is simply defined by 3 (or more) degrees of mutual, non-linear coupling between system elements. This three-body problem defines the simplest form of complexity.

We can compare the complexity / chaos system of problems by considering the three-body problem at thermodynamic equilibrium. Nothing is moving, no mutual orbits. The coupling mechanism may be underdetermined (subjective complexity), but may be considered as a thermodynamic perturbation (even though the perturbation has nothing to do with the heat component). As the thermodynamics increases, there emerges motion in the three-body system. The complexity within the system moves from "potential" complexity to "actual" complexity - a form of realization.

Of course there is subjective complexity, referring to the uncertainty, stochastic nature or lack of knowledge in systems - but this happens in simple systems as well as complex systems.

The term complexity is often used indiscriminately to describe both complexity and the coupling between complexity and chaos. If it were up to me, I would create a different word for the complex/chaos coupled system - something like "chomplexity" - but I hate neologisms, so I merely add a footnote to distinguish the two.

It makes sense that the Inuits have many words for snow. We have overloaded our one word, complexity, almost to the breaking point. Maybe it's time to rethink our lexicon.