[331]                           home                           [333]

 

Thursday, December 22, 2005

 

 The BCNGroup Beadgames

 

 

Challenge Problem  à

 

 

Lattice of ontologies

 

Function/structure descriptions

 

 

 

Dick Ballard wrote:

 

"Context" is indeed an artifact of natural language based reading, writing, and listening. But is very badly compromised by confusing two (or more) issues: (1) sub-language domain-based, noun phrase disambiguation and normalization versus (2) piece-wise topic specialization by context narrowing,

 

Paul Prueitt wrote:

 

"Relationship" starts off as binary co-occurrence of terms or phrases, or of patterns of gene expression. Then various techniques are applied to produce n-aries.

 

Judith writes:

 

I think maybe there is a danger here in making this subject overly technical. One thing I see developing in the discussion Paul cited

 

(at http://www.ontologystream.com/beads/nationalDebate/319.htm  )

 

is a tendency to narrow down what these words are allowed to mean, in an attempt to solidify a definition. I think that oversimplification of this type is almost never a good idea. For example, the word "context" is not an artifact of natural-language-based activities except perhaps under a very specialized set of circumstances. Granted; Dr. Ballard may have been speaking about precisely those circumstances and it is not possible for me to know that from this thread.

 

However, I would still argue that the word "context" was created IN natural language for a compelling reason. It was necessary to refer to an aspect of reality that is ubiquitous and, therefore, requires an ability to communicate about.

 

These two ideas (relationship and context) are, of course, related. However, they describe different aspects of reality (or "causality," if you prefer). The context under which some relation exists will specify the nature of the relation and impact all relational effects; in a sense, the context becomes part of those effects. Einstein proved that the relation of observer to system will become part of any measurements made of the system by the observer.

 

It goes far deeper than that, but that proof is extremely valuable because it points out one of the main features of the universe: the causal potency of relational aspects in general (the specifics of which, in any given situation, become the "context").

 

The concepts of relation and context will also have differing contextual information of their own, as words, based on how they are related to what they are describing. In human interaction, for example, the word "relationship" can have multiple meanings, many of which may be applicable simultaneously in any given situation. The situational context will specify the meaning/s and, therefore, the definitions have to be kept elastic enough so as to be capable of morphing as needed. This also points out how the semantic content of language defies restrictive formalization.

 

I think this defines one of the most pressing problems having to do with computation: Computation is now synonymous with digitization (i.e.; rendering in zeroes and ones). By its very nature, then, computation relies on converting all semantic elements to syntax. It is only possible to do this in a very limited way. In fact, the complete conversion of semantic aspects to syntax can only be fully accomplished regarding systems which don't rely on contextual elements to a significant degree. This includes machines and all other simple systems.

 

However, where complex systems are concerned, such limitations equate to a requirement for radical oversimplification which causes the digitized version to diverge from the original system's behavior. The divergence typically increases as a function of time. Think about weather system computer modeling/forecasting; the forecast based on the computer model may be relatively accurate in the short term but is almost always inaccurate when the span of projected time exceeds 48 hours.

 

In order to usefully model complex systems with digital computers, it is necessary to constantly generate new models which follow the natural system over time. Robert Rosen compared this process to that of mapping a sphere (like planet Earth) using flat planes. It can be done, and very usefully, but it's important to know how that will change the nature of the original system.

 

Where maps are concerned, we know that to use a single flat plane deforms the topology of the original the most, as it extends out in all directions from the original "point of contact. The least deforming way to map in this situation would be to use a great many small planes, changing their orientation each time, to follow the natural curve of the sphere.

 

Natural human languages are complex systems. To try to make language "fit" inside computational limitations for all definitions is a mistake because to do so will, ultimately, destroy the language. What is required is more akin to making computation fit language requirements, it seems to me.

 

Judith Rosen