[300]                           home                           [302]

 

Tuesday, December 13, 2005

 

 The BCNGroup Beadgames

 

 

Challenge Problem  à

 

Proposal:  To identify a common, widely used, and available concept ontology to be used with UDEF (Universal Date Element Framework) and other finite coding systems underneath.  ( see [298] )

 

Objective: a core-hub-common “concept ontology” plus basic structural data interoperability

 

BCNGroup December 12, 2005

 

 

In a previous discussion, we talked a bit about Rosen’s referential entailment.  For references see Stephen Kercel. “ : “Does Incomputable mean not engineerable?” and the Robert Rosen site:  { URL }

 

The way I portray the notion of entailment is related to those things that exist structurally in the world at the time of an emergence.  So, to use previous language, the structural entailment is “situated” in a present moment (i.e. it is not abstract), and IS the pragmatic axis.  It is those things that actually impinge in reality on the situation being modeled.  In this sense, this definition of pragmatic axis shares a property with consciousness; it ONLY exists in the present moment.  Because structural entailment only exists in the present moment (it is the present moment), any abstraction about real time structural entailment needs very careful recognition mechanism – in particular “perceptional measurement”. 

 

The paragraph above should communicate clearly the difference between structural (or physical) entailment and logical entailment.  A large literature exists about this difference. 

 

If the “emergence” is a negotiated web service, using UDDI registry for example, and the web service requires only an alignment of data structure, then we have one class of web services… those that are achieved purely by a few computational steps.    If the web service requires a level of understanding about the purpose of the service request, using a process ontology for example; then one has a second class of web services.

 

First-class web service is one that requires only the information necessary to achieve data interoperability

 

Second-class web service is one that requires an anticipation of the use of the data, where more than one possible use exists.  The second-class web service, so defined, has degeneracy as defined in the scientific works by Gerald Edelman.  Also see: “On human memory

 

To quote from the second paragraph of the first of the links above (published in 2005 “Biochemistry and the Sciences of Recognition”):

 

“There is a theme that weaves through these layers < … of organization of biochemical processes … >, which in retrospect I see has shaped my interest.  It is the Darwinian approach of population thinking, based on the notion that species, categories, and even molecular interactions in living systems arise by selection acting over time on populations consisting of large numbers of variants.  The idea that variation is not noise but is rather the substrate for the emergence of biological form and function provides an underlying theme that is central to and defining of what I have called the sciences of the recognition.  These include evolution itself, embryology (particularly morphogenesis), immunology, and the neurobiology of complex brains.  In all of these arenas, recognition at molecular, cellular, and organismal levels occurs through selective processes.  In each case, the substrate is biochemical although the higher order rules are governed by variation and selection. 

 

 

We make bold the word “category” because it is precisely here that we have the natural theory of category formation that the Greeks could not have come up with because they did not have the details of biochemistry that we have today.  Certainly the Greeks could use introspection to see and say something about category formation as part of the thought processes.  But they would not have been able to seen the biochemical details and it is in these details that the importance of response degeneracy (real, but structured, indeterminacy) is found. 

 

We also note that if OASIS, or IEEE or W3C, had a formalism that allowed one to model well the substrate of biological form and function, then we would have all that is necessary to have a Internet based ontology mediated environment for human collaboration in any area of endeavor. 

 

This environment cannot be the Semantic Web that the W3C has been developing, because the first order logic is not such a formalism.  Sowa’s lattice of theories [300] simply, as he recognizes, fails to model function structure degeneracy.  Or can the Tarski/Sowa lattice of theories be used to in fact model degeneracy?   Well, this is an open question.  Certainly we need not have to address this question while we are providing universal data interoperability.  I think that this is the point that John is making with his recent communications about a Unified Framework? 

 

Barry Smith, a member of the ONTAG working group, produced a ontology called Basic Formal Ontology (BFO) which separates static and dynamic realities.  But I do not feel that the BFO takes into account selectionism, recognition (in the sense that Edelman is talking about) or degeneracy (which I have talked about in this forum with no response from anyone.)  I may be wrong, but so far the issue has not been recognized in the ONTAC discussion.  But having this conversation is really important, IF the semantic web / anticipatory web of information in the Internet is to serve humanity as well as is possible.

 

So again, the BCNGroup proposition is that until the W3C, OASIS, and IEEE (and ISO) standards all recognize the need to separate concepts from logic, we are not able to see the two classes of web services within a universally available and stable set of tools.

 

Once we see the first class (structural interoperability – as best seen in the Core System presentation as part of the SafeNet discussion (2004)) then we have two things:

 

1)      real data interoperability that is beyond the competitive proprietary forces ability to ruin

2)     an opening into “selectionist” programs that use degeneracy in web service definition to express issues related to roles, objectives, intents, and similar marked-up human behavior. 

 

Edelman is one of the most profound of all biochemist, as is recognized by his Nobel Prize.  But more importantly he has uniquely defined certain process models that can be incorporated into “stratified ontological models” as are recommended for immediate development in the BCNGroup RoadMap.