[229]                               home                           [231]

 

Monday, November 21, 2005

 

 The BCNGroup Beadgames

National Project à 

Challenge Problem  à

 Center of Excellence Proposal à

 

 

 

 

Discussion at ONTAC forum

ONTAC stands for Ontology and Taxonomy Coordinating Working Group

It is a working group of

Semantic Interoperability Community of Practice (SICoP)

 

Communication to the working group from Paul Prueitt

 

John Sowa's elaboration of the issues regarding the problem of N-squared communication paths touches on each of the issues that one might feel should be addressed;                            see also Ken Ewell’s remarks at à [233]

 

(1) information flow mapping,

(2) evolution of best global solutions,

(3) relationship  [1] of vocabulary / logic + data structure to external / internal processes.

 

 

****

 

(not posted to general ONTAC discussion)

 

 

 

1) The measurement of information flow is available through open market sampling. 

1.1)          This sampling is occurring informally and yet soon, one imagines, the sampling will be public and formal.  

 

1.1.1) The sampling will define precisely only certain parts of internal entity computations

1.1.2) If (when) human language exchange is also "harvested" the overall ontology of interaction will become more transparent.    

1.1.3) The Roadmap that I developed for US Customs was designed to map all commodity transactions across national borders.  Again the notion is that internal processes are only important to this mapping activity if the process is immediately connected to a bill of laden (which was the "data structure" that was to be monitored). 

1.1.4) The resulting "information flow" reveals the commodity flow, and thus would be available to our government, and to corporations, for analysis.  The analysis might at first, to some people, seem to be possibly sinister but is actually a step towards transparent B-2-B transaction modeling that will (clearly) produce a decrease in costs for end product commodity such as cars.

1.1.5) Threat and vulnerability analysis, with ontology mediation, is then possible using other measured elements (such as human analysis conversations), along with the mapping of the commodity flow. 

 

1.2) Information flow can be considered (so that there is language to talk about this) as the exophysics of a complex system of systems.  The internal machinery would then be producing an endophysics which is "encapsulated" by the entity.  This language is consistent with work done at the Santa Fe Institute and Los Alamos and in the "complexity theory" community.  The complexity language seems natural to use here, to clearly make the distinction between internal processes and processes that are open and public. 

 

2) The evolution of best global solutions depends on an openness of the system to perturbation of underlying processes and causes.  Edelman (again to mention this point about degeneracy) and others suggest that the notions of Darwin included (in the hand of Darwin) a requirement that function/structure mechanisms have many to many mappings. 

2.1:  It should be possible to do the same thing in multiple ways.  Otherwise the evolution will get locked into a local basin of attraction.  Random perturbations of underlying causes and process will not force the “emerging” system to climb out of. 

2.2:  Monopoly is the practical consequence of not have "response degeneracy".  

2.3:  Which Monopoly “first” occurs is determined accidentally, since the full knowledge of all processes has never been available to intentional decision making.   Further discussion at à [231]

 

3) Data formats can be optimized for regular transactions between entities doing business with each other.  Here it is nice to have web services negotiate the form, the bits and bytes, that data will take in the transit between entities.  Clearly this is a problem that is partially already solved. 

 

3.1: Ballard, for example has technical work (implemented in a program) that evolves to find a minimal data foot print and to then link the structure of that foot print to semantics.  (This is in theory, and has not been shown - to me - to work as yet.) 

3.2: Klausner's work requires a community negotiated minimal data foot print.  This foot print is linked to an iconic behavioral language called Cubicon.  The community negotiation involves data exchanges and behavioral design by master designers.  The Cubicon approach seems more feasible that the Knowledge Foundations approach (Ballard’s company), but here there has to be a “social” system in place that helps the negotiation.

3.3: But with Klausner's work, as with Ballard's, the structure of the data exchanges are "suggested" to be optimal.

3.4: In (Ballard’s) theory, optimality “MUST” reflect the meaning as determined by external forces.

3.5:  As in a genetic program, the optimal data structure foot print (ie in bit size and in format) evolves to imply the actual "semantics" in the limited transaction space (the exophysics governing all entity to entity transactions).   

 

(Sorry that the language is long and complex.)

 

 

I hope I have merely restated the points you have made, John.   

 

 



[1] The relationship is stratified