[2]                           home                           [4]

 

Thursday, December 08, 2005

 

 The BCNGroup Beadgames Index

 

 

 

Emergency Medical Ontology Project,

 

To be Proposed[PC1] 

 

 

 

Communication from Dr Paul Prueitt to Dr Richard Ballard, Knowledge Foundations Inc

 

Dick,

 

Let us talk this discussion [2] one step forward, assuming that the monopoly that you have been seeking (for decades) materializes.

 

First, it is reality. The societies of the world do not have something like the functionality of a foundational core of all possible concept structures... and this is what you are looking for.

 

Your theory of information extends Shannon’s theory is a specific fashion, and ties together results from fundamental physics (of the most advanced nature) to create underlying computable structure (as seen in Mark 1 and Mark 2 – and of which we will see more of in Mark 3). 

 

The concept structures themselves are, as you say written into a compressed data structure but express as n-aries..  < r, a(1), a(2), . . . , a(n) > so there is a universal "standard" for the basic "concept".  I should say, the basic "piece" of information rather than "concept", but perhaps we can all agree on my definition of ontology as being a set of concepts.  As sets of concepts become empirically observed, we will find the interface in the Islamic community social discussion where recruitment to terrorism is occurring.  We will also see the recruitment patterns being expressed by other fundamentalist groups.  We will see the response patterns as a society of tens of millions of people wakes up to the huge struggle that the people along the Gulf Coast are dealing with.  We may see the onset of new flu contagion as medical centers and insurance centers communicate knowledge of patient cases. 

 

All of this must, in my opinion, leave the profound misperceptions of logic and artificial intelligence far behind.  Knowledge representation becomes a form of natural language, having immediate and transformational transparency into real time natural reality.  Truth becomes less of an opinion and more of a realization. 

 

Inference is redefined to be question-answer paths along lines of least resistance, and this knowledge transversal maps perfectly to the Lagrangian concept of particle motion in free space.  Thus the deep tie in to Paul Werbos’s discussions, and yes even Paul’s unique way of dealing with Bell paradoxes by evoking at least the concept of restricted backward flow of information and perhaps some types of pure energy.  It is understand that there is a need to address not only the Bell inequality (which has to be considered solid science) but other related paradoxes related to emergence.

 

So we both see the possible set of all concepts being enumerated within the computable structures within Knowledge Foundations.  This is the goal.  This goal will be achieved, and soon. 

 

Then I will say that these concepts are always aggregate-able from the 2-aries { < a, r, b > } that is the basis for my Orb technology.  I agree that these units do not hold semantics, only the parts that get expressed as n-aries.  I do not need to get sidetracked here on 2-aries verses n-aries, we (Ballard and I) have discussed this many times.  I just want to say that the general framework theory that I have been developing (generalizing the Sowa, Ballard and Zachman frameworks) needs 2-aries at the atomic (primitive) level and that these are intermediate results.  The intermediate results are found aggregated as event chemistry.  (I have written about this extensively.)

 

John (Sowa) and I and others would say that some type of transformation operators over the n-aries sets will provide concept searching and well as concept extraction, AND a means to propagate that outcome between members of a community of practice. 

 

Given, that Knowledge Foundations holds this monopoly will the propagation of knowledge bits between humans allow for the humans to make novel and unexpected alterations in the underlying "knowledge space".  I remind you that natural language MIGHT allow such modifications.

 

Well, (grinning) then we have Tom Adi’s work.  In this work and in work that is similar (around the world and very far from the mainstream), we have the notion that the human consciousness is mostly merely a processor that is processing very stable atomic (substructural) forms in ways that are also quite stable.  So the speak we generate and the way we generate this speech is NOT AN INVENTION or ourselves at the time we utter the speech. 

 

Even with this deterministic viewpoint, there is the (response) degeneracy that Gerald Edelman talks about in any emergence of an aggregation of substructural elements into a form that is driven by environmental needs.  This is the function/structure issue.  If this issue is addressed without stratification theory, in my opinion, then it is wrong.  Stratification allows one to see how the actually universe works, and why (for example) the pi number, plank’s constant, is an emergence feature of this universe (I smile at Paul Werbos).

 

Well, this is how Tom Adi and Richard Ballard’s work might fit together.

 

Comments?

 

Paul Prueitt

 

 

 


 [PC1]Prof Paul to add /subtract content appropriately to the document to make it complete and relevant to the tasks at hand