[73]                               home                            [75]

On the separation of syntax and semantics and the issue of (real time) pragmatics  ŕ

 

 

Tuesday, August 17, 2004

 

The BCNGroup Beadgames

 

 

 

Communicated to The Institute for Defense Analysis and The Arlington Institute

 

On the issue of inhibition by Incumbent Group Think

 

 

 

We have to keep working these issues until there is some resolution that allows a testing of the hypothesis that PriMentia technology brings the optimal data encoding for ontological structure, which has the form of a set { < a, r, b > } where a and b are "locations" and r is a relational variable.   Scientists like Ballard, Gudwin, Rao, Krieg, Werbos, Kohout, Mallery, Long, Szu, McGuinness, Aha, Lund, Goertzel, Citkin, and Sowa; CAN CERTAINLY BE AVAILABLE TO TEST THE HYPOTHESIS!

 

As stated in recent postings [71] in the bead game, the PriMentia encoding was given a patent in 2003 on something that seems too easy to have been missed; ie the regarding of an ASCII sting as a base n number.  The way that PTO works, the "innovation" must be a surprise.  I claim that most do not get the surprise at first.  Then when the surprise does occur, there is great resistance to using this surprise in the context of inference, due to the cultural bias that has developed around the issue of induction [73].

 

If one does get the surprise about the encoding, one can then take the next step; which is the definition of the Anticipatory Web of encoded Hilbert data. 

 

The use of the PriMentia technology to encode structure of patterns in data (without the imposition of theories of semantics) allows one to then develop data mining processes over these encoded structures and to both use and develop ontology in the form of graphs.  Metaphorically, one has a perception system that produces an image of what is in one’s face, or in one’s data, without the imposition of emotions or cognition.  This is the way that nature addressed the separation of perception from cognition (private and long standing discussion with Karl Pribram – see also: Brain and Perception (1991)). 

 

I have, in the past, called this “synthetic intelligence” because it leaves out the semantics completely.  One can then address the semantics separately, or as what I said often when I was senior scientist at Object Sciences,

 

“ what one leaves out in the beginning, one can often add back later on with some extraordinary benefit.”  

 

The common data encoding reveals itself both

 

1)       As a simple platform for very high-speed data mining and

2)       As a common and very simple, perhaps provably optimal, encoding into memory. 

 

The notational approach that I have developed also allows the community of information scientists to have a new formal foundation for talking about the issues related to induction, deduction and real time aggregation of information into contextual forms, as in the work by Klausner on the concept of a CoreSystem to replace the peer nodes within a distributed provable secure and resilient information system. 

 

At the hearings today, the Acting DCI, McLaughlin, mentioned the concept of Common Information Architecture.  One might note that a Joint Intelligence Virtual Architecture (JIVA) based on RDF (Resource Description Framework) has been around for some time.  Very large expenditures have been made on JIVA, and there are several objective evaluations that states that this joint architecture has not been successful. 

 

JIVA has been largely abandoned because JIVA did not serve the purposes for which the system was designed.

 

Otherwise the Acting DCI would not be answering a Senatorial question about what his greatest need is with a discussion about the need for a single Common Information Architecture.  Right?  That was supposed to be JIVA, right?  How much was spent on JIVA?  Well, and we know the history of JIVA very well, so we know the principled criticism of the architectural design was from the very beginning and yet was ignored.  The principled criticism was that RDF could not express semantics in real time because the pragmatics of real time can not be defined rigidly in advance.

 

What the Anticipatory Web does that JIVA could not do, is to produce the schema-independent encoding of facts observed in such a way so to make these facts available for real time convolution operating over the Hilbert encoding.