[34]                               home                            [36]

ORB Visualization

(soon)

 

National Knowledge Project

 

Open question in knowledge representation

comments

 

 

Someone said:

 

 

Did the intelligence community actually fail systemically?

 

If so, it sounds like time to reread the organizational cybernetics literature and build in automatic control mechanisms together with more redundancy at the system level while, at the personnel level, providing superior leadership with integrity and courage.

 

 

Reply:

 

 

Our sense is that the problem is not specific misdeeds but a systemic confusion about what computers and knowledge are and potentially have to do with each other. 

 

http://www.bcngroup.org/area3/pprueitt/kmbook/Chapter2.htm

 

The controversy over the work by category theorist Robert Rosen and Sir Roger Penrose's books is a pointer to a broad social analysis that is made within an extended community of scholars.  One can observe that very few in the intelligence procurement and funding community are even aware of the issues that can be very clearly posed by leading natural scientists.  But, everyone understands the nature of procurement, interest in money, and how to get along in the culture. 

 

At the same time the community is faced with two seemingly intractable problems:

 

1) how to recruit, organize and train humans who individually and as an organization produce high resolution and high fidelity knowledge in real time about what are emerging threats.

 

2) procure software which does not create false sense making consequences while increasing the productivity of the individual and the over all system.

 

In both cases, the broad social science community has important contributions to make.  The call for a National Project to establish the knowledge sciences goes back to 1992 when I was at the Neural Network Research Facility at Georgetown University.  Over the last decade, we have talked about this issue with several hundred of the leading scholars.  The first step towards this project is a conference where dedicated effort is made to open new horizons. 

 

Those who are primarily interesting in acquiring contracts should not attend this conference.  The science needs to be allowed to work out hard problems within narrow self-interest being allowed to be the controlling factor.

 

The proposal, "Initiative to Develop 21st Century Natural Language Understanding Capabilities" provides a comprehensive list of open problems and status of current progress.  However, a community of scientists has been suggesting that many of the open problems and the intractability of current progress in Natural Language Processing can withstand a break through caused by thinking about the problem of intelligence in a different way. 

 

In particular the many observations, form many schools of thought, that formal systems based on Hilbert mathematics and first order logics seems to help but at a point the mathematics and logic itself gets in the way of seeing what needs to be done. 

 

What is needed is new thinking that exploits certain "by-passes".

 

Perhaps the easiest by-pass is the one the never develops a structured database to encode models of either structural invariances or representational of meaning.

 

Most who do serious work have moved to hash tables.  But here there is an additional by-pass and that is to recognize that if one simply treats text as a base 64 number, then we can easily have key-less hash tables with no need for empty containers and no collisions.  (This is the essence of the 2003 "Hilbert engine” patent awarded to Primentia Inc).  

 

Search becomes almost immediate (in a small number of machine cycles).  Almost all algorithmic tasks are reduced two or three orders of magnitude.  Memory requirement go down in a fractal compression of data into structural information that is lossless and will reproduce original materials. 

 

The compression process is precise and not statistical.  The speed and precision then allows short duration human-machine interaction in cycles that previously were stuck at hours or days to train categorization and recognition constructions. 

 

So we have the possibility that many different algorithmic processes can be run at the same time and results will still occur within a short action-perception cycle involving the human's selective attention.  At that point, the role of natural intelligence and human cognitive acuity come into play and we are no long talking about artificial intelligence at all, but machine aided information production.

 

http://www.bcngroup.org/area2/KSF/HIP.htm

 

etc...

 

Like AC (alternating current) technology did in displacing DC (direct current) machine, Human-centric Information Production (HIP) changes everything. 

 

HIP changes the cost of the MIT proposal from 10 B to perhaps 200 M, while still creating the same foundation for far ranging economic benefits. 

 

Government and industry leadership will soon fund development of innovative foundational computing technologies beyond the current response range of industry and to revitalize basic research in the fundamental computing & networking sciences.

 

We believe that the CoreSystem platform can provide the 21st-century provably-secure computer architecture, net-centric operating environment, iconic programming language and intelligent user interface that enable ultra-high productivity software development, agile software adaptation, and personalized end-user services.  HIP is expressed in a natural fashion as part of the CoreSystem architecture.

 

http://128.242.106.181/Coretalk/CoreSystemIntro.PC.zip

http://128.242.106.181/Coretalk/CoreSystemIntro.MAC.zip

 

HIP is simple and in spite of some early patents, software based on HIP will be available at little or no cost to everyone within a few years.  Using HIP we have produced demonstrations of semiotic control systems based on categorical abstraction and event chemistry.

 

Other untapped mature technologies include schema logic, polylogics (from Germany) and a few others.