[216]                               home                           [218]


Saturday, November 19, 2005


 The BCNGroup Beadgames

National Project à 

Challenge Problem  à

 Center of Excellence Proposal à





Discussion at ONTAC forum

ONTAC stands for Ontology and Taxonomy Coordinating Working Group

It is a working group of

Semantic Interoperability Community of Practice (SICoP)



Why is this discussion important to the Washington Post?  à [218]



ONTAC Mission statement:

This working group is “to provide a mechanism for voluntary coordination of all activities within the Federal Government and among other interested parties in developing Knowledge Classification and Representation systems such as ontologies, taxonomies, thesauri, and graphical knowledge representations. The name for this working group will be “The Ontology and Taxonomy Coordinating WG”, abbreviated as “ONTAC”. The lead for this WG will be PatCassidy, Ontologist, MITRE (pcassidy@mitre.org)” 




Note from John Sowa:

With foot notes containing comments from Paul Prueitt [1]




Your statement “An ontology can be merely a set of well defined concepts, without logic. ”


This may be a true statement, but it is irrelevant [2] to the problems that the ONTAC project is trying to address:


Your said:

I have a different viewpoint, and that is that inference

and logic has failed to be applied to any formalism

sufficient to address the types of process questions

that come up in real life in every day situations.

(Prueitt, 1995)


The point is that *every* program that has ever been or ever will be implemented on a digital computer is absolutely precise down to the last bit.  There is not the slightest possibility of having a vague or incompletely defined program [3].


However -- and this is a very big *HOWEVER* -- there is an enormous gap between what a program does and what the programmer intended it to do.  The goal of any standards project, including ONTAC, is not to *discover* how everything works, but to *legislate* a particular way that computer things shall work. [4]


Ontology has a very long history of trying to analyze existence and to categorize the things that exist.  It has many points of contact with the physical sciences and with psychology.  The questions of what things exist and how they behave have become the province of science, and the questions of how people think about what exists have become the province of psychology and sociology.


Formal ontology is a development of the 20th century. On the one hand, it can be used to clarify the underlying issues of the broader field of ontology, including those aspects that have already become the province of science.


On the other hand -- which is the hand holding the money that funds projects like ONTAC -- formal ontology can be applied to the task of *legislating* how various computer programs are specified to handle the much narrower, much more specialized, and very much more precise categories that are implemented in computer systems that are required to interoperate.


Paul said:

I have the opinion that logic and mathematics fail to

provide the formalism required to model complex systems

(as in living biology or social systems).  At a minimum,

there is nothing as yet that succeeds for living systems

in the way that logic and mathematics was successful in

the engineering sciences.


That may be true.  But ONTAC is very much an engineering project [5].  The goal of ONTAC is *not* to understand how people think or how they should think.  It is to legislate how computer systems that are required to interoperate shall interpret a given set of labels in various data streams.


And those labels are *not* English words, even when they may look like English words.  Trying to understand how people use natural languages is a different project altogether.  It is without question a very important project, but it is not the ONTAC project.


John Sowa


[1] First, I have the greatest admiration for Dr John Sowa; however the issue that has arisen has to do with both honesty within the Semantic Web community and the use of taxpayer money by IT consulting firms.

[2] The representation of standards processes is that there is an open discussion and evaluation of a set of issues that are truthfully presented.  The “irrelevance” may be a forced judgment motivated by powerful industries whose will is to pervert academic activity and to define a set of facts on the ground that deny principled discussion not consistent with the perceived interests of these powerful forces (those who have the money).  The perceived interest is to control the behavior of the communication mechanisms that will support a future business AND political environment.  The core mission (of Semantic Interoperability Community of Practice (SICoP))  is derived from the notion that electronic government will be the means through which the people of the United States interact with the government.  The hijacking is that these powerful forces wish to control both the means of interacting with the government and the economic system.    In my opinion, the Democracy is lost if this hijacking is successful.

[3] The connectionist paradigm, such as artificial neural networks, is designed by the programmer to receive input in the form of unanticipated data patterns.  From this input, behavior occurs that is not designed by the programmer! 

   The semantic extraction technology, that is my field of expertise, has the same property.  The key point that you and the working groups are ignoring is that human in the loop interaction will cause the programs to behave in a fashion that is not anticipated by the programmer.  This is what “loose coupling” means to me.  The notion that “hard coupled” computational systems should control the interactions of commerce, our politic expression, and our communication is not correct, in my opinion. 

   The alternative is to claim that ontology is a set of concepts without logic.  Ontology, controlled vocabularies, would then serve as a dictionary to support the purpose of human communication. 

   The IT notion of web services are almost always represented to go beyond mere data interoperability.  We already have data interoperability except in cases due to non-interoperability by design. 

   In “Web Services and Service-Oriented Architectures”, by Douglas Barry,  he starts out with a scenario where by a traveling salesman’s complete schedule is computed by “web services” and then recomputed as things change in the world.  Tim Berners’Lee’s famous article in Scientific America has web services making medical prescriptions.  Computer mediated processes would support these types of activities, but not hard wires computational programs without iterated human interaction and decision making. 

[4] This claim about the purpose of ALL standards projects is news worthy.  It is true in almost all cases.  The consequences of this truth is startling, giving the money that is spent unwisely trying to do things that are not possible (such as to create intelligence as a product of completely autonomous computer programs.)  This goal is the goal of a runaway academic discipline called Artificial Intelligence.  The AI community has developed the status of a religion. 

[5] This is why ONTAC and all other such government-supported activities should be terminated.  The effects of typing to engineer human communication and economic activity is to create a world that makes money for those who are part of the system (are members of the religion), but which continues the alienation of all those who do not pay into a membership.  This changes the notion of our Democracy from We the People, to It the System.  The alternative is to support the BCNGroup’s call for a National Project that changes the course of our development of information and communication systems.