Thread: Mud-AI/NLP
View Single Post
Old 08-13-2003, 07:03 AM   #15
Yazracor
New Member
 
Join Date: Apr 2002
Posts: 18
Yazracor is on a distinguished road
Well, from my experiments it's not that much the storing of the information, but the finding of a "fitting response" which makes
the difficulty. The example of the miniature elephant nicely
demonstrates this. In my system, the connection of minature->elephant => small would have worked. The problem
is that several dozen other "facts" would match as closely as
this one. You could argue that to reduce these "wrong hits" you
could reduce the database size - but it would still be a "small world" if it was sufficiently large to "converse" with.
The only solution would be a concept of "context". In a MUD, it is feasible to create a knowledge base which contains the whole world, with regard to temporal and spatial relationships. Simple questions could, of course, be answered by inquiring from this database system, and this is what is being done by NL interfaces to databases.
The question is, if you want to go to the length of creating this
huge database, the difficult parsing algorithm, a "right size"
semantic network and a complex sentence-generator - just so people can ask simple questions to mobs, or if the system described above that simply checks for keywords and tries to find a fitting sentence from a dabase as answers would not have
a better relationship between effort spend and result.
Yazracor is offline   Reply With Quote