||Home||Switchboard||Unix Administration||Red Hat||TCP/IP Networks||Neoliberalism||Toxic Managers|
|(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and bastardization of classic Unix|
|News||See also||TEC Documentation||Recommended Links||Reference||Event Correlation|
|Typical operations with the rulebase||Debugging and testing TEC rules||Prolog||Humor||Etc|
Despite being the languages powerhouse that produced some of the best compilers that ever existed in the industry (for example IBM's optimizing and debugging compliers for PL/1 on IBM system 370 and later) this area of IBM expertise appears pretty segmented and does not spread evenly across the whole product line. If we talk about TEC, IBM looks especially badly: it never managed to made writing TEC rules clear or easy. Compiler level diagnostics is simply dismal. The whole rules writing environment is extremely primitive and smells cheap shareware. Moreover currently IBM cannot do much with the Prolog engine, which they do not own BIM prolog. The latter is owned by BMC. IBM actually has a chance to buy Borland Turbo Prolog and get rid of BIM Prolog, which is owned by their competitors, but they never did this.
It looks like there is an internal split in IBM with some faction advocating different approach to event correlation and processing. One faction promoted "everything Java" and now Java based engine is used on gateways. The other faction stick with existing Prolog engine, but paradoxically does nothing to improve it. It might be the same faction which pushed for buying Micromuse and Candle, that created for IBM more integration troubles then profit and pushed some users to the competitors.
Gartner believes that another, unstated reason for the Micromuse acquisition was the need to increase the scalability of Tivoli’s event management. Micromuse's functionality greatly overlaps TEC. TEC Prolog based engine and Micromuse engine can coexist and exchange events, but TEC engine has 2012 as the end of the life (as often the case, it might be quietly extended by IBM at the request of influential customers).
All-in-all TEC rules programming is a pretty convoluted, expensive, badly documented and prone to errors procedure: real Tivoli consultants Eldorado ;-). It might constitute up to 50% of TEC maintenance costs.
IBM runs two classes on writing rules, but while they are better then nothing, materials in those classes are very basic. In no way Advanced Rules Programming class has anything advanced in it. The only way to learn is study examples provided by IBM.
|Real-world examples of rules, BAROC files, and event
relationship diagrams are available on the event server host
in the $BINDIR/TME/TEC/samples/correlation
To determine current rulebase, its path and rulesets use the commands:
It is important to understand that TEC rules are written in a macro language that is compiled into Prolog (BIM prolog). You cannot understand well TEC engine without understanding the Prolog language.
IBM efforts to simplify Prolog by introducing procedural elements to it makes sense and helps a little but still you need to have a feel for Prolog to appreciate power of rules programming in TEC. Learning Prolog by taking one of the universities classes can get you much farther then any number of IBM classes (that does not mean that IBM's classes are unimportant; on the contrary they provide an excellent, but entry level overview of the rules writing).
Is Prolog effective for correlation of events is another important question and is discussed elsewhere. As used in Tivoli it is definitely not and introduction of the Prolog engine might be considered an engineering mistake: fashion-based decision of the original Tivoli designers. Of course this decision was made Prolog when was a very fashionable language associated with Japanese IT push. But the return on investment proved to be questionable. Road to hell is paved with good intentions.There were various attempt to adapt Prolog to system administration tasks. Among early one I would note Alva Couch and Michael Gilfix, first presented at the LISA 1999 Conference. They supposly created:
"... a system administration library that allows one to perform system administration tasks in Prolog. This is much more powerful and flexible than using other current tools, and has the advantage that the resulting Prolog programs are much closer to describing actual policies than CFEngine configuration files or PIKT scripts."
The material below is adapted from "Rule Developer's Guide". Obsolete information about rulepacks was deleted.
Rules are organized is collection called rulesets. Each is a flat file. Order of execution (or, more exact, merging into a single rules file during compilation) is specified in a separate file and is important: you can greatly enhance speed of the engine by discarding as many useless events as possible. Rulsets generally should be specialized for a particular type of events and if no rule matches particular event event should be discarded and not processed by further rules. Before it can be used by TEC it needs to be compiled from rules macro language into Prolog. This target representation is used by TEC rule engine to process events.
There can be several rulesets but logically they all are merged during compilation in Prolog and comprise so called rulebase. One TEC instance can have only one active rulebase. Usually you use the wrb command for rule base manipulation. Unless you understand what you are doing it is not recommended to modify any files used internally by an event server to manage rulebases with the text editor.
Note: When importing an object into a rule base (for example, rule sets, event classes, rule packs into rule base targets, and so forth), an object that already exists in the rule base must be deleted before you can replace it with a newer version of the object.
A rule base target is the actual rule base which is used by a rule engine. After compilation of the rule base on the IBM Tivoli Enterprise Console event server targets are located in the
directories (note the leading period in the .rbtargets subdirectory name).
Like we mentioned before rulesets are the flat text files that contain rules. Typically, only related rules are contained within a ruleset. Rule sets can be imported into a rule base target using the wrb command. When a rule base is compiled, rule sets are replicated to .rbtargets directories
When a rule base is being used by the TEC rule engine, the rules are processed in the order defined within a rule set and within the order of how the rule sets were imported into the rule base target. The regular rule processing order can be altered with the use of certain predicates called from within rules.
Note: The order of rule sets defined for a rule base target is important. Placement of rule sets determines evaluation order by the rule engine and thus efficiency. High frequency events should be placed first in the first ruleset.
A default set of rule sets is provided by IBM with the default_rb rule base.
The following procedure describes how to modify a rule set.
wrb -delrbrule rule_set rule_base
wrb -imprbrule /dev_dir/rule_set.rls rule_base
wrb -comprules rule_base
wrb -loadrb -use rule_base
All of the rule base targets defined for a rule base must use the same set of BAROC classes. When an event is received at an event server and its class is not in that event server's rule base, the event is given a status of PARSING_FAILED in the event server's reception log and the event is discarded.
Because all rule base targets for a rule base use the same set of classes, all rule builder and wrb commands manipulate BAROC files at the rule base level on the IBM Tivoli Enterprise Console event server. When the rule base is compiled, the event classes are replicated to the rule base targets defined in the rule base.Notes:
For additional information about BAROC files, see Event class concepts.
Like event classes, rule language predicates provided by IBM and predicates that you create must be stored at the rule base level on the IBM Tivoli Enterprise Console event server. When the rule base is compiled, these predicates are replicated to the rule base targets defined in the rule base. There are no commands for manipulating predicates. Predicates are imported automatically when you create a rule base.
Rule language predicates are described in Rule language predicates.
Some of the Prolog built-in predicates can be used as building blocks to create your own predicates. The Prolog predicates are described in Appendix A. Using Prolog in rules.
This example describes a process for creating a rule base using the wrb command and some of its options. There are multiple ways to create a rule base with the wrb command; this example simply shows one way. To fully understand the many capabilities of the wrb command, see the IBM Tivoli Enterprise Console Command and Task Reference. For information about rule base manipulation procedures using the rule builder, see Rule base manipulation procedures using the rule builder.
The event management environment for this example consists of a IBM Tivoli Enterprise Console event server and two Tivoli Availability Intermediate Manager event servers. The Tivoli Availability Intermediate Manager event servers are configured to process most events locally, only sending those of significance to the IBM Tivoli Enterprise Console event server for further processing. Events of significance were identified during an analysis phase of event management design and are detected by logic programmed into the rules.
The following list describes the requirements for this example:
The following list describes the assumptions for this example:
To create the Operations rule base:
wrb -crtrb -path /tec_rule/Operations Operations
wrb -imprbclass /data/TME/TEC/default_rb/TEC_CLASSES/tecad_nt.baroc Operations
wrb -imprbrule c:/tec_rule_dev/rls/tec_server.rls Operations
wrb -imprbrule c:/tec_rule_dev/rls/aim_ops_perf.rls Operations
wrb -imprbrule c:/tec_rule_dev/rls/aim_ops_sec.rls Operations
wrb -imprbrule c:/tec_rule_dev/rls/aim_pers.rls Operations
wrb -imprbrule c:/tec_rule_dev/rls/aim_acct.rls Operations
$ wrb -lsrbrule Operations Rule Set files -------------- tec_server.rls aim_ops_perf.rls aim_ops_sec.rls aim_pers.rls aim_acct.rls
wrb -comprules Operations
wrb -loadrb -use Operations
The active rule base is the rule base in memory being used by a rule engine. The loaded rule base is a rule base stored on disk in a known location by the event server. The loaded rule base can be a different rule base than the active rule base. The following examples explain this concept in a little more detail:
Again, loading a rule base copies rule base directories and files to a specific location known by the event server. Stopping and restarting the event server, or activating a loaded rule base if no BAROC file modifications were done, causes the rule base in this known location on disk to be placed into memory for use by the rule engine. Specifying the option to activate as well as load a rule base copies the files to the known location and automatically stops and restarts certain components of the event server so the rule base can be placed into memory and used.
When you create a rule base, you give it a meaningful name and specify where it is located on disk. The name is associated with the storage location so the name is used in subsequent references to the rule base.
For rule bases created on UNIX operating systems, the directories are created with permission values of 755. You can override these default permissions with the TEC_UMASK environment variable.
odadmin environ get > temp_fileEdit the file with a text editor and add the following line: The owner permissions cannot be altered with TEC_UMASK. Owner permissions always have a value of 7. You can alter the user and group permissions. For example, to set the permissions on a rule base directory structure to 750, perform the following steps:
odadmin environ set < temp_file
This tip looks at using assertion lists to preven events from being displayed in the Event Console until a certain threshold has been met
Importing TEC rules for new ITM 6.1 user's Posted by: john_willis on Apr 18, 2006 - 03:03 PM
If you have created a new rule for TEC to process events from ITM 6.1 you will need to import it into the TEC server's rulebase. In this article I am going to discuss how to this the hard way (i.e., the documented way) or the easy way (the undocumented way).
The documented way
First you have to import using the wrb command into the rulebase. Then you have to import the same rule into the default target rulebase (i.e., EventServer). Here is an example.
I have a new ruleset called itm61New.rls here is what I would do to import it. Notice on the imptgtrule you don't use the file name. You use the rule name (i.e., itm61New).
1) wrb -imprbrule itm61New.rls MyRB
2) wrb -imptgtrule itm61New EventServer Myrb
3) wrb -comprules MyRb
4) wrb -loadrb -use MyRb
If you want to make a change to the itm61New.rls you then have to do the following:
1) wrb -deltgtrule itm61New EventServer MyRb
2) wrb -delrbrule itm61New adv2
3) Make your changes to you original itm61New.rls file
4) wrb -imprbrule itm61New.rls MyRB
5) wrb -imptgtrule itm61New EventServer Myrb
6) wrb -comprules MyRb
7) wrb -loadrb -use MyRb
The undocumented way
If you have a test and a production set of TEC servers you can save alot of time if you just edit the rulebase directory directly. Here are the steps:
1) Copy your itm61New.rls to the TEC_RULES directory in your rulebase directory.
cp /export/home/gbs100/itm61New.rls /usr/local/Tivoli/RB/MyRB/TEC_RULES
2) Edit the rule_sets file in the TEC_RULES directory.
e.g., vi /usr/local/Tivoli/RB/MyRB/TEC_RULES/rule_sets
add this line in the rule_sets file:
rule_set('itm61New', 'itm61New.rls', active).
3) [Optional] Edit the rule_sets_EventServer file in the TEC_RULES directory.
e.g., vi /usr/local/Tivoli/RB/MyRB/TEC_RULES/rule_sets_EventServer
add this line in the rule_sets_EventServer file:
Note: This step isn't even necessary if you delete the rule_sets_EventServer file and you delete the /usr/local/Tivoli/RB/MyRb/.rbtargets directory. The target rulebase is a legacy feature that was related to a now defunt product called AIM. Since AIM no longer exists there is no reason to use target rulebases.
4) wrb -comprules MyRb
5) wrb -loadrb -use Myrb
If you want to make a change to the itm61New.rls you then have to do the following:
1) Make you changes to the itm61New.rls file
2) wrb -comprules MyRb
3) wrb -loadrb -use MyRb
Whether you use ALS prolog, BIM prolog, GNU Prolog, SWI prolog, or most other PC based prologs, the process of writing a prolog program is the same in principle. None of these systems (at least as installed on campus) has an IDE. However with a windowing environment you don't need one to get some of the benefits. Follow these steps when writing a prolog program.
- Create a file containing the prolog database of facts and rules. ALS and BIM prefer that you use the extension .pro, GNU and SWI use .pl.
- Run a copy of your prolog so that it uses the directory in which your program file is as its working directory. The easiest way to do this is to create an icon for the executable and set the properties appropriately. You can start ALS from the command line in both Solaris and Wintel. So if your path is set right that gives you another way to have the working directory be what you want. (You can't start SWI this way.)
GNU Prolog is in /opt/local/products/gprolog-1.0.0/bin/. Top run it make sure that this directory is on your path. You can start GNU by typing gprolog at the command line.
- Load your file into the prolog environment. Suppose that your file is called myprog.pro for ALS or myprog.pl for SWI. You load it into the environment by typing at the ?- prompt
- In ALS (BIM, Quintus) type: reconsult(myprog).
- In SWI, GNU type: [myprog]. These also work in ALS BIM and Qunitus but the semanitcs is a little different.
reconsult does not work in SWI. [myprog] works in ALS but has the effect of consult. Thus if you do it twice you get two copies of the file loaded not just one. (This didn't get standardized, sorry!).
With ALS and BIM because they can be started on the command line you can load the file when you start Prolog by typing alspro myprog or BIMprolog myprog. You can create a version of GNU prolog that does this too if you want. See the documentation in /opt/local/products/gprolog-1.0.0/doc.
- If the program has syntax errors the Prolog environment will tell you about them. You can re-edit the file to get rid of them and then reload, as above.
- Now run your goals at the ?- prompt. Editing and reloading as necessary.
Where to find the Prologs
- ALS prolog, SUN version is in /macslocal/
- ALS prolog, DOS version is on drive M;
- BIM prolog (SUN 4.1) is in /macslocal/Bprolog3.1. There is stuff you need to put in your .login to get this to work. See me for more information. Use turing for this.
- SWI can be downloaded from
(NT, '95 binary) etc. but check for more recent versions.
- There are other public domain versions. Such as bin prolog which can be obtained from
GNU Prolog can be found at Daniel Diaz's site
It is also on maxwell in /opt/local/products/gprolog-1.0.0/bin/
A new release of OpenESM for Prolog (V1.1) is now available at Sourceforge. This new release includes a Prolog engine for the TEC 3.9 State Based Correlation Engine (SCE).
You can download the new release at:
Here are some notes from the Readme:
SCEProlog - Prolog for the State Based Correlation Engine
This library implements a Prolog environment as a State Based Correlation Engine custom action. Why would you want to use Prolog at the TEC Gateway or adapter level? Firstly, a Prolog environment would allow you to leverage most of your existing Prolog facts and logic that enrich events before true event correlation. Secondly, the Prolog language provides a flexible and powerful language to manipulate event objects. Finally, since Prolog is the base language for the TEC rule language, it is familiar to every seasoned TEC rule writers.
The underlying Prolog implementation for SCEProlog is GNU Prolog for Java (http://gnuprologjava.sourceforge.net/) by Constantine A. Plotnikov. While this project seems stagnant, I found it the simplest to integrate with the State Based Correlation Engine. Included with this distribution is the gnuprolog.jar file. If you desire to see the source of the GNU Prolog for Java library it is available for download from the original project website.
The SCEProlog environment implements the most of the ISO standard with the following additional predicates:
BIM Prolog compatability:
State based Correlation Engine:
IP Address Name Resolution:
valid RegexFlag values:
You can see the rest of the notes in the readme that is part of the gb_08MAR2006.zip file.
TEC Prolog Push and Pop Predicates GulfBreeze Software Tivoli Training
Those of you who have been dealing with TEC rules over the years, know there are a lot of un-documented predicates available in the Prolog interpreter that TEC uses (i.e., BIM Prolog V40). Every once and a while I run into a few neat trinkets that open up a whole new area for programming with TEC rules...
Recently I stumbled across a few undocumented Record Database predicates. I found there are push and pop predicates that can be used manipulate arrays. The record_push and record_pop predicates can be used to process global stacks and arrays. The following example shows how the record_push/3 and record_pop/3 can be used to simulate array like functions.
record_push(hashtable1, key1, value1),
record_push(hashtable1, key1, value2),
record_push(hashtable1, key1, value3),
recorded(hashtable1, key1, _arraylist),
record_pop(hashtable1, key1, _first),
record_pop(hashtable1, key1, _second),
recorded(hashtable1, key1, _newarraylist).
If you run this prolog code from a BIMprolog.exe interpreter you will get the following output:
?- record(hashtable1, key1,),
record_push(hashtable1, key1, value1),
record_push(hashtable1, key1, value2),
record_push(hashtable1, key1, value3),
recorded(hashtable1, key1, _arraylist),
record_pop(hashtable1, key1, _first),
record_pop(hashtable1, key1, _second),
recorded(hashtable1, key1, _newarraylist).
_arraylist = [value3,value2,value1]
_first = value3
_second = value2
_newarraylist = [value1]
If you have any questions about this entry feel free to discuss this on the gulfsoft.com/blog.
This article introduces the major features and facets of the Agent Building and Learning Environment (ABLE). It discusses the ABLE architecture, and how to manipulate data beans, rule beans, and learning beans to be used in a wide variety of applications. The author descibes how you can create rules to be processed by rule engines, allowing you to separate business rules and policies from programming logic. He discusses the ABLE editors, and briefly describes how ABLE can be used with the Autonomic Management Engine (AME), IBM WebSphere Portal, and OS/400. After reading this article, the reader will have a solid understanding of the basic ABLE elements and be prepared to work with each of these elements represented as examples in the ABLE Examples Project.
Adding rules to applications
Write and run simple business rules or complex inferencing rules using the Agent Building and Learning Environment (ABLE) and its ABLE Rule Language (ARL). Example rulesets show ARL's syntax and capabilities, how to work with Java™ objects from ARL, how to write and debug rules in Eclipse, how to run rulesets from Java applications, demonstrate procedural and inferencing rule engines, and see the benefits of using rules written for inferencing rule engines.
Introducing ABLE rules
The Agent Building and Learning Environment (ABLE) is a Java-based framework, component library, and productivity toolkit for building intelligent agents that can use machine learning and reasoning. ABLE is designed to be used by applications involved in autonomic computing, business rules, data mining, diagnostics, forecasting, planning, policy management, retail, and resource balancing. ABLE's reasoning component includes a rule-based programming language known as ABLE Rule Language (ARL). This article introduces you to ARL and shows you how to use the Eclipse plug-in rule editor provided with the alphaWorks® download.
Kino's knowledge of Prolog and Expert Systems were essential to the development of the MailBot–the first full expert system for a standard mail system. ...
Expert Systems. [1-5a] Free-Cheap Expert System Shells ... Prolog Tutorial -- Contents; An Introduction and Tutorial for Common Lisp ...
dsc.itmorelia.edu.mx/~maliyusuf/Bookmarks.htm - 177k - Cached - Similar pages
25/02/2004 CFP: International Conference on Intelligent Agents, Web Tec, CIMCA ... 08/08/2002 Building an expert system shell in visual prolog, ...
www.mailgate.org/comp/comp.ai.shells/ - 91k - Jun 19, 2005 - Cached - Similar pages
Curriculum Vitae: Mary Mark
Sci-Tec Instruments, Inc. 1526 Fletcher Road Saskatoon, Saskatchewan ... The expert system was written in Prolog and integrated with a tank maintenance ...
members.verizon.net/~vze48qpu/professional/cv.html - 18k - Cached - Similar pages
File Format: PDF/Adobe Acrobat - View as HTML
Hardware: Sequent NUMA-Q servers, Storage-Tec tape Silos, 7 terabytes of EMC ... Aided in the development of an expert system help desk application using ...
www.cs.unca.edu/~johnsolf/WORK%20HISTORY.pdf - Similar pages
File Format: PDF/Adobe Acrobat - View as HTML
that Prolog does not usually incorporate proved theorems into the system, ... Conceptual programming is a methodology for engineering expert systems. ...
www.cs.nmsu.edu/~rth/publications/cpfoundations.pdf - Similar pages
Google matched content
[PDF] Standard and Custom Monitoring with Tivoli
[PDF] Smarter Rules
Managing intrusion detection sensors with Tivoli SecureWay Risk Manager
Rule Builder's Guide TME 10 Enterprise Console Rule Builders Guide, Version 3.6, September, 1998
Table of Contents
TME 10 Enterprise Console Overview
Creating Rules with the Graphical Rule Builder
Rule Bases and Rule Base Administration
Event Class Definition Files (BAROC)
Rule Types, Structure, and Syntax
Testing and Debugging New Rules
Rule Builder Examples
Appendix A. Default Rules
Appendix B. Commands
Appendix C. Rule Templates and Event Specifiers
Appendix D. Utility Templates
Appendix E. User-defined Templates
Comp.compilers Re Compiling Prolog-like languages
BIM Prolog, where Andre Marien devised the so far best way of compiling unification known
[Aug 13, 1997] FAQ Prolog Implementations 2-2 [Monthly posting]
ProLog by BIM is a high-performance and robust implementation of the Prolog language. It compiles to native machine code for maximum execution speed, and provides flexible memory management with automated expansion, shrinking, garbage collection and user-definable parameters. The ProLog by BIM environment comprises a GUI including an execution monitor and debugger, an on-line help facility, a extended emacs interface and a profiler. ProLog by BIM also includes a bi-directional external language interface, which is used for the included interfaces to graphics, windowing and RDBMS packages. The system also comes with a large library of Prolog source code which contains many of the most commonly used predicates. Stand-alone run-times without royalties and embeddability allow problem-free end-user delivery. ProLog by BIM comes with Carmen, a WYSIWYG GUI-Generator delivering Prolog code that allows notifiers and call-backs in Prolog and serves as a powerful rapid-prototyping aid. ProLog by BIM runs on SPARC, INTEL PC running Solaris 2.x, HP700 and IBM RS/6000. BIM provides both training and consultancy on Prolog and Prolog based developments efforts. For more information write to BIM Engineering Europe sa/nv, ProLog by BIM dept., avenue A. Einstein 4, B - 1348 Louvain-la-Neuve, Belgium, call +32 10 47 06 11, fax to +32 10 47 08 11 or email to firstname.lastname@example.org.
[May-June 1992] lim monitor article BIM testing the NLP waters
As databases get bigger and more complex, users need commensurately more sophisticated tools to retrieve information from them. For Belgian systems house BIM, that is where natural language processing comes in.
"Five years ago, I came to BIM with the belief that all the software applications which could benefit from natural language processing should make use of it," says Jean Louis Binot. "But only now do I think we finally have the technology to realize that." As manager of the R&D group at BIM, Binot has overseen the development of Loqui, a natural language interface for databases. If it is up to Binot, Loqui will represent one more tool in an arsenal of many to help usher in the next generation of information technology systems.
Located on the outskirts of Brussels, BIM is a fifteen year-old, privately-owned company of about 180 people. It has a strong R&D program, partly supported by its bread-and-butter systems integration activities and partly through government funding. Among other things, BIM is the developer of "a world-class Prolog" environment, ProLog by BIM, which has been designated the standard Prolog by IBM for its R/6000 line of RISC-based Unix workstations. Up until recently, BIM has focussed primarily on the Belgian market; it is now starting to look further afield, with offices established in France, Holland and the United States. Among the happy mixture of Flemings, Walloons, and foreigners at BIM, English alternates with French as the medium of communication.
This past March, BIM announced the commercial release of Loqui, which runs on Unix workstations. Loqui differs in a number of significant ways from similar products, such as nli's Natural Language and aiCorp's Intellect. For one, it does not use the standard database query language SQL to search and retrieve information from databases. Rather, it interacts with BIM Prolog, the language in which the package has been developed. BIM Prolog, in turn, has access to standard rdbmss via its proprietary interfaces.
Second, BIM does not position Loqui as a shrink-wrapped solution for end-users, but rather as an environment for developers to build and maintain natural language front-ends for large systems. "It's more a technology than a tool," explains Lieve Debille of BIM R&D. Significantly, the team refer to developing a Loqui application as "porting" it to a given context. They regard this as a job for a specialist, not an end-user.
Demonstrating Loqui on a Sun sparcstation at BIM's offices, Debille explains their goal in developing Loqui was to allow users to access a database without knowing anything about such matters as field and table names. "We're trying to free end-users from having to know about the structure of a database," she explains. The physical location of information as well as a basic set of operators are all mapped to "plain English" words. This active vocabulary can be viewed on-screen to give users a quick indication of the kind of questions they can ask. "Essentially, we are putting a supplementary database between the main database and the user," says Debille. The size of the vocabulary and the number of synonyms would depend on the specific application.
Loqui has two important features, explains Debille. The first is the response generator designed to offer unambiguous replies on the basis of dialog rules. "Such a system should be cooperative and intelligent," says Debille. "if you ask a question about an employee who is not in the employee database, the system should not simply return a negative answer. It should clearly indicate that the subject is not in the database." This leads Debille to explain the group's reason for shunting SQL in favor of their own Prolog interface: while SQL offers a certain degree of control over when and why database queries fail, Prolog makes more a cooperative dialog possible. "Global answers are just not enough," she maintains.
Anaphora but not ellipses
The second feature is a contextual interpreter which enables Loqui to handle different types of anaphoric reference. "Experience shows, however, that many SQL users tend to back away from the idea of using anaphora," says Debille. "Strangely enough, people seem to a distrust technique which represents an enormous timesaver in man-machine interaction. It's a challenge to get users who are accustomed to SQL to break their request down into a number of simple questions employing anaphora instead of formulating one long SQL-type query. The advantage of this approach is that it is much easier to figure out where you went wrong if you don't get the reply you want. You can much more easily backtrack.".
While Loqui can handle long chains of antecedents, it cannot handle ellipsis. You cannot, for example, ask "Who earns the most in the East?" followed by "And in the West?" Bart Vandecapelle says that they had incorporated this functionality into Loqui at one point, but removed it because it was not reliable enough. For a commercial application, an accuracy rate of fifty or even seventy percent is not enough, he says. The group is keenly aware of some of the misrepresentations and unfulfilled promises in the field of nl and have no interest in propagating further misunderstanding.
While Debille and Vandecapelle - both Eurotra Belgium veterans - address technical development, Bruno Schröder is handling strategic business development within the nl group - marketing in other words. It will be his task to try to interest potential customers for Loqui. Where will he be looking? "We're seeing large corporations now beginning to use their computers for more than just accounting and transaction-processing purposes. There's a lot of interest now in document image processing and full text retrieval as well as logistics and decision support. It's becoming clear, however, that something like only ten percent of the employees in a company use the corporate information system. This information is inaccessible to both management and staff because of the difficulty in retrieving it; they are dependent on SQL programmers within the centralized mis departments,".
"Moreover, the formbased query facilities common to traditional database systems are simply too rigid for these new applications. The structure of the information is too complex for them to be effective. We think the key to making this information more easily accessible will be nl. Decision support, project management - these applications are wellsuited to nl." Schröder cites as an example a huge public database being developed in Germany containing information about pollution; a system of this size and complexity would be a ripe candidate for an nl interface such as Loqui. He believes that an nl interface is the best way to allow people to easily obtain diverse information from large, complex databases.
JeanLouis Binot carries the discussion a step further. "In reality, we see nl being just one aspect of many in a user interface. This is what we refer to as multimodality. Certain information can more easily be displayed graphically, so why not use that technology? Ultimately, with such a system, graphic objects should also be active vocabulary, so you can refer to them using words as well. Asked whether the drive to establish standards and develop common resources within the NLP world is simply an academic exercise, Binot replies that it is indeed an important development, something that is sorely needed. The lack of tools is a burden; you should not have to redefine a noun phrase every time you write a grammar. Another ongoing project at BIM, a eurotra spinoff called et9, is in fact the development of a linguist workbench. While it is still about twenty months from completion, Binot says that the project's benefactors at the ec would like to see it become a de facto standard within the NLP world.
Binot feels there is much yet to be done in terms of awareness. "The French have a sense of the Language Industry," he points out. "There, you have the government taking a leadership role with prestigious, highprofile projects like the French electronic Yellow Pages developed by GsiErli. But in Belgium and in other European countries, there isn't that kind of highlevel interest yet."
BIM, Kwikstraat 4, b3078 Everberg, Belgium Tel +32 2 759 5925, Fax +32 2 759 4795
Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers : Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism : The Iron Law of Oligarchy : Libertarian Philosophy
War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda : SE quotes : Language Design and Programming Quotes : Random IT-related quotes : Somerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose Bierce : Bernard Shaw : Mark Twain Quotes
Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 : Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law
Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds : Larry Wall : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOS : Programming Languages History : PL/1 : Simula 67 : C : History of GCC development : Scripting Languages : Perl history : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history
The Peter Principle : Parkinson Law : 1984 : The Mythical Man-Month : How to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite
Most popular humor pages:
Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor
The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D
Copyright © 1996-2021 by Softpanorama Society. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.
FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.
This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...
|You can use PayPal to to buy a cup of coffee for authors of this site|
Last modified: March 12, 2019