Pedram Abrari of Corticon is presenting.
He started with a quick review of Ron Ross’ definition of business rule and then of some of the reasons why business rule engines came to be.
Shortcomings of first generation BRMS
- Difficult to use
- Proprietary programming languages
- Complex engine algorithms
- Proprietary rule development infrastructures
- Specialized debugging techniques to get the rules right
- The proposed solution: Model driven BRMS
- Performance bottleneck
- Deep expertise required for performance tuning
- Architectural performance ceiling: “Rete wall”
- The proposed solution: DeTI Algorithm
He then explained the “declarative vs procedural” paradigm and basically said that business rules need to be declarative. Then he discussed the first generation of expert systems evolved from expert systems and that these might not be well suited for general-purpose business rule automation.
Based on a paper from Forrester Research “The truth about business rules algorithm”.
First Generation BRMS:
- Inferencing algorithms
- Has authoring flexibility but possible performance issues
- Sequential algorithm
- Less authoring flexibility with better performance
Second Generation BRMS
- Extended sequential algorithms
- Authoring flexibility with performance
- Deployment-Time Inferencing, high performance, scalable inferencing
He defined inferencing as the fact that rule engines determine the sequence of rule processing based on rule logical dependencies, allowing more rule authoring flexibility.
Rule engines typically perform two types of iteration
- Fact iteration (95% of business decision on require fact iteration)
- Logical loop iteration (rule recursion)
He then performed a short demo of a simplified version of Miss Manners (the benchmark) to explain some of the principles behind “sequential algorithms” and “inferencing algorithms” (and the extended sequential algorithm).
He then covered the evolution of the first generation rule engines from very technical to the additions that have been added since then such as natural language, decision tables, etc.
He also covered some potential problems of first generation rule engines. And obviously he shows how Corticon’s tool solve all these problems.
To me putting down algorithms used by other very successful rule engines to show that your approach might be better might not be the best approach to getting your point across. If the next generation rule engines are about modeling the rules in a new way, then it should be presented as such.