JRules 7 introduces the Decision Validation Server (DVS) which replaces the Rule Scenario Manager (RSM) of previous versions.
The main purpose of DVS is to provide a way to test the rules and validate results against expected results. It also allows for simulations which allow comparison of the results of the execution of two rulesets.
As a note for RSM users, there is a way to migrate and RSM archive to a scenario file that is compatible with DVS. I haven’t tried this process, but it is available.
DVS uses Excel files to feed input data and expected results through the rules. A user can generate an Excel template to fill in with input data and expected results, then run the tests and get a report on tests passing or failing. This approach can work well in some environments, as long as you business users don’t mind entering the test data in the Excel template.
One thing is unclear to me and that is the evolution of this Excel file if your BOM evolves. For example, if you add a new field, is it possible to add this new field to the Excel file without regenerating and without breaking things? In any case, I’m pretty sure that with some Excel gymnastics it would be possible to copy and paste contents so re-entering everything is not required but the documentation is not clear on what steps would be required if you do have a filled in Excel template.
Should you wish to go that route, there is also an API that can be used to populate the Excel template There is also a tool that is supposed to help auto-populate the Excel file with Data. The sample code would have to be customized to fit the needs of other projects.
I tried the DVS for testing on the samples that ILOG provides and things obviously work relatively well. Generate a template based on a BOM, fill in the template with input data and expected results, run the test and voilà, a nice report indicating which tests passed or failed.
The main problem I have with using DVS is the limitations it has on generating the expected results template structure. If your output is not a “flat” structure, you may be in for a lot of fun. If your main output entity contains other entities, the scenario might not get generated properly if the number of child entities is not bounded. In the real life projects I played with, the output has multiple levels and has some unbounded elements and the wizard can’t generate the template properly because it attempts to “flatten” it. There might be a workaround by using virtual attributes (which basically means that you end up flattening the structure manually), but I was not going to start doing that for 180 attributes.
For those interested, I am reproducing the limitations texts here:
From the main readme.html located in the install directory:
|Limitations||Comments and workarounds|
|Cannot test the content in collections of complex type in Excel scenario file templates.||
The column headings of the Excel scenario file template are defined by the
attributes of the BOM class. Each column corresponds to an attribute.
Each attribute for which a test is defined in the Expected Results sheet:
If the attribute is a collection of simple type (
And from the documentation
WebSphere ILOG JRules V7.0 > Rule Studio online help > Testing, validating, and auditing rules > Tasks > Adding or removing columns from the Excel file
Besides this limitation, DVS is very promising.
Simulations allow you to leverage the DVS but in this case to perform “What if” analysis of the rules. In other words, perform an analysis of how a new ruleset will behave usually based on some historical data. You would identify and create KPIs that will perform the measurements you are looking for and the results would be made available to you after running the simulation.
I haven’t tried the simulations at this point, if I get a chance I will in the future and report my findings here.
The Decision Warehouse can be used to troubleshoot or to have a look at the execution trace of a ruleset. After enabling some properties on the ruleset and deploying it the RES, you are able to have a look at the trace of the execution of the rules.
The Execution details include the “Decision Trace” which basically shows you which rules fired and in what order, the Input and Output parameters which allows you to see the details of what went in and out of the rule execution.
I am really interested in this tool because I think it may allow business users to see which rules were fired and some of the technical details without tracing the execution of the rules step by step from Rule Studio. It can help identify the source of the problem, although the real troubleshooting might still have to be done in Rule Studio.
Each decision is now tagged with an execution ID or decision ID which can be sued to locate the decision in the decision warehouse. Once you are done troubleshooting, you can turn off the properties so that the execution stop gathering all of that information.
Enabling the trace on the execution obviously slows the performance, but it can facilitate some of the troubleshooting that sometimes needs to take place. A nice addition to the troubleshooting tools.