Geeks With Blogs

Charles Young
Dr. Gopal Gupta from the University of Texas in Dallas spoke first on “Programming Rules Using a Constraints-Embedded Spreadsheet Interface”.   Non-experts should be able to program rules.   Dr Gupta described a constraint satisfaction approach CSP is used to resolve problems where the objective is to find a solution which satisfies a given set of constraints.   His PlanEx work focuses on resource allocation problems.   Good examples are time tabling, scheduling etc.   Dr Gupta’s work uses a data-centric spreadsheet paradigm using a tabular structure to manipulate data tables.   Programming is done by replication. Replication is parameterised and a number of built-in functions are provided.   Spreadsheets are very familiar to business users.   Current spreadsheets are limited to arithmetic calculations.   The idea was to extend this with CSP functionality (sanity-check: CSP extensions and plug-ins for spreadsheets are as old as the hills).   Dr Gupta provided examples of his CSP tools being used to resolves scheduling problems in Excel.
PlanEx runs in Excel and is a .NET. tool.   Cells are thought of as variables and can contain permissible values, constants or constraints.   Once all permissible values and constraints are entered and replicated, the user generates solutions.   The problem is solved using a back-end Prolog engine.   The tool appears to have gone under the name of ‘ExSched’ previously.   We were shown a demo of the tool by Abhilash Tiwari.   PlanEx provides a toolbar interface to enter and manage constraints.
Next up was Jason Morris talking about “SINFERS: The Implementation of a Practical and Modern Rule-Based Expert System” who is a very active JESS proponent.   SINFERS stands for Soil Inferencing System used to predict soil properties from existing known properties. Soil has a profile composed of layers called ‘horizons’   Horizons have samples and samples have properties. Pedotransfer functions (PTFs) are linear, multivariate, statistical regression functions derived from experimental sample data and used to predict results from a small number of inputs.   Essentially PTFs do a kind of pattern matching against a knowledge base.   Each computed value has an average value and an error associated with it. PTFs essentially handle uncertainty.   Jason described some problems, including an ‘inbreeding’ issue to do with replacing properties with more certain properties where the more certain property is parameterised with the less certain property.   He talked about automation of various tasks such as updating the rulebase directly from the database.
Every domain has its own vocabulary and it is important to get the ontology sorted out (sorry, Jason, I mean ‘fixed’ – you know, the slide on Australian English could just about pass for British English as well – although we say ‘dog’s dinner’). Design metaphors can be used to create a common understanding of the way forward and helps to conceptualize problems and solution spaces. A question is how we program in the domain vocabulary rather than the low-level implementation vocabulary.    SINFERS provides a Rule Engine Manager that wraps and controls the rules engine.   It delegates to the Working Memory Manager and the Rule Manager class.   An Evaluator interface is provided which uses the Java Expression Parser.    It provides a ProcessStrategy interface to vary how to process each element in a soil profile.   SINFERS uses a ‘propose and revise’ strategy.   The control strategy is characterised as a ‘challenger vs. incumbent’ contest.   Jason described various aspects of SIFERS including the use of auto-focus, a monitor module that looks for various types of anomalies and a maintenance module.
Edson Tirelli and Adam Mollenkopf presented next on “Temporal Reasoning: A Requirement for Complex Event Processing.”   The talk started with a brief overview of CEP from Adam and an explanation of a CEP use case from FedEx.   This involved time-specific deliveries for critical freight using exclusive non-stop door to door services, blended surface and air services, etc.   The demo illustrated the use of complex event processing techniques to monitor various aspects of freight shipments, including route information, speed, temperature, etc.   The system provides notification of problems as they are detected and feedback on estimated arrival times, etc.   The system is implemented using Drools Fusion. It uses a number of knowledgebases and uses a combination of temporal, geo-spatial and (in future) probabilistic reasoning.   It uses 30 minute sliding windows and surfaces data to the end user using Flex.
Edson then took over and talked about temporal reasoning in Drools. He discussed semantics, expressivity of temporal relationships, the use of reference clocks and support for temporal dimensions.   Fusion uses discrete time slices and a combination of point-in-time and interval-based events, applied to Allen’s 13 temporal operators.   It provides a reference clock and session clock interface with two implementations (RealTimeClock and PseudoClock).   It will have a HeartbeatClock in future, and supports custom clocks.   The session clock supports the use of sliding windows.   The use of sliding windows means that negative patterns may require delaying rule firing.   The hardest requirement to meet was the support of the temporal dimension where a rule might match at one time and then not match at a later time.   CEP scenarios are stateful by nature, but events are only of interest for a period of time.   Edson described the use of temporal dependency matrices and temporal distance
Carlos Serrano-Morales from FICO was next up talking about “Business Rules in the Cloud”.   The competitive survival of the applications we write will depend on their ability to anticipate situations and their ability to react quickly to change. Increasingly, we are looking at the advent of ‘big data’ and explosive growth.   The data is often loosely structured or ins not semantically unified.   However, this data is often the most valuable asset in an organisation.   Another issue is that organisations want only to pay for what is needed and reduce hit in Capex by focussing on Opex.   The future is big data, asynchronous processing and extended enterprise architectures.   It will be enabled by cloud computing, new enterprise architectures and new technologies.   Cloud computing satisfies the economic imperatives of elastic resource provision. Carlos introduced the ideas of Infrastructure as a Service, Platform as a Service, Software as a Service and their relationship to traditional Internet companies.   Enterprise architectures will evolve to a mixed on-premises/off-premises model.    Traditional SQL stores don’t scale well for big data in the cloud, and cloud platforms are therefore evolving non-traditional forms of data storage.   Carlos described Map-Reduce, and gave Microsoft Research’s DryadLINQ technology an honourable mention in despatches.
Carlos then moved onto the impact of cloud computing on decision management.   Cloud computing needs agile decision services.   The challenge will be to deploy decision services in an efficient and scalable fashion within the cloud.   This will involve a decentralised approach with centralised management.   Carlos suggested that we don’t yet have solutions for this in the rule community, but that we will have to work on this in the near future.
The afternoon was taken up with a Q&A session followed by the Thursday Think Tank on "Needful Things".   I was a little surprised to be asked onto the panel at the last minute (James said it was a mistake my name wasn’t on the list).   So, I didn’t get to record the discussion in detail.   This year has been much better than last year in respect to discussions thanks to the regular Q&A sessions, but I think the common consensus is that we can do better next year regarding the think tank.   It was unmoderated and lacking detailed planning. Some feel that there were too many people on the panel and that it became too wide ranging.   So, for next year, we need to consider various possibilities such as pre-defining the questions and agenda, having a panel chairperson, breaking out into smaller interest group with perhaps panels of three, etc.   Anyway, the session ranged up and down with some good comments and observations and a fair degree of agreement on the need, for example, to get smarter about the ways in which we select and combine different technologies to meet different needs.   The discussion ran into some difficulty when discussing some low-level details of implementation, and I certainly felt that we began to lose people at that point. The lack of pre-planning or questions and answers meant that some issues were not really addressed very fully.   We finished off with...yes of course...a discussion on benchmarks!
Posted on Friday, October 30, 2009 5:31 AM | Back to top

Comments on this post: October Rules Fest: Day 4

# re: October Rules Fest: Day 4
Requesting Gravatar...
Just found this tidbit thank to Ken Kellis. thanks Ken. The T3 started off about September or so with some topics that I felt were needed but I got shouted down that I was "constraining" the thoughts by forcing them into a set pattern of subjects. So, I backed off and adopted a "free flowing format" without a moderator.

BIG MISTAKE!! That won't happen next year. Next year I hope that either myself or someone else who is basically independent will moderate the T3 and will ensure that the topics are focused on what we can do to IMPROVE the industry as a whole; NOT on what Drools or Open Rules or OPSJ or Advisor or Jess needs.

I really do appreciate you and Eric Charpentier blogging on ORF 2009. Now, if someone would just send some cash through the mail to help finance next year. :-)

Left by jco on Oct 31, 2009 12:48 AM

# re: ORF: Day 4 + Whiteboard Woes
Requesting Gravatar...
I think James Owen wrote: "...Next year I hope that either myself or someone else who is basically independent will moderate the T3 and will ensure that the topics are focused on what we can do to IMPROVE the industry as a whole..."

I'd be happy to do it. Just please don't refer to me as the "Jess Guy" or the "Jess Guru" :-) I am Jason Morris of Morris Technical Solutions. At this point in time, I just happen to be supporting Jess, but I'm not officially a Sandia employee. True, Sandia (well Ernest and Craig) has been very good to me and so I'm very loyal to them, but business is business and they know that. I have no problem being vendor neutral since Sandia does not consider itself a vendor in competition with the likes of Drools, JRules, FICO, etc.

Jason Morris
Morris Technical Solutions LLC
Left by Jason Morris on Oct 31, 2009 10:37 PM

# re: October Rules Fest: Day 4
Requesting Gravatar...
Yeah, I know the feeling. Last year I was singled out as the 'Microsoft' guy and got some strange reactions from one or two people (I will never forget the intensity in the eyes of the guy who told me that all .NET developers have, by definition, low IQs). I've never worked for Microsoft. I currently work for a Microsoft gold partner that makes its living from its expertise in certain MS tools and technologies. Don't get me wrong. I'm certainly an MS fanboy. What is nice about ORF, though, is that we have the opportunity for a few days to concentrate on the technology as it applies across the industry and not on commercial sectarianism.
Left by Charles Young on Nov 01, 2009 6:11 AM

# re: October Rules Fest: Day 4
Requesting Gravatar...
I thought Karl Reich was our token Microsoft Guy. :)
Left by Greg Barton on Nov 06, 2009 2:50 PM

Your comment:
 (will show your gravatar)

Copyright © Charles Young | Powered by: