I'm dealing with boolean functions of which I can only (but safely) assume that they come in as a SOP and contain no negations (e.g. (A && B && C) || (A && B && D)). The number of disjunctions is usually > 5, the number of conjunctions within usually > 10.
Because in my case computing the value of each variable is hard and the result is to be considered ephemeral, I need to be able to minimize said functions with respect to variable occurrence. The result of this minimization does not need to be in any normal form and is allowed to be nested arbitrarily deep.
Having asked a similar question before, SO points to general solutions using fanout-minimization, Karnough maps, QM or BDDs. Before dealing with these approaches - which would blow up my code extensively - I'd like to double-check if the a priori known facts about the input function do not yield the possibility to use a smaller though less general approach of minimization.
AFAICS applying the absorption and distributivity laws will always provide the minimal form. Is there a possibility to exploit the fact that the functions come in as SOPs and have no negations? It appears to me that there should be a recursive algorithm of simple intersection- and union-operations on the variables that will yield the desired result.
Can one describe that algorithm?
Edit: Request for comments: Having done some research on the topic, it appears to me that the question asked here is equivalent to finding the optimal variable ordering of the reduced BDD of the given functions.
Background: The minimized function is passed on to a job queue to figure out the value of all required variables. The function is evaluated afterwards. Consider the application examples:
- The input function (A && B && C) || (A && B && D) can be written as A && B && (C || D), which eliminates having to evaluate A and B twice. Evaluation of C and D is serialized in the job queue because only one of them needs to be proven true.
- (A && B && C) || (A && B && C && D) || (A && B && X && E) is reduced to A && B && (C || (X && E)). The evaluation of X && E is considered more hard and therefor placed behind evaluation of C in the queue, the evaluation of D is dropped.
According to your assumptions, you'll need a function to evaluate your signature before executing the required function.
There's no a priori algorithm that will do this for you, at least in java, hence you'll need to codify it and keep iterating until you find the most general abstraction.
Boolean algebra
There you have all the properties applied in logic, being the first three the most useful for you as you don't want to use the NOT operation. I hope this helps.