I've been solving previous years' exam questions for practice and I came across one problem that I /suspect/ I can't solve without a number theory relation that I am not aware of.
The problem is:
Given an array of N integers, find the smallest integer which is a divisor of all the previous integers.
Now the problem is that if I can't cache any results from the modulo operations, then the complexity becomes O(n^2) which doesn't run fast enough to pass the problem's automatic test since there is a time limit and a potential size of 3 million elements.
Are there any f and g for which f(d, g[a1, a2, a3, a4, ..., an])) is true if and only if d | a1, d | a2, d | a3, ..., d | an ? If not, do you guys have any other suggestions on approaches to the problem?
Any help is appreciated!
an
divides all previous elements if and only if it divides the greatest common divisor of those elements.So you need to keep track of
gcd(a1, a2, ..., an)
and the minimuman
for whichan | gcd(...)
.