I encountered an Algorithms question which seemed easy at first but I'm puzzled and don't know what to do! the question is as follows:
Suppose we have a number n, and then we have a matrix of n columns and 2*n rows. it is like the power set (e.g. for n=2 we have 10,11,00,01). the rows don't necessarily follow any order. also we can access each matrix member in O(1) time. now, we have (2^n)-1 of the rows, and we want to find the last row in least possible time (note that there are n*(2^n-1) members and we only have O(2^n) time). how can we do this? (we have O(n) memory. though I'm not sure if it's enough. if you know an answer which uses any amount of memory it's fine)
Example input:
3
101
110
100
000
111
011
010
Example output: 001
I tried to model it by the popular missing bit problem (by xor-ing the matrix members) but couldn't do much progress.
P.S. an implementation of the algorithm on C/C++/Java/Python will be very appreciated as it clarifies things up.
P.S.S: Can you also cite some sources? or tell me how you came up with the answer and how you think about such questions?
The constraints are a little vague, but you're probably looking for something like this:
If you sum up the number of elements read you get 2^n + 2^(n-1) + 2^(n-2) ... which is less than 2^(n+1) and therefore in O(2^n)