Measure the tradeoff between accuracy and speed of algorithms

1.6k Views Asked by At

I have a group of algorithms A, B, C and D. Each of them has a specific execution time and a certain accuracy (MSE). Is there a formal way of calculating the tradeoff between the execution time (speed) and the accuracy?

For example if A has an accuracy of 0.1 and a computational time of 3s, whereas algorithm B has a better accuracy at 0.095, but needs 150s to execute. Although B performs slightly better, the tradeoff should favor A, since it takes considerably less time to execute.

Is there any equation or formal approach I can use to calculate this tradeoff?

1

There are 1 best solutions below

0
On

Although this might not always be appropriate in all situations, I've found a paper that might be useful to future readers looking for a time-accuracy tradeoff. It was published by people from MIT.

Sidiroglou-Douskos, S., Misailovic, S., Hoffmann, H., and Rinard, M. (2011). Managing Performance vs Accuracy Trade-offs with Loop Perforation. In Proceedings of the ACM SIGSOFT Symposium and the European Conference on Foundations of Software Engineering, pages 124–134. ACM.