Why are GPUs only used for linear algebra as opposed to nonlinear calculations?

354 Views Asked by At

I keep hearing that GPUs are useful because they are quick at linear algebra.

I see how a GPU can be utilised to quickly perform linear calculations, and I see why that is useful, but I don't see why these calculations need to be linear.

Why can't we have each GPU core take in 4 numbers a, b, c, d and compute a^b + c^d, or any other nonlinear function?

If the answer is that linear algebra is more efficient: how is linear algebra more efficient and how would one utilise linear algebra to compute or approximate an arbitrary nonlinear function (if specificity is required, assume the function is a nonlinear polynomial)?

2

There are 2 best solutions below

2
On BEST ANSWER

GPUs are used for pretty much everything. Your observation is unrelated to GPUs or programming, it’s about books and articles on the subject.

Here’s reasons why you mostly see examples about linear algebra.

  1. Linear algebra is relatively simple, easy to explain how massive parallelism helps.

  2. Linear algebra is used for a lot of things. For some practical applications, speeding up just the linear algebra already causes massive performance win, despite the matrices involved are assembled on CPU with scalar code.

  3. Linear algebra is simple enough to be abstracted away in a library like cuBLAS. Arbitrary nonlinear functions tend to require custom compute kernels, which is harder than just consuming a library someone else wrote.

1
On

GPUs are useful when the computations they need to perform can be parallelized, i.e., they can be executed with a "divide et impera" approach, dividing the problem into subproblems and solving each subproblem separately and then combining them into the solution to the original problem.

In Linear Algebra there is intensive use of matrix multiplication, which is the most classic of problems that can be solved by parallelization, that is why GPUs are so efficient in the practical applications that require it, such as deep learning.