Performance issues processing & updating huge dataset in JS

200 Views Asked by At

I have a Python Tkinter app that loads a dataset with 10 million data points into a plot using Matplotlib. In addition, it allows the user to constantly apply various signal processing filters on runtime that manipulate the data and update the plot.

I'm re-building this app in a web platform using React and microservices architecture, The data files are saved in S3 and loaded by the client side.

The data filters that should be applied are:

  • Butterworth
  • Trace normalize
  • Standard deviation

The python libraries used currently are:

  • scipy
  • NumPy
  • matplotlib

The problems I'm facing are as followed:

  • There are no apparent libraries in JS that suits my needs for complex signal processing like Pyhton does
  • Doing the processing anywhere else than the client side will result in unnecessary network trafficking.

What is the right approach for applying live complex mathematical calculations for a huge dataset in JS?

My initial thinking was to send an HTTP request to a Python microservice with the chosen filters that he will apply to the file and send the data back to the client. The biggest problem with that approach is the huge network trafficking used just to make a small change. For compression in the desktop app, it takes less than half a second to process such a change.

My web app is written in ReactJS and the plot library I'm using is Poltly.js which is working just fine, but the real-time data manipulation is the big challenge.

0

There are 0 best solutions below