I read in several places, including on Stack Overflow posts, that it is no longer true that Node.js has better performance than .NET in the .NET Core era.
I've read that part of the reason that Node.js had an advantage on the first place is that it doesn't have the overhead of context switching because of the single thread model.
But in .NET Core although we use await
and free up the thread while waiting for an I/O operation, it still doesn't have the single thread model like Node.js so it still has the burden of context switching.
My question is: How could .NET Core compare to Node.js in performance while it has the context switch overhead that doesn't exist on Node.js?