Big files streaming and processing And multithreading.

When I was in Bosch, there are some big files that need to be processed, but the files are too large compared to the allocated machine memory to the app, so an OutOfMemory exception was encountered if want to read the file into memory to operate. Another exception StackOverFlow gets encountered when recursion is too deep and the method stack loads too much data but the heap very often has vast space.

So this has to be a stream that keeps getting read, the file could be parsed line by line by each time reading a line.

To accelerate this process, the file could be separated into big blocks, and each block gets processed by a thread, and in the end, combine each thread's output. The file statistic often has some info on locating a certain position and begins to operate from that place, and when one thread has finished processing, a semaphore could emit a signal or such. when all semaphores have finished, the output is into small pieces.

This is the same strategy in Javascript Promise.WhenAll somehow, though javascript is single-threaded, it is possible to give ordinal and sort the output or threads.