Sharing Bytearray in multiprocessing

55 Views Asked by At

I have two functions. One variable bytearray type - data.

First function – Reads a file and adds bytes to ‘data’, second takes and deletes these bytes from ‘data’. Those functions should work at the same time, so I use multiprocessing, and is variable shared the like Value. But, typecode_or_type of Value doesn’t contain bytearray type.

Okay, I started looking on stackoverflow, and found few approaches to use:

  • Proxy
  • Memory management (SharedMemory)
  • Numpy
  • Maybe some ways that were just scrolled) A lot of information and I got confused. What approach I should explored further and use?

THX!

def reader(stream):
    for b in s3_obj._raw_stream:
        stream.extend(b)
        print('Add bytes to stream')

def writer(stream):
    while True:
        if len(stream) >= BUFFER_SIZE:
            process.stdin.write(stream[:1024])
            del stream[:1024]
            print('Dump bytes to pipe')


if __name__ == '__main__':
    stream = bytearray()
    data = Value('????', stream)
    p1 = multiprocessing.Process(name='p1', target=reader, args=(data, ))
    p = multiprocessing.Process(name='p', target=writer, args=(data, ))
    p1.start()
    p.start()
0

There are 0 best solutions below