I was hoping for a clean solution for throttling a specific type of producer while the consumer is busy processing, without writing a custom block of my own. I had hoped the code below would have done exactly that, but once SendAsync
blocks after the capacity limit hits, its task never completes, insinuating that the postponed message is never consumed.
_block = new TransformBlock<int, string>(async i =>
{
// Send the next request to own input queue
// before processing this request, or block
// while pipeline is full.
// Do not start processing if pipeline is full!
await _block.SendAsync(i + 1);
// Process this request and pass it on to the
// next block in the pipeline.
return i.ToString();
},
// TransformBlock block has input and output buffers. Limit both,
// otherwise requests that cannot be passed on to the next
// block in the pipeline will be cached in this block's output
// buffer, never throttling this block.
new ExecutionDataflowBlockOptions { BoundedCapacity = 5 });
// This block is linked to the output of the
// transform block.
var action = new ActionBlock<string>(async i =>
{
// Do some very long processing on the transformed element.
await Task.Delay(1000);
},
// Limit buffer size, and consequently throttle previous blocks
// in the pipeline.
new ExecutionDataflowBlockOptions { BoundedCapacity = 5 });
_block.LinkTo(action);
// Start running.
_block.Post(0);
I was wondering if there is any reason why the linked ActionBlock
does not consume the postponed message.
I faced the same problem as you. I didn't dig deep into implementation of LinkTo but I think it propogate message only when source block received some. I mean, there may be a case when source block have some messages in its input, but it will not process them until next Post/SendAsync it received. And that's your case.
Here is my solution and it's working for me.
First declare "engine"
And then Producer that uses engine
Now you can use producer with consumer (eg ActionBlock)
Hope it will help you!