What is the size limitation for Azure Durable Entities

1.4k Views Asked by At

Azure durable entities are stateful components offered by Microsoft Azure functions. They can hold a state, which requires to be JSON serializable, and will be stored in reliable storage.

My questions are:

  1. How large the state of the entity can be?
  2. What is the pricing model for the entities? Could I rely on the GB-s reported by Azure Portal for the pricing model, or the storage used by the entities also will be priced separately?
3

There are 3 best solutions below

4
On BEST ANSWER

JayakrishnaGunnam-MTs pricing answer is correct for normal functions but azure durable entities/functions are also billed as per normal storage account costs for storage and transactions to table and queues. https://learn.microsoft.com/en-us/azure/azure-functions/durable/durable-functions-billing

Of what I could find about the maximum size for a single entity the only answer I was able to find was the size of a blob as a limit. I was not able to find another limit in the DurableEntityContext implementation but might be missing something.

UPDATE: After some testing, it does seem that durable entities are for < 64 kb stored inside the table storage. After that, they will be moved to a blob. My simple entity with just a large string was only able to be stored for about 50-100MB but at 100 it started to have some issues. Sometimes I received an OutOfMemoryException. (Consumption plan)

0
On

Assuming that you will be trying to read off your durable entities (using ReadEntityStateAsync<T>()) there is a practical limit up to which you can store data in a durable entity.

IDurableEntityClient documentation for ReadEntityStateAsync<T>() says:

Tries to read the current state of an entity. Returns default(<typeparamref name="T" />) if the entity does not
exist, or if the JSON-serialized state of the entity is larger than 16KB.

So it is advisable to store only small amounts of data that you need like counters, dates etc as per your application.

1
On
  1. The storage size limits in the case of entities are related to how much data we can fit into Azure Table storage. i don't think we have threshold for using.
  2. As per the pricing model azure functions are charged based on memory size, execution time and execution per month.

Functions are billed based on observed resource consumption measured in gigabyte seconds (GB-s). Observed resource consumption is calculated by multiplying average memory size in gigabytes by the time in milliseconds it takes to execute the function. Memory used by a function is measured by rounding up to the nearest 128 MB, up to the maximum memory size of 1,536 MB, with execution time calculated by rounding up to the nearest 1 ms. The minimum execution time and memory for a single function execution is 100 ms and 128 mb respectively. Functions pricing includes a monthly free grant of 400,000 GB-s.

Link for pricing calculator https://azure.microsoft.com/en-us/pricing/details/functions/