Too many open files - KairosDB

263 Views Asked by At

on running this query:

{ "start_absolute":1359695700000, "end_absolute":1422853200000, "metrics":[{"tags":{"Building_id":["100"]},"name":"meterreadings","group_by":[{"name":"time","group_count":"12","range_size":{"value":"1","unit":"MONTHS"}}],"aggregators":[{"name":"sum","align_sampling":true,"sampling":{"value":"1","unit":"Months"}}]}]}

I am getting the following response:

500 {"errors":["Too many open files"]}

Here this link it is written that increase the size of file-max.

My file-max output is:

cat /proc/sys/fs/file-max
382994

it is already very large, do I need to increase its limit

1

There are 1 best solutions below

5
On

What version are you using? Are you using a lot of grou-by in your queries? You may need to restart kairosDB as a workaround.

Can you check if you have deleted (ghost) files handles (replace by kairosDB process ID in the command line below)?

ls -l /proc/<PID>/fd | grep kairos_cache | grep -v '(delete)' | wc -l  

THere was a fix in 0.9.5 for unclosed file handles. There's a fix pending for next release (1.0.1).

cf. https://github.com/kairosdb/kairosdb/pull/180, https://github.com/kairosdb/kairosdb/issues/132, and https://github.com/kairosdb/kairosdb/issues/175.