I use llama-cpp-python to run LLMs locally on Ubuntu. While generating responses it prints its logs.
How to stop printing of logs??
I found a way to stop log printing for llama.cpp but not for llama-cpp-python. I just want to print the generated response.