CMAKE in requirements.txt file: Install llama-cpp-python for Mac

518 Views Asked by At

I have put my application into a Docker and therefore I have created a requirements.txt file. Now I need to install llama-cpp-python for Mac, as I am loading my LLM with from langchain.llms import LlamaCpp.

My installation command specifically for Mac is:

"CMAKE_ARGS="-DLLAMA_METAL=on" FORCE_CMAKE=1 pip install llama-cpp-python"

But it does not work if I put this in my "requirements.txt" file.

My requirements.txt file looks as follows:

chromadb==0.4.14
langchain==0.0.354
pandas==2.0.3
python-dotenv==1.0.0
python_box==7.1.1
PyYAML==6.0.1
streamlit==1.29.0
torch==2.1.0
sentence-transformers==2.2.2
faiss-cpu==1.7.4
CMAKE_ARGS="-DLLAMA_METAL=on" FORCE_CMAKE=1 pip install llama-cpp-python # Does not work like this

My Dockerfile looks like this after the comment of @ivvija :

FROM python:3.11.5
WORKDIR /app
COPY . .
RUN pip3 install -r requirements.txt
RUN CMAKE_ARGS="-DLLAMA_METAL=on" FORCE_CMAKE=1 pip install llama-cpp-python
EXPOSE 8501
HEALTHCHECK CMD curl --fail http://localhost:8501/_stcore/health
ENTRYPOINT ["streamlit", "run", "streamlit_app.py", "--server.port=8501", "--server.address=0.0.0.0"]

Which results in this error: ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects

How can I do this?

1

There are 1 best solutions below

0
On

I finally found a solution: In your Dockerfile add the llama-cpp-python installation for MAc like this:

FROM python:3.11.5
WORKDIR /app
COPY . .
ENV RUN CMAKE_ARGS="-DLLAMA_METAL=on" FORCE_CMAKE=1
RUN pip install llama-cpp-python
RUN pip3 install -r requirements.txt
EXPOSE 8501
HEALTHCHECK CMD curl --fail http://localhost:8501/_stcore/health
ENTRYPOINT ["streamlit", "run", "streamlit_app.py", "--server.port=8501", "--server.address=0.0.0.0"]