I am trying to deploy my micro projects using docker compose on AWS EC2. The images of major containers are Nginx, Flask, MySQL, and Tensorflow/Serving.
I have successfully made connection between Flask container and DB in MySQL container for sending select query to the DB and receive correct result back to Flask,
but yet I am failing to make connection between Flask and Tensorflow/Serving for sending input as a HTTP POST request to a saved Keras model (folder including .pb file) and receive the prediction back to Flask.
Here is the docker-compose.yml
version: "3"
services:
nginxproxy:
depends_on:
- nginx
- db
image: nginx:alpine
container_name: proxyserver
restart: always
ports:
- "80:80"
- "443:443"
volumes:
- ./nginx/nginx.conf:/etc/nginx/nginx.conf
- ./certbot-etc:/etc/letsencrypt
- ./myweb:/usr/share/nginx/html
nginx:
image: nginx:latest
container_name: mywebserver
restart: always
volumes:
- ./myweb:/usr/share/nginx/html
flask:
build: ./flask_docker
restart: always
container_name: myflask
command: gunicorn -w 1 -b 0.0.0.0:80 wsgi:server
links:
- tf
tf:
image: tensorflow/serving
restart: always
container_name: mytf
volumes:
- ./exportedmodel:/models/test_model
environment:
- MODEL_NAME=test_model
expose:
- "8501"
db:
image: mysql:5.7
container_name: mysqldb
volumes:
- mydb:/var/lib/mysql
restart: always
environment:
MYSQL_ROOT_PASSWORD: rootpw
MYSQL_DATABASE: db
MYSQL_USER: user
MYSQL_PASSWORD: pw
volumes:
mydb:
For the connection between Flask and DB in MySQL, the python code below in Flask has worked well. I got the correct result for my select query.
db = pymysql.connect(
host='mysqldb', # container name
port=3306,
user='root',
passwd='rootpw',
db='mydatabase',
charset='utf8')
cursor = db.cursor()
However, for the connection between Flask and Tensorflow/Serving, the code below in Flask shows error.
data = json.dumps({"signature_name": "serving_default", "instances": [input_id]}) # [input_id] is the input for the DL model
headers = {"content-type": "application/json"}
json_response = requests.post('http://mytf:8501/v1/models/test_model:predict', data=data, headers=headers) # /models/test_model includes .pb file, the Keras model
result = json_response.json()
The error message is: HTTPConnectionPool(host='mytf', port=8501): Max retries exceeded with url: /v1/models/test_model:predict (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f4aecd595e0>: Failed to establish a new connection: [Errno 111] Connection refused'))
Why does the POST request fail, though I have set the port expose and links in the docker-compose.yml?
(p.s. - I did not install the full Tensorflow in the Flask container, because while I tried the Flask container was killed, probably due to memory exceed. So installation of the full Tensorflow is not in my option.)
I have tried to change 'mytf' in request.post into localhost, 127.0.0.1, 0.0.0.0, container ID, but all of the attempts have failed. The HTTPConnectionPool problem still exists.
I found the answer myself.
Previously, the files including 'saved_model.pb' were in the folder 'exportedmodel'.
I created a folder named '0001' under the folder 'exportedmodel', and I moved the files into 'exportedmodel/0001'.
(Note: I did not change the contents of docker-compose.yml.)
Then surprisingly, it works!
(Maybe... because the path for HTTP POST request includes 'v1'??)
I referred to chapter 19 of "Hands-On Machine Learning" (2nd Ed.) written by Aurélien Géron.