Deploying Recommendation System as microservice using Docker


A REST-Based Architecture for Serving ML Model Outputs at Scale

*Note:This is working draft articel

Docker enables your applications to communicate between one another and to compose and scale various components. Modern applications running in the cloud often rely on REST-based microservices architectures by using Docker containers. Data scientists use these techniques to efficiently scale their machine learning models to production applications.

Architecture

So let’s take a look at the building blocks of microservice:

  • An nginx proxy in front of the api (nginx)
  • A Python Flask api (from here on called api)
  • A Redis Database
  • A Python Machine Learning (from here on called Model)

When visiting client make a request you connect to the nginx proxy component. Below is an overview that shows how the components work together.

The first container need to run is "redis:alpine" from docker official repository. Redis will serve as in memory database to hold item to item recommendation generate by Model Container.

Model Container

Recommendation will be produce by Model Container and save in redis container. If there is new item, The model need to be retrain. There will be no down time during training becouse its the training process happen in anather container/host.

FROM frolvlad/alpine-python-machinelearning

RUN mkdir -p /usr/src/app
WORKDIR /usr/src/app
COPY ./requirments.txt .
RUN pip install --no-cache-dir -r requirments.txt

ENV PYTHONUNBUFFERED 1
copy *.py .

CMD ["python3", "tfidf1.py"]

#docker run --rm -v "$PWD"/input/:/usr/src/app/input -w /usr/src/app tfidf1 python3 tfidf1.py

Flask API Container

Every request will be handle by Flask API. This is realy simple resfull server that read recommendation from redis server. This architecture is really scallable becouse if need anather server we just instanstiate anather docker container.

FROM frolvlad/alpine-python3

ENV REDIS_HOST redisDB
ENV REDIS_PORT 6379
ENV REDIS_PWD XXX

COPY . /usr/src/api
WORKDIR /usr/src/api
RUN pip install -r requirments.txt

RUN sed -i 's/REDIS_HOST/'"${REDIS_HOST}"'/' /usr/src/api/config.py
RUN sed -i 's/REDIS_PORT/'"${REDIS_PORT}"'/' /usr/src/api/config.py
RUN sed -i 's/REDIS_PWD/'"${REDIS_PWD}"'/' /usr/src/api/config.py

EXPOSE 5000

CMD ["python3", "app.py"]

Docker Compose

version: '2'
services:
  redis:
    image: "redis:alpine"
    networks:
      - jktnotebook-net
    container_name: redisDB


  model:
    build: model
    networks:
      - jktnotebook-net
    container_name: tfidf1
    volumes:
      - ./input:/usr/src/app/input
    links:
      - redis


  flask-api:
    build: api
    networks:
      - jktnotebook-net
    container_name: flask-api
    ports:
      - 80:5000
    links:
      - redis

networks:
  jktnotebook-net:

Deployment