Use of Docker to do Parallel Processing

  • Ansari Aadil
  • Topic Author
More
01 Sep 2021 04:10 #10025 by Ansari Aadil
Use of Docker to do Parallel Processing was created by Ansari Aadil
Hi,

I am using python to do a global parametric study for various PV technologies and orientations. The drawback is that it takes a lot of time (weeks) to do the parametric analysis. So I thought about using docker to run multiple simulations in parallel to reduce the simulation time. Here is what I am doing: 

1) First I am defining the docker image using Dockerfile: 
FROM python:3
WORKDIR src
COPY. ./
RUN pip install --no-cache-dir -r requirements.txt
ENTRYPOINT ["python","mycode.py"] 

mycode.py is the code that I changed to reflect the directory path location for docker, it is the same as the PY file, but the difference is only in the location path ( corresponding to the docker image path).

2) Running the docker image using docker swarm ( docker warm is the management tool for deploying the programs on multiple nodes) :
docker service create --replicas 10 --mount type=bind,source=/opt/data/,target=/opt --name pyapp mypycode

The problem here is there are 10 replicas of the same code running simultaneously but dependant on each other. This is causing all the replicas of the python code to run the same thing instead of running independently. Is there a way such that they run independently? 

P.S. I have attached the python file below for your reference. 

Please Log in or Create an account to join the conversation.

Moderators: Paul Gilman
Powered by Kunena Forum