This is my first question on stack overflow.. So, basically I have a repo where in I have multiple API'S JSON files let's say Offers and Continuity. I created Offers.Dockerfile and Continuity.Dockerfile (posted Offers.Dockerfile content below) So, I created docker images using docker build -f Offers.Dockerfile -t offers-apis-sandbox . then I installed nginx using brew install nginx and configured nginx.conf using sudo nano /usr/local/etc/nginx/nginx.conf on mac (attached the file) saved it and reloaded the nginx sudo nginx -s reload then I ran docker run -p 8081:8081 continuity-apis-sandbox and docker run -p 8080:8081 offers-apis-sandbox Both started the prism and running.. for continuity `[7:20:22 PM] › [CLI] … awaiting Starting Prism… [7:20:22 PM] › [CLI] ℹ info GET http://127.0.0.1:4010/continuity/programs?a=voluptas&t=corrupti [7:20:22 PM] › [CLI] ℹ info GET http://127.0.0.1:4010/continuity/program/balance?a=similique&t=laudantium [7:20:22 PM] › [CLI] ℹ info POST http://127.0.0.1:4010/continuity/program/optin?a=odio&t=est [7:20:22 PM] › [CLI] ▶ start Prism is listening on http://127.0.0.1:4010

I testedGET http://127.0.0.1:4010/continuity/programs?a=voluptas&t=corrupti `on the postman but getting the error message

Could not send request
Error: connect ECONNREFUSED 127.0.0.1:4010

I wanted to route 8080 and 8081 on a single port 8081. Please correct me if I'm wrong this is my first time attempting to do something...

This is Offers.Dockerfile

# Use the official Node.js LTS (Long Term Support) image
FROM node:14-slim

# Set working directory inside the container
WORKDIR /app

# Install Prism globally
RUN npm install -g @stoplight/prism-cli

# Copy your API JSON files to the container
#COPY ./reference/Offers/Offers-APIs.v1.json /app/Offers-APIs.v1.json
COPY . /app
# Expose the port that Prism will listen on
EXPOSE 8081

# Command to run Prism and serve your API JSON file
CMD ["prism", "mock", "/app/reference/Offers/Offers-APIs.v1.json", "-d"]

This is the nginx.conf file

worker_processes 1;

events {
    worker_connections 1024;
}

http {
    include mime.types;
    default_type application/octet-stream;

    sendfile on;
    keepalive_timeout 65;

    server {
        listen 80;
        server_name localhost;

        location / {
            root html;
        }
    }

    # Reverse Proxy Configurations for APIs
    server {
        listen 80;
        server_name api-offers.example.com;

        location / {
            proxy_pass http://localhost:8080;
        }
    }

    server {
        listen 80;
        server_name api-continuity.example.com;

        location / {
            proxy_pass http://localhost:8081;
        }
    }
}
0

There are 0 best solutions below