Setup Server with same instance and different address

65 Views Asked by At

I want to develop my application separately (API, JOBS, WEB), so that it stays in this manner:

  • API: api.myaddress.com
  • JOBS: jobs.myaddress.com
  • WEB: myaddress.com

I know how to do that with distinct instances with Amazon and GoogleComputing, however, I was wondering IF, I could setup a single instance to do all that, and each DNS namespace, going to a different port on that machine, like:

  • api.myaddress.com resides in xxx.xxx.xxx.xxx:8090
  • jobs.myaddress.com resides in xxx.xxx.xxx.xxx:8080
  • myaddress.com resides in xxx.xxx.xxx.xxx:80

Also if that is possible, I don't know where I should configure that (Is it in the DNS, or a specific setup on my instance in Amazon/Google?)

3

There are 3 best solutions below

0
On

what you are looking for is a load balancer (ELB in case of amazon).

setup the load balancer to send traffic to the different ports and at DNS level setup CNAMES for your services that point to the 3 load balancers that you have.

1
On

Why do you want them to go to a different port? Its certainly not necessary. You can use DNS to point all of those domains/subdomains to a single server/ip address, and then thru your webserver configuration bind the various subdomain names to each particular website on that server.

In IIS you bind in the IIS Manager tool, and apache has a similar ability:

http://httpd.apache.org/docs/2.2/vhosts/examples.html

0
On

It sounds like what you are looking for is an HTTP reverse proxy. This would be a web server on your machine that binds to port 80 and, based on the incoming Host: header (or other criteria) it forwards the request to the appropriate Node.js instance, each of which is bound to a (different) port of its own.

There are several alternatives. A couple that immediately come to mind are HAProxy and Nginx.

DNS cannot be used to control which port a request arrives at.

Another approach, which is (arguably) unconventional but nonetheless would work would be to set up 3 CloudFront distributions, one for each hostname. Each distribution forwards requests to an "origin server" (your server) and the destination port can be specified for each one. Since CloudFront is primarily intended as a caching service, you would need to return Cache-Control: headers from Node to disable that caching where appropriate... but you could also see some performance improvements on responses that CloudFront can be allowed to cache for you.