I am building a web application for a client. I want to host it publicly at a spare domain, so they can access it and review it as it progresses. I want it to be as close as possible to how they will experience it when it goes live on their real domain.
The application is built in Laravel, and most of the pages require a login, so are automatically protected. There are some, however, which are publicly available. I'll obviously set a robots.txt to discourage indexing by search engines, but that's not infallible as we know.
What options are there for providing access to my customer in a secure way? Here's my list so far:
- Implement HTTP Basic Auth for routes that are not already protected. But application-level auth is already complex enough (with RBAC, etc), so I would rather not.
- Modify .htaccess to allow IP addresses only from their work network. They have a mobile workforce though, so this is not a great solution.
- Setup a VPN. This is not my sweet spot, and seems like a lot of work.
- Give them access to start/stop the virtual host, or put the application in development mode. This would mean their team members would have to coordinate their test sessions, which would be too restrictive.
What other options are there?
The ways you mentioned seem reasonable to me. I'd just add that if you have control over the web server (eg. Nginx) you could also restrict access to a specific IP (or range of IPs) with something like:
that way you don't need to worry about crawlers accessing your website, or setting up VPNs, or checking routes that would require basic auth. You just grant a set of IPs access to the full web app without additional limitations.
Another thing you could do is set up a firewall rule to disallow all but a specific IP. Or you could set up your server to deliver the app on some obscure port that nobody would be inclined to use. You might want to check out ufw (uncomplicated firewall) for these two options. For example:
and then serve the app on port 8325.