Multi-machine Vagrant project not provisioning as per docs

944 Views Asked by At

I’ve trying to set up a multi-machine Vagrant project. According to the docs (https://www.vagrantup.com/docs/multi-machine/), provisioning is “outside in”, meaning any top-level provisioning scripts are executed before provisioning scripts in individual machine blocks.

The project contains a Laravel project, and a Symfony project. My Vagrantfile looks like this:

require "json"
require "yaml"

confDir = $confDir ||= File.expand_path("vendor/laravel/homestead", File.dirname(__FILE__))

homesteadYamlPath = "web/Homestead.yaml"
homesteadJsonPath = "web/Homestead.json"
afterScriptPath = "web/after.sh"
aliasesPath = "web/aliases"

require File.expand_path(confDir + "/scripts/homestead.rb")

Vagrant.configure(2) do |config|
  config.vm.provision "shell", path: "init.sh"

  config.vm.define "web" do |web|
    web.ssh.forward_x11 = true

    if File.exists? aliasesPath then
      web.vm.provision "file", source: aliasesPath, destination: "~/.bash_aliases"
    end

    if File.exists? homesteadYamlPath then
      Homestead.configure(web, YAML::load(File.read(homesteadYamlPath)))
    elsif File.exists? homesteadJsonPath then
      Homestead.configure(web, JSON.parse(File.read(homesteadJsonPath)))
    end

    if File.exists? afterScriptPath then
      web.vm.provision "shell", path: afterScriptPath
    end
  end

  config.vm.define "api" do |api|
    api.vm.box = "ubuntu/trusty64"

    api.vm.provider :virtualbox do |vb|
      vb.customize ["modifyvm", :id, "--memory", "2048"]
    end

    api.vm.network "private_network", ip: "10.1.1.34"
    api.vm.network "forwarded_port", guest: 80, host: 8001
    api.vm.network "forwarded_port", guest: 3306, host: 33061
    api.vm.network "forwarded_port", guest: 9200, host: 9201

    api.vm.synced_folder "api", "/var/www/api"

    api.vm.provision "shell", path: "api/provision.sh"
  end
end

I have a block (web) for the Laravel project, where I’ve copied the contents of the Homestead-based Vagrantfile, and an api block that uses the “standard” Vagrant configuration.

To bootstrap the projects, I created a simple shell script (init.sh) that simply clones the Git repositories into git-ignored directories. Given the documentation says configuration works outside-in, I’d therefore expect that script to run, and then the machine-specific blocks, but this doesn’t seem to be happening. Instead, on vagrant up, I receive the following error:

There are errors in the configuration of this machine. Please fix the following errors and try again:

vm:
* A box must be specified.

It seems it’s still trying to provision the individual machines, before running the shell script. I know the shell script isn’t getting called as I added an echo statement to it. Instead, the terminal just outputs the following:

Bringing machine 'web' up with 'virtualbox' provider...
Bringing machine 'api' up with 'virtualbox' provider...

So how can I get Vagrant to run my shell script first? I think it’s failing because the web group is checking if my web/Homestead.yaml file exists and if so, use the values in there for configuring (including the box name), but as my shell script hasn’t been ran and hasn’t cloned the repository that file does not exist, so there is no box specified, which Vagrant complains about.

1

There are 1 best solutions below

6
On

The issue is that you do not define a box for the web machine. You need to either define the box in the outer space like

config.vm.box = "ubuntu/trusty64"

if you plan to use the same box/OS for both machines or define in the web scope

web.vm.box = "another box"

EDIT

Using the provision property will run the script in the VM, which is not what you want here, as you want the script to run on your host. (and because it runs in the VM, it needs the VM to be booted first)

Vagrantfile is just a simple ruby script, so you could add your script or even an execution to it (from ruby call), a potential issue I could see is that you cannot guarantee the execution and specially that the execution of your init script will be complete before vagrant does it things on the VM.

A possibility is to use the vagrant trigger plugin and execute your shell script before the up event

  config.trigger.before :up do
    info "Dumping the database before destroying the VM..."
    run  "init.sh"   
  end

Running it this way, vagrant will wait for the script to be executed before it runs its part of the up command.

You would need to do some check in your script to make sure it runs only when needed, otherwise, it will run everytime you start the machine (invoking vagrant up), e.g. you could make a check on the presence of the yaml file