I'm trying to deploy my Laravel application on AWS serverless platform, I'm using dynamic imports and code splitting in my laravel-mix to compile the assets, to execute this I followed the steps in the documentation of bref package I installed the required library as instructed, I also synced my public directory with my s3 bucket
npm run prod
aws s3 sync public/ s3://<bucket-name>/ --delete --exclude index.php
and configured my .env
file to:
MIX_ASSET_URL=https://<bucket-name>.s3.amazonaws.com
ASSET_URL=https://<bucket-name>.s3.amazonaws.com
Next I configured my blade
file to :
<script src="{{ asset('nits-assets/js/app.js') }}"></script>
And my webpack.mix.js
file has:
const mix = require('laravel-mix')
const webpack = require('webpack');
const ASSET_URL = process.env.ASSET_URL + "/";
mix.js('resources/js/app.js', 'public/nits-assets/js')
.postCss('resources/sass/app.css', 'public/nits-assets/css', [
require("tailwindcss"),
])
.webpackConfig({
output: {
chunkFilename: 'nits-assets/chunks/[name].[contenthash].js',
publicPath: ASSET_URL
},
resolve: {
symlinks: false,
alias: {
NitsModels: path.resolve(__dirname, 'Models'),
},
},
plugins: [
new webpack.DefinePlugin({
"process.env.ASSET_PATH": JSON.stringify(ASSET_URL)
})
],
}).sourceMaps().version();
Since I'm splitting the code into chunks I'm facing trouble fetching the chunks, My initial app.js
file loads from s3 bucket but for loading chunks it try to fetch through public directory.
How can I configure my laravel-mix
to load chunks from s3 bucket synced with my public directory?
Edit:
As suggested in the answer I changed my serverless.yml and it look like this:
service: laravel
provider:
name: aws
# The AWS region in which to deploy (us-east-1 is the default)
region: ap-south-1
# The stage of the application, e.g. dev, production, staging… ('dev' is the default)
stage: dev
runtime: provided.al2
environment:
AWS_BUCKET: # environment variable for Laravel
Ref: Storage
iamRoleStatements:
# Allow Lambda to read and write files in the S3 buckets
- Effect: Allow
Action:
- s3:PutObject
- s3:ListBucket
- s3:GetObject
Resource:
- Fn::GetAtt: Storage.Arn # the storage bucket
- Fn::Join: [ '', [ Fn::GetAtt: Storage.Arn, '/*' ] ] # everything in the storage bucket
resources:
Resources:
Storage:
Type: AWS::S3::Bucket
package:
# Directories to exclude from deployment
exclude:
- node_modules/**
- public/storage
- resources/assets/**
- storage/**
- tests/**
functions:
# This function runs the Laravel website/API
web:
handler: public/index.php
timeout: 28 # in seconds (API Gateway has a timeout of 29 seconds)
layers:
- ${bref:layer.php-74-fpm}
events:
- httpApi: '*'
# This function lets us run artisan commands in Lambda
artisan:
handler: artisan
timeout: 120 # in seconds
layers:
- ${bref:layer.php-74} # PHP
- ${bref:layer.console} # The "console" layer
plugins:
# We need to include the Bref plugin
- ./vendor/bref/bref
Now I'm getting:
An error occurred: ArtisanLambdaFunction - Value of property Variables must be an object with String (or simple type) properties.
Edit 2:
Problem was with indent(tab) configuration in serverless.yml at :
environment:
AWS_BUCKET: # environment variable for Laravel
Ref: Storage
iamRoleStatements:
# Allow Lambda to read and write files in the S3 buckets
- Effect: Allow
Action: s3:*
But now getting seperate issue:
Error: The CloudFormation template is invalid: Template error: every Fn::GetAtt object requires two non-empty parameters, the resource name and the resource attribute
A 403 error suggests that you are unauthorized or there is some kind of permissions error.
Based on this page:
https://bref.sh/docs/frameworks/laravel.html
Can you confirm that you have set appropriate permissions for lambda to read and write from s3 buckets that you use ?
The documentation goes on to say that you will have to add the the token line from the following snippet in config/filesystems.php . Have you done so?
I don't use Laravel, so the above suggestions are just based on looking at your error and reading the documentation. But if you provide more information, I'm happy to take a look