I am trying to upload a jar to S3 via gitlab CI pipeline.
On deploy
stage a run a job with a script.
- mvn -s ../aws-settings.xml deploy
my aws-settings.xml
file looks like this (AWS access keys are set in gitlab CI as environment variables)
<settings>
<servers>
<server>
<id>artifact-s3-repo</id>
<username>${env.AWS_ACCESS_KEY_ID}</username>
<password>${env.AWS_SECRET_ACCESS_KEY}</password>
<configuration>
<region>${env.AWS_DEFAULT_REGION}</region>
</configuration>
</server>
</servers>
</settings>
and in my pom.xml
I set distribution managment as
<distributionManagement>
<repository>
<id>artifact-s3-repo</id>
<url>s3://<myartifactbucketname>/</url>
</repository>
</distributionManagement>
my pipeline fails on deploy
stage with the following output
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-deploy-plugin:2.7:deploy (default-deploy) on project <my-project-name>: Failed to deploy artifacts/metadata: Cannot access s3://<myartifactbucketname>/ with type default using the available connector factories: BasicRepositoryConnectorFactory: Cannot access s3://<myartifactbucketname>/ using the registered transporter factories: WagonTransporterFactory: java.util.NoSuchElementException
I need some hints to configure the connection to aws properly (without adding additional plugins if possible). Thank you in advance.
I didn't manage to publish artifact to S3 via
maven deploy
, but I implemented AWS CLI commands in my pipeline script