Getting Started Spring Boot Application on AWS — S3 (Part 1)

VAIBHAV KURKUTE
7 min readNov 15, 2020

In this series, we will be developing a Web Application with Java Based Framework — Spring Boot, which will interact with various AWS Services as per the Project Requirement.

For Development on Java, AWS provides an SDK to connect Cloud Services: https://aws.amazon.com/sdk-for-java/

Spring Boot with its Cloud Supports helps in Auto-Configuration and Bootstrapping things required to create Clients required to use AWS Cloud Services and abstracts the “Low-Level API Provided in Java SDK” by AWS.

Pre-Requisites :

  1. AWS Free-Tier/Standard Account
  2. Basic Knowledge of Java-Spring (MVC, Data Access)
  3. About AWS Services like : S3, SQS, DynamoDB

Let’s Get Started :

What we are going to build : A Web Application which will upload Files to S3

Step 1: Create a Spring Boot App

Visit https://start.spring.io/ for Quick Project Template, Select Any Build (Gradle/Maven) as per your experience.

From Dependencies Section, Select Following :

  1. Spring Web

Then “Generate” Project and Import in Eclipse/STS IDE.

AWS Step to Create New Bucket to Store our Objects/Files

  1. Login to AWS Console & Search S3
AWS Console — S3 Service

2. S3 Service Dashboard & Create Bucket

Here you will see Buckets created in your Account, if you are new you may see nothing listed, and In which Region they are. (Also Bucket Name Should be Unique across Globe)

S3 Dashboard — Bucket / Region / Access

Click on “CREATE BUCKET”

3. Create New Bucket (Name should be unique Across Region)

Type Bucket Name and Select Region probably Near to you.

  • (Make Note of Bucket Name and Region in which it is created)

4. Block Public Access to Bucket

Make Sure Tick is there By Default

  • With this, No File/Object inside Bucket is Publicly Accessible or No can download Unless and Until you share URL or Provide Access.

5. Versioning & Tag

By Default Versioning is Disabled, let it be like it. S3 stores Object/File as Flat File, means If you upload any file with the same name and in the same bucket, it will be overridden, and AWS will store it as new version deleting older.

But If you wish to preserve older versions (n-numbers) then you need to enable this feature.

Tags are just extra attributes (key-value) pair you can assign to services, like for what it stores, what it is used for and anything, it's for identification and use later for monitoring and costing etc.

6. Encryption Settings

AWS Provides extra safety to your Object/File with Encryption, you can enable if it's needed, but it incurs some cost. By default its disabled

7. Done, let's create Bucket

WoWW !!! You just created new Bucket in you AWS Account

(Instead UI, you can create Bucket with CLI as Well or through Code)

Let move to our Programming Part.

Step 2: To Add AWS SDK Dependencies in ( build.gradle or pom.xml )

Note: I Avoid Using a Common SDK i.e com.amazonaws.aws-java-sdk Provided by AWS as it increases the size of JAR Build

  1. S3 SDK
<dependency> 
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-s3</artifactId>
<version>${aws.sdk.version}</version>
</dependency>

2. JSON Dependency

In our APP we will send Response to front-end App (Angular / React or anything ) with JSON

<dependency>
<groupId>org.json</groupId>
<artifactId>json</artifactId>
<version>20190722</version>
</dependency>

Step 3. AWS Service Clients

Before we use any AWS Service API’s in Java we need to set up a Client, which has an implementation on How to Interact with AWS Infrastructure for any action, e.g Upload/Download a File, or Creating a Bucket or anything.

In our App we are using S3, so let's create S3Client

S3Config.java

Reference: https://gist.github.com/vaibhavdes/702b0a7024719b9670d7bf77057d0426

@Value("${cloud.aws.region.static}") 
String region;
@Bean
public AmazonS3 s3client() {
LOGGER.info("--- S3 Configuration Completed---")
AmazonS3 s3Client = AmazonS3ClientBuilder
.standard()
.withCredentials(new DefaultAWSCredentialsProviderChain())
.withRegion(Regions.fromName(region))
.build();
return s3Client ;
}
@Bean
public AmazonS3Client amazonS3Client() {
return (AmazonS3Client) AmazonS3ClientBuilder.standard()
.withCredentials(new DefaultAWSCredentialsProviderChain())
.withRegion(Regions.fromName(region))
.build();
}

A. In Above you will see .withCredentials(…), for any action to be taken on AWS, one needs to be Authenticated / Authorized, just like we access AWS Console with Username/Password, but here we are trying to interact with AWS Services with Code, for this AWS provides us with

a. Secret Key & b. Access Key

There are two ways to make the Authentication and Configure Client :

  1. With Credentials
AWSCredentials credentials = new BasicAWSCredentials(
"thisIsMyAccessKey",
"thisIsMySecretKey"
);
AmazonS3 s3client = AmazonS3ClientBuilder
.standard()
.withCredentials(new AWSStaticCredentialsProvider(credentials))
.withRegion(Regions.US_EAST_2)
.build();

2. With DefaultAWSCredentialsProviderChain

AmazonS3 s3Client  = AmazonS3ClientBuilder
.standard()
.withCredentials(new DefaultAWSCredentialsProviderChain())
.withRegion(Regions.fromName(region))
.build();

a. Use this method if you already did the setup of CLI on the machine, because it will create a folder at location “C:\Users\UserName\.aws” credentials, from where it will read.

b. Second Advantage is, if you are running/deploy this Application in EC2 Instance, then no need to pass Static Credentials, it will detect automatically, but along with that you need to assign Permission to EC2 Instance with access to the Service, in this case, it is S3

B. Also you might have observed we have used .withRegion() in above code as in case of S3 when we create a Bucket it will be present in Particular Region of AWS, which we need to mention/pass in code, e.g US-EAST-1

Step 4. Let's Write Few Methods to Use S3 API to Upload/Download File

For Every Method we pass ‘Bucket Name’, on which we need to perform storage operations, also we have to handle if any exception occurs.

S3FileOperation.java

Reference: https://gist.github.com/vaibhavdes/a411669aa24c589c8a841e9e33458380

// AUTO-WIRE S3 Client @Autowired 
AmazonS3 s3client;
// UPLOAD FILE
@Override
public void uploadSingleFile(String bucketName, String fileName, MultipartFile file) {
LOGGER.info(" --- Uploading a new file to S3 Bucket --- ");
if (s3client.doesBucketExistV2(bucketName)) {
try {
ObjectMetadata metadata = new ObjectMetadata();
metadata.setContentLength(file.getSize());
s3client.putObject(new PutObjectRequest(bucketName, fileName, file.getInputStream(), metadata));
} catch (IOException e) {
LOGGER.error("IOException: " + e.getMessage());
} catch (AmazonServiceException a) {
LOGGER.info("Error Message: " + a.getMessage());
LOGGER.info("HTTP Status Code: " + a.getStatusCode());
LOGGER.info("AWS Error Code: " + a.getErrorCode());
LOGGER.info("Error Type: " + a.getErrorType());
LOGGER.info("Request ID: " + a.getRequestId());
}
}
}

// DOWNLOAD FILE
@Override
public byte[] downloadFile(String key, String bucketName) {
try {
if (s3client.doesBucketExistV2(bucketName)) {
LOGGER.info(" --- Downloading an file from bucket --- ");
S3Object obj = s3client.getObject(new GetObjectRequest(bucketName, key));
S3ObjectInputStream stream = obj.getObjectContent();
LOGGER.info("Content-Type: " + obj.getObjectMetadata().getContentType());byte[] content = IOUtils.toByteArray(stream);
obj.close();
return content;
}
} catch (AmazonServiceException a) {
LOGGER.info("Error Message: " + a.getMessage());
LOGGER.info("HTTP Status Code: " + a.getStatusCode());
LOGGER.info("AWS Error Code: " + a.getErrorCode());
LOGGER.info("Error Type: " + a.getErrorType());
LOGGER.info("Request ID: " + a.getRequestId());
}
return null;
}
// DELETE FILE@Override
public void deleteFile(String bucketName, String key) {
LOGGER.info(" --- Deleting file from bucket --- ");
if (s3client.doesBucketExistV2(bucketName)) {
s3client.deleteObject(bucketName, key);
}
}

Step 5. Time For Finishing Touch to Our Spring Boot Project with RestController to Integrate with UI or External API

S3Controller.java

@RestController
@CrossOrigin(maxAge = 3600)
public class TestController {
@Value("${bucket.name}")
String bucketName;
@Autowired
S3FileOperation fileOperation;
@PostMapping(path = "/upload", consumes = { MediaType.MULTIPART_FORM_DATA_VALUE })
public String uploadFile(RedirectAttributes model,
@RequestPart(value = "file", required = false) MultipartFile[] files) throws IOException {
if (files.length == 0) {
return "Invalid Data";
}
fileOperation.uploadFile(bucketName, files);
Map<String,String> result = new HashMap<String, String>();
result.put("key",files.getOriginalFilename());
return result;
}
@GetMapping(path = "/download")
public ResponseEntity<byte[]> uploadFile(@RequestParam(value = "file") String file,
@RequestParam(value = "bucket") String bName) throws IOException {
byte[] data = fileOperation.getFile(file, bName);
if (data == null) {
return ResponseEntity.noContent().build();
}
ByteArrayResource resource = new ByteArrayResource(data);return ResponseEntity.ok()
// .contentLength(data.length)
// .header("Content-type", "application/octet-stream")
.header("Content-disposition", "attachment; filename=\"" + file + "\"").body(resource.getByteArray());
}
}

You can write additional API for deleting a file, creating/listing buckets etc. Do Not forget to remember Region (Or You can check same via AWS Console).

We have provided Reference code on Github Gists.

Step 6. Config Related to Spring Boot App and AWS Config

resources/application.properties

spring.application.name=awsdemo
server.port=8080
bucket.name = YourBucketNameHere
cloud.aws.region.static = us-east-1
cloud.aws.region.auto = true
cloud.aws.credentials.useDefaultAwsCredentialsChain=true

In Above Properties, you can change Bucket name, Service Region and CredentialsChain config as per your need.

7. Run your Application and Test via Postman or UI If you have created

Let’s See Deployment on EC2 Instance in Next Part of Blog.

For Complete Project check our Github Repo

Cleanup:

  1. Remove S3 Bucket

--

--