File Upload In GraphQL With Apollo Server Using S3 Bucket & Node.js

File Upload In GraphQL With Apollo Server Using S3 Bucket & Node.js

Graphql has become more common as a result of its various features which solve under/over fetching issues. Moreover, it enables simple caching, federation, non-versioning APIs, subscriptions, etc.

I studied numerous articles and instructions written by GraphQL community members on how to create GraphQL services. However, none of these resources stated that uploading a file was possible with GraphQL.

When I was assigned to manage and develop a new feature that involved Graphql file uploading, Then I spent nearly a week reading tonnes of tutorials, including those from the Apollo docs, GraphQL upload on GitHub, Google and stack-overflow, Although this feature was implemented in Apollo Server 2.0, I found that most of the resources on this feature were scattered across several repositories and tutorials leaving out critical steps.

This article will walk you through the steps required to set up a GraphQL server that can handle file upload in graphql and stream data into an S3 bucket

What is S3 Bucket?

A cloud hosting service offered by Amazon Web Services (AWS) is called Amazon S3 (Simple Storage Service) (AWS). It is a cloud-based storage solution that gives programmers and companies access to data, pictures, movies, and other information from anywhere at any time.

Many applications need to be able to upload files to an S3 bucket, and utilizing the Amazon SDK and the graphql upload middleware, this can be accomplished in a GraphQL Apollo Server with ease.

Installing the dependencies

Let’s install the following dependencies, I’ll use npm for this, but you sure can do this with the yarn command as well

npm install apollo-server apollo-server-express graphql-upload aws- sdk uuid

Amazon S3 configuration

The next step is to create an S3 bucket and set up the required IAM permissions so that your server can access the bucket. The AWS documentation contains detailed instructions on how to do this.

Now let’s begin with the setup of the S3 bucket

const AWS = require(“aws-sdk”);
module.exports = new AWS.S3({
credentials: {
// all secret keys should be kept in the env file.
accessKeyId: process.env.AWS_ACCESS_KEY,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
},
region: process.env.AWS_BUCKET_REGION,
params: {
ACL: “public-read”,
Bucket: process.env.AWS_BUCKET_NAME,
},
},
app: {
storageDir: “tmp”,
});

Configure the Apollo server

This index.js file listens on port 4000 and contains an instance of ApolloServer with type definitions and resolvers.

const express = require(“express”);
const { ApolloServer } = require(“apollo-server-express”);
const { GraphQLUpload, graphqlUploadExpress } = require(“graphql-upload”);
const typeDefs = require(“./graphql/typeDefs/schema”);
const resolvers = require(“./graphql/resolvers/Query”);
const PORT = process.env.PORT; // 4000 port
async function startServer() {
// Error handling
const overrideError = (err) => {
if (err.message.startsWith(“Database Error: “)) {
return new Error(“Internal server error”);
}
return err;
};
const server = new ApolloServer({
typeDefs,
resolvers
});
await server.start();
const app = express();
app.use(graphqlUploadExpress());
server.applyMiddleware({ app });
await new Promise((r) => app.listen({ port: PORT }, r));
console.log(Server ready at http://localhost:${PORT}${server.graphqlPath});
} // http://localhost:4000/graphql
startServer();

Creating Graphql schema

const { gql } = require(“apollo-server”);
const typeDefs = gql` scalar Upload type File { status: Int! url: String! } type Mutation { uploadFile(files: Upload!): File } `;
module.exports = typeDefs

You might have noticed that we have defined a scalar named Upload in our schema. This scalar will be “mapped” to the implementation of the graphql upload requirement.

Now that our schema has been created, we can start creating our resolvers.

Upload file resolver

Let’s import the necessary modules and dependencies first.

const { GraphQLUpload} = require(“graphql-upload”); import { v4 as uuid } from ‘uuid’;
const s3 = require(“./config/awsS3/s3”);
module.exports = {
};

Next, we’ll map our scalar Upload to the implementation of graphql upload.

const { GraphQLUpload } = require(“graphql-upload”);
import { v4 as uuid } from ‘uuid”;
const s3 = require(“./config/awsS3/s3”);
module.exports = { Upload: GraphQLUpload
};

We can now begin working on our mutation, so we go to our arguments to obtain the file.

const { GraphQLUpload } = require(“graphql-upload”);
import { v4 as uuid } from ‘uuid’;
const s3 = require(“./config/awsS3/s3”);
module.exports = {
Upload: GraphQLUpload,
Mutation: {
uploadFile: async (_, { files }) => {
const unique = uuid() // generate random UUIDs.
const { createReadStream, filename, mimetype, encoding } = file;
const { Location } = await s3
.upload({
Body: createReadStream(),
Key: ${unique}/${filename},
ContentType: mimetype,
})
.promise();
return {
status: 200,
url: Location,
};
}
};

Testing your File/Image Upload

Alright. Now it’s time to have a taste of it. But wait file uploads are not supported by Apollo Playground’s Interface? So, you must test your File upload using Postman or you can also use the same thing with a curl command

curl –location –request POST ‘http://localhost:4000/graphql’
–form ‘operations=”{“query”: “mutation ($files: Upload!) { uploadFile(files: $files) { status url } } “, “variables”: {“file”: null}}”‘
–form ‘map=”{“0”: [“variables.file”]}”‘
–form ‘0=@”./images/mohd-nadeem.jpg”‘

Conclusion

With this approach, you can now upload files In GraphQl and access them with S3 URLs. I hope you found it interesting. If you noticed any errors in this article, please mention them in the comment sections

Leave a Comment

Your email address will not be published. Required fields are marked *

Recent Posts

salesforce world tour essentials dubai
Salesforce World Tour Essentials Dubai (May 16th, 2024)
simplifying npl the magic of natural language processing
Simplifying NLP: The Magic of Natural Language Processing
streamlining-salesforce deployment with gearset a devops revolution Part 2
Streamlining Salesforce Deployment with Gearset: A DevOps Revolution (Part 2)
streamlining-salesforce deployment with gearset a devops revolution Part 1
Streamlining Salesforce Deployment with Gearset: A DevOps Revolution (Part1)
apache ant with provar automation tool
Apache Ant With Provar Automation Tool
Scroll to Top