Uploading to Your S3 Bucket
PDFBolt offers the option to upload generated PDFs directly to your own S3-compatible storage, providing enhanced security and compliance, especially with GDPR requirements. By utilizing this feature, you ensure that your documents are stored in your own environment, and you have full control over them.
Advantages of Uploading to Your S3 Bucket
-
Enhanced Security: By storing PDFs in your own S3 bucket, you reduce the risk associated with transferring files over networks and relying on third-party storage.
-
GDPR Compliance: Direct uploads to your storage mean that PDFBolt doesn't store your documents longer than necessary, aligning with GDPR's data minimization principles.
-
Control Over Data: You have full control over access permissions, storage duration, and data handling policies for your documents.
How It Works
1. Generate a Pre-Signed URL: Create a pre-signed URL that allows PDFBolt to upload the PDF directly to your S3 bucket without requiring access to your storage credentials.
2. Include customS3PresignedUrl
in Your Request: Add the generated pre-signed URL to the customS3PresignedUrl
parameter in your API request to PDFBolt.
3. PDFBolt Uploads to Your Bucket: Once the PDF is generated, PDFBolt uses the pre-signed URL to securely upload the document to your S3 bucket.
4. Access Your PDF: Retrieve and manage the PDF from your S3 bucket using your usual workflows.
Supported S3-Compatible Storage
PDFBolt supports any S3-compatible storage, not just AWS S3. This means you can use services like:
- Amazon S3
- DigitalOcean Spaces
- MinIO
- Wasabi
- Backblaze B2
This list is not exhaustive. Other S3-compatible storage providers are also supported.
Example: Generating a Pre-Signed URL in Node.js
Below is a detailed guide on how to generate a Pre-Signed URL for uploading files to any S3-compatible storage using Node.js and AWS SDK v3.
Step 1️⃣ Install Required Packages
To start, ensure that you have the latest versions of the necessary packages for AWS SDK v3:
npm install @aws-sdk/client-s3
npm install @aws-sdk/s3-request-presigner
Step 2️⃣ Configure the S3 Client
Set up the S3 client with the details of your storage provider. Replace the placeholder values with your actual storage credentials and endpoint:
const { S3Client, PutObjectCommand } = require('@aws-sdk/client-s3');
const { getSignedUrl } = require('@aws-sdk/s3-request-presigner');
const s3 = new S3Client({
endpoint: 'https://your-s3-compatible-endpoint.com', // Replace with your storage endpoint
region: 'your-region', // Replace with your region
credentials: {
accessKeyId: 'your-access-key-id', // Your Access Key ID
secretAccessKey: 'your-secret-access-key' // Your Secret Access Key
}
});
Step 3️⃣ Create a Function to Generate the Pre-Signed URL
Create a function that generates a Pre-Signed URL. This function specifies the bucket name, file name, and additional parameters like content type and expiration time.
async function generatePresignedUrl(bucketName, fileName) {
const command = new PutObjectCommand({
Bucket: bucketName, // Name of your bucket
Key: fileName, // Name of the file to upload
ContentType: 'application/pdf' // Specify the file type
});
try {
const url = await getSignedUrl(s3, command, { expiresIn: 3600 }); // URL valid for 1 hour. Adjust expiresIn as needed.
console.log('Pre-signed URL:', url);
return url;
} catch (error) {
console.error('Error generating Pre-Signed URL:', error);
}
}
Step 4️⃣ Call the Function to Generate a URL
- Invoke the
generatePresignedUrl
function by passing the name of your bucket and a unique file name. To ensure the file name is unique, you can append a timestamp to the name.
generatePresignedUrl('your-bucket-name', `document-${Date.now()}.pdf`);
- To run the script, save it in a file (e.g.,
generatePresignedUrl.js
) and execute it in your terminal:
node generatePresignedUrl.js
- If everything is configured correctly, this will generate a Pre-Signed URL similar to the following:
https://your-bucket.s3.amazonaws.com/path/to/your-document.pdf?AWSAccessKeyId=YOUR_ACCESS_KEY&Expires=1609459200&Signature=YOUR_SIGNATURE
Complete Code Example
const { S3Client, PutObjectCommand } = require('@aws-sdk/client-s3');
const { getSignedUrl } = require('@aws-sdk/s3-request-presigner');
// Configure the S3 client for your storage provider
const s3 = new S3Client({
endpoint: 'https://your-s3-compatible-endpoint.com', // Replace with your storage endpoint
region: 'your-region', // Replace with your region
credentials: {
accessKeyId: 'your-access-key-id', // Your Access Key ID
secretAccessKey: 'your-secret-access-key' // Your Secret Access Key
}
});
async function generatePresignedUrl(bucketName, fileName) {
const command = new PutObjectCommand({
Bucket: bucketName, // Name of your bucket
Key: fileName, // Name of the file to upload
ContentType: 'application/pdf' // Specify the file type
});
try {
const url = await getSignedUrl(s3, command, { expiresIn: 3600 }); // URL valid for 1 hour
console.log('Pre-signed URL:', url);
return url;
} catch (error) {
console.error('Error generating Pre-Signed URL:', error);
}
}
// Example usage
generatePresignedUrl('your-bucket-name', `document-${Date.now()}.pdf`);
Using the Pre-Signed URL with PDFBolt
Use the generated URL as the customS3PresignedUrl
in your API request to PDFBolt:
{
"url": "https://example.com",
"customS3PresignedUrl": "https://your-bucket.s3.amazonaws.com/path/to/your-document.pdf?AWSAccessKeyId=YOUR_ACCESS_KEY&Expires=1609459200&Signature=YOUR_SIGNATURE"
}
- Permissions: Ensure that the IAM user or role you use to generate the pre-signed URL has the necessary permissions to perform the
putObject
operation on the specified bucket and key. - Expiration: Adjust the expiration time -
expiresIn
- for the pre-signed URL based on your specific use case.
For more details on how to generate pre-signed URLs in other languages, refer to the AWS SDK documentation or your storage provider's SDK.