Import/Export from AWS S3

Modified on Wed, 07 Jun 2023 at 03:52 PM

INTRODUCTION:


To import from an AWS S3 bucket to Gigasheet, you can use the /maven/import endpoint. See documentation here.

TABLE OF CONTENTS

Import from AWS S3

API Request

To import from an AWS S3 bucket to Gigasheet, you can use the /maven/import endpoint. See documentation here.

For the S3 connector, the request body will look like:

{

  "connector": "S3",

  "params": {

    "bucketPath": "s3://MYBUCKET/path/"

  }

}

  • The bucketPath must include the S3 bucket and prefix, if any. If the path ends with a “/” or is only a bucket name, it is assumed to be a prefix and the importer will try to copy all objects that match the prefix. Otherwise, the importer will try to find the object with the exact name as provided to copy.

When successful, the import will result in a folder in your library called “import-TIMESTAMP” where TIMESTAMP is the UTC timestamp written as YYYYMMDDhhmmss for when the import is called. Inside that folder, there will be a folder with the name of the bucket and the folder structure of the full S3 path for up to 25 objects from your bucket.

From there, you can rearrange the files however you’d like!

S3 Permissions

In order for Gigasheet to be able to read from your bucket, you’ll need to provide Gigasheet read-access to your bucket (or bucket path). To do that, you’ll need to attach a policy to your bucket. You can do that by:

  1. By default, AWS blocks all public access to S3 buckets. You’ll need to allow public access by disabling all of the “block” settings. See here for details. It should look like below when done.

  1. You need to explicitly tag the bucket with your Gigasheet account email address(es) so we can verify that your account is tied to the bucket. To do this, we use bucket tags. You can go to your bucket in AWS, go to properties and add the tags. (When logged into the console, you can use this link to directly get there: https://s3.console.aws.amazon.com/s3/bucket/BUCKETNAME/property/tagging/edit?region=REGION . You have to put in your own BUCKETNAME and REGION.

You must add a tag called GigasheetUsers. The value will be a space-separated list of all Gigasheet email addresses that should have access to this bucket.

You should see something like this in the bucket properties when done:

  1. You need to allow Gigasheet’s import role (arn:aws:iam::522801257022:role/gigasheet-import) access to the bucket (or bucket path). Here is an example policy:

{

    "Version": "2012-10-17",

    "Statement": [

        {

            "Sid": "AllowCrossAccountRead",

            "Effect": "Allow",

            "Principal": {

                "AWS": "arn:aws:iam::522801257022:role/gigasheet-import"

            },

            "Action": [

                "s3:GetObject",

                "s3:ListBucket",

                "s3:GetBucketTagging"

            ],

            "Resource": [

                "arn:aws:s3:::MYBUCKET",

                "arn:aws:s3:::MYBUCKET/path/*"

            ]

        }

    ]

}

Limitations

  • S3 Import is limited to buckets in us-east-1

  • S3 Import is limited to objects < 5 GB

Export to AWS S3

API Request

To export from Gigasheet to an AWS S3 bucket, you can use the /maven/export endpoint. See documentation here.

For the S3 connector, the request body will look like:

{

  "connector": "S3",

  "params": {

    "bucketPath": "s3://MYBUCKET/path/"

  },

  "exportHandle": "GIGASHEET_EXPORT_HANDLE"

}

  • The bucketPath must include the S3 bucket and prefix, if any. Gigasheet will add a new folder based on the UTC timestamp (YYYYMMDDhhmmss) and place the exported zip file in it.

  • The exportHandle is the handle for the already created export in Gigasheet.

S3 Permissions

In order for Gigasheet to be able to write to your bucket, you’ll need to provide Gigasheet write-access to your bucket (or bucket path). To do that, you’ll need to attach a policy to your bucket. You can do that by:

  1. By default, AWS blocks all public access to S3 buckets. You’ll need to allow public access by disabling all of the “block” settings. See here for details. It should look like below when done.

  1. You need to explicitly tag the bucket with your Gigasheet account email address(es) so we can verify that your account is tied to the bucket. To do this, we use bucket tags. You can go to your bucket in AWS, go to properties and add the tags. (When logged into the console, you can use this link to directly get there: https://s3.console.aws.amazon.com/s3/bucket/BUCKETNAME/property/tagging/edit?region=us-east-1 . You have to put in your own BUCKETNAME.

You must add a tag called GigasheetUsers. The value will be a space-separated list of all Gigasheet email addresses that should have access to this bucket.

You should see something like this in the bucket properties when done:

  1. You need to allow Gigasheet’s export role (arn:aws:iam::522801257022:role/gigasheet-export) access to the bucket (or bucket path). Here is an example policy:

{

    "Version": "2012-10-17",

    "Statement": [

        {

            "Sid": "AllowCrossAccountWrite",

            "Effect": "Allow",

            "Principal": {

                "AWS": "arn:aws:iam::522801257022:role/gigasheet-export"

            },

            "Action": "s3:PutObject",

            "Resource": "arn:aws:s3:::MYBUCKET/path/*"

        },

        {

            "Sid": "AllowCrossAccountGetBucketTags",

            "Effect": "Allow",

            "Principal": {

                "AWS": "arn:aws:iam::522801257022:role/gigasheet-export"

            },

            "Action": "s3:GetBucketTagging",

            "Resource": "arn:aws:s3:::MYBUCKET"

        }

    ]

}

Limitations

  • S3 Import is limited to buckets in us-east-1
  • S3 Import is limited to objects < 5 GB

Was this article helpful?

That’s Great!

Thank you for your feedback

Sorry! We couldn't be helpful

Thank you for your feedback

Let us know how can we improve this article!

Select atleast one of the reasons

Feedback sent

We appreciate your effort and will try to fix the article