Support

Amazon S3 Archive

Archiving Logs to Amazon S3

Once logs age past your log retention period, they’re no longer accessible. What if you still need them? Well, Loggly can facilitate log archiving by sending logs to an Amazon Web Services (AWS) S3 bucket. Logs in an S3 bucket are kept until you remove them, so you’ll always have a copy handy if you need them for historical trend analysis, auditing or other purposes. The S3 bucket is a separate product maintained by Amazon. We cannot help create or maintain accounts within AWS. However, we’ll give you an overview of how to set up archiving on S3 and point you to Amazon’s extensive documentation on all things AWS, when necessary. We assume that you already have an Amazon account. If you don’t have one, you’ll have to create an Amazon account.

Note: Loggly’s Amazon S3 archiving feature is available to the Enterprise tier only, offering extensive capabilities described below which include filtering, compression, splitting logs into folders and more.  Only account owners can set up archiving within Loggly. If that’s not you, contact your account owner.

Overview

The configuration page is located at “Account” -> “Archiving”. Here you will be able to create new S3 Archiving jobs and manage existing ones:

Screen Shot 2016-07-24 at 3.44.47 PM

The image above shows the details of the configured S3 Archiving jobs. The following information is available on this page:

Status:

check Indicates that the configuration is successful.

warning Indicates that there is a problem and urgent attention is needed.

Enabled:

Enable Toggle the check box to enable or disable S3 Archive jobs(s).

Destination: S3 Archive destination name.

Filter: Specifying a “Tag” ensures that we only archive events that match with this Tag.

Split into folders by: Logs may also be split into folders within your bucket based on the values of a field.

Compression: Shows the chosen compression format.

Output: Shows the chosen output format.

Last Written: The date when the last data was written.

Actions:

edit Edit button to edit the corresponding S3 Archive configuration.

cross Permanently delete the corresponding configuration. This cannot be undone.

 

Adding a new Archiving destination

To set up S3 Archiving, proceed to the “Account” -> “Archiving” tab and then click on the “Add New” button.

Screen Shot 2016-07-14 at 11.12.16 AM

Step 1.

Head over to Amazon and grant permissions to aws@loggly.com so that we can write to your bucket.

A. Create an S3 bucket

After you’ve setup an account you’ll have to set up a bucket that we can send logs to. Check out Amazon’s documentation on setting up a new bucket. Ignore the “Set Up Logging” button. You’ll come back into our product to do that.

B. Give Loggly permission within AWS to write to the bucket

Once you have the bucket created:

  • Select the bucket in the buckets panel and click the “Properties” tab
  • Click the “Permissions” tab
  • Click the “Add More Permissions” button
  • In the ‘grantee’ field, enter aws@loggly.com
  • Check the boxes for “List”, “Upload/Delete”, and “View Permissions”
  • Click “Save” in the lower-right corner:
Screen Shot 2016-08-09 at 11.25.49 PM

Should you need further help with this, AWS has documentation on editing bucket permissions.

Step 2.

Set up how you want your archive to be stored in your Amazon bucket.

Once you’ve set up an account and an S3 bucket, you’ll need to give Loggly some information so we can write to the bucket.

A. Enter the name of the Destination:

This is the S3 Bucket that you created. This should be the same name as your Amazon S3 bucket:

Screen Shot 2016-07-24 at 3.18.00 PM

Once you have entered the name of the S3 bucket, you could then test the permissions using the Test Permissions button as shown below:

Screen Shot 2016-07-24 at 3.30.33 PM

If the test result shows Success, we have successfully established your new S3 bucket within Loggly to archive the logs automatically. Continue to the next step to configure optional settings or click save if you just want to use the basic configuration to archive all Loggly data to your S3 bucket.

B. Apply Filters (Optional):

Filtering allows you to send only a subset of your logs into the destination bucket. You may specify a field value with an optional wildcard as the filter.

Screen Shot 2016-07-24 at 4.45.14 PM

C. Split Into Folders By (Optional):

Logs may also be split into folders within your bucket based on the values of a field. You could select one of the options under “Split Into Folders By”:

In the above example, syslog.host is used to create names of the resulting unique folders for each syslog host.

Below the field above, you will see the following advanced options to select the Date Format:

Important: Enter a default folder name into which you would like to store events that do not contain the field by which we are splitting the events. By default, if no folder name is specified, we will store these events in a folder named “no_match.”

You could also supply a prefix or suffix for the split-by folder name. This will help you better manage your log archives.

Screen Shot 2016-07-24 at 4.31.49 PM

Click on save when you have selected the desired options. An entry will be created under the Archiving page. Please refer to the status field if you encounter any issue.

Step 3: We send logs to your S3 bucket

That’s all you need to do. Once we verify access to your S3 bucket, we’ll write logs in batches every half hour. After you first set-up an S3 bucket it may take up to 8 hours before you start seeing logs in your bucket.

Step 4: Find an S3 Client

When you’re ready, you should find an S3 client so that you can read logs written to your S3 bucket. Remember, once logs are deleted from Loggly’s search index, they are no longer accessible from our site. There are various clients available for Mac OS, Windows and *nix systems. At Loggly, we use S3cmd, an open source command line tool for managing data stored with S3.

 

Troubleshooting Amazon S3 Archiving

If you don’t see any data show up in the search tab, check for these common problems.

  • Wait a few minutes in case indexing needs to catch up.
  • Check if the Archive job is enabled under the “Archiving” tab.
  • Check if the Archive job’s status is healthy.
  • Check if you have log data to archive in Loggly.
  • Check the Account overview page to see if you are exceeding the data volume limit of your plan.
  • Check for errors on the Archiving page and correct them.

Still Not Working?

Thanks for the feedback! We'll use it to improve our support documentation.