Amazon S3 Ingestion
Loggly can automatically retrieve new log files or objects added to your S3 bucket(s). Amazon can be configured to send an event to Loggly to notify us when a new object is added. When we receive the notification, we will download the log file and ingest it into Loggly. To make the process of sending events reliable, we send them through Amazon’s Simple Queue Service (SQS), which saves the event until we can retrieve it. Our script will set up all the permissions and the queue automatically.
At this time our ingestion service only supports Elastic Load Balancer (Classic) and CloudFront logs in the following regions: us-east-1, us-west-2, us-west-1, eu-west-1, ap-southeast-1, ap-northeast-1, ap-southeast-2, sa-east-1. Please note that ELB (Application) Logs are not supported. Alternatively, you may use our S3 Lambda script in regions where Lambda scripts are supported.
Alternatively, you could also configure your AWS account yourself by following the manual setup instructions.
The configuration page is located at “Source Setup” -> “S3 Sources”. Here you will be able to create new AWS S3 sources and manage existing ones:
The image above shows the details of the configured AWS sources. The following information is available on this page:
Indicates that the configuration is successful.
Indicates that there is a problem and urgent attention is needed.
Toggle the check box to enable or disable AWS source(s).
Path: Location of the S3 logs.
Tags: Tag(s) related to the S3 bucket.
Actions: Edit or Delete the corresponding source entry.
Adding a new AWS source
This can be accomplished using the script option described below. If you prefer doing it manually, please click the “Manual” tab and follow instructions for the manual setup.
To set up S3 Ingestion, proceed to the “Source Setup” -> “S3 Sources” tab and click on the “Add New” button.
Enter the name of the S3 bucket from which you would like to send logs. As an option you can provide a Prefix also. A prefix operates similarly to a folder. If you add a prefix here then only keys (or files) that are in that folder will be ingested by Loggly. The prefix can also contain multiple folders separated by slashes, for example “loggly/2017/01”:
Note: One prefix per bucket is allowed, if you change the prefix then the keys with the new prefix will be ingested.
Enter the AWS account ID for the account containing the bucket. Your account ID will be used to set up the queue to send notifications about the new objects in your bucket. You can find your account ID on the Security Credentials page:
If you have multiple active tokens, please choose the customer token you would like to use to send logs to Loggly. For example, select the appropriate token from the drop-down field:
If you have only one active token, that token will be used as default. Therefore, this step will not be presented on the page if you have one active token.
The script requires administrator access to configure your account. If you haven’t already, run the aws configure command using administrator credentials. Please install the AWS CLI tool if you haven’t already:
Run the script below to give Loggly access to read from your chosen S3 bucket, and set up a SQS queue so that we’ll receive notification when new objects are added to your S3 bucket:
s3bucket: The name of the bucket from which you would like to send logs.
acnumber: Your AWS account number, which you can get from your AWS account page in the console.
user (optional): The IAM username that Loggly should use when accessing the queue and bucket. Please use a dedicated user for Loggly. The default is loggly-s3-user.
sqsname (optional): The name of the SQS queue that Loggly will receive notifications from when objects are added to the bucket. Please use a dedicated queue for Loggly. The default is loggly-s3-queue.
Enter SQS Queue name:
You may optionally provide one or more comma-separated tags that describe your data to make it easier to search:
Click save after you have entered the information. You will be back to the AWS Sources page and if the configuration was successful, you will see a green checkmark for this source.
Troubleshooting Amazon S3 Ingestion
If you don’t see any data show up in the search tab, then check for these common problems.
- Wait a few minutes in case indexing needs to catch up.
- Try the manual method if the script method doesn’t help.
- Check if the AWS source is enabled under the AWS Sources tab.
- Check the log files to make sure they exist and you have the right path.
- The objects added previously will not be sent to Loggly, so only test by sending new logs.
- Check the Account overview page to see if you are exceeding the data volume limit per your subscription.
- Check for errors on the page and correct them.
Still Not Working?
- Search or post your own Amazon S3 Ingestion questions in the community forum.