Send Amazon S3 Lambda Script
You can send your Amazon Simple Storage Service (S3) logs to Loggly using an AWS Lambda Script. This requires you to set up and maintain a Lambda script on your AWS account. This script reads the logs from your S3 bucket and then sends them to Loggly over our HTTP/S Bulk Endpoint. It sends the data in raw format, and you can use our automated parsing or derived fields to parse the data. Lambda scripts currently only run in specific AWS regions, and they can only pull logs from S3 buckets in the same region. As an alternative, you can also use our bash script if you prefer to run on your own server infrastructure. Alternatively, we offer an automatic service for ingesting logs from your bucket.
Option 1: AWS Lambda Script
1. Get the Lambda Code
Clone the git repo
git clone https://github.com/loggly/S3ToLoggly.git cd S3ToLoggly
Install require npm packages.
Zip up your code
zip -r S3ToLoggly.zip S3ToLoggly.js node_modules
The resulting zip (S3ToLoggly.zip) is what you will upload to AWS in step 2 below.
2. Configure the Lambda Function
Go to AWS Lambda Console Console. Click “Create a Lambda function” button. (Choose “Upload a .ZIP file”). Fill the following details:
- Name: S3ToLoggly
- Upload lambda function (zip file you made above in Step 1)
- Handler*: S3ToLoggly.handler
- Role*: In the drop down click “S3 execution role”. (This will open a new window to create the role, click Allow)
- Set memory at 128MB
- Set Timer to 10 seconds.
Configure Event Source to call S3ToLoggly when logs added to S3 bucket. Go to AWS Lambda Console . Make sure the S3ToLoggly lambda function is selected, then click ‘Actions->Add event source‘. Then fill the following details:
- Event source type: S3
- Bucket: Choose the S3 bucket that contains your logs.
- Event type: ObjectCreated (All)
Using the S3 Management Console click the S3 bucket that contains your logs.
Under Properties -> Events, select Send to Lambda Function and select the Lambda function which you have created.
3. Tag Your Logs in S3
The Lamba script will look for your customer token as an S3 tag, which it will use to send data to your account. It will also add tags for Loggly, which will make the logs easier to find in search. Using the S3 Management Console click the s3 bucket that contains your logs.Under Properties -> Tags, add the following tags:
- Key: loggly-customer-token
- Key: loggly-tag
- TOKEN: your customer token from the source setup page
4. Verify Events
Search Loggly events with the tag as s3_lambda_tag over the past 20 minutes. It may take few minutes to index the events. If if doesn’t work, see the troubleshooting section below.
Option 2: Bash Script
Use this script if you prefer to run on your own server infrastructure without using Lambda. It currently only supports uncompressed line-separated text files.
1. Run The Script
Run our automatic configure-s3-file-monitoring script below to continuously read the files in your bucket and send the logs to Loggly through your syslog daemon. Alternatively, you can follow our manual configuration instructions below.
curl -O https://www.loggly.com/install/configure-s3-file-monitoring.sh sudo bash configure-s3-file-monitoring.sh -a SUBDOMAIN -u USERNAME -s3url S3URL -s3l S3ALIAS
- SUBDOMAIN: your account subdomain that you created when you signed up for Loggly
- USERNAME: your Loggly username, which is visible at the top right of the Loggly console
- S3URL: an s3 URL that points to either a file or bucket. For example, s3://bucket/file.
- S3ALIAS: an easy to recognize name that must be unique for S3 URL
You will need to enter your system root password so it can update your rsyslog configuration. It will then prompt for your Loggly password.
2. Verify Events
Search Loggly for events with the s3file tag over the past 20 minutes. It may take a few minutes to index the event. If it doesn’t work, see the troubleshooting section below.
Advanced AWS S3 Logging Options
- s3cmd – A command line tool for downloading data from s3
- GitHub Source – View the source or suggest improvements
- Loggly Libraries Catalog – New libraries are added to our catalog
- Search or post your own S3 logging question in the community forum.
If you don’t see any data show up in the verification step, then check for these common problems.
- Wait a few minutes in case indexing needs to catch up
- Make sure you’ve included your own customer token
- Make sure you have configured same roles as mentioned above.
- Search or post your own Amazon S3 logging questions in the community forum.