Python Sending FreeBSD logs to Amazon's CloudWatch Logs service


Currently I have an old legacy process of syncing server logs from cloud servers back to a centralized sever using rsync/ssh. I want to make this pain go away.

In a perfect world since our servers mostly live in EC2, I would love to point all the logs we care about to AWS's CloudWatch Log service. The problem I run into is AWS does not have a FreeBSD script or package to do this, only linux. AWS uses a python script to make this bit happen. I am not much of a coder, just enough to break stuff so I thought I would ask the community the following questions.

1. Has anyone found an alternate way to send FreeBSD logs to AWS's CloudWatch Logs service?

2. If anyone is interested, I have attached the actual python install script to this thread (changed extension to .txt to upload) for you python experts out there. If anyone is interested in taking a stab at modifying the script for BSD, maybe we can work out some compensation.

Thanks for your time,




  • awslogs-agent-setup.txt
    46.9 KB · Views: 609
  • Thanks
Reactions: Oko
Hi Lee,
I made some hacks to the script to get it to run on FreeBSD , so it installs now everything it does as on a Linux system. However
the files that are created/installed also have to be modified to run on a FreeBSD environment. This is a much bigger job then
hacking the python script. Should you still be interested in the script so far, let know.

Henk Dijkstra
Have you ever progressed with this project? I am looking into using AWS CloudWatch Logs with FreeBSD. Up to now I have successfully used them with CentOS and Amazon Linux, it would be great to leverage your work if possible, ljimber and Henk Dijkstra. Thank you.
Hi Lee,
I have not continued working on the log script (so it is at the state of december 2016, but the FreeBSD log entries are still being sent to cloudwatch)
We are now fully occupied in transfering our old datacenter (all with FreeBSD) to AWS.
We found out lots of things how to use FreeBSD in the cloud. We now can fire-up EC2 instances in an autoscale group by only running a boto3 python script (and finally found out how to pass the userdata to the new instance at boot time).
We found out how to use a lambda function to change the IP address in route53 of a server we instanciate with such boto3 scripts, and we found out how to create multiple tiers (we now use EFS between the tiers to communicate data, but for new development we are considering SQS.
There is still a lot of work to be done but I think we are in good progress.


  • awslogs-agent-setup.txt
    44.3 KB · Views: 549