My IT team recently activated the MySQL general log to meet regulatory requirements. However, we’ve encountered significant performance issues:
- After 20+ hours of activation, the general log generated over 3 million rows.
- Today, when my team attempted to query the general log data, our application experienced errors.
- Our Grafana dashboard showed 100% CPU usage during the issue.
Our database server has the following hardware specifications:
- vCPU
- GB RAM
It’s clear that the current hardware is insufficient to handle the load caused by the general log. I’ve explored alternatives like using Fluentd to forward logs, but I’m unsure if this is the right approach.
Here my current mysql general log configuration
general_log_file = /var/log/mysql/mysql.log
general_log = 1
log_output = 'table'
Here are my questions:
- Should I consider upgrading the server hardware, or is there a better way to manage the general log efficiently?
- Are there any best practices or tools that can help offload and process the MySQL general log data without impacting server performance?
2
Answers
Definitely you need to deploy on server hardware that is capable of serving your traffic. If you have a regulatory requirement to keep query logs, then that’s what you need to do.
You could process query logs on a different server. This is what I’d do. It would reduce the impact on your database.
First, use file-based query logging.
log_output=file
.query.log
toquery-YYYYMMDD.log
FLUSH LOGS;
. This closes the old log file, and creates a new log file with the original name.scp
the dated log file to your log-archive server, where you can do whatever you want with it.You might want to investigate using a log-search product such as Graylog or another product. Check them out and do some proof of concept tests to see which one meets your needs. Stack Overflow is not a place for technology product recommendations, because you know your needs better than us. It’s up to you to evaluate several options.
You said your application "experienced errors" when searching the log file, but that’s not enough description to make a guess at what a solution could be. It’s like telling your auto mechanic that your car "makes a funny sound."
I don’t recommend saving the general log into tables as it will have a significant impact on the performance of MySQL. You should set the log_output to File to store the general log in files. You can use some log tools to collect it in real time and then analyze it instead of querying the mysql.general_log of MySQL. Both reading from and writing to the mysql.general_log will consume I/O cpu and thus affect the performance of MySQL.