Overview
In most Linux systems, the MySQL database instance installed by default is often configured to a safe range of parameters to ensure stable operation. The query caching settings are typically not enabled or set very conservatively. As a result of this conservatism, database access uses little system memory to cache repeated quries. By enabling the cache and tuning it to the application needs, improvements can be visibly seen in most applications especially Magento eCommerce Stores.
For initial testing, a cache size of 32M was decided on as a starting point. Prior to any changes, the database cache parameters located in /etc/my.cnf were configured as:
query_cache_limit 1M query_cache_size 0 query_cache_type ON
Changing the parameters
To make any changes, the file /etc/my.cnf was edited to include the following parameters and the database restarted. After changes the database reports the following:
query_cache_limit 2M query_cache_size 32M query_cache_type ON
Testing
To test our changes we initially installed a Magento eCommerce store with the default demo product database installed. We timed the fetch of every product using siege. By increasing the cache size on the database a further 200ms was shaved off the response time of the site. This simple change would indicate that there might be more room for improvement in the parameters and layout of the database sub-system to optimise it specifically for Magento use, mindful that optimising for Magento can be a tricky task at best!
Traps!
The popular MySQLTuner script (as detailed in our MySQL Performance Tuning and Optimisation guide) was run and indicated that changing the thread count would be beneficial, however this change resulted in a performance drop by 290ms!
Conclusion
Out of the box, the MySQL database configuration is good but could be significantly better. Increasing the cache provided an instant beneficial response. In addition to database performance enhancements there are also various cache technologies such as Memecached and Redis that will improve performance by caching often referenced data.