My website can handle 40,000 people or more simultaneously and run fast but a search engine bot will kill mysql. It's been driving me insane because once the bots come, the site will show "Could not connect: too many connections" and I have to manually restart mysqld to get the website back up. I have been tackling this for a year. I have made so many adjustments to apache and mysql tuning and nothing seems to work. I have changed the max_connections from 300 to 1800 to 10000 and that does not fix the bot problem.
I use Amazon linux and have a huge instance/server. Ram is not an issue. I have done countless tech supports and they find nothing wrong EVER. SO I have to assume it has something to do with my programming. I do not use Wordpress, I built my site from scratch but like I said it can handle 40,000 people no problem. Bots though, crash it.
my connect script is simple:
die('Could not connect: ' . mysql_error());
The odd thing is, there is always "1" for current connections even if there is 2000 people on the site. So that is why I feel like I'm doing something wrong with connecting to the database.
Does anyone have experience or advice on keeping a site running at all times with heavy bot traffic? PLEASE!!! I repeat, this is not an increase max_connections issue.