I am the sole dev working with a very large warehouse management system for a small business. Our software is housed on a local server that is only accessible within the building, but I also have a local copy of the system on my machine to use for development purposes before things get pushed to a git repository to be pulled into the live system. Pretty standard stuff.
The database is huge, as it contains data for every order and every product and every customer the company has obtained or produced since the beginning of the company, around 10 years ago. I obviously have to keep my local database up-to-date for troubleshooting specific orders, etc. and I can’t just import a database with millions of records using phpmyadmin, so I use BigDump to stagger the import whenever I need to update my local dev db to match the production db.
Here is my problem: Before I can import the updated DB with BigDump, I have to completely drop the dev db and create a blank one with the same name. So I run the script:
DROP DATABASE db_name;
CREATE DATABASE db_name;
I KNOW this is very dangerous and am terrified that one day I will accidentally run this on the production db and completely delete our entire warehouse db. (We of course backup the production db regularly, but we get orders every minute or more and if I accidentally drop the db, there is a good chance we will lose a lot of data).
What should I add to the script to make sure it only runs if it is my local machine’s development db server? I know that I can add a use statement for the db_name or just make the db_name different on my dev server, but it is a legacy piece of software that I didn’t design and for various reasons, I have to keep the db_name the same in both environments. Is there a way to set a use statement for the server_name? Both servers are named localhost, so does that throw another wrench in the situation? As you may be able to tell, server configuration is not my strongest skillset.
Here are my two DB server specs (if that is helpful):
Local Dev DB Serber
Server: Localhost via UNIX socket
Server type: MySQL
Server version: 5.7.26 – MySQL Community Server (GPL)
Protocol version: 10
Server charset: UTF-8 Unicode (utf8)Live Production DB Server
Server: MariaDB 10 (Localhost via UNIX socket)
Server type: MariaDB
Server version: 10.0.32-MariaDB – Source distribution
Protocol version: 10
Server charset: UTF-8 Unicode (utf8)
2
Answers
This seems more of a dba question than a PHP question IMHO. But this is relevant. I would prevent the drop_priv on any remote users of the production database.
I would look at this link for more details.
Prevent Drop Databases From MySQL
And what I would actually do is make sure there are 2 different user accounts for each database. On the production database, only ever EVER access it from your ‘dev-name’@’my-pc-name-or-ip’ account if you need to access it locally. And make sure that user cannot drop databases. Strict adherence to utilizing ‘user-name’@’ip-address’ for permissions, instead of doing ‘username’@’%’ will go a long way here.
Do not ever use the full root permission user on the production database. There’s no need. There’s also no need for any external user to have drop permissions on your production database, really. If it were me, and I was running the risk of dropping all tables in production in some corner of a work flow, I would make sure the only user that has drop privileges is ‘root’@’localhost’
Also, I would check your backup policy in the event of worst case. You should know how often its backing up.
Try to use environment variables. Here is a nice article:
Using Environment Variables in PHP
You should define your env., like
ENV=dev
orENV=prod
etc. in the configuration of your Apache or Nginx host. Here is an example for Apache conf of v-host:enter link description here
Then just check your environment in the code. Here is an example
enter link description here