We have customers, and Customer
objects. One of these, which I was creating with a migration, was Customer( account='INTERNAL' ...)
as a placeholder for when we manufacture stuff for internal consumption.
Customers come and go, so the objects need to be deletable. But this single row must not be deleted.
What is the best way to stop it getting accidentally deleted? I can replace all instances of Customer.objects.get(account='INTERNAL')
with get_or_create
so it is self-resurrecting, but it feels like storing up trouble for the future if there are ForeignKey references to it and then it gets deleted. (It can’t be Models.PROTECT, because other customers can be deleted).
Tagged [postgresql], because that’s what we use. I suspect there might be a way to use database functionality to do what I want, but I’d appreciate it if the answer showed how to write doing that as a Django migration (if that’s possible). I know more about Django than SQL and Postgres extensions.
[update] As an imperfect fix, I have subclassed the Customer.delete method. It now throws an error if the account name is INTERNAL. But this isn’t bullet-proof. This may be good enough, but it doesn’t feel like a right thing.
3
Answers
you can add an extra property for eg: customer_type and assign value "internal" for row which must not be deleted and some other value for other customers. When deleting, you can check this column.
You can use
on_delete=models.PROTECT
to created related record that will prevent deletion of the Customer until it existsCreate special model for this case
When you will try to remove related customer it will raise DB error
I would suggest to cover this requirement by supporting authorization (and eventually authentication). And this can be done at a DBMS level (PostgreSQL) or API level.
Example: in your API you design/develop two following methods:
At a DBMS level, I’m inclined to believe you can establish this as well with grants and/or views.