something to share.
not sure about your site, but I believe most custom made sites doesn’t handle record duplication effectively.
1. take a scenario where we have a few tables related to each other. everytime someone updates one table, the script will need to go to other tables and add an entry/entries accordingly. deleting the record is the same but some site doesn’t delete other entries in the related tables. as a result, there is a lot of unwanted records in the db.
2. Another tricky part is when there are links to images in the db. deleting the record means deleting the actual file as well, not just removing the entry from the table. Most sites doesn’t remove the actual file/image. As a result, the site will keep growing and growing with many unwanted images.
3. End users love the “duplicate” feature. Sometimes, setting up a new post/record involves adding records to about 5,6 tables. When we duplicate a record, we should be duplicating the image as well and change the link to one with a new timestamp so that when we remove the duplicate, we will NOT remove the original image. this is not happening at the moment for many sites because I think it is too troublesome to code.
As a result of not removing the actual images or other related records when a record has been deleted, the site only gets bigger and the db gets flooded with zombie entries.
One option is to write a script to do a file or db cleanup every 3-6 months or so. I think this can be fairly complicated as each table requirements is different especially if we are talking like 80 tables for a typical site.
This the conclusion I got from working on a framework in a company.
Hope some people find this information useful.