In the following post, I explore and elaborate on the guidelines and practices that have come to be part of my daily workflow (as a GIS Analyst- GIS Technician). These apply to projects with a medium to a long lifespan. However, might not be necessary for short projects with no revisiting needed.
The very first concept and maybe the most important when working in projects with large data arrays. Every file used must have a standardized information structure, linked or related to the rest of the data. This increases the ease of verification, interoperability, and cross-referencing.
Building a versioning system that encompasses and backups, sometimes when uploading data to update a file a historic log can help in understanding the changes made the file became this or another way to make choices. Furthermore, some “corrections” are done without the needed care and end turning a dataset into the trash. In these cases, you need a safe place out of reach for the everyday user, where you can recall the old version with the correct information.
However, possessing all backups is not enough, you must have all supporting data for every edit and change done in a file. This information should be placed in the metadata of each file. Of course, it takes up to the double of time to this, but it takes a tenth of the time to retrieve said information in case of need.
And to finish this post, I recommend never underestimating the metadata. While it is a cumbersome and frustrating thing to do (or a guilty pleasure), metadata is a validation step needed for serious projects. So take it with the needed seriousness, investigate for any local standards, find a proper software, and just fill as many fields as you think are needed (keep it simple, but complete).