A “double post” is the act of sending a file in for processing twice on a production system.
Most operators consider a “double post” to be far worse than a missing file or missing transmission, because files sent in for internal processing often cannot be cleanly backed out. Double post violations involving hundreds or thousands of duplicate payment, payroll and provisioning transactions are relatively common experiences and are feared by all levels of management because they take considerable time, expense and loss of face to clean up.
There are many technologies and technologies used today to guard against double-posts. These include:
Remembering the cryptographic hashes of recently transmitted files. This allows file transfer software to catch double-posts of identical files and to quarantine and send alerts appropriately.
Enforcing naming and key-record schemes on incoming files. This often prevents external systems from blindly sending the same file or batch of records again and again.
Synchronizing internal knowledge of records processed with external file transfer systems. This advanced technique is EDI-ish in nature, as it requires file transfer technology to crack, read and interpret incoming files. However, it allows more sophisticated handling of exceptions (such as possible “ignore and go on” cases) better than simpler “accept/reject file” workflows.