It seems to be quite difficult to find comparisons between system-versioned temporal tables and the older options, such as DB triggers and CDC. I currently don't have the time to write an extended test on SQL Server 2016, so I thought I'd ask about it here.
Basically, the typical advantage with triggers is that they are easier to manage in stand-alone and clustered / alwaysOn environments, are real-time for being synchronized, and have access to session data such as the user ID.
CDC on the other hand while requiring a bit more management and being asynchronous, is much lighter, and thus performs far better. So if there's any doubt at all that the bottle necking caused by triggers could become a problem, CDC will basically be the superior solution. In terms hardware requirements, there's a negligible extra space requirement by CDC due to using logs and cdc audit tables for tracking the changes.
The question: How do temporal tables compare to the two above? In terms of speed, performance, storage space usage. WHEN should I use temporal tables instead of triggers or CDC? When should I not?
I understand anything as potentially complex as the business requirements and technical limitations behind DB auditing isn't going to have one easy answer, as it depends largely on the requirements and scope of the project. But anything to shed more light on the questions above would be appreciated. Thanks!
(Note, I got back to this in 2021 and added what I had learned as an answer to the original question. You'll find it below.)