Hi Folks - I am pretty new to TDMS. While I understand the basics.....when comparing this type of tool to what we traditional use to refresh/transfer data to target systems (full DB refresh)........I have one concern which I am not quite sure how easy/realistic it is to overcome it.
- W/ full DB refresh process - we KNOW that end users in the target environment are getting exact copy of PROD....so it's got ALL the data they need to test w/, and nothing will be missed in the testing process due to the data not being there/etc.
- But w/ TDMS employed to build out the target boxes, it seems like the quality of testing is only going to be good if people using the tool know exactly what data they need on the target to test w/, and no how to make sure it gets copied there. So in other words, if their "subset" of data is 1% of the whole DB size, they are POSITIVE that this 1% is ALL they need refreshed to the target ----- that they are SURE, that NOTHING else is needed to make their testing valid. And I would question whether the average tester has this grand understanding of all the data in the 30-50k tables/etc in the DB. To know all the interdependencies involved/etc.
Thoughts?