Database analyses are the hardest part of our work, in my opinion. I've mostly managed analyses of large government databases so far - like Medicare claims data and data from the Hospital Cost and Utilization Project (HCUP). The difficulty is really that it's hard to know when you get the wrong answer. If you get something that looks right, you have to pretty much trust it once you've checked all the SAS code and any formulas used to get results into Excel or wherever the results are being displayed.
It's different from clinical trial data analysis because in a clinical trial, you always have a source document to go back to. The source data in a large government database are often so numerous that it's really hard to go back and verify that the analysis is correct. I guess it's similar to modeling in some respects, but models can often be recreated in another language to verify the results. Databases could conceivably be analyzed in multiple programs (like SAS and STATA, for example), but that would probably add 10-25% to the timeline and budget. In my small company, at least, we don't have the luxury.
At least the error I caught today was detected before we had already published the results!