This is common knowledge to anyone who has worked in the field - it's like asking for a citation for the claim that eating too much junk food leads to obesity. But here are two data points:
http://blogs.sciencemag.org/pi...
https://ancillary-proxy.atarimworker.io?url=https%3A%2F%2Fen.wikipedia.org%2Fwiki%2F...
So that's less than 20% of approved drugs that are discovered in academia to begin with. Academic labs aren't large-scale operations - a single-investigator R01 grant from the NIH might be $5 million over 5 years, and most investigators won't have more than a handful of these. For the really big superstar labs, let's assume a very generous upper bounds of $10 million per year (not all of which is necessarily from the government). If it's a big multi-investigator project, maybe double that. Except for a handful of big centers (like the NIH itself, or genome sequencing centers), academia just doesn't operate at a large scale - a typical university research department is just an aggregation of many smaller units that are largely autonomous. The hidden advantage to these organizational limitations is that failed projects usually fail before anyone spends too much money on them. So let's hypothesize at the extreme, academics spent no more than $50 million per drug candidate. Compare to the numbers in the Wikipedia article.
Now, you could of course argue that because drug development is informed by the public-domain knowledge generated by taxpayer-funded researchers, drug companies are leaching off the public in that way too. I guess that's technically true (albeit difficult-to-impossible to quantify), but you might as well argue that because the government invented digital computers, companies like IBM and Intel should have been nationalized. (Note that the difference in salary between academia and big pharma is relatively large - to shift more drug development to academia, you'll need to raise salaries, or find a lot of scientists willing to work for academic salary while doing grunt work on massive projects that will mostly likely fail.)
To pick a more specific example, the NIH spends approximately $1.2 billion per year on aging-related research (including but not limited to Alzheimer's):
https://ancillary-proxy.atarimworker.io?url=https%3A%2F%2Fwww.nia.nih.gov%2Fabout%2F...
Most of that will be single-investigator grants, and as anyone who has worked in basic research can tell you, the majority of the grants that are funded won't lead to any immediate treatments, although they may provide useful information in the long term. In contrast, here is an estimate of the total cost per Alzheimer's drug being $5.7 billion (including failures, and keep in mind the overwhelming bulk of that is spent by drug companies):
https://ancillary-proxy.atarimworker.io?url=https%3A%2F%2Falzres.biomedcentral.c...
This isn't to argue that taxpayer funding of basic research isn't valuable - it's absolutely essential IMHO. But most of what it produces isn't going to lead directly to new drugs or treatments.
Obligatory disclaimer: I do not work for a drug company, but I did receive funding from them as a government scientist, and receive a small bonus from IP licensing fees every year. Frankly it was far more trouble than it was worth; drug companies are kind of a pain in the ass to deal with, even if you only talk to the scientists.