I found this article to summarize how I view the problem: http://www.economist.com/news/briefing/21588057-scientists-think-science-self-correcting-alarming-degree-it-not-trouble . It's not just parapsychology, it's the way we do science today. To summarize, a negative resulting study will more likely not than get published, this has created a meta-statistical crisis. If one uses a 95% confidence interval, 1 in 20 studies will have an incorrect conclusion, so repeating studies is important. Now, if I through cut back grant funding across the country, and create a hoard of desperate professors who are cranking through studies to find something to hang their hat on the next grant, and they do 20 studies and find 1 with a positive result--they publish. You end up with a bunch on non-reproducible studies: http://www.jove.com/blog/2012/05/03/studies-show-only-10-of-published-science-articles-are-reproducible-what-is-happening Parapsychology is only the tip of a far bigger crisis of how science is currently conducted, from funding to publication. What can you do? Don't buy into pop-news sensationalist claims from a single study. Start believing in it when multiple studies report the same result, and remain skeptical.
Kahneman's letter called for better reproducability of priming effects (observed when you ask one person if Mount Everest is taller or shorter than 50,000 feet, then ask them how tall Mount Everest is, then ask another person if Mount Everest is taller or shorter than 5,000 feet, then ask them how tall Mount Everest is, and the second person tends to give lower estimates). The request was enthusiastically answered by the Many Labs Replication Project, an international group of labs which revisited 13 classic experimental results. Ten effects were reproduced, with some of the anchoring effects (like the one about Mount Everest) demonstrated with stronger results than in the earlier studies. John Ioannidis, mentioned in the Economist article, has done heroic work exposing bad science in medicine. We recently had a discussion about a simple and effective technique that may be ignored because of the way medical research works. (P.S. Your link to the Economist article is broken because of the period at the end.)