And who can forget the LK-99 thing last summer? That's what I thought your post was about, bfx. But nope, a completely different superconductivity let down. I'm impressed someone laid out this particular saga in longform. lol this is so commonly true. So many grad students will run the labs for like 80 hours a week, gather the data sets they were told to, and then have no idea what any of it means. Instead of the advisor telling them, the boss'll just swoop in and publish, and the lab rats will be lucky if they get a coauthorship. It's also a reminder that my advisor was so badass that he can't be stopped from making his own plots. He writes code code, not just toggling image and graph settings. It's perfect that the story ends with him lying about his work on Twitter. Primo perfecto.Several other researchers told the news team that the principal investigator does not typically produce all the plots. “That’s weird,” Canfield says.
Yeah, condensed matter is full of fraud stories. And it's almost always interesting, if disheartening, to read about them. It's a shame showing a lack of successful measurement isn't rewarded or even encouraged. I end up hearing stuff through the grapevine how the idea I thought worth revisiting was already tried by some small team back in the '90s, and it's only mentioned at the back of the supplemental materials. There's a binder (and database) on my desk (laptop) that catalogues excerpts and mentions of such misses that may end up being my biggest contribution to the field. To be fair, grad students span gamut from 'wait, why isn't B a constant?' out-of-their-depth beginners to the likes of you, who probably shake their head at visiting professors' inexperience with methodology. Not really trying to defend how some people run labs, but I know in my heart there were times when prof wasted his breath on explaining my role in the grand scheme of things.So many grad students will run the labs for like 80 hours a week, gather the data sets they were told to, and then have no idea what any of it means.
Truth. I got scooped once, by a matter of days, was just about to submit to a journal, and one of my advisors said, basically, "oh well. next time." I was like "well it's kind of a complimentary paper, reconfirming the same physics", and they said "so what? you gotta be first." Same idea, though. Even though the paper would have contributed to the field, I was discouraged from publication. I should've published anyway, in hindsight, just like all the null results and other reconfirmations. But especially so, because the paper was already written and everything. Eh, not common at all. Only once has this very notably happened, I think, when some theorists with no idea how particle spectrometers work were trying to use our data to do something with relativistic gauge invariance. They got shot down pretty badly at a conference. I just went off googling, and I can see they never published. Righteous, the process works! But it's very true that grad students in physics are selected primarily through their skills in mathematics, which is obviously necessary, but I've seen how often some of the students very skilled with maths struggle when they get into research. Creativity, critical thinking, and math skillz rolled into a single person is super rare. There were only one or two people in my class of twenty that had all three, and it sure as hell wasn't me. By the way, it's funny because I'm still in grad school, hah, for just one more week! Fell off the wagon for a few years. Went to rehab for booze. Doing much better. I should probably write a pretty lengthy post about rehab, though. My god, what a funny experience.It's a shame showing a lack of successful measurement isn't rewarded or even encouraged.
grad students span gamut from 'wait, why isn't B a constant?' out-of-their-depth beginners to the likes of you, who probably shake their head at visiting professors' inexperience with methodology.
Yeah this article is pretty amazing. Fun what happens when you fuck over all your students :) :) :) For the first half of the article I was wondering where the grift would be and...lo and behold.The paper was published on 14 October 2020 to fanfare. Dias and a co-author, Ashkan Salamat, a physicist at the University of Nevada, Las Vegas (UNLV), also announced their new venture: Unearthly Materials, a Rochester-based company established to develop superconductors that operate at ambient temperatures and pressures.
Nobody shoots private industry in the foot like private industry.
I'm gonna push back on that idea a little. Firstly, this was an academic scandal, with an eye toward industry. But more to the point, I have worked in academia and private industry both fairly extensively, and I've found that private industry generally does more rigorous science (though often not as exciting). The caveat is that I work in biotech, so I don't know how that relates to physics. I would imagine it's not so different, though, because the incentive structures dictate everyone's behavior (but to be fair, biology experiments are notoriously opaque and hard to reproduce even when the hypothesis is rock solid, so there's a lot more room for obfuscation than in a harder experimental science). In academia the financial incentives come from grants, which generally result from publications, which generally result from high impact discoveries. So the incentive boils down to "make high impact discovery." In industry the incentive is to move product, and moving product doesn't happen if the product doesn't work. The product won't work if the science behind it is faulty. So the incentive is to weed out bad science, and only pursue the most reproducible work. This leads to a relative lack of risk taking, but generally more faith that what comes out of it is solid. I can tell you from years of experience that the attitude in academia is "defend this at all costs" and in industry it's "kill this at all costs". Totally different mentalities. But again this is biotech. I realize fully that not all industries follow this trend, especially, say, venture-backed tech. I would imagine that the academic side of tech is way more upstanding than the industry side, but that's a hunch based on little-to-no first hand knowledge.
I dunno, man. My comment was mostly a flippant quip, and yeah, of course I'll concede that most fields runs most of the gamut of incentive structures, but the distribution can vary widely between fields. If that makes sense. My discipline's definitely more of an outlier; very little overlap with industry. Ain't nobody but the taxpayer gonna foot the bill, because there's never going to be a product. It's pure research. Pretty sure I could convince people that the $2 billion dollars for the main experiment I work on was well spent, if I'm granted 30 minutes and a whiteboard. But anyway. The profit incentives that come with for-profit products seem problematic for a lot of people. Especially when the amount of potential profit is billions of dollars, like for a room temp superconductor. It's not like the guy in this article, Dias, thought he had anything worth two shits, but he still decided to ride the hype train for some short-term recognition. I'd be tempted to say "the system works!" if it didn't damage public trust and perception. Maybe his calculus his that he'll still be able to land a gig for a private company, because he's toast in academia. I know this is childish, oversimplified, and probably at least a bit of something I tell myself to feel better about having very little income compared to private industry salaries, but I still think there's a nobility in academia. At least in the hard sciences. No, the academic system is not as infallible as I once thought, but it's been really nice to realize that my peers aren't doing what they do for the money. Some of them make a very comfortable amount of money, don't get me wrong, but most of them could earn a lot more in industry, and for less work. Fuck tenured professorship though. I'm not sure where the stereotype of the lazy tenured prof comes from (humanities? note: this is not my perception, I'm trying to guess common opinion), but half of the profs I had in physics seemed miserable. Overworked. Health problems. What is "this"? All of the best researchers that I know in the public sector have very little issue with taking an L and moving on if they were wrong. Again, there's probably a major difference inside academia between fields with a lot of industry overlap vs. not. I'm cool with not a huge income, but I think having kids would change the game. Already made the choice not to ever do that.I can tell you from years of experience that the attitude in academia is "defend this at all costs" and in industry it's "kill this at all costs".
“This” is whatever hypothesis you’ve tied your success to. Some of the issues I’ve discussed above maybe are unique to pharma, where I work. And in no way did I mean to imply that academia isn’t a good career populated with almost all good people. Just that the incentive structure, which can affect good people too, is such that it encourages putting one’s best data forward, say. This is obviously a ton harder in a hugely collaborative gazillion dollar field such as the one you work in…Lot of small time scientists work sort of on an island. Also, the bigger the claim, the easier it is to poke holes in. The superconductor thing reminds of this thing that happened maybe 10 years ago where a researcher in my field claimed she could make induced stem cells by just bathing them in acid in some specific way. Took like 5 minutes to rip to pieces. Some people really want that paper in Nature. For real don’t mind me though. I’m just kind of a jaded asshole because of some experiences I’ve had when my (very well connected and basically bulletproof) data didn’t align with the big lab’s big money hypothesis. They’ll skewer you.
I'm always shocked by the laziness that goes into made up data. Like here you have multiples of a number. In bio you often see stretched or inverted images that are portrayed as individual replicates, say. You'd think that if you were going to go to the level of just plain cheating that you'd put your back into it. There are statistical tests, e.g., that can tell you with pretty good precision whether numbers are random, say, or whether a large group of numbers is spaced in a way you'd expect a natural set of data to be spaced (e.g., the frequency of small numbers increases with the size of the number). I doubt it would be that hard to fake a set of data if you reverse engineered it to pass the standard battery of smell-test statistics. But you never see that. Or maybe only the dumb ones get caught?
I mean... there's no way to not get caught faking superconductivity. But apparently SpinLaunch and Theranos (and so many others) are allowed to swindle investors for long periods of time. Maybe Dias got inspired. I should also say that obviously a lot of good can come from private industry and profit motives. And occasionally there's public sector flops, like the faster than light neutrinos, for example. Those guys seemed to know they were wrong and just wanted help figuring out why, though. Honestly, I'm having a hard time thinking of why someone would lie if they intend to stay funded by grants.
Yeah, what’s stopping people from using AI generated synthetic data?
Nothing. Can't find it now, but I saw just the other day that someone had got a paper containing the phrase "As a large language model, I can't..." somewhere in the meat of the paper past peer review. Apparently this here comment is the first on hubski to introduce the concept of enshittification. It's not only affecting online or social media platforms, obviously.