a thoughtful web.
Good ideas and conversation. No ads, no tracking.   Login or Take a Tour!
comment by asdfoster

EDIT2: This comment has started a bit of an unrelated debate, so I would like to quickly clarify that my original complaint in this comment was more of a semantic issue that I had with the author repeatedly saying "science" when he meant psychology/medicine specifically. This is something that I've seen a lot from people in some fields more than others and it is a pet peeve of mine.

Original comment:

It's important to note that this is an indicator for social sciences like psychology, but not really for harder sciences like Physics/Astronomy/Chemistry. It always annoys me that people just say "science" when talking about soft sciences as if the shortcomings there apply to the hard sciences as well.

This shows that when studying people, it's very easy to do it wrong and to get bad results and false positives. This does not say anything about harder sciences. This doesn't mean that things like climate change could just be a placebo effect. Human biases don't change thermometers, but they might change more the subjective criteria of a softer science field.

EDIT: The important thing that needs to be remembered here is that these fields operate differently. In these softer sciences (especially psychology) the only evidence comes from "doing things to people and watching what happens". The problem here is that people are very complicated, and it's easy to fuck it up and accidentally include a bias. With harder sciences, we know more about the system, can get a more solid mathematical / theoretical foundation that can predict things, and can approach situations from a larger variety of observational vantages to get a fuller picture.





wasoxygen  ·  3181 days ago  ·  link  ·  

    social sciences like psychology, but not really for harder sciences

I understand the idea, but the problem is that chemists are themselves soft. People who study the "hard" sciences are just as subject to bias and error as sociologists.

From Cargo Cult Science:

    We have learned a lot from experience about how to handle some of the ways we fool ourselves. One example: Millikan measured the charge on an electron by an experiment with falling oil drops, and got an answer which we now know not to be quite right. It's a little bit off because he had the incorrect value for the viscosity of air. It's interesting to look at the history of measurements of the charge of an electron, after Millikan. If you plot them as a function of time, you find that one is a little bit bigger than Millikan's, and the next one's a little bit bigger than that, and the next one's a little bit bigger than that, until finally they settle down to a number which is higher.

Someone plotted it.

One might argue that human behavior is so much more complicated than anything that can happen in a Petri dish that more errors in social sciences are inevitable, but (1) I am not convinced that this is true and (2) it assumes that experimenters do not account for complexity when drawing conclusions in their work.

In practice, we always begin with a personal judgement about the reliability of the evidence we observe, so we never escape the ouroboros.

doommaggot  ·  3181 days ago  ·  link  ·  

You trying to draw a line between them shows you don't understand science as well as you think you do. The same shortcomings do apply to hard sciences as well. There is a reason why double blind studies have so much significance attached to them. There is a reason why people look to see if studies have been independently verified and why attempts are made to verify them. Biases of the people performing the studies can affect the results no matter what is being studied.

(Of course the evidence in favour of Anthropomorphic Global Warming is overwhelming to the point of being irrefutable.)

asdfoster  ·  3181 days ago  ·  link  ·  

I'm not saying that the line is anywhere near distinct, but the extremes of the hard/soft science spectrum are intrinsically and fundamentally different.

While physics has a good mathematical and theoretical backing, and the experiments and the theory build off of each other and check each other's power, you don't have so much variety in sources of information in softer sciences.

You mention double blind studies, but again, those only exist in softer sciences because of the systematics and complexity of dealing with human/biological subjects. The article mentions meta studies as the top of the evidencial pyramid, but they don't exist in harder sciences because they don't make any sense in that context.

Of course all experiments are subject to systematics. I would never claim anything different. However the point that I raised in my edit (posted before your reply) still applies. The very grounds on which science is done is fundamentally different in different fields because of the type of research possible and the types of data available.

For example, in Astronomy (my field), you can't really set up an experiment where you create a nebula and watch it form into a start system. You also can't watch one system go through its entire life because of the timescales involved. Experiments don't really work on this scale. Instead, you have to rely on observational and theoretical techniques to study how things work. Further, in Astronomy (and chemistry and atmospheric science and others) you have physics to fall back on to predict how things will happen. (EDIT: To clarify, this was my point with the climate change example in the first post. The human researcher's beliefs won't change what the thermometer will read and it won't change how the winds will blow. They might change how another human will react. An important note is that they might also impact which data gets recorded, for example if the researcher takes Christmas off every year they might miss something important in the data at that time, or if the researcher only takes data once every few days they might miss some of the smaller scale/period signals etc.)

The physics and the math don't change because of the placebo effect. Unfortunately, people do.

In harder fields, you study these more objective things, in softer fields you study softer and more subjective things that can be more easily influenced.

Again, I'm not saying that there is no room for misinterpretation or for systematics (although in harder sciences, the systematics are more physical and quantifiable in nature), just that many of the specific grievances in the article are less applicable to a harder science.

That said, there are still human/researcher biases to take into account with the harder sciences, they're just very different.

doommaggot  ·  3181 days ago  ·  link  ·  

Meta analyses absolutely are done in hard science fields.

Use of statistics and mathematical backing is very much done in social science fields.

Biology makes use of blinded studies. Physics has used some blinded studies as well.

The amount of heterogeneity between hard sciences and soft sciences is virtually identical.

Citation

asdfoster  ·  3181 days ago  ·  link  ·  

Interesting read in the citation.

Yes I concede that many physics papers will compile the results of previous groups to get more realistic numbers as a combination of their work and experiments. Although it isn't called a meta analysis in hard science fields, it is basically the same thing. Forgive me for not thinking about that very thoroughly, it's 6 AM for me and I've been awake for a while :)

From my experience with the soft sciences (which is admittedly not much beyond an undergraduate level, unlike hard sciences), their mathematical backing is less rigorous than you would find in, say physics. You won't really find any mathematical proofs in a biology paper. All fields APPLY math to model and statistics to approximate errors and variations, but only the harder sciences USE mathematical logic and proof structure in order to predict a relationship between unstudied things from mathematical/geometrical underpinnings of the universe. The softer fields use the math to explain, but only the harder fields can use it to predict and to build off of the previous knowledge base.

If you have any counterexamples to the things in my last paragraph I would love to see them, for I know of none.

asdfoster  ·  3181 days ago  ·  link  ·  

My original complaint was more of a semantic issue that I had with him repeatedly saying "science" when he meant psychology/medicine specifically. This is something that I've seen a lot from people in some fields more than others and it is a pet peeve of mine.