I wonder, like you, how many of us have squashed our own ideas because it'll never meet the criteria of "evidence-based," mainly because evidence is defined so narrowly and with special interests in mind.
Indeed. I've struggled with this for as long as I have been in the field. While I want things that work, I also know there's a lot out there in our communities that do work that will never find their way into JAMA.
It's pretty fascinating how we'll let others define what works. I'm as much a scientist as I am an artist in the Mental health world, and we have a long way to go in figuring out what it means to experience health and well-being.
Loved and lived this thought process. I was approached by a recovery house with 65 male residents for ideas on creative ways to impact recovery. I suggested introducing music to reduce isolation, encourage community and develop a positive outlook. Well, we don't have money, instruments, and instructor and we don't have any clinical evidence that would work. So I bought all the instruments, hired and instructor, developed an implemented a music program. After 6 months we did a university sanctioned survey that produce data showing an 8.3 Net Promoter Score, a 7 of 10 'Helping my Recovery' and 92% desired for more music, more instruments and more instruction. Now that we have 'evidence', we have a large grant application to expand to many more recovery centers. Eventually, we will get to a RCT, but in the meantime, we are changing lives...
I'm reminded of the wisdom of philosopher Arne Naess, who emphasized the idea that doing something intuitively obvious should require no justification.1 I also addressed this topic of RCTs in a recent editorial where i recommended that any study title indicate the type of research described.2
1. Naess A. Ecology of Wisdom. Great Britain: Penguin Books, 2016.
The rigidity and inflexibility of the scientific method hurts as many people as it helps.
So well said, David!
I wonder, like you, how many of us have squashed our own ideas because it'll never meet the criteria of "evidence-based," mainly because evidence is defined so narrowly and with special interests in mind.
Indeed. I've struggled with this for as long as I have been in the field. While I want things that work, I also know there's a lot out there in our communities that do work that will never find their way into JAMA.
It's pretty fascinating how we'll let others define what works. I'm as much a scientist as I am an artist in the Mental health world, and we have a long way to go in figuring out what it means to experience health and well-being.
Loved and lived this thought process. I was approached by a recovery house with 65 male residents for ideas on creative ways to impact recovery. I suggested introducing music to reduce isolation, encourage community and develop a positive outlook. Well, we don't have money, instruments, and instructor and we don't have any clinical evidence that would work. So I bought all the instruments, hired and instructor, developed an implemented a music program. After 6 months we did a university sanctioned survey that produce data showing an 8.3 Net Promoter Score, a 7 of 10 'Helping my Recovery' and 92% desired for more music, more instruments and more instruction. Now that we have 'evidence', we have a large grant application to expand to many more recovery centers. Eventually, we will get to a RCT, but in the meantime, we are changing lives...
Thanks for your thoughts, we hear you.
Great piece, Ben.
I'm reminded of the wisdom of philosopher Arne Naess, who emphasized the idea that doing something intuitively obvious should require no justification.1 I also addressed this topic of RCTs in a recent editorial where i recommended that any study title indicate the type of research described.2
1. Naess A. Ecology of Wisdom. Great Britain: Penguin Books, 2016.
2. Smith RC. Using the term "evidence-based" in the communication literature. Patient Educ Couns 2023;110:107684. (https://www.ncbi.nlm.nih.gov/pubmed/36857857).