Archive for category HtBYiS
An excellent example of using past misconduct cases to teach students how to behave ethically when performing science.
In a new paper published in the Journal of College Science Teaching, three professors at Clayton State University in Morrow, Georgia, discuss why retractions are good case studies for teaching ethics and examining the scientific process in class. Stephen Burnett, Richard H. Singiser, and Caroline Clower write:
View original post 186 more words
The Tuskagee Experiment.
An example of what *not* to do.
The Tuskagee experiment was a research project involving both the Tuskagee Institute and the US Public Health Service. Originally, this study was to last six months to study the progression of syphilis in Negro men under the project title “Tuskagee Study of Untreated Syphilis in the Negro Male”. Why just the Negro male, you’re asking? At the time it was believed that the disease progressed in a different manner in white males. They knew little about the natural progression of the disease in black males and they wanted to obtain enough data to justify a treatment program specifically for them. So, at the start they had good intentions. Although allowing anyone with a disease to go untreated for any length of time seems cruel, it is, however, the only way to study the natural progression. And, if a subject is fully aware of the risks and gives consent, then data collection can commence. However, although men consented to the study, there was no informed consent. The subjects were not told they had syphilis. They were told they had ‘bad blood’ and were seduced to take part by the offer of free health care, meals and burial insurance. At some points they were even discouraged from seeking treatment elsewhere. Further, the study ended up lasting more than the allotted six months. It ended up running for 40 years until an epidemiologist, Peter Buxtun, blew the whistle leading to an investigation. In 1972, these experiments were finally deemed “ethically unjustified” and the terminated. A nice piece by the CDC summaries the timeline for this project and the actions taken in the aftermath.
As horrific an example of unethical research this is, I don’t think the scientists that took part should be demonized as people that purposely chose to destroy the lives of many. If anything, they had utilitarian mindsets; they were essentially sacrificing a few in order to understand the disease more completely so they could develop a treatment that would protect and save so many, many more. Holding such a utilitarian mindset can be dangerous in allowing you to justify the means of any research you are performing. In science, we should remember to apply deontological ethics where we adhere to our obligation and duty to fully disclose information on a study to our subjects and to ensure that informed consent is truly informed and not misleading. This may mean we might not be able to attract as many subjects to high risk studies, but it is the only way to guard ourselves against repeating a Tuskagee situation.
Thanks to the 1974 National Research Act, the Presidential Bioethics Commission and the Office for Human Research Protections, this sort of thing should never happen again. What happened is so horrifying to those of us who practice science today that it seems we shouldn’t need to discuss it, that it must, surely, be impossible that anyone would even attempt to perform research in a manner even close to this. As I say this, I am only too aware that, although they may not be on a scale with the Tuskagee transgression, there are still ethical breaches.
This year, the SUPPORT study (Surfactant Positive Airway Pressure and Pulse Oximetry Trial) was found to fail in its duty to provide participants complete details of the risks involved and bringing into question the extent to which the participants consent could be considered informed. This study was to test the outcome of using different oxygen levels on premature babies, at the extremes used either blindness or death could result. Parents were informed of these risks, however, it was not made clear enough that if they did not take part in the study that their babies would have been treated with oxygen levels within these extremes where the risks of blindness and death were somewhat minimized. There are still risks involved for premature babies whatever course of care was taken but more detailed information about the risks in and out of the study should have been provided.
If we are to always function in an ethical manner, from time to time we need to examine the dark places in the scientific past and remember how easy it can be to fall into unethical situations when judgement is clouded by ambition towards a goal that once seemed honorable.
Science fails without trust. Members of a lab must be able to trust each other that the batches of shared chemicals produced are as stated on the labels, equipment is being calibrated and maintained correctly, and their data is genuine. PIs must be able to trust the members of the lab that they are performing the research they are funded to do and that they are using the appropriate, legal methods using the appropriate resources. Other scientists must be able to trust that what is presented in scientific articles is the truth of what was performed and that the results have not been massaged or invented. The public, whose taxes pay for a lot of the research, must be able to trust that the funding is being used appropriately and not misappropriated.
When I started my PhD I had to attend some ethics courses. My PI made it a mandatory requirement for everyone in his group. I’m glad I attended the classes, not only because it helped clarify what the expected behavior was but because it gave me an opportunity to learn about philosophy. Returning to academia after having worked in the pharmaceutical industry, I was already well versed in scientific ethics and good laboratory (GLP) and good manufacturing practices (GMP), but attending these classes and hearing some of the questions other new PhD students asked, I realized there are some grey areas that new graduates are not so clear about.
In industry you can’t set foot in a lab until you’ve read the company standard operating procedures (SOPs) for labbook reporting requirements, usage of filesystems, storage of data and other standard GMP procedures that you need to adhere to. Likewise, every student, before starting any research, should be made to attend ethics classes and should be taught what we mean by good laboratory practices.
When I first started in industry, I was shocked that I’d made it through a whole undergraduate degree without being taught how to complete a labbook to GLP or GMP standards, nor how to write an SOP effectively. In some of my very first interviews attempting to get into the pharmaceutical industry as a fresh graduate, a skill I was repeatedly asked about was whether I had written SOPs and they always asked if I was familiar with standard laboratory practices. It wasn’t until in industry that it was hammered home to me how crucial accurate labbook reporting was, even down to reporting serial numbers on balances and calibration details.
In academia we are supposed to work to GLP so it is a little bit more lax than the pharmaceutical industry, where reporting has to be very stringent given that a drug you are working on if not handled or labelled properly could harm someone down the line. But nevertheless, I am shocked at how little training in GLP standards is given at the undergraduate level. It should therefore be no surprise that on starting graduate school there is often a lack of reporting in labbooks and also in understanding what standards one should be adhering to and what rules there are.
There has been much discussion on the importance of ethics in scientific research, such as that published by the NIH, but what I would like to see is professors with industrial backgrounds encouraged to hold classes dedicated to teaching industry standard practices and ethical performance of research.