Lab Rats and Slaughterhouses: The importance of animal studies in science

Hello Everyone,

I’ve been on the sidelines the past few weeks as the lab has engaged in a massive animal study.  This one has become so big that a large portion of the staff has been dedicated to it full time for the past three weeks.  I’d been wanting to comment on the use of animals in laboratory experiments for a while now.  I’ll state that I, personally, am not against using animals for experiments.  I’m not a vegetarian and I don’t disagree with eating animals or animal products.  There is, however, a line that needs to be respected.  We humans have been domesticating animals for centuries.  Historically, we used to respect and appreciate the burden that animals bare in providing food, labor, and other materials for us.  We’ve forgotten this, however, and take it for granted at our own peril.

Animal studies are vital for health sciences.  A quick search of papers on Thompson Reuters Web of Science for “animal studies” returns over eighty thousand results (1).  Undoubtedly there have been more than ten times this number of studies conducted over the past century, likely millions.  Animal studies are a crucial part of the FDA’s approval process in pre-clinical trials (2).  Animal testing allowed the United States to avoid the complications of thalidomide induced birth defects by strongly influencing their decision to prevent its sale in the US.  Additionally, studies on animals have produced numerous vaccines and therapies that greatly improved human health.  We should be more grateful for the sacrifices that other species have born to advance our own.

Despite all of these facts, there are those who conduct animal experiments with indifference bordering on malice.  To some, animals in these studies are no more than a number.  An egregious example comes from the recent allegations against Santa Cruz Biotechnologies (3).  I’ll go on record again to say I am not a fan of this company’s business practices.  Under pressure from the USDA to prove that their animal handling facilities were up to standards several thousand of their animals “vanished”.  These animals have not been found but one can guess what has likely happened to them.  Again, while I am not against eating animals or using them for laboratory testing I am not for the wholesale killing of huge numbers of animals, especially to potentially hide the misdeeds of a few people.  The death of an animal in a study needs to be balanced out with the usefulness of the study for human health.  Money should not be the only factor!

There are standards in place to ensure that animals are respected and not killed with wanton abandon.  Every university and institution that wants to get funding from the government (which is all of them for the most part), must develop an Internal Review Board to examine animal studies.  You need to show the IRB that your studies require animals. Performing experiments on the animals you are studying is the only way to answer important health questions.  The IRB also audits and oversees the studies to ensure that minimal harm comes to the animals.  I feel that some might not realize that these standards were put place by scientists who do respect animals understand their importance to human beings.  I admit, there are a few have lost sight of this fact.  We all need to realize when we’ve crossed the line and forgotten just how important animals are to us.

As always thanks for reading and feel free to comment!

Sincerely,

GRW

 

(1) Web of Science Accessed April 25, 2016. http://apps.webofknowledge.com/Search.do?product=UA&SID=4FMm14tlwI4DZfVpa92&search_mode=GeneralSearch&prID=9b6bba28-3b5f-4509-ba77-25b126ae7312

(2) FDA, 2016. http://www.fda.gov/Drugs/ResourcesForYou/Consumers/ucm295473.htm

(3) Reardon, S. Thousands of goats and rabbits vanish from major biotech lab. Nature February 19, 2016.

Data, data everywhere but how to process and represent it?

Hi Everyone,

I’ve been writing a lot lately, papers, presentations, late blog posts and I wanted to talk briefly about the way we process data.  I’ve used many different data processing tools in the past from Excel, to Matlab, to Igor and Origin.  I wanted to write a brief post about the one I’m getting used to right now: R.  R is a programming language created in the early 90s.  It’s open source for all of you open source aficionados out there.  This means that no matter where you are in the world, as long as you have Administrator access to your computer, you can download and install it.  This gives us the flexibility of being able to process data and produce high-quality images on numerous different devices.  Getting familiar with the basics of R is a good first step towards conducting science professionally.

I say professional science because making the images for a paper is often times much different than producing an image for a lab report.  As an undergraduate student you can get away with creating pretty poor graphs as long as they display the data you need to show.  Take this simple bar graph for instance, Figure 1.  It shows all of the data that you want, Percentage vs Type with the correct amounts, but it’s pretty ugly to look at.  The lines don’t really give us a great sense of where the values are nor the error bar sizes but it communicates the basic idea. Take a look at Figure 2 instead.  This is made using R with a package called Grammar of Graphics  (“ggplot”).  All I have to do is load my data from a txt file and I can copy and paste a pre-written code to make a very clear looking graph.  It takes a little bit of initial setup to write the code (see below) but I can just copy and paste it into R and it will make my image for me!  This helps speed up the process of graph production and also helps standardize it across labs.  Instead of emailing excel files back and forth one can email a simple string of code to get the same graphs.

Figure 1Example

Figure 1                                                                Figure 2

 

There is one pro and one con of this program that I’d like to discuss.  The Pro to using R comes from that last line of the code.  It specifies a place to save it but also the DPI, “Dots Per Inch” of the image.  Publications will not accept a blurry, pixelated image that you copied and pasted into a word document.  There are minimum levels for image quality that have to be met.  If you really love Excel you can control the DPI by using a separate image processing program or use a different data processing program.  It’s nice to be able to do it all in one shot, however, with R.  The major downside to R, that I haven’t mentioned yet, is that it is NOT a spreadsheet program.  You can’t save a spreadsheet of data in R as you could in Excel, so you’d need to save .txt files or other types of files along with the R code you’d want to process.  This can be both a help and a hinder as most machines will put out data as a .txt file or as an exotic file form.  The nice thing about R is that there is a large community of people working to create new add-ons constantly.  So don’t despair! There is chance that somebody has written code to put your exotic data file into a form that R can read!

Thanks for reading!  Once things quite down I’ll start talking a bit more about some now methods that I’m looking into for these combinatorial chemistry projects!  Please feel free to send me comments here or any other posts!

Sincerely,

GRW

Code:

 

limits <- aes(ymax = Num + Error, ymin = Num – Error)

 

p <- ggplot(data=df, aes(x=Type, y=Num), fill=Type) + theme(text = element_text(size=8)) +

geom_bar(colour=”black”,fill=”#0072B2″, stat=”identity”)

 

p + geom_bar(position=”dodge”, stat=”identity”)

 

dodge <- position_dodge(width=0.9)

F <- p + geom_bar(position=dodge) + geom_errorbar(limits, position=dodge, width=0.25)+ labs(x = “Type”, y = “Percent”)

 

print(F)

ggsave(file=”Example.png”, width = 3.2, height = 2.4, dpi = 500)