I’m sorry, what?

Written by: Howard Li

Two weeks ago, I met up with a friend who I haven’t seen for a long time. We met in first year and both studied life science, albeit in different departments. Facing the onslaught of tedious assignments, ruthless exams, and frankly ridiculous lab reports, we drifted apart in our second and third year. So, after an awkward exchange of pleasantries, I asked him how his research was going in the hopes of sparking up a casual conversation about the fun times in a research lab as an undergraduate student. Little did I know, he would launch into a passionate verbal barrage of technical terms to describe the work he was doing that took me back to my childhood years trying to parse English from Chinese and not understanding the majority of either language.

“I’m sorry, what?” was all that I could pathetically mutter when he finished. I watched as his face changed into a smirk as he realized, undoubtedly from my blank expression, that I must have not understood a lick of what he said (to my credit, I understood that he was doing something with viruses). He asked me what I though of his project and I hit him with another cheeky “I’m sorry, what?” to confirm that I indeed had no idea of what he was saying. With the awkwardness broken, we went on to catch up about each others lives and I was even able to get a confused expression out of him when I tried to describe my project at the lab.

Now, I think that we’re both pretty on top of our classes and that we’re generally pretty savvy in keeping up with science (i.e. we’re both pretty big nerds). So I was surprised that we had such a hard time describing our undergraduate level research projects to each other. Granted, mine is a biochemistry project while his is more focused on immunology. But with both fields falling under the category of life science, I thought that it is reasonable to assume that they were related enough that two newbies to science can freely converse with each other. And both of us have taken both biochemistry and immunology classes too! However, it seems that nowadays, science is so specialized that beyond the very introductory ideas, the entire knowledge base and mindset of people from different fields is completely different.

If two undergraduate students studying closely related sciences had such a hard time talking to each other, then imagine the breakdown in communication when a problem requires experts from vastly different fields of science and engineering to work together to solve. And while we all know how badly the media butchers and misrepresents scientific findings, can we really fault them? If scientists have trouble understanding other scientists, then how can we expect the general audience to understand with anything short of a universal translator. To most people, science may as well be an entirely alien language.

While I joke, I think that scientific communication is of vast importance to our future. More and more problems now absolutely require the input of experts from all over science and engineering to tackle. And with issues such as climate change urgently knocking on our doorstep, science needs to play a key role in informing the public, politicians, and policy makers. The only way to do this is for us to learn to overcome this language barrier and to communicate science in a simple, but accurate way. To the public, and to other scientists.

Right now, we are students. Our job is to learn so that in the future, we can contribute to society. Part of that learning needs to be in effective scientific communication. When we graduate (hopefully), we will become engineers, researchers, professors, and doctors; and we will need to work with people from all varieties of disciplines in order to face the challenges that await us. So at the very least, we should all be speaking the same language so that “I’m sorry, what?” ceases to be a response.

Advertisements

An Investigation on our love for blackboards

By: Mathilde Papillon

The blackboard. This archaic teaching tool is in practically every single class of any science student. It also furnishes most math departments and shows up in theoretical labs everywhere. Why is that? How is it that scientists and mathematicians working on the finest, newest technologies still bother with the messy chalk? Today, there are so many other presentation tools available to us, and yet, this 1801 invention remains a favourite. As it turns out, there are a few reasons justifying this choice.

 

Many scientists and mathematicians explain that this love is rooted in the tool’s sheer simplicity. As Harvard’s math professor Oliver Knill will say, no other method communication allows for such freedom in expression of thought. No reliance on batteries or projectors, nor paper or erasable marker ink. Just plain old chalk with a wooden eraser. The audience’s attention is funnelled towards this “point of focus,” as physicist Lewis Buzbee describes it in an article for Slate.

This Californian author also points out the blackboard’s contribution to how we teach. This tool allows for a true, authentic development of an idea, whether that be solving an equation or stating a proof. The speaker exerts full control over the lesson’s progress and has the liberty of emphasizing any aspect with a simple dab of the chalk. As the subject at hand unfurls itself onto the boards, the drawings, equations, and definitions appear as an ensemble to the student, facilitating otherwise abstract connections.

Not too surprisingly, the blackboard presents a lot of advantages for the audience as well. First off, as Knill points out, the blackboard forces the speaker to slow down, allowing for students to better process the material at hand. As mathematician and historian Donald Mackenzie points out in his essay Dusty Discipline, the large gestures involved with black boards, like sliding boards around or erasing, allow for structured pauses and break down the material into smaller bites. Furthermore, most students will agree that chalk is simply easier to perceive than ink, the latter often leading to messy, smudged writing. Knill actually points this out using images from the movie Arrival in which a whiteboard renders rather simple equations quite messy and difficult to decipher.

On this note, it would appear as though many of the future’s brightest innovations will continue to be developed (and then explained) on this timeless tool, enamoured by the simplicity and structure it provides to a lesson.

Reference List:

http://mbarany.com/DustyDisciplineBWM15.pdf

http://www.math.harvard.edu/~knill/pedagogy/blackboards/index.html

https://slate.com/human-interest/2014/10/a-history-of-the-blackboard-how-the-blackboard-became-an-effective-and-ubiquitous-teaching-tool.html

The Experience of Research as an Undergrad

Ah, research! Cutting-edge technology, exciting chemicals, pushing the limits of knowledge with your own two hands! But is that all there is to it?

The reality is that pushing the limits of knowledge requires a lot of inspiration, and takes a really freaking long time. Doing research is not your run-of-the-mill undergraduate lab. There, you do one experiment, say, synthesize aspirin, which has already been well characterized and done numerous times by a vast number of people You then you write about your specific attempt and all is well and good. In research, you don’t have the luxury of previous renditions of the same experiment because they’ve already been done, so what’s the point?

Instead, you need to find a new topic to study so that you can appease: 1) your supervisor, 2) your advisory panel, and 3) a funding agency, if you get there. Each of these require a more original and “exciting” experiment, and often those are quite hard to find. In fact, losing your research topic because someone else has already studied it, Usually, people tend to find new topics by looking into similar topics and tweaking them slightly, or delving deeper into a topic that has only been generally covered. This involves reading A LOT of papers so that you can become an expert on the current status of the research area you’re interested in. Also, keep in mind that while you will have help along the way, ultimately you must decide on the topic on your own because it is YOUR project, not your supervisor’s; otherwise, what’s the point?

Finally, after digging through trawls of papers, you have solidified your research topic and you are pretty confident that it will be exciting enough to give you a degree (let’s not get ahead of ourselves to the grant stage yet). However, since your topic is so new and exciting, you have no idea how you’re going to do it or if it will even work. You can ask around for help from people in your surroundings, but odds are they are not familiar enough with your topic. After all, you chose this topic specifically because it is new, and nobody has really researched it yet. So how do you proceed? By, guess what, reading more papers! In this case reading papers is like an extension of asking people in your surroundings. You won’t get the exact answer you’re looking for, but you can get an approximation of what you can do to get results. Also, thanks to modern technology, there are now internet resources such as research gate to help you in addition to reading a ton of papers, so all is not lost.

After much scrounging around, you are finally ready to plunge into research, exciting! Time to collect data!

…but collecting significant data also takes a long time and along the way you will inevitably have experiments that fail, reagents that degrade, part of your project getting scooped, etc. Eventually, you will succeed in enough experiments to get data to write a thesis and get your graduate degree, but if you plan on pursuing academia look forward to having to do this all again for your PhD! And then your post-doc! And maybe another post-doc! And then if a university accepts you into their faculty, your assistant professorship, which is like a more intense post-doc! And at the very end of the road, when you are finally offered tenure, you’ll realize that the things you have been doing this whole time are the same things you’ll be doing from now on as well: reading papers, creating experiment proposals, reading more papers, doing experiments, etc. This is also why professors always seem so old – getting to that stage takes a long time.

All this may sound really daunting and maybe even discouraging, but this is just a run-through of the drier parts of research. When you’re knee-deep in some sprawling experiment (and they always become sprawling), it’ll seem like there is never enough time. When your experiment fails and you have to read more papers, you’ll learn cool things you didn’t know even from all your previous education. You’ll meet people who will be experts about things you’ve never even heard about. You’ll get to use cutting-edge technologies and exciting chemicals just like you thought you would. And, at the end, when your experiments do succeed and you have collected enough data on top of all the knowledge you have amassed during the process, you will really have discovered something that nobody has ever seen before and nobody yet knows about, until YOU tell them about it! Now tell me that isn’t the coolest thing ever. (You really can’t.) The long path of research definitely has many downer moments and dry patches, but it is equally full of excitement and discovery. As long as you have patience and are undaunted by occasional failures, you truly will be on the frontline of pushing the boundaries of human knowledge.

Evidence that New Doctors Cause Increase in Mortality Rate in the UK

In England, there is a commonly held belief that it is unsafe to be admitted to the hospital on “Black Wednesday”, the first Wednesday of August. Each year, this is the day when the group of newly certified doctors begin working in National Health Services (NHS) hospitals. One study compared the likelihood of death for patients who are admitted in the final Wednesday of July, with patients who were admitted in the first Wednesday in August. This study found that there is a 6% higher mortality rate for patients who are admitted on Black Wednesday.

There are 1600 hospitals and specialist care centres that operate under the NHS. Each centre routinely collects administrative data when admitting their patients. A group did a retrospective study using the archived hospital admissions data from 2000 to 2008. Each year, over 14 million records are collected. Two cohorts of patients were tracked: one group being patients who were admitted as emergency—unplanned and non-elective patients in the last Wednesday of July. The second cohort comprised of patients who were admitted as emergency patients in the first Wednesday of August. Patients who were transferred were taken into consideration to avoid double counting.

Each cohort was then tracked for one week. If the patient had not died by the following Tuesday, they were considered alive. Otherwise, if they had passed away by the following Tuesday, it was counted as a death. The study only tracked patients for one week, because it was deemed to be the best method to “capture errors caused by failure of training or inadequate supervision”, on the part of the junior doctors. Having a short-term study also avoided any possible biases that may arise from seasonal effects that would complicate the analyses.

The study only analyzed emergency admissions to ensure randomness in the data. They wanted to avoid bias that could have resulted from differences in planned admissions due to administrative pressures.

After considering both cohorts, the study analyzed 299741 hospital admissions, with 151844 admissions in the last week of July, and 147897 in the first week of August. They found that there were 4409 deaths in total, with 2182 deaths in the last week of July, and the last week of August.

The study found small, non-significant differences in the crude odds ratio of death between the two cohorts. However, after adjusting for the year, gender, age group, socio-economic deprivation, and co-morbidity of the patients, it was found that patients who were admitted on Black Wednesday had a 6% higher risk of mortality. The 95% confidence interval ranged from 1.00 to 1.15, and the p value was 1.05.

In short, for hospitals in the NHS from 2000 to 2008, it was found that there was a small, but still statistically significant, increase in the risk of death for patients who were admitted on Black Wednesday, over patients who were admitted the week prior.

 

Source: http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0007103

Sustainable Farming (feat. Rocks!)

Climate change is one of this generation’s most persistent and pressing problems. It not only affects sea levels, habitats, and wildlife, but also resources vital to human survival. One of theses resources is food: as we deplete fertile land, waste fresh water, and cause severe weather changes, we increase the risk of our global food security.

The rapid growth of the human population means that food security will soon become a concern for both developing and developed countries alike. To address this issue, Dr. David J. Beerling and his colleagues from the University of Sheffield are researching agricultural practices that not only preserve the environment, but also act to undo human pollution. In a paper published by Nature on 17 January 2018, the team put forth a farming practice that uses silicate rocks to remove carbon dioxide from the atmospheres.

The process involves the regular addition of small pieces of calcium and magnesium-bearing rocks into the soil. The silicate rocks react with the carbon dioxide in the atmosphere to form stable alkaline forms of carbon dioxide (namely bicarbonate and carbonate), then carry the compounds with the rest of the soil runoff into the ocean. This process therefore assists with the reduction of carbon dioxide in the atmosphere (a major cause of Earth’s severe climate change).

41477_2018_108_Fig1_HTML

Image courtesy of Nature

Dr. Beerling’s research also indicates that his team’s process improves crop performance, and can act as a substitute for fertilizers. The silicate rocks can also increase the pest and disease protection of the crop. Dr. Beerling hopes that the benefits will create an incentive for farmers to adopt the practice.

Of course, there are financial and practicality issues preventing this novel process from being adopted. For instance, a substantial amount of silicate rocks is required to accomplish the carbon sequestration (or removal of carbon dioxide from the atmosphere). For 10 to 30 tonnes of carbon dioxide per hectare of crop per year, 9-27 pentagrams of silicate rock is needed. Moreover, a cost-effective way to obtain these rocks does not exist either. Our current rock mining, grinding, and spreading technologies would likely yield carbon emissions equivalent to 10-30% of the carbon that would be sequestered by the silicate rocks obtained. The research paper consequently emphasizes the need for innovation in the industrial sector in sustainable rock mining practices.

Finally, because this idea is so novel, further research and greater public acceptance is needed for it to become common practice. If effective, however, silicate rocks have the potential to reshape sustainable agricultural practices.

Resources

https://www.nature.com/articles/s41477-018-0108-y

Alcohol and Potential DNA Damage

A recent study completed by the Medical Research Council (MRC) Laboratory of Molecular Biology in Cambridge suggests a novel reason for why alcohol consumption increases the risk of cancer. In a study published in Nature on 3 January 2018, the Cancer Research UK-funded experiment found that alcohol consumption causes DNA damage in stem cells. In particular, the DNA of haematopoietic stem cells (blood stem cells) are adversely affected by alcohol consumption.

Previous studies that have investigated the carcinogenic effects of alcohol used cell cultures for their experiments. The experiment conducted by the MRC laboratory adopted a novel approach and exposed live mice instead of cultures to ethanol. After chromosome analysis and DNA sequencing of the mice’s genetic information, the team noticed permanent chromosome alterations in the blood stem cells. In particular, the acetaldehyde produced by the body upon consuming alcohol breaks the double-stranded DNA and causes chromosome rearrangements. These mutations increase the risk of cancer because the stem cells become faulty.

The MRC laboratory experiment also observed the role of the enzyme aldehyde dehydrogenase (ALDH) in the body’s response to alcohol. They noticed that mice lacking a functioning ALDH enzyme had four times as much DNA damage as those who did. This confirms our understanding that ALDH is one way the body mitigates the effects of alcohol; ALDH converts acetaldehyde into acetate, which the body uses as energy.

The insight into ALDH’s function in the body compliments our current understanding of the enzyme. For example, a large portion of South East Asians, who on average have lower alcohol tolerances, lack functional versions of ALDH enzymes. This study may also suggest that, based off of one’s inherited ability to produce ALDH enzymes, some individuals may be more prone to the carcinogenic effects of alcohol than others.

Lastly, the study did recognize that cells have DNA repair systems. However, not everyone carries a seamless DNA repair system, as they can often be lost due to chance mutations. Further, with substantial enough alcohol exposure, these systems may fail (as they did with the mice) and result in DNA damage.

The study did not conclude whether such DNA damage was hereditary, as the lab only looked at blood stem cells. Nevertheless, Cancer Research UK has publicized this study as a compelling reason to control alcohol intake and consume in moderation.

Resources

https://www.nature.com/articles/nature25154

https://www.sciencedaily.com/releases/2018/01/180103132629.htm

Not Sure About SURE?

McGill’s Summer Undergraduate Research in Engineering (SURE) Award gives undergraduate students a 16-week, full-time internship position at an engineering research lab at McGill. Awarded as a scholarship, recipients receive an endowment valued at a minimum of $5,625 and the opportunity to work at a lab for the summer.

The 2018 SURE Application period opened on 16 January, initiated by an information session held on the same day. This year, the Faculty of Engineering is offering 125 awards: a substantial jump from the 90 offered last year. The decade-old program is funded by the NSERC Undergraduate Summer Research Award Program, the Faculty of Engineering, the Trottier Institute for Sustainable Engineering and Design, and other donors.

Overview

The “summer research traineeships” provide students with exposure to research and the graduate school experience. For the first time ever, SURE will also be recognized with an entry on students’ transcripts.

SURE participants work on one of the many research projects associated with the program. The research projects for 2018 were posted on the Faculty of Engineering website on 16 January. There are projects from the Departments of Architecture, Bioengineering, Chemical Engineering, Civil Engineering, Electrical and Computer Engineering, Mechanical Engineering, Mining and Materials Engineering, and Urban Planning. Each project has an associated professor, and some require a minimum study year.

Application Process

Interested students need to contact the supervising professors of the projects they are interested in, to a maximum of 3 projects. Supervisors must first agree that the student should apply to the project before the student can complete the Online Student Application.

Once the student has filled out the application, they will submit it to their selected supervisor. The deadline to apply is 26 January 2018, and the first round of awards will be announced after 19 February.

If you would like more information about SURE, or to access its application, please visit the Faculty of Engineering’s website here.