The Experience of Research as an Undergrad

Ah, research! Cutting-edge technology, exciting chemicals, pushing the limits of knowledge with your own two hands! But is that all there is to it?

The reality is that pushing the limits of knowledge requires a lot of inspiration, and takes a really freaking long time. Doing research is not your run-of-the-mill undergraduate lab. There, you do one experiment, say, synthesize aspirin, which has already been well characterized and done numerous times by a vast number of people You then you write about your specific attempt and all is well and good. In research, you don’t have the luxury of previous renditions of the same experiment because they’ve already been done, so what’s the point?

Instead, you need to find a new topic to study so that you can appease: 1) your supervisor, 2) your advisory panel, and 3) a funding agency, if you get there. Each of these require a more original and “exciting” experiment, and often those are quite hard to find. In fact, losing your research topic because someone else has already studied it, Usually, people tend to find new topics by looking into similar topics and tweaking them slightly, or delving deeper into a topic that has only been generally covered. This involves reading A LOT of papers so that you can become an expert on the current status of the research area you’re interested in. Also, keep in mind that while you will have help along the way, ultimately you must decide on the topic on your own because it is YOUR project, not your supervisor’s; otherwise, what’s the point?

Finally, after digging through trawls of papers, you have solidified your research topic and you are pretty confident that it will be exciting enough to give you a degree (let’s not get ahead of ourselves to the grant stage yet). However, since your topic is so new and exciting, you have no idea how you’re going to do it or if it will even work. You can ask around for help from people in your surroundings, but odds are they are not familiar enough with your topic. After all, you chose this topic specifically because it is new, and nobody has really researched it yet. So how do you proceed? By, guess what, reading more papers! In this case reading papers is like an extension of asking people in your surroundings. You won’t get the exact answer you’re looking for, but you can get an approximation of what you can do to get results. Also, thanks to modern technology, there are now internet resources such as research gate to help you in addition to reading a ton of papers, so all is not lost.

After much scrounging around, you are finally ready to plunge into research, exciting! Time to collect data!

…but collecting significant data also takes a long time and along the way you will inevitably have experiments that fail, reagents that degrade, part of your project getting scooped, etc. Eventually, you will succeed in enough experiments to get data to write a thesis and get your graduate degree, but if you plan on pursuing academia look forward to having to do this all again for your PhD! And then your post-doc! And maybe another post-doc! And then if a university accepts you into their faculty, your assistant professorship, which is like a more intense post-doc! And at the very end of the road, when you are finally offered tenure, you’ll realize that the things you have been doing this whole time are the same things you’ll be doing from now on as well: reading papers, creating experiment proposals, reading more papers, doing experiments, etc. This is also why professors always seem so old – getting to that stage takes a long time.

All this may sound really daunting and maybe even discouraging, but this is just a run-through of the drier parts of research. When you’re knee-deep in some sprawling experiment (and they always become sprawling), it’ll seem like there is never enough time. When your experiment fails and you have to read more papers, you’ll learn cool things you didn’t know even from all your previous education. You’ll meet people who will be experts about things you’ve never even heard about. You’ll get to use cutting-edge technologies and exciting chemicals just like you thought you would. And, at the end, when your experiments do succeed and you have collected enough data on top of all the knowledge you have amassed during the process, you will really have discovered something that nobody has ever seen before and nobody yet knows about, until YOU tell them about it! Now tell me that isn’t the coolest thing ever. (You really can’t.) The long path of research definitely has many downer moments and dry patches, but it is equally full of excitement and discovery. As long as you have patience and are undaunted by occasional failures, you truly will be on the frontline of pushing the boundaries of human knowledge.

Advertisements

Evidence that New Doctors Cause Increase in Mortality Rate in the UK

In England, there is a commonly held belief that it is unsafe to be admitted to the hospital on “Black Wednesday”, the first Wednesday of August. Each year, this is the day when the group of newly certified doctors begin working in National Health Services (NHS) hospitals. One study compared the likelihood of death for patients who are admitted in the final Wednesday of July, with patients who were admitted in the first Wednesday in August. This study found that there is a 6% higher mortality rate for patients who are admitted on Black Wednesday.

There are 1600 hospitals and specialist care centres that operate under the NHS. Each centre routinely collects administrative data when admitting their patients. A group did a retrospective study using the archived hospital admissions data from 2000 to 2008. Each year, over 14 million records are collected. Two cohorts of patients were tracked: one group being patients who were admitted as emergency—unplanned and non-elective patients in the last Wednesday of July. The second cohort comprised of patients who were admitted as emergency patients in the first Wednesday of August. Patients who were transferred were taken into consideration to avoid double counting.

Each cohort was then tracked for one week. If the patient had not died by the following Tuesday, they were considered alive. Otherwise, if they had passed away by the following Tuesday, it was counted as a death. The study only tracked patients for one week, because it was deemed to be the best method to “capture errors caused by failure of training or inadequate supervision”, on the part of the junior doctors. Having a short-term study also avoided any possible biases that may arise from seasonal effects that would complicate the analyses.

The study only analyzed emergency admissions to ensure randomness in the data. They wanted to avoid bias that could have resulted from differences in planned admissions due to administrative pressures.

After considering both cohorts, the study analyzed 299741 hospital admissions, with 151844 admissions in the last week of July, and 147897 in the first week of August. They found that there were 4409 deaths in total, with 2182 deaths in the last week of July, and the last week of August.

The study found small, non-significant differences in the crude odds ratio of death between the two cohorts. However, after adjusting for the year, gender, age group, socio-economic deprivation, and co-morbidity of the patients, it was found that patients who were admitted on Black Wednesday had a 6% higher risk of mortality. The 95% confidence interval ranged from 1.00 to 1.15, and the p value was 1.05.

In short, for hospitals in the NHS from 2000 to 2008, it was found that there was a small, but still statistically significant, increase in the risk of death for patients who were admitted on Black Wednesday, over patients who were admitted the week prior.

 

Source: http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0007103

Observing Molecular Machines using Cryo-EM

In 2017, the Nobel Prize in Chemistry was awarded to Jacques Dubochet, Joachim Frank, and Richard Henderson for their development of cryo-electron microscopy (cryo-EM). In 1986, Ernst Ruska, and Gerd Binning and Heinrich Rohrer won the Nobel Prize in Physics for designing the first electron microscope and for designing the scanning tunneling microscope respectively. In 1982, Aaron Klug won the Nobel Prize in Chemistry for developing crystallographic electron microscopy. With so many Nobel Prizes having been awarded for electron microscopy, what makes this recent development different from the other two?

The keywords for cryo-EM are proteins and resolution. Binning and Rohrer’s scanning tunneling microscope, designed way back in 1981, had a maximal lateral resolution of 0.1 nm and a maximal depth resolution of 0.01 nm—a resolution high enough to resolve individual atoms. They do this by slowly hovering a needle only one atom wide at the tip over an extremely flat surface of a solid that is made of a uniform lattice of atoms and reading the disturbances in the voltage difference between the surface and the microscope tip.

However, proteins (and other macromolecules but we are less excited about them—“another DNA structure, woooooo,” said nobody ever after the double-helix structure was determined in 1953) are these wild things that require atomic-level resolution to determine how the individual amino acids are oriented, but are also absolutely gigantic molecules where the 3-dimensional configurations of said amino acids matter just as much, if not more as their identities. Therefore it is no surprise that just slowly hovering a really thin needle over a uniform layer of proteins doesn’t really get you to this 3D structure. Not to mention that proteins are really delicate flowers and will precipitate and become some amorphous aggregate if you so much as look at them funny (okay I exaggerate, but only slightly). A good resolution of a protein for structural biologists averages at around 2.5 angstroms (Å, 1 Å = 0.1 nm), though there are many that have a higher resolution. For reference, a covalently bonded carbon atom has a diameter of 1.5 Å.

The current gold standard for protein structural determination is with x-ray crystallography, where an x-ray beam is fired into a protein crystal and due to the crystalline nature of the proteins, the single beam diffracts into many different directions which is caught on a screen. The beam’s diffraction angles and intensities can then be measured to produce a 3D electron-density map of the individual atoms can be reconstituted to eventually yield a complete 3D protein structure. However, protein crystallization often requires the use of poisonous salts and precipitants, and extreme pHs that would never be found in living organisms and very large proteins and membrane proteins are often impossible to crystallize. (Also the actual crystallization process is very luck-based and even for regular sized proteins may or may not happen. Trying to grow protein crystals is very good for building character.) This is where cryo-EM, our star, finally comes into the scene.

Electron microscopy usually uses some form of electron interactions between a source and the object to be imaged. In order to not disturb these delicate interactions, the imaging has to take place in an absolute vacuum, which falls under the “will aggregate protein” condition category. Instead, a pure sample of a protein of interest is frozen down and an electron beam is fired onto the frozen protein to produce an image—a “trace”—on a detector. Other than the freezing instead of vacuum, this sounds like pretty standard electron microscopy, no? The key advancements were figuring out how to flash freeze water-soluble proteins (because normal freezing could also fall in the “will aggregate protein” category, and may produce ice crystals which are NOT protein) and how to get the detectors good enough to achieve the really high resolution but also really large width required to image proteins.

The protein in the frozen sample is found in various orientations and from taking an image of hundreds of different orientations of the same protein using the new, high-tech detector that was recently developed, a computer can be used to generate a complex electron density map comparable to those obtained through x-ray crystallography, which can then be used to generate the true structure of the protein in high resolution and accuracy.

Picture1

An example of cryo-EM images of a protein that together can be used to generate a 3D structure. Image: Maofu Liao, Harvard Medical School.

The beauty of cryo-EM is that it can do everything that x-ray crystallography cannot: image proteins in a non-poisonous environment and image very large proteins. It also doesn’t require painstakingly screening every possible combination of precipitant, salt, and pH possible and hoping and praying that out of one of those potentially thousands of combinations a crystal will grow within your tenure in that lab.

But you may be wondering, “who cares about protein structures anyways?”. Protein structures are very important in drug development, and are also important in understanding the molecular mechanisms of both healthy and disease states. Viruses are also basically protein assemblies, and determining their structures are very important in understanding their pathology and behaviour on a molecular scale. Also, they look cool! (Protein aesthetics is usually how people get suckered into structural biology). Appreciate this nice picture of the Zika virus obtained through cryo-EM. Picture1

Cryo-EM structure of the Zika Virus. PDB: 5IRE Sirohi et al. (2016) The 3.8 angstrom resolution cryo-EM structure of Zika virus. Science. 352, 467-470

Happy lurking the cryo-EMs!

Sustainable Farming (feat. Rocks!)

Climate change is one of this generation’s most persistent and pressing problems. It not only affects sea levels, habitats, and wildlife, but also resources vital to human survival. One of theses resources is food: as we deplete fertile land, waste fresh water, and cause severe weather changes, we increase the risk of our global food security.

The rapid growth of the human population means that food security will soon become a concern for both developing and developed countries alike. To address this issue, Dr. David J. Beerling and his colleagues from the University of Sheffield are researching agricultural practices that not only preserve the environment, but also act to undo human pollution. In a paper published by Nature on 17 January 2018, the team put forth a farming practice that uses silicate rocks to remove carbon dioxide from the atmospheres.

The process involves the regular addition of small pieces of calcium and magnesium-bearing rocks into the soil. The silicate rocks react with the carbon dioxide in the atmosphere to form stable alkaline forms of carbon dioxide (namely bicarbonate and carbonate), then carry the compounds with the rest of the soil runoff into the ocean. This process therefore assists with the reduction of carbon dioxide in the atmosphere (a major cause of Earth’s severe climate change).

41477_2018_108_Fig1_HTML

Image courtesy of Nature

Dr. Beerling’s research also indicates that his team’s process improves crop performance, and can act as a substitute for fertilizers. The silicate rocks can also increase the pest and disease protection of the crop. Dr. Beerling hopes that the benefits will create an incentive for farmers to adopt the practice.

Of course, there are financial and practicality issues preventing this novel process from being adopted. For instance, a substantial amount of silicate rocks is required to accomplish the carbon sequestration (or removal of carbon dioxide from the atmosphere). For 10 to 30 tonnes of carbon dioxide per hectare of crop per year, 9-27 pentagrams of silicate rock is needed. Moreover, a cost-effective way to obtain these rocks does not exist either. Our current rock mining, grinding, and spreading technologies would likely yield carbon emissions equivalent to 10-30% of the carbon that would be sequestered by the silicate rocks obtained. The research paper consequently emphasizes the need for innovation in the industrial sector in sustainable rock mining practices.

Finally, because this idea is so novel, further research and greater public acceptance is needed for it to become common practice. If effective, however, silicate rocks have the potential to reshape sustainable agricultural practices.

Resources

https://www.nature.com/articles/s41477-018-0108-y

Alcohol and Potential DNA Damage

A recent study completed by the Medical Research Council (MRC) Laboratory of Molecular Biology in Cambridge suggests a novel reason for why alcohol consumption increases the risk of cancer. In a study published in Nature on 3 January 2018, the Cancer Research UK-funded experiment found that alcohol consumption causes DNA damage in stem cells. In particular, the DNA of haematopoietic stem cells (blood stem cells) are adversely affected by alcohol consumption.

Previous studies that have investigated the carcinogenic effects of alcohol used cell cultures for their experiments. The experiment conducted by the MRC laboratory adopted a novel approach and exposed live mice instead of cultures to ethanol. After chromosome analysis and DNA sequencing of the mice’s genetic information, the team noticed permanent chromosome alterations in the blood stem cells. In particular, the acetaldehyde produced by the body upon consuming alcohol breaks the double-stranded DNA and causes chromosome rearrangements. These mutations increase the risk of cancer because the stem cells become faulty.

The MRC laboratory experiment also observed the role of the enzyme aldehyde dehydrogenase (ALDH) in the body’s response to alcohol. They noticed that mice lacking a functioning ALDH enzyme had four times as much DNA damage as those who did. This confirms our understanding that ALDH is one way the body mitigates the effects of alcohol; ALDH converts acetaldehyde into acetate, which the body uses as energy.

The insight into ALDH’s function in the body compliments our current understanding of the enzyme. For example, a large portion of South East Asians, who on average have lower alcohol tolerances, lack functional versions of ALDH enzymes. This study may also suggest that, based off of one’s inherited ability to produce ALDH enzymes, some individuals may be more prone to the carcinogenic effects of alcohol than others.

Lastly, the study did recognize that cells have DNA repair systems. However, not everyone carries a seamless DNA repair system, as they can often be lost due to chance mutations. Further, with substantial enough alcohol exposure, these systems may fail (as they did with the mice) and result in DNA damage.

The study did not conclude whether such DNA damage was hereditary, as the lab only looked at blood stem cells. Nevertheless, Cancer Research UK has publicized this study as a compelling reason to control alcohol intake and consume in moderation.

Resources

https://www.nature.com/articles/nature25154

https://www.sciencedaily.com/releases/2018/01/180103132629.htm

Alan Guth and the Multiverse

Feature Photo: The Atlantic

The content from this article was produced by Mathilde Papillon.

On the evening of January 18, 2018, Alan Guth, a famous American theoretical physicist and cosmologist, visited McGill University to deliver a talk entitled “Inflationary Cosmology: Is our Universe Part of a Multiverse”. Over the course of his career, Guth has won several prestigious awards in physics. He currently works as a professor at MIT, and is recognized as the inventor of the Inflation Theory. Across the scientific community, it is largely agreed that the Inflation Theory is humanity’s best guess to date of how to universe came to be.

The talk took place in McGill University’s biggest Lecture hall: Leacock 132. Notably, the room was packed, and organizers had to send dozens of people home due to a lack of seating space. This talk was part of Anna I. McPherson Lectures in Physics, a series of lectures regarding hot topics in physics that McGill has taken part of for twenty years now.

Guth’s talk addressed three main subjects: The theory of inflation, evidence for such, and the resulting possibility of a multiverse. He began by making the distinction between the conventional Big Bang theory, a concept that only addresses the aftermath of the “bang”, and inflation. Inflation describes what happened during the bang. By the laws of general relativity, gravitational repulsion is theoretically possible. In this, gravity works in an opposite way to what we are all used to.

The Inflation theory states that in the beginning, matter was comprised of tiny patches of negative pressure – on the order of 10E-28 cm large – that continued to exponential expansion. The phenomena is driven by repulsive gravity.

The “second miracle of physics”, and the other main idea that is at the heart of the theory of Inflation, is negative energy. This simply states that there exists negative energy, allowing the total amount of energy in the universe to the 0. All the energy that people are “familiar with”, are counterbalanced by negative energy. It is theorized that in the beginning of time, there was an exponential expansion of both positive and negative energies.

Photo: Mathilde Papillon

Next, Guth presented evidence for inflation. He asked a series of questions that are left unanswered by the conventional Big Bang theory, and proceeded to show how Inflation can resolve or explain these gaps in the knowledge.

  1. In a macroscopic sense, why is the universe so uniform? Inflation suggests that the universe is stretched out in each region in order to accommodate specific density.
  2. Why is the universe flat? If we define Ω to be the ratio between the universe’s measured mass density and the critical mass density for flatness, we find that Ω is equal to 1 to 16 significant digits. Inflation’s gravitational repulsion drives Ω to 1, making the universe’s mass density closer to the mass density required for flatness.
  3. On a small scale, why is the universe so non-uniform? Inflation uses a quantum mechanical approach that is based on probability. Therefore, in the beginning of the universe, there is a very high chance that there were improbably, tiny fluctuations caused by gravity. These regions would be a little more dense, and have a gravitational pull that is a little stronger. This phenomenon is known as quantum fluctuations. There is evidence for quantum fluctuations in the universe’s cosmic radiation background.

After addressing these questions, Guth described the possibility of a multiverse as suggested by inflation. Assuming that inflation is correct, since the universe has started to inflate, it should inflate forever. Physicists have determined that the basis for inflation, the material with negative pressure, has a half-life, and decays. However, the rate of inflation is so high, by the time one half-life has gone by, the remaining half that is still ‘active’ has grown to be beginning than the lost half. Therefore, it is possible for the universe to inflate forever.

In the process of inflation, it is possible for pieces of inflating material to break off, creating “pocket universes” on their own. From this, it is possible that our universe is one of these pockets.

Guth kept the large audience engaged for the hour he spoke for, receiving a few rounds of applause. He closed off his talk with a question period, in which an audience member asked him what his thoughts were on the religious and philosophical beliefs that humanity holds. Guth believes that his work shows us how small and insignificant humanity is, but that humanity is important to ourselves. As such, it is important to keep building a civilization that we wish to keep living in.

A Weekend of Engineering: MEC 2017

The Feature photo was taken from the McGill Engineering Competition Facebook page.

Each year, a handful of McGill engineering students organize the McGill Engineering Competition (MEC): a three-day event open only to students in the Engineering Faculty. The 2017 MEC ran from 24 to 26 November. Over the course of a weekend, participants competed in one of eight categories for the chance to represent McGill at the Quebec Engineering Competition in late January.

The eight competition categories were: Junior Design, Senior Design, Consulting Engineering, Impromptu Debate, Engineering Communication, Innovative Design, Re-Engineering, and Scientific Research Presentation. Some of the categories allowed competitors to prepare beforehand, while others presented challenges to the participants the day of. For example, the Junior Design category challenged competitors to build an environmentally-friendly boat that could hold up to one kilogram without sinking.

Teams presented their projects in front of a volunteer judging panel consisting of company representatives or McGill Alumni. The teams were scored based off of a predefined rubric distributed at the beginning of the competition. The top three teams of each event were announced at the awards ceremony on Sunday evening.

The registration fee for the event was $25 and could be purchased during tabling hours in the McConnell Engineering Building or online. The registration fee also included a T-shirt, lanyard, and complimentary meals for the weekend.

MEC is an annual competition at McGill University, held near the end of the Fall term. For more information, please visit the McGill Engineering Competition Facebook page.