Tuesday, September 6, 2011

Why we need to get rid of tenure

It's another start to the new school year, although really doesn't feel like it for most grad students, as we've been working through the summer. But at least with the hype around back to school, one can't help but feel a litle excited to be going into campus today (although seeing the lineup for the bus may bring this excitement down rather quickly).

I still remember my first day, and my first semester, yes, I enjoyed going to classes that I was enrolled in, and also the occasional classes in which I was not. Learning was fun and it still is, and I solely believe that the access to education should be a basic human right, not some way for colleges and universities to make millions while putting students into major debt before they're even out of school.

But one particular professor still comes to mind from first year: my first year bio prof. Now he wasn't a bad prof, he taught (or read to be accurate) the slides of each lecture day in and day out. Now he got the job done, he got all the information across, but I thought university was the time for engaging lectures and intellectual development. For the most part it was, but I saw many of my fellow first years start to hate this class and stop going altogether, only to do very poorly and having to take it again the next semester. I remember thinking that I could do a better job and I didn't even have a degree! This is also one of the reasons I also aspire to teach at the university level.

Over the years, I saw a trend. A trend that may be evident to most students as well. If the prof is relatively new, the subject is taught with passion and interest, while the senior profs tend to lead very monotonous straight from the text lectures. Now, why this happens may be different, but I belive for the most part, this has to do with tenure. What is tenure? A prof can acquire a tenure position at a university with a history of continuous teaching for a certain number of years. What this means is that the tenured prof is pretty much guaranteed to have secured a position at that university for the rest of time. The new prof is trying to get tenure, so is trying very hard and trying to really engage students, while the tenured senior prof just doesn't care, cause they can't be fired (unless they do something that justifies that).

In my opinion, we need to get rid of this tenure position. It makes two big problems. First, the tenured prof no longer has motivation for teaching 'outside the box' and really engage young minds, and second, senior tenured profs remain well past their retirement ages and refuse to step down and allow new profs to begin their careers. I believe we need to change this tenure position, perhaps allow that profs can get tenure, but they will still have yearly reviews where their contracts can be terminated. And instead of tenure, we can have extended contracts, like 5 or 10 or 20 years, like in sports. At the end of your contract, the university can decide whether or not to extend the contract based on the yearly reviews. If not, the prof can continue to work as a nonteaching (research only) faculty until the contract is offered or apply somewhere else. What this also does is prevents contracts from being extended for retired profs to make room for new profs. And also might allow for the exchange of profs around different universities thereby increasing intellectual diversity among universities.
Sent from BlackBerry®

Friday, September 2, 2011

Medical school reform

The bus is taking quite a long time to show up, so I'd thought I'd just post something taken from a brief conversation with my coworker and colleague Connor. Basically, its about the medical school system, which I personally believe needs to be seriously looked at. Now, don't get me wrong, the medical school system isn't the only system that needs to be changed. How grad school is structured and the way that researchers/young scientists are being abused and manipulated also needs to be changed but that is for another post. Also, I'd like to caution the readers to take my post here with a grain of salt, these are only my thoughts and I view the medical system through the other side as a grad student. However, I have worked with many medical students while working in lab during 1st year grad, my department contains many medical students so I've had the chance to take classes with them, and the building in which I work is where most of the medical school lectures occur. Also keep in mind that my information may not be accurate for all medical schools, but it is for the one I'm the closest to.

The medical system as it is now requires the premed student to satisfy 3 major requirements: good marks in undergrad (to the point where I've seen premed students sabotage other premed students, I hope this is only a rare occurrence), MCAT and volunteer work. But there is another factor that most students forget: the designated number of seats available to you depending on where you live, what university you went to, and what degree(s) you have completed. Speaking of which, right off the bat, I want to mention that students in a nonscience disciplines (especially Arts) should NOT be given a chance to apply to med school. This is not because I hate arts, but you CANNOT compare a student who has taken 4 years of full on science training to a student who has only taken the science courses required to gain admittance to med school. We need to streamline students into medicine and health from undergrad.

Getting back to my original point, universities will restrict seats from you if you're out of province, if the medical university doesn't acknowledge your home university's grading system, or if you've completed a PhD prior to applying to med school. What this basically means is that you're restricted to follow the idea of a predetermined entrance "model". On top of this, each medical university has their own board that decides who should get in (which needless to say can be full of bias). This system can restrict a very capable and ambitious premed student from entering a very good medical school simply because he/she doesn't live close enough.

This needs to be changed. The Canadian government already administers the MCAT, I suggest that they take over the admission system too. Basically, all your marks (which the goverment already has) along with your MCAT marks are attached to your medical school application that one government board decides your entrance into medical school. The student would also name their top 3 universities. This way, if a student in Victoria wanted only to study medicine in McGill, Univ. of Toronto and Queens, he/she would have an equal chance with a student already living and studied in Ontario. Plus, different universities have different admission requirements: some put more weight on marks over volunteer work or the MCAT while others do something different. Having one government body evaluating med admissions would also standardize the admission requirements, which would just be a bonus.

I also realize that this might be a lot of work for that committee, but consider this. Medical universities pretty much take a year just to let students know if they're admitted or not. I'm sure this committee, being responsible for only one purpose each year, could easily finish their task in the same time frame.

Anyways, just my thoughts. I may continue this topic again sometime in the future.
Sent from BlackBerry®

Tuesday, August 30, 2011

And so I've returned

So it seems like its been quite a while since my last post. Although its a bit understandable why I wasn't able to post anything sooner, especially with deadlines coming up such as with the paper I am submitting, committee meeting, comp exam, and various other experiments that need to be done as I'm transitioning my project into a different field, I still feel a little bad that I wasn't able to keep this blog updated. But no more, I shall try my best to keep this going, even thought it's highly unlikely that no one really reads this, but hey, writing improves happiness (at least for me), which improves neurogenesis in the brain (I'm sure there is a reference for that somewhere; I believe it falls under enrichment in environment can increase neurogenesis in brains of rodents or something like that, which incidentally is the field my project is transitioning onto), and increasing neurogenesis will hopefully lead to increase in productivity and output. That's the hope at least. This whole idea is thanks to a TED talk by Shawn Achor linked to me by my good friend Steve Kim. I've attached the link if anyone wants to watch, its actually a very good short talk! http://www.youtube.com/watch?v=GXy__kBVq1M

Its really interesting how a research project seems to develop over time. I originally started looking at a mouse model of multiple sclerosis, which without going into too much detail, didn't really pan out. And this is one of the problems plaguing research using animal models. First of, animal models are just that, a system that is manipulated to simulate certain pathological mechanisms of a human disease so that we can study it and uncover more mechanisms or ways to "treat" the model disease. By no means does the animal model even remotely equivalent to the human disease, in fact, it could be very far indeed. But most of the time, we restrict ourselves to the idea that we're asking a specific question, so the model is the closet thing we have to test, which is reasonable, I guess. This always reminds me of a comment my colleague Wulin Teo from Univ. of Calgary who said he almost wishes we could study the actually human disease, just dissect out the human spinal cord and take a look at what pathologies are really going on in the human disease (we were talking about MS in this instance). All ethical issues aside, I kind of agree, without studying afflicted humans, how can we imagine to accomplish the treatment or even cure the human disease. Of course, many doctors (and general people) would be appalled by this comment, and by no means am I saying that we should go out and start experimenting on patients, but I feel that there is more need for directed and focused experimentation on human samples. Clinical research needs to merge more with biomedical research to have any real input into translational medicine.

I personally feel clinical research needs this influx of biomedical research, and with no offense intended, I think clinical research the way it is right now, is fickle at best. It needs more stringent research guidelines, like the ones found in biomedical research. Scientists who has published papers or are in the process know what I am talking about. For example, to validate a point or finding for publication, you must show that your hypothesis stands up using multiple different experiments such as in vitro and in vivo experiments. In clinical research (as far as I know and what I've seen), doing one experiment on one or a few patient samples is enough to publish something. Now how is that possible, the patients are likely very different (genetically) and may different environmental factors that can impact their specific illness, and what about the location of the sample and what drugs these patients were on when the biopsy was taken. Essentially, there isn't a consistent and controlled foundation from which to validate your hypothesis in clinical research (and I understand there are limitations when it comes to getting your hand on human samples).

In correspondence to this, I'd like to acknowledge a temporary but very intelligent supervisor that I've had the privilege of working for: Dr. Peter Rieckmann. His philosophy was to coordinate and facilitate communication and idea sharing between medical doctors and graduate researchers/postdocs. I really enjoyed working with my medical colleagues and fellow researchers as a team when we had bi-weekly lab meetings where everyone presented what they had managed to complete during that time. It was very helpful when, for example, you'd notice an experimental phenotype and one of the medical doctors would chime up and say that what you see looks very similar to what they notice in the patients, and as the disease progresses, they also see these other features. This would not only provide feedback from a clinical angle, but also allows you to think "outside the box" when it comes to research.

I think it was Albert Einstein who said "imagination is more important than knowledge", wherein the ability to troubleshoot a problem with creativity and imagination is a bonus to researchers, it is unfortunately one of the traits that is in danger of being lost when going through graduate school.