My Dominant Hemisphere

The Official Weblog of 'The Basilic Insula'

Archive for the ‘My Best Posts’ Category

Meeting Ghosts In The Chase For Reality

leave a comment »

Sunrise

Sunrise (via faxpilot @ Flickr CC BY-NC-ND license)

Watching the morning sun beaming through the clouds during today’s morning jog, I was struck by an epiphany. What ultimately transpired was a streak of thoughts, that left me in a overwhelming sense of awe and humility for its profound implications.

Perhaps the rejuvenating air, the moist earth from the previous night’s rains and the scent of the fresh Golden Flamboyant trees lining my path made the sun’s splendor much more obvious to see. Like in a photograph coming to life, when objects elsewhere in the scene enhance the main subject’s impact.

As I gazed in its direction wondering about the sunspots that neither I nor anyone else around me could see (but that I knew were really there, from reading the work of astronomers), I began thinking about my own positional coordinates. So this was the East, I found. But how did I know that? Well as you might have guessed, from the age old phrase: “the sun rises in the East and sets in the West”. Known in Urdu as “سورج مشرق میں نکلتا ہے اور مغرب میں ڈوبتا ہے ” or in Hindi, “सूरज पूरव में निकलता है और पश्चिम में डूबता है” and indeed to be found in many other languages, we observe that man has come to form an interesting model to wrap his mind around this majestic phenomenon. Indeed, many religious scriptures and books of wisdom, from ancient history to the very present, find use of this phrase in their deep moral teachings.

But we’ve come to think that we know this model is not really “correct”, is it? We’ve come to develop this thinking with the benefit of hindsight (a relative term, given Einstein’s famous theory, by the way. One man’s hindsight could actually be another man’s foresight!). We’ve ventured beyond our usual abode and looked at our planet from a different vantage point – that of Space. From the Moon and satellites. The sun doesn’t actually rise or set. That experience occurs because of our peculiar vantage point – of relatively slow or immobile creatures grounded here on Earth. One could say that it is an interesting illusion. Indeed, you could sit on a plane and with the appropriate speed, chase that sliver of sunlight as the Sol (as it’s lovingly called by scientists) appears or disappears in the horizon, never letting it vanish from view and do so essentially indefinitely.

Notes In The Margin About Language

Coming back, for a moment, to this amusing English phrase that helped me gauge my position, I thought about how language itself can shape one’s thinking. A subject matter upon which I’ve reflected before. There really comes a point when our models of the world and the universe get locked within the phraseology of a language that can actually reach the limits of its power of expression fairly unexpectedly. Speak in English and your view is different from somebody who can speak in Math. Even within Math, the coming about of algebra expanded the language’s power of expression incredibly from its meager beginnings. New models get incorporated into the lexicon of a language and because we tend to feed off of such phrases to make sense of ourselves and our universe, there is the potential for an inertia to develop, whereby it becomes easy to stay put with our abstractions of reality and not move on to radically new ones – models that are beyond the power of expression of a language and that haven’t yet been captured in its lexicon. In a way we find that models influence languages and languages themselves influence models and ultimately there is this interesting potential for a peculiar steady state to be reached – which may or may not be such a good thing.

So when it comes to this phenomenon, we’ve moved from one model to another. We began with “primitive” maxims. Perhaps during a time when people used to think of the Earth as flat and stars as pin-point objects too. And then progressed to geocentrism and then heliocentrism, both of which were basically formulated by careful and detailed observations of the sky using telescopes, long before the luxury of satellites and space travel came into being. And now that we see the Earth from this improved vantage point – of Space – our model for understanding reality has been refined. And actually, really shifted in profound ways.

So what does this all mean? It looks like reality is one thing, that exists out there. And we as humans make sense of reality through abstractions or models. How accurate we are with our abstractions really depends on how much information we’ve been able to gather. New information (through ever more detailed experiments or observations and indeed as Godel and Poincare showed, sometimes by mere pontification), drives us to alter our existing models. Sometimes in radically different ways (a classic example is our model of matter: one minute particle, one minute wave). There is this continuous flux about how we make sense of the cosmos, and it will likely go on this way until the day mankind has been fully informed – which may never really happen if pondered upon objectively. There have been moments in the past where man has thought that this precipice had been finally reached, that he was at last fully informed, only to realize with utter embarrassment that this was not the case. Can man ever know, by himself, that he has finally reached such a point? Especially, given that this is like a student judging his performance at an exam without the benefit of an independent evaluator? The truth is that we may never know. Whether we think we will ever reach such a precipice really does depend on a leap of faith. And scientists and explorers who would like to make progress, depend on this faith – that either such a precipice will one day be reached or at least that their next observation or experiment will increase them in information on the path to such a glorious point. When at last, a gestalt vision of all of reality can be attained. It’s hard to stay motivated otherwise, you see. And you thought you heard that faith had nothing to do with science or vice versa!

It is indeed quite remarkable the extent to which we get stuck in this or that model and keep fooling ourselves about reality. No sooner do we realize that we’ve been had and move on from our old abstraction to a new one and one what we think is much better, are we struck with another blow. This actually reminds me of a favorite quote by a stalwart of modern Medicine:

And not only are the reactions themselves variable, but we, the doctors, are so fallible, ever beset with the common and fatal facility of reaching conclusions from superficial observations, and constantly misled by the ease with which our minds fall into the rut of one or two experiences.

William Osler in Counsels and Ideals

The World According To Anaximander

The World According To Anaximander (c. 610-546 BCE)

The phenomenon is really quite pervasive. The early cartographers who divided the world into various regions thought funny stuff by today’s standards. But you’ve got to understand that that’s how our forefathers modeled reality! And whether you like it or not someday many generations after our time, we will be looked upon with similar eyes.

Watching two interesting Royal Society lectures by Paul Nurse (The Great Ideas of Biology) and Eric Lander (Beyond The Human Genome Project: Medicine In The 21st Century) the other day, this thought kept coming back to me. Speaking about the advent of Genomic Medicine, Eric Lander (who trained as a mathematician, by the way) talked about the discovery of the EGFR gene and the realization that its mutations strongly increase the risk for a type of lung cancer called Adenocarcinoma. He mentioned how clinical trials of the drug Iressa – a drug whose mechanism of action scientists weren’t sure of yet but was nevertheless proposed as a viable option for lung adenocarcinomas – failed to show statistically significant differences from standard therapy. Well, that was because the trial’s subjects were members of the broad population of all lung adenocarcinoma cases. Many doctors realizing the lack of conclusive evidence of a greater benefit, felt no reason to choose Iressa over standard therapy and drastically shift their practice. Which is what Evidence-Based-Medical practice would have led them to do, really. But soon after the discovery of the EGFR gene, scientists decided to do a subgroup analysis using patients with EGFR mutations, and it was rapidly learned that Iressa did have a statistically significant effect in decreasing tumor progression and improving survival in this particular subgroup. A significant section of patients could now have hope for cure! And doctors suddenly began to prescribe Iressa as the therapy of choice for them.

As I was thinking about what Lander had said, I remembered that Probability Theory as a science, which forms the bedrock of such things as clinical trials and indeed many other scientific studies, had not even developed until the Middle Ages. At least, so far as we know. And modern probability theory really began much later, in the early 1900s.

Front page of "Doctrine of Chance – a method for calculating the probabilities of events in plays" by Abraham de Moivre, London, 1718

Abraham de Moivre's "Doctrine of Chances" published in 1718, was the first textbook on Probability Theory

You begin to realize what a quantum leap this was in our history. We now think of patterns and randomness very differently from ancient times. Which is pretty significant, given that for some reason our minds are drawn to looking for patterns even where there might not be any. Over the years, we’ve developed the understanding that clusters (patterns) of events or cases could occur in a random system just as in a non-random one. Indeed, such clusters (patterns) would be a fundamental defining characteristic of a random process. Absence of clusters would indicate that a process wasn’t truly random. Whether such clusters (patterns) would fit with a random process as opposed to a non-random one would depend on whether or not we find an even greater pattern of how these clusters are distributed. A cluster of cases (such as an epidemic of cholera) would be considered non-random if by hypothesis testing we found that the probability of such a cluster coming about by random chance was so small as to be negligible. And even when thinking about randomness, we’ve learned to ask ourselves if a random process could be pseudo-random as opposed to truly random – which can sometimes be a difficult thing to establish. So unlike our forefathers, we don’t immediately jump to conclusions about what look to our eyes as patterns. It’s all quite marvelous to think about, really. What’s even more fascinating, is that Probability Theory is in a state of flux and continues to evolve to this day, as mathematicians gather new information. So what does this mean for the validity of our models that depend on Probability Theory? If a model could be thought of as a chain, it is obvious that such a model would be as strong as the links with which it is made! So we find that statisticians keep finding errors in how old epidemiologic studies were conducted and interpreted. And the science of Epidemiology itself improves as Probability Theory is continuously polished. This goes to show the fact that the validity of our abstractions keeps shifting as the foundations upon which they are based themselves continue to transform. A truly intriguing idea when one thinks about it.

Some other examples of the shifting of abstractions with the gathering of new information come to mind.

Image from Andreas Vesalius's De humani corporis fabrica (1543), page 190.

An image from Vesalius's "De Humani Corporis Fabrica" (1543)

Like early cartographers, anatomists never really understood human anatomy very well back in the days of cutting open animals and extrapolating their findings to humans. There were these weird ideas that diseases were caused by a disturbance in the four humors. And then Vesalius came along and by stressing on the importance of dissecting cadavers, revolutionized how anatomy came to be understood and taught. But even then, our models for the human body were until recently plagued by ideas such as the concept that the seat of the soul lay in the pineal gland and some of the other stuff now popularly characterized as folk-medicine. In our models for disease causation, we’ve progressed over the years from looking at pure environmental factors to pure DNA factors and now to a multifactorial model that stresses on the idea that many diseases are caused by a mix of the two.

The Monty Hall paradox, about which I’ve written before is another good example. You’re presented with new information midway in the game and you use this new information to re-adjust the old model of reality that you had in your mind. The use of decision trees in genetic counseling, is yet another example. Given new information about a patient’s relatives and their genotype, your model for what is real and its accuracy improves. You become better at diagnosis with each bit of new information.

The phenomenon can often be found in how people understand Scripture too. Mathematician, Gary Miller has an interesting article that describes how some scholars examining the word Iram have gradually transformed their thinking based on new information gathered by archeological excavations.

So we see how abstractions play a fundamental role in our perceptions of reality.

One other peculiar thing to note is that sometimes, as we try to re-shape our abstractions to better congrue with any new information we get, there is the tendency to stick with the old as much as possible. A nick here or a nudge there is acceptable but at its heart we are usually loath to discard our old model entirely. There is a potential danger in this. Because it could be that we inherit flaws from our old model without even realizing it, thus constraining the new one in ways yet to be understood. Especially when we are unaware of what these flaws could be. A good example of abstractions feeding off of each other are the space-time fabric of relativity theory and the jitteriness of quantum mechanics. In our quest for a new model – a unified theory or abstraction – we are trying to mash these two abstractions together in curious ways, such that a serene space-time fabric exists when zoomed out, but when zoomed in we should expect to see it behave erratically with jitters all over the place. Our manner of dealing with such inertia when it comes to building new abstractions is basically to see if these mash-ups agree with experiments or observations much better than our old models. Which is an interesting way to go about doing things and could be something to think about.

Making Sense Of Reality Through The Looking Glass

Making Sense Of Reality Through The Looking Glass (via Jose @ Flickr, CC BY-SA-NC license)

Listening to Paul Nurse’s lecture I also learned how Mendel chose Pea plants for his studies on inheritance rather than other complicated vegetation because of the simplicity and clarity with which one could distinguish their phenotypes, making the experiment much easier to carry out. Depending on how one crossed them, one could trace the inheritance of traits – of color of fruit, height of plant, etc. very quickly and very accurately. It actually reminded me of something I learned a long time ago about the various kinds of data in statistics. That these data could be categorized into various types based on the amount of information they contain. The highest amount of information is seen in Ratio data. The lowest is seen in Nominal data. The implication of this is that the more your experiment or scientific study uses Ratio data rather than Nominal data, the more accurate will your inferences about reality be. The more information you throw out, the weaker will your model be. So we see that there is quite an important caveat when we build abstractions based on keeping it simple and stripping away intricacy. When we are stuck with having to use an ape thumb with a fine instrument. It’s primitive, but it often gets us ahead in understanding reality much faster. The cost we pay though, is that our abstraction congrues better with a simpler and more artificial version of the reality that we seek to understand. And reality usually is quite complex. So when we limit ourselves to examining a bunch of variables in say for example the clinical trial of a drug, and find that it has a treatment benefit, we can be a lot more certain that this would be the case in the real world too provided that we prescribe the drug to as similar a patient pool as in our experiment. Which rarely happens as you might have guessed! And that’s why you find so many cases of treatment failure and unpredictable disease outcomes. How the validity of an abstraction is influenced by the KISS principle is something to think about. Epidemiologists get sleepless nights when pondering over it sometimes. And a lot of time is spent in trying to eliminate selection bias (i.e. when errors of inference creep in because the pool of patients in the study doesn’t match to an acceptable degree, the kinds of patients doctors would interact with out in the real world). The goal is to make an abstraction agree with as much of reality as possible, but in doing so not to make it so far removed from the KISS principle that carrying out the experiment would be impractical or impossible. It’s such a delicate and fuzzy balance!

So again and again we find that abstractions define our experiences. Some people get so immersed and attached with their models of reality that they make them their lifeblood, refusing to move on. And some people actually wonder if life as we know it, is itself an abstraction :-D! I was struck by this when I came upon the idea of the Holographic principle in physics – that in reality we and our universe are bound by an enveloping surface and that our real existence is on this plane. That what we see, touch or smell in our common experience is simply a projection of what is actually happening on that surface. That these everyday experiences are essentially holograms :-D! Talk about getting wild, eh :-D?!

The thought that I ultimately came with at the end of my jog was that of maintaining humility in knowledge. For those of us in science, we find that it is very common for arrogance to creep in. When the fact is that there is so much about reality that we don’t know anything about and that our abstractions may never agree with it to full accuracy, ever! When pondered upon deeply this is a very profound and humbling thing to realize.

Even the arrogance in Newton melted away for a moment when he proclaimed:

If I have seen a little further it is by standing on the shoulders of Giants.

Isaac Newton in a letter to rival Robert Hooke

Here’s to Isaac Newton for that spark of humility, even if it was rather fleeting :-). I’m guessing there must have been times when he might have had stray thoughts of cursing at himself for having said that :-)! Oh well, that’s how they all are …


Copyright Firas MR. All Rights Reserved.

“A mote of dust, suspended in a sunbeam.”



Search Blog For Tags: , , , , , ,

Written by Firas MR

November 16, 2010 at 12:18 am

Seeking Profundity In The Mundane

leave a comment »

seeking a new vision

Seeking A New Vision (via Jared Rodriguez/Truthout CC BY-NC-SA license)

The astronomer, Carl Sagan once said:

It has been said that astronomy is a humbling and character-building experience. There is perhaps no better demonstration of the folly of human conceits than this distant image of our tiny world. To me, it underscores our responsibility to deal more kindly with one another, and to preserve and cherish the pale blue dot, the only home we’ve ever known.

– in the Pale Blue Dot

And likewise Frank Borman, astronaut and Commander of Apollo 8, the first mission to fly around the Moon said:

When you’re finally up on the moon, looking back at the earth, all these differences and nationalistic traits are pretty well going to blend and you’re going to get a concept that maybe this is really one world and why the hell can’t we learn to live together like decent people?

Why is it I wonder, that we the human race, have the tendency to reach such profound truths only when placed in an extraordinary environment? Do we have to train and become astronomers or cosmonauts to appreciate our place in the universe? To find respect for and to cherish what we’ve been bestowed with? To care about each other, our environment and this place that we are loath to remember is the one home for all of life as we know it?

There is much to be learned by reflecting upon this idea. Our capacity to gain wisdom and feel impressed really does depend on the level to which our experiences deviate from the banal, doesn’t it? Ask what a grain of food means to somebody who has never had the luxury of a mediocre middle-class life. Ask a lost child what it must be like to have finally found his mother. Or question the rejoicing farmer who has just felt rain-drops on his cheeks, bringing hope after a painful drought.

I’m sure you can think of other examples that speak volumes about the way we, consciously or not, program ourselves to look at things.

The other day, I was just re-reading an old article about the work of biomathematician, Steven Strogatz. He mentioned how as a high-school student studying science, he was asked to drop down on his knees and measure the dimensions of floors, graph the time periods of pendulums and figure out the speed of sound from resonating air columns in hollow tubes partly filled with water, etc. Each time, the initial reaction was that of dreariness and insipidity. But he would then soon realize how these mundane experiments would in reality act as windows to profound discoveries – such as the idea that resonance is something without which atoms wouldn’t come together to form material objects or how a pendulum’s time period when graphed reflects a specific mathematical equation.

There he was – peering into the abstruse and finding elegance in the mundane. The phenomenon reminded me of a favorite quote:

The real voyage of discovery consists not in seeking new landscapes, but in having new eyes.

Marcel Proust

For that’s what Strogatz, like Sagan and Borman was essentially experiencing. A new vision about things. But with an important difference – he was doing it by looking at the ordinary. Not by gazing at extra-ordinary galaxies and stars through a telescope. Commonplace stuff, that when examined closely, suddenly was ordinary no more. Something that had just as much potential to change man’s perspective of himself and his place in the universe.

I think it’s important to realize this. The universe doesn’t just exist out there among the celestial bodies that lie beyond normal reach. It exists everywhere. Here; on this earth. Within yourself and your environment and much closer to home.

Perhaps, that’s why we’ve made much scientific progress by this kind of exploration. By looking at ordinary stuff using ordinary means. But with extra-ordinary vision. And successful scientists have proven again and again, the value of doing things this way.

The concept of hand-washing to prevent the spread of disease for instance, wasn’t born out of a sophisticated randomized-clinical trial. But by a mediocre accounting of mortality rates using a much less developed epidemiologic study. The obstetrician who stumbled upon this profound discovery, long before Pasteur later postulated the germ theory of disease, was called Ignaz Semmelweis, later to be known as the “savior of mothers”. His new vision led to the discovery of something so radical, that the medical community of his day rejected it and his results were never seriously looked at during his lifetime (So much for peer-review, eh?). The doctor struggled with this till his last breath, suffering at an insane asylum and ultimately dying at the young age of 47.

That smoking is tied with lung cancer was first conclusively learned by an important prospective cohort study that was largely done by mailing a series of questionnaires out to smoking and non-smoking physicians over a period of time, asking how they were doing. Yes, even questionnaires, when used intelligently, could be more than just unremarkable pieces of paper; they could be gateways that open our eyes to our magnificent universe!

From the polymath and physician, Copernicus’s seemingly pointless calculations on the positions of planets to the dreary routine of looking at microbial growth in petri-dishes by physician Koch, to physicist and polymath, Young‘s proposal of a working theory for color vision, to the physician, John Snow’s phenomenal work on preventing cholera by studying water wells long before the microbe was even identified, time and time again we have learned about the enormous implications of science on the cheap. And science of the mundane. There’s wisdom in applying the KISS (Keep It Simple Stupid) principle to science after all! Even in the more advanced technologically replete scientific studies.

More on the topic of finding extraordinary ideas in ordinary things, I was reminded recently of a couple of enchanting papers and lectures. One was about finding musical patterns in the sequence of our DNA. And the second was an old but interesting paper1 that proposes a radical model for the biology of the cell and that seeks to reconcile the paradoxes that we observe in biological experiments. That there could be some deep logical underpinning to the maxim, “biology is a science of exceptions”, is really quite an exciting idea:

Surprise is a sign of failed expectations. Expectations are always derived from some basic assumptions. Therefore, any surprising or paradoxical data challenges either the logical chain leading from assumptions to a failed expectation or the very assumptions on which failed expectations are based. When surprises are sporadic, it is more likely that a particular logical chain is faulty, rather than basic assumptions. However, when surprises and paradoxes in experimental data become systematic and overwhelming, and remain unresolved for decades despite intense research efforts, it is time to reconsider basic assumptions.

One of the basic assumptions that make proteomics data appear surprising is the conventional deterministic image of the cell. The cell is commonly perceived and traditionally presented in textbooks and research publications as a pre-defined molecular system organized and functioning in accord with the mechanisms and programs perfected by billions years of biological evolution, where every part has its role, structure, and localization, which are specified by the evolutionary design that researchers aim to crack by reverse engineering. When considered alone, surprising findings of proteomics studies are not, of course, convincing enough to challenge this image. What makes such a deterministic perception of the cell untenable today is the massive onslaught of paradoxical observations and surprising discoveries being generated with the help of advanced technologies in practically every specialized field of molecular and cell biology [12-17].

One of the aims of this article is to show that, when reconsidered within an alternative framework of new basic assumptions, virtually all recent surprising discoveries as well as old unresolved paradoxes fit together neatly, like pieces of a jigsaw puzzle, revealing a new image of the cell–and of biological organization in general–that is drastically different from the conventional one. Magically, what appears as paradoxical and surprising within the old image becomes natural and expected within the new one. Conceptually, the transition from the old image of biological organization to a new one resembles a gestalt switch in visual perception, meaning that the vast majority of existing data is not challenged or discarded but rather reinterpreted and rearranged into an alternative systemic perception of reality.

– (CC BY license)

Inveigled yet :-) ? Well then, go ahead and give it a look!

And as mentioned earlier in the post, one could extend this concept of seeking out phenomenal truths in everyday things to many other fields. As a photography buff, I can tell you that ordinary and boring objects can really start to get interesting when viewed up close and magnified. A traveler who takes the time to immerse himself in the communities he’s exploring, much like Xuan Zang or Wilfred Thesiger or Ibn Battuta, suddenly finds that what is to be learned is vast and all the more enjoyable.

The potential to find and learn things with this new way to envision our universe can be truly revolutionary. If you’re good at it, it soon becomes hard to ever get bored!

Footnotes:

  1. Kurakin, A. (2009). Scale-free flow of life: on the biology, economics, and physics of the cell. Theoretical Biology and Medical Modelling, 6(1), 6. doi:10.1186/1742-4682-6-6


Copyright Firas MR. All Rights Reserved.

“A mote of dust, suspended in a sunbeam.”



Search Blog For Tags: , , , ,

Written by Firas MR

November 13, 2010 at 10:48 am

Contrasts In Nerdity & What We Gain By Interdisciplinary Thinking

leave a comment »

scientific fields and purity

Where Do You Fit In This Paradigm? (via xkcd CC BY-NC license)

I’ve always been struck by how nerds can act differently in different fields.

An art nerd is very different from a tech nerd. Whereas the former could go on and on about brush strokes, lighting patterns, mixtures of paint, which drawing belongs to which artist, etc. the latter can engage in ad-infinitum discussions about the architecture of the internet, how operating systems work, whose grip on Assembly is better, why their code works better, etc.

And what about math and physics nerds? They tend to show their feathers off by displaying their understanding of chaos theory, why imaginary numbers matter, and how we are all governed by “laws of nature”, etc.

How about physicians and med students? Well, like most biologists, they’ll compete with each other by showing off how much of anatomy, physiology or biochemistry or drug properties they can remember, who’s uptodate on the most recent clinical trial statistics (sort of like a fan of cricket/baseball statistics), and why their technique of proctoscopy is better than somebody else’s, the latest morbidity/mortality rates following a given procedure, etc.

And you could actually go on about nerds in other fields too – historians (who remembers what date or event), political analysts (who understands the Thai royal family better), farmers (who knows the latest in pesticides), etc.

Each type has its own traits, that reflect the predominant mindset (at the highest of intellectual levels) when it comes to approaching their respective subject matter. And nerds, being who they are, can tend to take it all to their heads and think they’ve found that place — of ultimate truth, peace and solace. That they are at last, “masters” of their subjects.

I’ve always found this phenomenon to be rather intriguing. Because in reality, things are rarely that simple – at least when it comes to “mastery”.

In medicine for instance, the nerdiest of most nerds out there will be proud and rather content with the vast statistics, nomenclature, and learn-by-rote information that he has finally been able to contain within his head. Agreed, being able to keep such information at the tip of one’s tongue is an achievement considering the bounds of average human memory. But what about the fact that he has no clue as to what fundamentally drives those statistics, why one drug works for a condition whereas another drug with the same properties (i.e. properties that medical science knows of) fails or has lower success rates, etc.? A physicist nerd would approach this matter as something that lies at the crux of an issue — so much so that he would get sleepless nights without being able to find some model or theory that explains it mathematically, in a way that seems logical. But a medical nerd? He’s very different. His geekiness just refuses to go there, because of the discomforting feeling that he has no idea whatsoever! More stats and names to rote please, thank you!

I think one of the biggest lessons we learn from the really great stalwarts in human history is that, they refused to let such stuff get to their heads. The constant struggle to find and maintain humility in knowledge was central to how they saw themselves.

… I can live with doubt and uncertainty and not knowing. I think it’s much more interesting to live not knowing than to have answers which might be wrong. I have approximate answers and possible beliefs and different degrees of certainty about different things, but I’m not absolutely sure of anything and there are many things I don’t know anything about, such as whether it means anything to ask why we’re here, and what the question might mean. I might think about it a little bit and if I can’t figure it out, then I go on to something else, but I don’t have to know and answer, I don’t feel frightened by not knowing things, by being lost in a mysterious universe without having any purpose, which is the way it really is so far as I can tell. It doesn’t frighten me.

Richard Feynman speaking with Horizon, BBC (1981)

The scientist has a lot of experience with ignorance and doubt and uncertainty, and this experience is of great importance, I think. When a scientist doesn’t know the answer to a problem, he is ignorant. When he has a hunch as to what the result is, he is uncertain. And when he is pretty darn sure of what the result is going to be, he is in some doubt. We have found it of paramount importance that in order to progress we must recognize the ignorance and leave room for doubt. Scientific knowledge is a body of statements of varying degrees of certainty – some most unsure, some nearly sure, none absolutely certain.

Now, we scientists are used to this, and we take it for granted that it is perfectly consistent to be unsure – that it is possible to live and not know. But I don’t know everybody realizes that this is true. Our freedom to doubt was born of a struggle against authority in the early days of science. It was a very deep and very strong struggle. Permit us to question – to doubt, that’s all – not to be sure. And I think it is important that we do not forget the importance of this struggle and thus perhaps lose what we have gained.

What Do You Care What Other People Think?: Further Adventures of a Curious Character by Richard Feynman as told to Ralph Leighton

an interdisciplinary web of a universe

An Interdisciplinary Web of a Universe (via Clint Hamada @ Flickr; CC BY-NC-SA license)

Besides being an important aspect for high-school students to consider when deciding what career path to pursue, I think that these nerd-personality-traits also illustrate the role that interdisciplinary thinking can play in our lives and how it can add tremendous value in the way we think. The more one diversifies, the more his or her thinking expands — for the better, usually.

Just imagine a nerd who’s cool about art, physics, math or medicine, etc. — all put together, in varying degrees. What would his perspective of his subject matter and of himself be like? Would he make the ultimate translational research nerd? It’s not just the knowledge one could potentially piece together, but the mindset that one would begin to gradually develop. After all, we live in an enchanting web of a universe, where everything intersects everything!


Copyright Firas MR. All Rights Reserved.

“A mote of dust, suspended in a sunbeam.”



Search Blog For Tags: , , , , ,

 

Written by Firas MR

November 12, 2010 at 12:00 am

कैसे हमारी भाषाएँ हमारी विचारधारा को शकल देती हैं | کیسے ہماری زبانیں ہماری سوچ و فکر کو شکل دیتی ہیں

with one comment

Group of early 20th century Ceylon Moors (via Wikipedia)


नमस्कार दोस्तो !

जैसा कि आप जान गए होंगे ये हिन्दी में मेरा पहला ब्लॉग पोस्ट है। मैं ये कोशिश कर रहा हूँ कि जितनी भी भाषाएँ मुझे आती हैं, इन सब का इस ब्लॉग पर इस्तेमाल किया करूँ।

कई दिन पहले मैं एक बढ़िया लैक्चर देख रहा था। जिसका विषय था Urdu Politics In Hyderabad State” अर्थात “उर्दू भाषा की राजनीति, हैदराबाद राज्य में। हैदराबाद राज्य से मतलब, उस वक़्त का जब वो निज़ाम सरकार द्वारा (लेकिन अंग्रेज़ो की निगरानी में) चलाया जाने वाला अलग मुल्क था।  निवेदन-कर्ता थीं कविता दतला और वो बता रही थीं कि किस तरह एक ज़माना हुआ करता था जब हिन्दी और उर्दू एक ही बोली हैं माने जाते थे। एक ऐसा ज़माना, जब ये समझा जाता था कि ये दोनों में फर्क केवल लिखने में ही है। किस तरह, जो लोग उर्दू जानते थे वो हिन्दी भी लिखना-पढ़ना समझते थे और उसी प्रकार जो लोग हिन्दी बोलते थे, वो उर्दू के भी माहिर थे। और कैसे जब अंग्रेजों ने दक्षिण एशिया के इस अनोखे उपमहाद्वीप पर कदम रखा, तो उनकी भी यही गणना थी, जो हम उस ज़माने की अंग्रेज़ी पुस्तकों में पा सकते हैं। कलकत्ता के “Royal Asiatic Society Of Calcutta” की पुस्तकालय में ऐसी कई पुस्तकें भरी पड़ी हैं। उर्दू और हिन्दी ऐसी जुड़ी हुई हैं, कि एक की परिपक्वता दूसरे की उन्नति पर निर्भर है।

आगे भाषण में ये भी सवाल आया, कि आख़िर ये दोनों भाषाएँ अपने इस अटूट और सुंदर रिश्ते से कब और कैसे मुंह मोड़ने लगीं? हाँ ये सच है कि आज भी चंद लोग होंगे जो इन दोनों के बीच ज़्यादा भेद-भाव नहीं करते और दोनों को उतना ही अपने व्यष्टित्व से जोड़ते हैं जो बड़ी ही उच्चपद वाली बात है। लेकिन आज अधिकतर लोग समझते हैं कि इन दोनों के बीच धार्मिक स्वभाव का अंतर है। और ऐसा जब कि इन दोनों के बीच धार्मिक अंतर पहले होता ही नहीं था। श्रीमति दतला इस इतिहास को खोजती हैं। कैसे भाषा से हम अपनी पहचान बनाते हैं, और किस प्रकार ये पहचान समय के साथ-साथ राजनैतिक कारणों से बदलती रहती है। और वो भी आम आदमी के बोध के बग़ैर।

भाषण में, विज्ञान की दुनिया में उर्दू को बढ़ोतरी देने वाली विश्वविद्यालयों और उन से जुड़े माननीय विधवानों के इतिहास पर, उर्दू तथा हिन्दी के बदलते रिश्तों और इन के द्वारा समाजी मनोवैज्ञानिकता पर असर, इन सब पर भी बहुत दिलचस्प बातें हुईं। कैसे लोगों के बीच फूट की कृत्रिम जड़ें पैदा हुईं, और इन के अंशतः कारण कैसे एक भव्य उपमहाद्वीप के लोगों को ऐसे लहू-लुहान बटवारे को सहना पड़ा, जो मानवीय इतिहास के सब से बड़े खून-खराबों में शामिल होता है।

मुझे इस लैक्चर की सब से दिलचस्प बात ये लगी, कि ये भाषा और उस के समाजी मनोवैज्ञानिकता तथा आत्मिक स्वभाव पर प्रभाव के ऊपर एक महत्वपूर्ण उपदेश देता है। एक ऐसा सबक जो सिखाता है मनुष्य के छोटेपन और उस के द्वारा उस के अंदर ऐसी मूर्खपूर्ण एवं भयानक संभावना को, जो कर सकती है मानव जाति को अपने ही हाथों नष्ट।

मेंने इस से पहले computer programming पर लिखा था। लेकिन आज के विषय से संबन्धित एक विचार तब सामने नहीं लाया था हालांकि वहाँ पर भी भाषा एवं मानसिक स्वभाव के तालमेल का भरपूर उदाहरण देखने को मिलता है। शायद अंदर ही अंदर ये सोचा था कि इस बारे में अगर अलग ही ब्लॉग पोस्ट हो तो बेहतर होगा। दरअसल जो व्यक्ति Python जैसी भाषा में programming करता है, उस की सोच और विचारधारा एक C भाषा में programming करने वाले से भिन्न होती है। मानो कि विचारधारा कि सीमाएं भाषा से बिलकुल जुड़ी होती हैं। जो व्यक्ति machine language में सोचता है, उसी को computer के अंदरूनी हिसाब-किताब का असली मानों में पता होता है, क्यूंकि वो computer जैसा सोचने लगता है। हमारी सोच किस कदम पर चलती है और कैसा रूप ढा लेती  है, ये ईस पर काफी कुछ निर्भर होता है कि हम किस भाषा में अपनी विचारधाराओं को सँवारते हैं।

उर्दू इतिहास से संबन्धित मेंने एक और बेहतरीन लैक्चर देखा, जो शहर दिल्ली के अनेक मान्यवर उर्दू विद्वानों के अन्योन्यदर्शनों से भरपूर है। लेखकों के साथ ये बातचीत, Delhi’s Mother Tongue: The Story Of Urdu” अर्थात “दिल्ली की मात्र-भाषा: उर्दू की कहानी, के नाम से उपलब्ध है। निर्देशक हैं, श्रीमान वरुण। इस भाषा के इतिहास का वर्णन करते हुए विद्वान ये कहते हैं, कि हिन्दी और उर्दू ऐतिहासिक रूप से एक ही बोल-चाल के ढंग हैं। उन का मानना है कि समय की लकीर पर उर्दू का जन्म हिन्दी से पहले हुआ, उस वक़्त जब सुल्तान बादशाहों का इस क्षेत्र की ओर आना हुआ। सुल्तानों की सेना को लोक भाषा, जो उस जमाने में ब्रज-भाषा थी, समझ नहीं आती थी। और वो चाहते थे (शायद सैन्य श्रेष्ठता के लिए) कि जनता के साथ ताल्लुक़ात पैदा करने के लिए एक ऐसी भाषा को जन्म दिया जाए जो खुद अपनी तुर्कीय भाषा के साथ साथ लोक-भाषा के मिलाव से एक अनोखा मिश्रण हो। और इस प्रकार उर्दू भाषा दुनिया में पहली बार आई। आरंभ में तो इस भाषा का ज़ोर बोल-चाल में आसानी पैदा करने पर ही था, और लिखाई-पढ़ाई बाद में आई। जब लिखाई-पढ़ाई आई, तब जा कर लोगों ने लिपि के अनेक रूप अपनाए जिन में से दो लिपियाँ वो हैं जिन को आजकल हम उर्दू लिपि और हिन्दी लिपि के नाम से पहचानते हैं। धीरे धीरे, दोनों भाषाओं की लोकप्रियता बढ़ते गयी, और एक भाषा की उन्नति से दूसरी भाषा पर भी प्रभाव पड़ता गया। हत्ता कि आज भी देखा जाए तो यही सिलसिला चलता जा रहा है!

लिपियों से एक और बात याद आई। क्या आप जानते हैं कि श्रीलंका में जो लोग “(Ceylon Moor)” “सीलोन मूर के नाम से अपनी पहचान बनाते हैं, उनहों ने एक जमाने में अरबी भाषा को अपनाया था? और मज़े की बात ये है कि बोल-चाल अरबी थी तो ज़रूर लेकिन लिपि होती थी तमिल में! यानि कि अरबी बोल को वो लोग तमिल लिपि में लिखा करते थे। समय के साथ साथ उन का अरबी से संबंध टूटता गया और वो अरबी को छोड़ कर पूरी तरह से तमिल बोली पर आ गए। है ना दिलचस्प बात!

आशा है कि आज का ये ब्लॉग पोस्ट आप सभी को अच्छा लगा होगा। आज के लिए इतना ही। मिलते हैं अगली बार!

————————————————————————————

کہانی اردو زبان کے پیدائش کی


(ایک ضروری بات: اس مضمون کو سہی روپ میں دیکھنے کے لئے آپ ناظرین کو یہ font ڈاونلوڈ کرکے اپنے سسٹم پر ڈالنا ہوگا . یہ ایسی font ہے جو خاص کمپیوٹر سکرین پر باآسانی پڑھنے کے لئے بنائی گئی ہے .)

آداب دوستو ،

جیسا کہ آپ جان گئے ہونگے یہ ہندی میں میرا پہلا بلوگ پوسٹ ہے . میں یہ کوشش کر رہا ہوں کی جتنی بھی زبانیں مجھے آتی ہیں ، ان سب کا اس بلوگ پر استعمال کیا کروں .

کئی دن پہلے میں ایک بڑھیا خطاب دیکھ رہا تھا . جس کا عنوان تھا Urdu Politics in Hyderabad State” یعنی کہ ” اردو زبان کی سیاست ، ریاست حیدراباد میں . ریاست حیدرآباد سے مطلب اس وقت کا جب وہ نظام کے زیر انتظام (لیکن انگریزوں کی نگرانی میں) الگ ملک ہوا کرتا تھا . خطیبہ تھیں محترمہ کویتا دتلا اور وہ بتا رہی تھیں کہ کس طرح ایک زمانہ ہوا کرتا تھا جب ہندی اور اردو ایک ہی بولی ہیں ، مانے جاتے تھے . ایک ایسا زمانہ ، جب یہ سمجھا جاتا تھا کہ یہ دونوں میں فرق صرف لکھنے میں ہی ہے . کس طرح ، جو لوگ اردو جانتے تھے وہ ہندی بھی لکھنا پڑھنا سمجھتے تھے اور اسی طرح جو لوگ ہندی بولتے تھے ، وہ اردو کے بھی ماہر تھے . اور کیسے جب انگریزوں نے جنوبی آسیہ کے اس انوکھے برصغیر پر قدم رکھا ، تو ان کا بھی یہی جائزہ تھا ، جو ہم اس زمانے کی انگریزی کتابوں میں پا سکتے ہیں . کلکتہ کے ” Royal Asiatic Society of Calcutta ” کے کتاب خانے میں ایسی کئی کتابیں بھری پڑی ہیں . اردو اور ہندی ایسی جڑی ہوئی ہیں کہ ایک کی برتری دوسرے کی ترقی پر منحصر ہے .

آگے خطاب میں یہ بھی سوال آیا ، کہ آخر یہ دونوں زبانیں اپنے اس اٹوٹ اور خوبصورت رشتے سے کب اور کیسے منہ موڑنے لگیں ؟ ہاں یہ سچ ہے کہ آج بھی چند لوگ ہونگے جو ان دونو کے بیچ زیادہ تفرق نہیں کرتے اور دونو کو اتنا ہی اپنی یکسانی سے جوڑتے ہیں ، جو بڑے اصالت والی بات ہے . لیکن آج بیشتر لوگ سمجھتے ہیں کہ ان دونوں کے بیچ مذہبی خصوصیات والا فرق ہے . اور ایسا جب کہ ان دونو کے درمیان مذہبی تفرق پہلے ہوتا ہی نہیں تھا . محترمہ دتلا اس تاریخ کو کھوجتی ہیں . کیسے زبان سے ہم اپنی پہچان بناتے ہیں ، اور کس طرح یہ پہچان وقت کے ساتھ ساتھ سیاسی اسباب سے بدلتی رہتی ہیں . اور وہ بھی عام آدمی کی آگاہی کے بغیر .

تقریر میں ، علمی دنیا میں اردو کو بڑھاوا دینے والی جامعیات اور ان سے جڑے نامور عالموں پر ، اردو اور ہندی کے بدلتے رشتوں اور ان کا اثر سماجی نفسیات ، ان سب پر بھی بہت دلچسپ باتیں ہوئیں . کیسے لوگوں کے بیچ پھوٹ کی مصنوعی جڑیں پیدا ہوئیں ، اور ان کے باعث (کچھ حد تک ہی سہی) کیسے ایک شاندار برصغیر کے لوگوں کو ایسے لہو-لہان بٹوارے کو سہنا پڑا ، جو انسانی تاریخ کے سب سے بڑے خون-خرابوں میں شامل ہوتا ہے .

مجھے اس تقریر کی سب سے دلچسپ بات یہ لگی ، کہ یہ زبان اور اس کے سماجی نفسیاتی حالات اور روح پر اثر کے اوپر ایک اہمترین سبق دیتا ہے . ایک ایسا سبق جو سکھاتا ہے آدمی کے چھوٹےپن اور اس کے ذریع اس کے اندر ایسی نکممی اور بھیانک قابلیت کو ، جو کر سکتی ہے آدم ذات کو اپنے ہی ہاتھوں تباہ .

میں نے اس سے پہلے computer programming پر لکھا تھا . لیکن آج کے موضوع سے متعلق ایک خیال تب سامنے نہیں لایا تھا ، حالانکہ وہاں پر بھی زبان اور نفسیاتی طبعیت کے تال میل کی بھرپور مثال دیکھنے کو ملتی ہے . شاید اندر ہی اندر یہ سوچا تھا کہ اس بارے میں اگر الگ ہی بلوگ پوسٹ ہو تو بہتر ہوگا . دراصل جو آدمی Python جیسی زبان میں programming کرتا ہے ، اس کی سوچ کا رخ و شکل ایک C زبان میں programming کرنے والے سے الگ ہوتا ہے . گویا کہ خیالات کے رخ کی سرحدیں ، زبان سے بلکل جڑی ہوتی ہیں . جو آدمی machine language میں سوچتا ہے ، اسی کو computer کے اندرونی حساب-کتاب کا اصلی معنوں میں پتا ہوتا ہے ، کیوں کہ وہ computer جیسا سوچنے لگتا ہے . ہماری سوچ کس قدم پر چلتی ہے اور کیسا روپ ڈھا لیتی ہے ، یہ اس پر کافی کچھ منحصر ہوتا ہے کہ ہم کس زبان میں اپنے خیالات کو سنوارتے ہیں .

اردو تاریخ سے متعلق میں نے ایک اور بہترین تقریر دیکھی ، جو شہر دلّی کے مختلف نامور اردو ادبیات کے ماہرین کے انٹرویو سے بھرپور ہے . ماہرین کے ساتھ یہ گفتگو ، Delhi’s Mother Tongue: The Story of Urdu” یعنی کہ ” دلّی کی مادری زبان : اردو کی کہانی ، کے نام سے ملے گی . منتظم ہیں جناب ورن . اس زبان کی تاریخ کی وضاحت کرتے ہوے ماہرین یہ کہتے ہیں ، کہ ہندی اور اردو تاریخی روپ سے ایک ہی بول-چال کے ڈھنگ ہیں . ان کا ماننا ہے کہ وقت کی لکیر پر اردو کی پیدائش ہندی سے پہلے ہوئی ، اس وقت جب سلطان بادشاہوں کا اس خطّے کی طرف آنا ہوا . سلطانوں کی فوج کو عوام کی زبان ، جو اس زمانے میں برج-بھاشا تھی ، سمجھ نہیں آتی تھی . اور وہ چاہتے تھے (شاید دفاعی حکمت عملی کے لئے) کہ عوام کے ساتھ تعلّقات پیدا کرنے کے لئے ایک ایسی زبان کو ایجاد کیا جاۓ جو خود اپنی ترکی زبان کے ساتھ ساتھ عام بولی کے ملاؤ سے ایک انوکھا مرکب ہو . اور اس طرح اردو زبان دنیا میں پہلی بار آئی . شروعات میں تو اس زبان کا زور بول-چال میں آسانی پیدا کرنے پر ہی تھا ، اور لکھائی-پڑھائی بعد میں آئی . جب لکھائی-پڑھائی آئی ، تب جا کر لوگوں نے دستاویز و خط کے مختلف روپ اپناۓ جن میں سے دو دستاویز وہ ہیں جن کو ہم آج کل اردو خط اور ہندی خط کے نام سے پہچانتے ہیں . دھیرے دھیرے ، دونوں زبانوں کی مقبولیت بڈتے گئی ، اور ایک زبان کی ترقی سے دوسری زبان پر بھی اثر پڑتا گیا . حتیٰ کہ آج بھی دیکھا جاۓ تو یہی سلسلہ چلتا جا رہا ہے !

دستاویزوں سے ایک اور بات یاد آئی . کیا آپ جانتے ہیں کہ سریلنکا میں جو لوگ “(Ceylon Moor)” “سیلون مور کے نام سے اپنی پہچان بناتے ہیں ، انہوں نے ایک زمانے میں عربی زبان کو اختیار کیا تھا ؟ اور مزے کی بات یہ ہے کہ بول چال تو عربی تھی تو ضرور لیکن خط و دستاویز تھا تامل میں ! یعنی کہ عربی بول کو وہ لوگ تامل دستاویز میں لکھا کرتے تھے . وقت کے ساتھ ساتھ ان کا عربی سے رابطہ ٹوٹتا گیا اور وہ عربی کو چھوڈ کر پوری طرح سے تامل بولی پر آ گئے .  ہے نہ دلچسپ بات !

امید ہے کہ آج کا یہ بلوگ پوسٹ آپ سبھی کو اچّھا لگا ہوگا . آج کے لئے اتنا ہی . ملتے ہیں اگلی بار !


Copyright Firas MR. All Rights Reserved.

“A mote of dust, suspended in a sunbeam.”



Search Blog For Tags: , , , , , , , ,

Written by Firas MR

November 9, 2010 at 11:39 pm

Revitalizing Science Education

with one comment

[Video]
Richard Feynman: “… But you’ve gotta stop and think about it. About the complexity to really get the pleasure. And it’s all really there … the inconceivable nature of nature! …”

And when I read Feynman’s description of a rose — in which he explained how he could experience the fragrance and beauty of the flower as fully as anyone, but how his knowledge of physics enriched the experience enormously because he could also take in the wonder and magnificence of the underlying molecular, atomic, and subatomic processes — I was hooked for good. I wanted what Feynman described: to assess life and to experience the universe on all possible levels, not just those that happened to be accessible to our frail human senses. The search for the deepest understanding of the cosmos became my lifeblood [...] Progress can be slow. Promising ideas, more often than not, lead nowhere. That’s the nature of scientific research. Yet, even during periods of minimal progress, I’ve found that the effort spent puzzling and calculating has only made me feel a closer connection to the cosmos. I’ve found that you can come to know the universe not only by resolving its mysteries, but also by immersing yourself within them. Answers are great. Answers confirmed by experiment are greater still. But even answers that are ultimately proven wrong represent the result of a deep engagement with the cosmos — an engagement that sheds intense illumination on the questions, and hence on the universe itself. Even when the rock associated with a particular scientific exploration happens to roll back to square one, we nevertheless learn something and our experience of the cosmos is enriched.

Brian Greene, in The Fabric of The Cosmos

When people think of “science education”, they usually tend to think about it in the context of high school or college. When in reality it should be thought of as encompassing education life-long, for if analyzed deeply, we all realize that we never cease to educate ourselves no matter what our trade. Because we understand that what life demands of us is the capacity to function efficiently in a complex society. As we gain or lose knowledge, our capacities keep fluctuating and we always desire and often strive for them to be right at the very top along that graph.

When it comes to shaping attitudes towards science, which is what I’m concerned about in this post, I’ve noticed that this begins quite strongly during high school, but as students get to college and then university, it gradually begins to fade away, even in some of the more scientific career paths. By then I guess, some of these things are assumed (at times you could say, wrongfully). We aren’t reminded of it as frequently and it melts into the background as we begin coping with the vagaries of grad life. By the time we are out of university, for a lot of us, the home projects, high-school science fests, etc. that we did in the past as a means to understand scientific attitude, ultimately become a fuzzy, distant dream.

I’ve observed this phenomenon as a student in my own life. As med students, we are seldom reminded by professors of what it is that constitutes scientific endeavor or ethic. Can you recall when was the last time you had didactic discussions on the topic?

I came to realize this vacuum early on in med school. And a lot of times this status quo doesn’t do well for us. Take Evidence-Based-Medicine (EBM) for example. One of the reasons, why people make errors in interpreting and applying EBM in my humble opinion, is precisely because of the naivete that such a vacuum allows to fester. What ultimately happens is that students remain weak in EBM principles, go on to become professors, can not teach EBM to the extent that they ought to and a vicious cycle ensues whereby the full impact of man’s progress in Medicine will not be fulfilled. And the same applies to how individuals, departments and institutions implement auditing, quality-assurance, etc. as well.

A random post that I recently came across in the blogosphere touched upon the interesting idea that when you really think about it, most practicing physicians are ultimately technicians whose job it is to fix and maintain patients (like how a mechanic oils and fixes cars). The writer starts out with a provocative beginning,


Is There A Doctor In The House?


[...]

Medical doctors often like to characterize themselves as scientists, and many others in the public are happy to join them in this.

I submit, however, that such a characterization is an error.

[...]

and divides science professionals into,

[...]

SCIENTIST: One whose inquiries are directed toward the discovery of new facts.

ENGINEER: One whose inquiries are directed toward the new applications of established facts.

TECHNICIAN: One whose inquiries are directed toward the maintenance of established facts.

[...]

and then segues into why even if that’s the case, being a technician in the end has profound value.

Regardless of where you find yourselves in that spectrum within this paradigm, I think it’s obvious that gaining skills in one area helps you perform better in others. So as technicians, I’m sure that practicing physicians will find that their appraisal and implementation of EBM will improve if they delve into how discoverers work and learn about the pitfalls of their trade. The same could be said of learning about how inventors translate this knowledge from the bench to the bedside as new therapies, etc. are developed and the caveats involved in the process.

Yet it is precisely in these aspects that I find that medical education requires urgent reform. Somehow, as if by magic, we are expected to do the work of a technician and to get a grip on EBM practices without a solid foundation for how discoverers and inventors work.

I think it’s about time that we re-kindled the spirit of understanding scientific attitude at our higher educational institutions and in our lives (for those of us who are already out of university).

From self-study and introspection, here are a couple of points and questions that I’ve made a note of so far, as I strive to re-invigorate the scientific spirit within me, in my own way. As you reflect on them, I hope that they are useful to you in working to become a better science professional as well:

  1. Understand the three types of science professionals and their roles. Ask where in the spectrum you lie. What can you learn about the work professionals in the other categories do to improve how you yourself function?
  2. Learning about how discoverers work, helps us in getting an idea about the pitfalls of science. Ultimately, questions are far more profound than the answers we keep coming up with. Do we actually know the answer to a question? Or is it more correct to say that we think we know the answer? What we think we know, changes all the time. And this is perfectly acceptable, as long as you’re engaged as a discoverer.
  3. What are the caveats of using language such as the phrase “laws of nature”? Are they “laws”, really? Or abstractions of even deeper rules and/or non-rules that we cannot yet touch?
  4. Doesn’t the language we use influence how we think?
  5. Will we ever know if we have finally moved beyond abstractions to deeper rules and/or non-rules? Abstractions keep shifting, sometimes in diametrically opposite directions (eg: from Newton’s concepts of absolute space-time to Einstein’s concepts of relative space-time, the quirky and nutty ideas of quantum mechanics such as the dual nature of matter and the uncertainty principle, concepts of disease causation progressing from the four humours to microbes and DNA and ultimately a multifactorial model for etiopathogenesis). Is it a bad idea to pursue abstractions in your career? Just look at String Theorists; they have been doing this for a long time!
  6. Develop humility in approach and knowledge. Despite all the grand claims we make about our scientific “progress”, we’re just a tiny speck amongst the billions and billions of specks in the universe and limited by our senses and the biology of which we are made. The centuries old debate among philosophers of whether man can ever claim to one day have found the “ultimate truth” still rages on. However, recently we think we know from Kurt Godel’s work that there are truths out there in nature that man can never arrive at by scientific proof. In other words, truths that we may never ever know of! Our understanding of the universe and its things keeps shifting continuously, evolving as we ourselves as a species improve (or regress, depending on your point of view). Understanding that all of this is how science works is paramount. And there’s nothing wrong with that. It’s just the way it is! :-)
  7. Understand the overwhelming bureaucracy in science these days. But don’t get side-tracked! It’s far too big of a boatload to handle on one’s own! There are dangers that lead people to leave science altogether because of this ton of bureaucracy.
  8. Science for career’s sake is how many people get into it. Getting a paper out can be a good career move. But it’s far more fun and interesting to do science for science’s own sake, and the satisfaction you get by roaming free, untamed, and out there to do your own thing will be ever more lasting.
  9. Understand the peer-review process in science and its benefits and short-comings.
  10. Realize the extremely high failure rate in terms of the results you obtain. Over 90% by most anecdotal accounts – be that in terms of experimental results or publications. But it’s important to inculcate curiosity and to keep the propensity to question alive. To discover. And to have fun in the process. In short, the right attitude; despite knowing that you’re probably never going to earn a Fields medal or Nobel prize! Scientists like Carl Friederich Gauss were known to dislike publishing innumerable papers, favoring quality over quantity. Quite contrary to the trends that Citation Metrics seem to have a hand in driving these days. It might be perfectly reasonable to not get published sometimes. Look at the lawyer-mathematician, Pierre de Fermat of Fermat’s Last Theorem fame. He kept notes and wrote letters but rarely if ever published in journals. And he never did publish the proof of Fermat’s Last Theorem, claiming that it was too large to fit in the margins of a copy of a book he was reading as the thought occurred to him. He procrastinated until he passed away, when it became one of the most profound math mysteries ever to be tackled, only to be solved about 358 years later by Andrew Wiles. But the important thing to realize is that Fermat loved what he did, and did not judge himself by how many gazillion papers he could or could not have had to his name.
  11. Getting published does have a sensible purpose though. The general principle is that the more peer-review the better. But what form this peer-review takes does not necessarily have to be in the form of hundreds of thousands of journal papers. There’s freedom in how you go about getting it, if you get creative. And yes, sometimes, peer-review fails to serve its purpose. Due to egos and politics. The famous mathematician, Evariste Galois was so fed-up by it that he chose to publish a lot of his work privately. And the rest, as they say, is history.
  12. Making rigorous strides depends crucially on a solid grounding in Math, Probability and Logic. What are the pitfalls of hypothesis testing? What is randomness and what does it mean? When do we know that something is truly random as opposed to pseudo-random? If we conclude that something is truly random, how can we ever be sure of it? What can we learn from how randomness is interpreted in inflationary cosmology in the manner that there’s “jitter” over quantum distances but that it begins to fade over larger ones (cf. Inhomogeneities in Space)? Are there caveats involved when you create models or conceptions about things based on one or the other definitions of randomness? How important is mathematics to biology and vice versa? There’s value in gaining these skills for biologists. Check out this great paper1 and my own posts here and here. Also see the following lecture that stresses on the importance of teaching probability concepts for today’s world and its problems:


    [Video]

  13. Developing collaborative skills helps. Lateral reading, attending seminars and discussions at various departments can help spark new ideas and perspectives. In Surely You’re Joking Mr. Feynman!, the famous scientist mentions how he always loved to dabble in other fields, attending random conferences, even once working on ribosomes! It was the pleasure of finding things out that mattered! :-)
  14. Reading habits are particularly important in this respect. Diversify what you read. Focus on the science rather than the dreary politics of science. It’s a lot more fun! Learn the value of learning-by-self and taking interest in new things.
  15. Like it or not, it’s true that unchecked capitalism can ruin balanced lives, often rewarding workaholic self-destructive behavior. Learning to diversify interests helps take off the pressure and keeps you grounded in reality and connected to the majestic nature of the stuff that’s out there to explore.
  16. The rush that comes from all of this exploration has the potential to lead to unethical behavior. It’s important to not lose sight of the sanctity of life and the sanctity of our surroundings. Remember all the gory examples that  WW2 gave rise to (from the Nazi doctors to all of those scientists whose work ultimately gave way to the loss of life that we’ve come to remember in the euphemism, “Hiroshima and Nagasaki”). Here’s where diversifying interests also helps. Think how a nuclear scientist’s perspectives could change about his work if he spent time taking a little interest in wildlife and the environment. Also, check this.
  17. As you diversify, try seeing science in everything – eg: When you think about photography think not just about the art, but about the nature of the stuff you’re shooting, the wonders of the human eye and the consequences of the arrangement of rods and cones and the consequences of the eyeball being round, its tonal range compared to spectral cameras, the math of perspective, and the math of symmetry, etc.
  18. Just like setting photography assignments helps to ignite the creative spark in you, set projects and goals in every avenue that you diversify into. There’s no hurry. Take it one step at a time. And enjoy the process of discovery!
  19. How we study the scientific process/method should be integral to the way people should think about education. A good analogy although a sad one is, conservation and how biology is taught at schools. Very few teachers and schools will go out of their way to emphasize and interweave solutions for sustainable living and conserving wildlife within the matter that they talk about even though they will more than easily get into the nitty-gritty of the taxonomy, the morphology, etc. You’ll find paragraphs and paragraphs of verbiage on the latter but not the former. This isn’t the model to replicate IMHO! There has to be a balance. We should be constantly reminded about what constitutes proper scientific ethic in our education, and it should not get to the point that it begins to fade away into the background.
  20. The current corporate-driven, public-interest research model is a mixed bag. Science shouldn’t in essence be something for the privileged or be monopolized in the hands of a few. Good ideas have the potential to get dropped if they don’t make business sense. Understand public and private funding models and their respective benefits and shortcomings. In the end realize that there are so many scientific questions out there to explore, that there’s enough to fill everybody’s plate! It’s not going to be the end of the world, if your ideas or projects don’t receive the kind of funding you desire. It’s ultimately pretty arbitrary :-) ! Find creative solutions to modify your project or set more achievable goals. The other danger in monetizing scientific progress is the potential to inculcate the attitude of science for money. Doing science for the joy of it is much more satisfying than the doing it for material gain IMHO. But different people have different preferences. It’s striking a balance that counts.
  21. The business model of science leads us into this whole concept of patent wars and Intellectual Property issues. IMHO there’s much value in having a free-culture attitude to knowledge, such as the open-access and open-source movements. Imagine what the world would be like if Gandhi (see top-right) patented the Satyagrah, requiring random licensing fees or other forms of bondage! :-)
  22. It’s important to pursue science projects and conduct fairs and workshops even at the university level (just as much as it is emphasized in high school; I would say to an even greater degree actually). Just to keep the process of discovery and scientific spirit vibrant and alive, if for no other reason. Also, the more these activities reflect the inter-relationship between the three categories of science professionals and their work, the better. Institutions should recognize the need to encourage these activities for curricular credit, even if that means cutting down on other academic burdens. IMHO, on balance, the small sacrifice is worth it.
  23. Peer-review mechanisms currently reward originality. But at the same time, I think it’s important to reward repeatability/reproducibility. And to reward statistically insignificant findings. This not only helps remove bias in published research, but also helps keep the science community motivated in the face of a high failure rate in experiments, etc.
  24. Students should learn the art of benchmarking progress on a smaller scale, i.e. in the experiments, projects, etc. that they do. In the grand scheme of things however, we should realize that we may never be able to see humongous shifts in how we are doing in our lifetimes! :-)

    Srinivasa Ramanujan

    Srinivasa Ramanujan

  25. A lot of stuff that happens at Ivy League universities can be classified as voodoo and marketing. So it’s important to not fret if you can’t get into your dream university. The ability to learn lies within and if appropriately tapped and channelized can be used to accomplish great stuff regardless of where you end up studying. People who graduate from Ivy League institutes form a wide spectrum, with a surprising number who could easily be regarded as brain-dead. IMHO what can be achieved is a lot more dependent on the person rather than the institution he or she goes to. If there’s a will, there’s a way! :-) Remember some of science’s most famous stalwarts like Michael Faraday and Srinivasa Ramanujan were largely self-taught!
  26. Understand the value of computing in science. Not only has this aspect been neglected at institutes (especially in Biology and Medicine), but it’s soon getting indispensable because of the volume of data that one has to sift and process these days. I’ve recently written about bioinformatics and computer programming here and here.
  27. It’s important to develop a level of honesty and integrity that can withstand the onward thrust of cargo-cult science.
  28. Learn to choose wisely who your mentors are. Factor in student-friendliness, the time they can spend with you, and what motivates them to pursue science.
  29. I usually find myself repelled by demagoguery. But if you must, choose wisely who your scientific heroes are. Are they friendly to other beings and the environment? You’d be surprised as to how many evil scientists there can be out there! :-)

I’m sure there are many many points that I have missed and questions that I’ve left untouched. I’ll stop here though and add new stuff as and when it occurs to me later. Send me your comments, corrections and feedback and I’ll put them up here!

I have academic commitments headed my way and will be cutting down on my blogular activity for a while. But don’t worry, not for long! :-)

I’d like to end now, by quoting one of my favorite photographers, George Steinmetz:

[Video]
George Steinmetz: “… I find that there is always more to explore, to question and, ultimately, to understand …”

Footnotes:

  1. Bialek, W., & Botstein, D. (2004). Introductory Science and Mathematics Education for 21st-Century Biologists. Science, 303(5659), 788-790. doi:10.1126/science.1095480


Copyright Firas MR. All Rights Reserved.

“A mote of dust, suspended in a sunbeam.”



Search Blog For Tags: , , , ,

Written by Firas MR

November 6, 2010 at 5:21 am

Let’s Face It, We Are Numskulls At Math!

with one comment

Noted mathematician, Timothy Gowers, talks about the importance of math

I’ve often written about Mathematics before Footnotes. As much as math helps us better understand our world (Modern Medicine’s recent strides have a lot to do with applied math for example), it also tells us how severely limited man’s common thinking is.

Humans and yes some animals too, are born with or soon develop an innate ability for understanding numbers. Yet, just like animals, our proficiency with numbers seems to stop short of the stuff that goes beyond our immediate activities of daily living (ADL) and survival. Because we are a higher form of being (or allegedly so, depending on your point of view), our ADLs are a lot more sophisticated than say those of, canaries or hamsters. And consequently you can expect to see a little more refined arithmetic being used by us. But fundamentally, we share this important trait – of being able to work with numbers from an early stage. A man who has a family with kids knows almost by instinct that if he has two kids to look after, that would mean breakfast, lunch and dinner times 2 in terms of putting food on the table. He would have to buy two sets of clothes for his kids. A kid soon learns that he has two parents. And so on. It’s almost natural. And when someone can’t figure out their way doing simple counting or arithmetic, we know that something might be wrong. In Medicine, we have a term for this. It’s called acalculia and often indicates the presence of a neuropsychiatric disorder.

It’s easy for ‘normal’ people to do 2 + 2 in their heads. Two oranges AND two oranges make a TOTAL of four oranges. This basic stuff helps us get by day-to-day. But how many people can wrap their heads around 1 divided by 0? If you went to school, yea sure your teachers must have hammered the answer into you: infinity. But how do you visualize it? Yes, I know it’s possible. But it takes unusual work. I think you can see my point, even with this simple example. We haven’t even begun to speak about probability, wave functions, symmetries, infinite kinds of infinities, multiple-space-dimensions, time’s arrow, quantum mechanics, the Higgs field or any of that stuff yet!

As a species, it is so obvious that we aren’t at all good at math. It’s almost as if we construct our views of the universe through this tunneled vision that helps us in our day-to-day tasks, but fails otherwise.

We tend to think of using math as an ability when really it should be thought of as a sensory organ. Something that is as vital to understanding our surroundings as our eyes, ears, noses, tongues and skins. And despite lacking this sense, we tend to go about living as though we somehow understand everything. That we are aware of what it is to be aware of. This can often lead to trouble down the road. I’ve talked about numerous PhDs having failed at the Monty Hall Paradox before. But a recent talk I watched, touched upon something with serious consequences that meant people being wrongfully convicted because of a stunted interpretation of DNA, fingerprint evidence, etc. by none other than “expert” witnesses. In other words, serious life and death issues. So much for our expertise as a species, eh?!

How the human mind struggles with math!

We recently also learned that the hullabaloo over H1N1 pandemic influenza had a lot do with our naive understanding of math, the pitfalls of corporate-driven public-interest research notwithstanding.

Anyhow, one of my main feelings is that honing one’s math not only helps us survive better, but it can also teach us about our place in the universe. Because we can then begin to fully use it as a sensory organ in its own right. Which is why a lot of pure scientists have argued that doing math for math’s own sake can not only be great fun (if done the right way, of course :-P) but should also be considered necessary. Due to the fact that such research has the potential to reveal entirely new vistas that can enchant us and surprise us at the same time (take Cantor’s work on infinity for example). For in the end, discovery, really, is far more enthralling than invention.

UPDATE 1: Check out the Khan Academy for a virtually A-Z education on math — and all of it for free! This is especially a great resource for those of us who can’t even recall principles of addition, subtraction, etc. let alone calculus or any of the more advanced stuff.

Copyright © Firas MR. All rights reserved.


# Footnotes:

  1. ذرا غور فرمائیے اپنے انسان ہونے کی حیثیت پر
  2. Decision Tree Questions In Genetics And The USMLE
  3. The Story Of Sine
  4. On The Impact Of Thinking Visually
  5. A Brief Tour Of The Field Of Bioinformatics
  6. Know Thy Numbers!

, , , , ,

ذرا غور فرمائیے اپنے انسان ہونے کی حیثیت پر

with 2 comments

واقعی کتنا چھوٹا ہے انسان

(ایک ضروری بات: اس مضمون کو سہی روپ میں دیکھنے کے لئے آپ ناظرین کو یہ font ڈاونلوڈ کرکے اپنے سسٹم پر ڈالنا ہوگا . یہ ایسی font ہے جو خاص کمپیوٹر سکرین پر باآسانی پڑھنے کے لئے بنائی گئی ہے .)

آداب دوستو ،

پچھلے کچھ لمحوں میں میرے ذہن میں ایک بات آئی . ہم انسان ‘ اشرف المخلوقات ‘ کے نام سے اپنے آپ کے بارے میں سوچتے ہیں ، اور یقیناً مذہبی طور پر بھی ہم کو یہی سکھایا جاتا ہے . اکثر اس بات کو لے کر ہم کافی مغرور بھی ہو جاتے ہیں . بھول جاتے ہیں کہ ، یے خطاب ہم پر یوں ہی نہیں نوازا گیا . کیا ہم اسکے واقعی حقدار ہیں ؟

سچ پوچھیں تو ہم اسکے حقدار تبھی کہلایںگے جب ہماری فکر اور ہمارے اعمال میں واضح طور پر اسکا ثبوت ہو . لیکن مجھے لگتا ہے کہ ہم میں سے اکثریت کا یہ حال ہے کہ نا تو ہم کو اس کا احساس ہوتا ہے اور نا اس بات کو لے کر دلچسپی . ہم اپنی روز مرّہ زندگیوں کو جینے میں اتنے گم و مبتلا ہیں کہ ‘ اشرف المخلوقات ‘ کا کردار نبھانا بھول جاتے ہیں . اور ایسی چیزوں پر توجہ دینا بھول جاتے ہیں جن پر وقت لگا کر ہم یے کردار نبھانے کی کم از کم کوشش تو کر سکتے ہیں .

کل میں نے جول سارٹور نام کے مشہور فوٹوگرافر کی تقریر دیکھی . اس میں وہ یے بیان کر رہے تھے کہ ہماری یے قدرتی ماحول کو تباہ کرنے کی رفتار کی تیزی ، ہماری اس رفتار کی تیزی سے کئی گنا بڑھ کر ہے جس سے ہم جیتے جاگتے جانور ، پیڑ – پودوں کے نئے اقسام کے بارے میں معلومات حاصل کرتے ہیں . آخر کتنی حیرت انگیز بات ہے یے . نا جانے کتنے ایسے قدرتی چیزوں کو دیکھے بغیر اور ان کے بارے میں غور کئے بغیر انسان یوں ہی آخر تک جیتا جا ے گا . وہ یے بھی بتا رہے تھے کہ اگلے دس سال میں دنیا کے تمام جل تھلیوں کے اقسام میں ، پچاس فیصد کا ناپیدا و غیب ہونے کا امکان ہے . یہ کوئی معمولی بات نہیں ہے !

طبّی دنیا میں بھی ہم نے حال ہی میں genetic codes کو سمجھنا شروع کیا ہے . اور ہماری بیماریوں کو لے کر جانکاری میں نئی طرح کی سوچ پیدا اب ہی ہو رہی ہے . نا جانے آگے اور کیا ترقی ہوگی  اور نئی باتوں کا پتا چلے گا .

ہم حیاتی دنیا کے بارے میں کتنا کم جانتے ہیں ، کیا ہم کو اس بات کا احساس ہے ؟

مشہور فوٹوگرافر جول سارٹور کی قدرتی ماحول کے بچاؤ پر لاجواب تقریر .

نامور فوٹوگرافر ، Yann Arthus-Bertrand کی یہ شاندار فلم ہم کو سمجھاتی ہے کہ ہم کیا داؤ پر لگا رہے ہیں . پوری فلم یہاں دیکھئے .

فزکس کی دنیا بھی ہم کو ہمارے ماحول کے بارے میں سکھاتی ہے . کبھی غور کیجئے ، چاہے واقعہ کیسا بھی ہو یا چیز کیسی بھی ہو ، وہ ایک probability wave کے ڈھنگ میں سوچا یا سوچی جا سکتی ہے . بہت ہی انوکھی بات ہے یے . کیونکہ انسان کا دماغ ایسا نہیں چلتا . وہ اس بات کو ماننے سے انکار کرتا ہے کہ بھلا کوئی چیز ایک ہی پل میں ، ایک سے زائد مقام پر بھی پائی جا سکتی ہے . لیکن یے علمی تجربوں سے ممکن پایا گیا ہے (جیسا کہ Wheeler’s Experiment) اور دانشوروں میں اس بات کو لے کر اب بھی بحث ہوتے رہتی ہے . اور ہان ، علمی تجربوں سے یے بھی ثابت ہوا ہے کہ دو علیحدہ چیزیں ، چاہے ان کے بیچ میں کتنا بھی فاصلہ ہو ، کبھی کبھی ان میں ایک قسم کا رابطہ ہوتا ہے ، جس کو quantum entanglement کہتے ہیں . ذہن ماننے کو انکار کرتا ہے ، لیکن ہاں یہ سچ ہے . اور بھی ایسی کئی دلچسپ باتیں ہیں جیسے multiple-space-dimensions , time dilation, wormholes وغیرہ . اور تو اور ، ان چیزوں کا جائزہ ہم گھر بیٹھے ہی لے سکتے ہیں . کبھی اندھیری رات کو ، اپنی نظریں آسمان کے ستاروں کی طرف دوڑ ایں اور غور کریں کہ ان کی روشنی کو ہماری آنکھوں تک پہنچنے میں کتنا وقت لگا ہوگا . کیا وہ جو ستارے آپ کو نظر آ رہے ہیں ، اتنی مدّت میں وہیں کے وہیں ہیں یا پھر چل بسے ہیں ؟ نظر دوڑانا شاید آپ کو مشکل کام لگتا ہوگا . لیکن Orion Nebula جیسی دور کی چیزیں تو آپ کے سامنے ہی ہیں ! ذرا دیکھئے تو !

ہم طبیعی کائنات کے بارے میں کتنا کم جانتے ہیں ، کیا ہم کو اس بات کا احساس ہے ؟

The Elegant Universe - PBS NOVA

PBS NOVA سے جاری کی گئی فزکس پر بہترین فلم

حساب کی دنیا میں ہم کو حال ہی میں پتا چلا ہے کہ infinity کے بھی اقسام ہوتے ہیں . بلکہ infinity کے infinite اقسام ہیں ! ایک infinity دوسری infinity سے بڑھی یا چوٹی ہو سکتی ہے . غور کر یے دو circles A & B پر . ایک بڑھا ہے ، تو دوسرا چھوٹا . یعنی کہ ایک کا circumference دوسرے سے بڑھا ہے . لیکن کیا یہ سچ نہیں کہ ہر circumference کے اندر infinite sides ہوتے ہیں ؟ تو بڑھے circle میں جو infinite sides ہیں وہ چھوٹے circle کے infinite sides سے زیادہ ہوئے ؛ ہے نا ؟ کتنی دلچسپ بات ہے . اور ایسی نا جانے کئی ایسی چیزیں ہیں جن کا اندازہ ہم کو آج بھی نہیں ہے .

ہم حساب کے بارے میں کتنا کم جانتے ہیں ، کیا ہم کو اس بات کا احساس ہے ؟

The Story of Maths - BBC

BBC کی شاندار فلم ، The Story of Maths

Marcus du Sautoy بتاتے ہیں symmetry کی اہمیت

پرسوں میں نے ایک اور تقریر دیکھی جس میں خطیب نے یے نقطہ اٹھایا کہ ہر دو ہفتے ، ایک ایسے انسان کی موت ہوتی ہے جو اپنے ساتھ ساتھ اپنی زبان اور ادبیات لے کر اس دنیا سے چل بستا ہے . وہ اپنی زبان کا واحد اور آخری بول چال میں استعمال کرنے والا فرد ہوتا ہے اور اس کے جانے کے بعد اس کی نسل ، اس زبان سے ناتا ہمیشہ کے لئے کھو بیٹھتی ہے . اور اس نسل کے ساتھ ساتھ دنیا کے باقی سبھی لوگ بھی . صدیوں سے اکٹھا کی گئی حکمت یافتہ باتیں جو اس زبان میں بندھی پڑھی تھیں ، اب وہ گویا ہمیشہ کے لئے غیب ہو جاتی ہیں .

ہم بیٹھے بیٹھے صدیوں کا علم ہاتھوں سے گنوا رہے ہیں ، کیا ہم کو اس بات کا احساس ہے ؟

ہر دو ہفتے اس دنیا سے ایک زبان چل بستی ہے

سچ پوچھیں توعام طور پر اس سوال کا جواب نفی میں ہوتا ہے .

اور انسان ان چیزوں پر غور و فکر کرنے کے بجائے ایسی چیزوں پر توانائی اور وقت ضائع کرنے کو ہمیشہ تیّار رہتا ہے . جیسا کہ ایک دوسرے سے جھگڑنا یا جنگ لڑنا ، یا پھر غیر اخلاقی اقتصادی برتری کے نشے میں آ کر ایک دوسرے کو کچلنا یا ایک دوسرے کے سامنے ٹانگ اڑانا وغیرہ وغیرہ .

کیا یے دانشمند مخلوق کی صفت ہے ؟ کیا ہم ایسا برتاؤ کر کے اپنے ‘ اشرف المخلوقات ‘ والے کردار سے منہ نہیں موڈ رہے ہیں ؟ کبھی کبھی لگتا ہے کہ آخرت تک یہی ماجرا رہے گا . جب تک کہ ہم میں اپنے ترجیحات‌‍ٍ زندگی کو صحیح کرنے کا احساس نہیں ہوگا یہی حال رہے گا .

اس نقطے سے ذہن میں ایک اور ایک خیال آیا . نا جانے انسان کی عقل کا کتنا حصّہ ماضی کے تجربوں پر منحصر ہوتا ہے . یعنی کہ ذکاوت کی دین (output) ، ماضی کی لین (input) پر کتنی منحصر ہے ؟

اگر انسان اپنے ماضی میں ، اپنے گرد و نواح سے اور اپنی حیثیت و کیفیت سے ناواقف ہوتا تو کیا یے توقع سے بعید ہے کہ وہ چھوٹی موتی ، نا گوار چیزوں پر اپنا وقت ضائع کرتا ؟ اپنا اصلی کردار ادا کرنا بھول جاتا ؟ غور کیا جاے تو ہم یے مظہرartificial intelligence programming میں بھی پا سکتے ہیں . میرا قیاس ہے کہ کئی AI systems اسی طرح چلتے ہیں . ان میں ضرور ایسا code ہوتا ہوگا جس سے روبوٹ کو باہر کی دنیا کا احساس پانے میں مدد ملتی ہوگی . جیسے ہی اس کو باہر کے ماحول میں تبدیلی کا احساس ہوتا ہے ، وہ اپنا رویّہ بدل دیتا ہے . شاید ہم بھی اسی طرح ہی ہیں . بس ہمارے شعور و احساسات کو مزید جگانا ہوگا .

کبھی سوچا نہیں تھا کہ اتنا طویل فلسفیانہ مضمون لکھو نگا . لکھتے لکھتے اتنا وقت گزر گیا ! تو بس دوستو ، آج کے لئے اتنا ہی . ملتے ہیں اگلی بار . اس طرح کبھی کبھی ہمیں اس تیز رفتار زندگی سے وقت نکال کر اپنے کردار کو نبھانے کے بارے میں سوچنا ہوگا . کہ بھئی ، ہم ‘ اشرف المخلوقات ‘ خطاب کے واقعی حقدار بنتے ہیں یا نہیں ؟ اپنے رب اور اس کی کائنات کو سمجھتے ہیں یا نہیں ؟ اور کس طرح اپنا وقت گزارنا چاہتے ہیں .

جانے سے پہلے اسی موضوع پر دیکھئے ایک لطف بھرا انگریزی گیت :

… The sky calls to us,
If we do not destroy ourselves,
We will one day venture to the stars …

Copyright Firas MR. All rights reserved.

Beginning Programming In Plain English

with 3 comments

Part 1 of an introductory series on programming using the Python language via SciPy @ Archive.org Special Thanks

Before I begin today’s discussion (since it concerns another book), a quick plug for Steve McCurry, whose photography I deeply admire and whose recent photo-essays on the subject of reading, are especially inspirational and worth checking out. I quote:

Fusion: The Synergy of Images and Words Part III « Steve McCurry’s Blog

“Reading is a means of thinking with another person’s mind; it forces you to stretch your own.” — Charles Scribner

Susan Sontag said: “The camera makes everyone a tourist in other people’s reality.” The same can be said for reading books.

Every once in a while, I receive feedback from readers as to how much they appreciate some of my writing on non-clinical/non-medical subjects. Sometimes, the subject matter concerns books or web resources that I’ve recently read. Occasionally, I also like taking notes as I happen to read this material. And often, friends, family and colleagues ask me questions on topics that I’ve either read a book about or have made notes on. Note-taking is a good habit as you grow your comprehension of things. In my opinion, it also helps you skeletonize reading material – sort of like building a quick ‘Table Of Contents’ – that you can utilize to build your knowledge base as you assimilate more and more.

If you’ve ever visited a college bookstore in India, you’ll find dozens and dozens of what are popularly referred to as “guides” or “guidebooks”. These contain summaries and notes on all kinds of subjects – from medicine to engineering and beyond. They help students:

  1. Get verbosity in their main coursebooks (often written in English that is more befitting the Middle Ages) out of the way to focus on skeletonizing material
  2. Cram before exams

I tend to think of my notes and summaries of recently-read books, as guidebooks. Anchor points, that I (& often family or friends) can come back to later on, sometimes when I’ve long forgotten a lot of the material!

I write this summary in this spirit. So with all of that behind us, let’s begin.

I stumbled upon an enticing little book recently, called “Learning the BASH shell“, by Cameron Newham & Bill Rosenblatt. Being the technophile that I am, I just couldn’t resist taking a peek.

I’ve always been fascinated by the innards of computers – from how they’re made and assembled to how they are programmed and used. My first real foray into them began with learning some of the fundamentals of DOS and BASIC on an old 286 (I think) as a 7th grader. Those were the days of pizza-box styled CPU-case form factors, monochrome monitors that had a switch that would turn text green, hard disks that were in the MB range, RAM that was measured in KB and when people thought 3.5 inch floppies were cool. Oh boy, I still do remember the way people used to go gaga over double-sided, high-density, pre-formatted and stuff! As I witnessed the emergence of CDs and then later DVDs and now SSDs and portable HDs, I got my hands dirty on the 386, the 486, the Pentium 1, the Pentium 3, the Pentium 4 (still working!) and my current main workstation which is a Core 2 Duo. Boy, have I come a long way! Over the years I’ve read a number of books on computer hardware (this one and this one recently – more on them for a future post) and software applications and Operating Systems (such as this one on GIMP, this one on GPG, this one, this one and this one on Linux and this one and this one on FreeBSD – again, more on them later!). But there was always one cranny that seemed far too daunting to approach. Yup, programming. Utterly jargoned, the world of modern programming has seemed really quite esoteric & complicated to me from the old days, when BASIC and dBASE could get your plate full. When you’ve lost >95% of your memory on BASIC, it doesn’t help either. Ever since reading about computational biology or bioinformatics (see my summary of a book on the topic here), I’ve been convinced that getting at least a superficial handle on computer programming concepts can mean a lot in terms of having a competitive edge if you ever contemplate being in the research world. This interplay between technology and biology and the level to which our research has evolved over the past few decades was further reinforced by something I read recently from an interview of Kary Mullis, the inventor of PCR. He eventually won the Nobel Prize for his work:

Edge: Eat Me Before I Eat You! A New Foe For The Bad Bugs, A Talk with Kary Mullis

[...]

What I do personally is the research, which I can do from home because of the Internet, which pleases me immensely. I don’t need to go to a library; I don’t need to even talk to people face to face.

[...]

There are now whole books and articles geared towards programming and biology. I recommend the great introductory essay, Why Biologists Want to Program Computers by author, James Tisdall.

“Learning the BASH shell” is a fascinating newbie-friendly introduction to the world of programming and assumes extremely rudimentary familiarity with how computers work or computer programming in general. It certainly helps if you have a working understanding of Linux or any one of the Unix operating system flavors, but if you’re on Windows you can get by using Cygwin. I’ve been using Linux for the last couple of years (originally beginning with Ubuntu 6.06, then Arch Linux and Debian, Debian being my current favorite), so this background certainly helped me grasp some of the core concepts much faster.

Defining Programming

So what exactly is programming anyway? Well, think of programming as a means to talk to your computer to carry out tasks. Deep down, computers understand nothing but the binary number system (eg: copy this file from here to there translates into gibberish like .…010001100001111000100110…). Not something that most humans would find even remotely appealing (apparently some geeks’ favorite pastime is reverse-engineering human-friendly language from binary!). Now most of us are familiar with using a mouse to point-and-click our way to getting tasks done. But sometimes it becomes necessary to speak to our computers in more direct terms. This ultimately comes down to entering a ‘programming environment’, typing words in a special syntax (depending on what programming language you use) using this environment, saving these words in a file and then translating the file and the words it contains into language the computer can understand (binary language). The computer then executes tasks according to the words you typed. Most languages can broadly be divided into:

  1. Compiler-based: Words in the programming language need to be converted into binary using a program called a ‘compiler’. The binary file can then be run independently. (eg. the C programming language)
  2. Interpreter-based: Words in the programming language are translated on-the-fly into binary. This on-the-fly conversion occurs by means of an intermediary program called an ‘interpreter’. Because of the additional resources required to run the interpreter program, it can sometimes take a while before your computer understands what exactly it needs to do. (eg. the Perl or Python programming languages)

If you think about it, a lot of the stuff we take for granted is actually similar to programming languages. HTML (the stuff of which most web-pages are made) and LATEX (used to make properly typeset professional-quality documents) are called Text Mark-up Languages. By placing the typed words in your document between various tags (i.e. by ‘marking’ text), you tell your web-browser’s HTML-rendering-engine or your LATEX program’s LATEX-rendering-engine to interpret the document’s layout, etc. in a specific way. It’s all actually similar to interpreter-based programming languages. Javascript, the language that’s used to ask your browser to open up a pop-up, etc. is also pretty similar.

What is BASH?

BASH is first and foremost a ‘shell’. If you’ve ever opened up a Command-Prompt or CLI (Command Line Interface) on Windows (Start Menu > Accessories > Command Prompt), then you’ve seen what a shell looks like. Something that provides a text interface to communicate with the innards of your operating system. We’re used to doing stuff the GUI way (Graphical User Interface), using attractive buttons, windows and graphics. Think of the shell as just an alternative means to talk to your computer. Phone-line vs. paper-mail, if that metaphor helps.

Alright, so we get that BASH provides us with an interface. But what else does it do? Well, BASH is also an interpreted programming language! That is amazing because what this allows you to do, is to use your shell to create programs for repetitive or complicated multi-step tasks. A little segue into Unix philosophy bears merit here. Unix-derivative operating systems, unlike others, basically stress on breaking complicated tasks in to tiny bits. Each bit is to be worked on by a program that specializes in that given component of a task. sort is a Unix program that sorts text. cut snips off a chunk of text from a larger whole. grep is used to find text. sed is used to replace text. The find program is used to find files and directories. And so on. If you need to find a given file, then look for certain text in it, yank out a portion of it, replace part of this chunk, then sort it from ascending to descending order, all you do is combine find, grep, sed, cut and sort using the proper syntax. But what if you didn’t really want to replace text? Then all you do is omit sed from the workflow. See, that’s the power of Unix-based OS(s) like Linux or FreeBSD. Flexibility.

The BASH programming language takes simple text files as its input. Then an interpreter called bash translates the words (commands, etc.) into machine-readable code. It’s really as simple as that. Because BASH stresses on the Unix philosophy, it assumes you’ll need to use the various Unix-type programs to get stuff done. So at the end of the day, a BASH program looks a lot like:

execute the Unix program date
assign the output of date to variable x
if x = 8 AM
then execute these Unix program in this order(find, grep, sed, cut, sort, etc.)

Basic Elements of Programming

In general, programming consists of breaking down complicated tasks into bits using unambiguous language in a standard syntax.

The fundamental idea (using BASH as an example) is to:

  1. Construct variables.
  2. Manipulate variables. Add, subtract, change their text content, etc.
  3. Use Conditions such as if/then (referred to in technobabble as “Flow Control”)
  4. Execute Unix programs based on said Conditions

All it takes to get going is learning the syntax of framing your thoughts. And for some languages this can get hairy.

This explains why some of the most popular programming languages out there try to emulate human language as much as possible in their syntax. And why a popular language such as Perl was in fact developed by a linguist!

This was just a brief and extremely high-level introduction to basic concepts in programming. Do grab yourself a copy and dive in to “Learning the BASH shell” with the aforementioned framework in mind. And before you know it, you’ll soon start putting two and two together and be on your way to developing your own nifty program!

I’m going to end for today with some of the additional excellent learning resources that I’m currently exploring to take my quest further:

  1. Steve Parker’s BASH tutorial (extremely easy to follow along)
  2. Greg’s BASH Guide (another one recommended for absolute noobs)
  3. Learning to Program Using Python – A Tutorial for Hobbyists, Self-Starters, and All Who Want to Learn the Art of Computer Programming by Alan Gauld
  4. How to think like a Computer Scientist – Learning with Python by Jeffrey Elkner, Allen B. Downey, and Chris Meyers

UPDATE 1: If you’re looking for a programming language to begin with and have come down to either Perl or Python, but are finding it difficult to choose one over the other, then I think you’ll find the following article by the famous Open Source Software advocate, Eric S. Raymond, a resourceful read: Why Python?

UPDATE 2: A number of resourceful, science-minded people at SciPy conduct workshops aimed at introducing Python and its applications in science. They have a great collection of introductory videos on Python programming concepts & syntax here. Another group, called FOSSEE, has a number of workshop videos introducing Python programming here. They also have a screencast series on the subject here.

UPDATE 3: AcademicEarth.org has quite a number of useful lecture series and Open Courseware material on learning programming and basic Computer Science concepts. Check out the MIT lecture, “Introduction to Computer Science and Programming” which is specifically designed for students with little to no programming experience. The lecture focuses on Python.

Copyright Firas MR. All rights reserved.


# Player used is Stream Player licensed under the GPL. Special thanks to Panos for helping me get the embedded video to work! Steps I followed to get it working:

  • Download the Stream Player plugin as a zip. Extract it locally. Rename the player.swf file to player-swf.jpg
  • Upload player-swf.jpg to your WordPress.com Media Library. Don’t worry, WordPress.com will not complain since it thinks it’s being given a JPG file!
  • Next insert the gigya shortcode as explained at Panos’ website. I inserted the following between square brackets, [ ] :
  • gigya  src="http://mydominanthemisphere.files.wordpress.com/2010/11/player-swf.jpg"  width="512" wmode="transparent" allowFullScreen="true" quality="high"  flashvars="file=http://ia311014.us.archive.org/1/items/scipy09_introTutorialDay1_1/scipy09_introTutorialDay1_1_512kb.mp4&image=http://ia311014.us.archive.org/1/items/scipy09_introTutorialDay1_1/scipy09_introTutorialDay1_1.thumbs/scipy09_introTutorialDay1_1_000180.jpg&provider=http"

  • Parameters to flashvars are separated by ampersands like flashvars="file=MOVIE URL HERE&image=IMAGE URL HERE". The provider="http" parameter to flashvars states that we would like to enable skipping within the video stream.

The Doctor’s Apparent Ineptitude

leave a comment »

ineptitude

via Steve Kay@Flickr (by-nc-nd license)

As a fun project, I’ve decided to frame this post as an abstract.

AIMS/OBJECTIVES:

To elucidate factors influencing perceived incompetence on the part of the doctor by the layman/patient/patient’s caregiver.

MATERIALS & METHODS:

Arm-chair pontification and a little gedankenexperiment based on prior experience with patients as a medical trainee.

RESULTS:

Preliminary analyses indicate widespread suspicions among patients on the ineptitude of doctors no matter what the level of training. This is amply demonstrated in the following figure:

As one can see, perceived ineptitude forms a wide spectrum – from most severe (med student) to least severe (attending). The underlying perceptions of incompetence do not seem to abate at any level however, and eyewitness testimonies include phrases such as ‘all doctors are inept; some more so than others’. At the med student level, exhausted patients find their anxious questions being greeted with a variety of responses ranging from the dumb ‘I don’t know’, to the dumber ‘well, I’m not the attending’, to the dumbest ‘uhh…mmmm..hmmm <eyes glazed over, pupils dilated>’. Escape routes will be meticulously planned in advance both by patients and more importantly by med students to avert catastrophe.

As for more senior medics such as attendings, evasion seems to be just a matter of hiding behind statistics. A gedankenexperiment was conducted to demonstrate this. The settings were two patients A and B, undergoing a certain surgical procedure and their respective caregivers, C-A and C-B.

Patient A

Consent & Pre-op

C-A: (anxious), Hey doc, ya think he’s gonna make it?

Doc: It’s difficult to say and I don’t know that at the moment. There are studies indicating that 95% live and 5% die during the procedure though.

C-A: ohhh kay (slightly confused) (murmuring)…’All this stuff about knowing medicine. What does he know? One simple question and he gives me this? What the heck has this guy spent all these years studying for?!’

Post-op & Recovery

C-A: Ah, I just heard! He made it! Thank you doctor!

Doc: You’re welcome (smug, god-complex)! See, I told ya 95% live. There was no reason for you to worry!

C-A: (sarcastic murmur) ‘Yeah, right. Let him go through the pain of not knowing and he’ll see. Look at him, so full of himself – as if he did something special; luck was on our side anyway. Heights of incompetence!’

Patient B

Consent & Pre-op

C-B: (anxious) Hey doc, ya think he’s gonna make it?

Doc: It’s difficult to say and I don’t know that at the moment. There are studies indicating that 95% live and 5% die during the procedure though.

C-B: ohhh kay (slightly confused) (murmuring)…’All this stuff about knowing medicine. What does he know? One simple question and he gives me this? What the heck has this guy spent all these years studying for?!’

Post-op & Recovery

C-B: (angry, shouting numerous explicatives) What?! He died on the table?!

Doc: Well, I did mention that there was a 5% death rate.

C-B: (angry, shouting numerous explicatives).. You (more explicatives) incompetent quack! (murmuring) “How convenient! A lawsuit should fix him for good!”

The Doctor’s Coping Strategy

Although numerous psychology models can be applied to understand physician behavior, the Freudian model reveals some interesting material. Common defense strategies that help doctors include:

Isolation of affect: eg. Resident tells Fellow, “you know that patient with the …well, she had a massive MI and went into VFib..died despite ACLS..poor soul…so hey, I hear they’re serving pizza today at the conference…(the conference about commercializing healthcare and increasing physician pay-grades for ‘a better  and healthier tomorrow’)”

Intellectualization: eg. Attending tells Fellow, “so you understand why that particular patient bled to death? Yeah it was DIC in the setting of septic shock….plus he had a prior MI with an Ejection Fraction of 33% so there was that component as well..but we couldn’t really figure out why the antibiotics didn’t work as expected…ID gave clearance….(ad infinitum)…so let’s present this at our M&M conference this week..”

Displacement: eg. Caregiver yells at Fellow, “<explicatives>”. Fellow yells at intern, “You knew that this was a case that I had a special interest in and yet you didn’t bother to page me? Unacceptable!…” Intern then yells at med student, “Go <explicatives> disimpact Mr. X’s bowels…if I don’t see that done within the next 15 minutes, you’re in for a class! Go go go…clock’s ticking…tck tck tck!”

We believe there are other coping mechanisms that are important too, but in our observations these appear to be the most common. Of the uncommon ones, we think med students as a group in particular, are the most vulnerable to Regression & Dissociation, duly accounting for confounding factors.

All of these form a systematic ego-syntonic pattern of behavior, but for reasons we are still exploring, is not included in the DSM-IV manual’s section on Personality Disorders.

CONCLUSIONS:

Patients and their caregivers seem to think that ALL doctors are fundamentally inept, period. Ineptitude follows a wide spectrum however – ranging from the bizarre to the mundane. Further studies (including but not limited to arm-chair pontification) need to be carried out to corroborate these startling results and the factors that we have reported. Other studies need to elucidate remedial measures that can be employed to save the doctor-patient relationship.

NOTE: I wrote this piece as a reminder of how the doctor-patient relationship is experienced from the patient’s side. In our business-as-usual frenzy, we as medics often don’t think about these things. And these things often DO matter a LOT to our patients!

Copyright © Firas MR. All rights reserved.

USMLE – Designing The Ultimate Questions

leave a comment »

Question

Shot courtesy crystaljingsr @ Flickr (Creative Commons Attribution, Non-Commercial License)

 

There are strategies that examiners can employ to frame questions that are designed to stump you on an exam such as the USMLE. Many of these strategies are listed out in the Kaplan Qbook and I’m sure this stuff will be familiar to many. My favorite techniques are the ‘multi-step’ and the ‘bait-and-switch’.

The Multi-Step

Drawing on principles of probability theory, examiners will often frame questions that require you to know multiple facts and concepts to get the answer right. As a crude example:

“This inherited disease exclusive to females is associated with acquired microcephaly and the medical management includes __________________.”

Such a question would be re-framed as a clinical scenario (an outpatient visit) with other relevant clinical data such as a pedigree chart. To get the answer right, you would need:

  1. Knowledge of how to interpret pedigree charts and identify that the disease manifests exclusively in females.
  2. Knowledge of Mendelian inheritance patterns of genetic diseases.
  3. Knowledge of conditions that might be associated with acquired microcephaly.
  4. Knowledge of medical management options for such patients.

Now taken individually, each of these steps – 1, 2, 3 and 4 – has a probability of 50% that you could get it right purely by random guessing. Combined together however, which is what is necessary to get the answer, the probability would be 50% * 50% * 50% * 50% = 6.25% [combined probability of independent events]. So now you know why they actually prefer multi-step questions over one or two-liners! :) Notice that this doesn’t necessarily have anything to do with testing your intelligence as some might think. It’s just being able to recollect hard facts and then being able to put them together. They aren’t asking you to prove a math theorem or calculate the trajectory of a space satellite :P !

The Bait-and-Switch

Another strategy is to riddle the question with chock-full of irrelevant data. You could have paragraph after paragraph describing demographic characteristics, anthropometric data, and ‘bait’ data that’s planted there to persuade you to think along certain lines and as you grind yourself to ponder over these things you are suddenly presented with an entirely unrelated sentence at the very end, asking a completely unrelated question! Imagine being presented with the multi-step question above with one added fly in the ointment. As you finally finish the half-page length question, it ends with ‘<insert-similar-disease> is associated with the loss of this enzyme and/or body part: _______________’. Very tricky! Questions like these give flashbacks and dejavu of  days from 2nd year med school, when that patient with a neck lump begins by giving you his demographic and occupational history. As an inexperienced med student you immediately begin thinking: ‘hmmm..okay, could the lump be related to his occupation? …hmm…’. But wait! You haven’t even finished the physical exam yet, let alone the investigations. As medics progress along their careers they tend to phase out this kind of analysis in favor of more refined ‘heuristics’ as Harrison’s puts it. A senior medic will often wait to formulate opinions until the investigations are done and will focus on triaging problems and asking if management options are going to change them. The keyword here is ‘triage’. Just as a patient’s clinical information in a real office visit is filled with much irrelevant data, so too are many USMLE questions. That’s not to say that demographic data, etc. are irrelevant under all conditions. Certainly, an occupational history of being employed at an asbestos factory would be relevant in a case that looks like a respiratory disorder. If the case looks like a respiratory disorder, but the question mentions an occupational history of being employed as an office clerk, then this is less likely to be relevant to the case. Similarly if it’s a case that overwhelmingly looks like an acute abdomen, then a stray symptom of foot pain is less likely to be relevant. Get my point? That is why many recommend reading the last sentence or two of a USMLE question before reading the entire thing. It helps you establish what exactly is the main problem that needs to be addressed.

Hope readers have found the above discussion interesting :). Adios for now!

Copyright © Firas MR. All rights reserved.

Follow

Get every new post delivered to your Inbox.